All of us have a nerd in our lives - the cousin, brother-in-law, or son who helps us beat back the Blue Screen of Death or whatever latest glitch infests our smartphone. While the younger generation is fairly familiar with how to handle personal electronics, for most people it's basically magic that works when you produce the right incantation.
There is, of course, a vast underlying reality of multifarious pieces of technology that allow our modern interconnected world of infinite communications to function. Decades ago, a handful of guys in a garage could change the world.
It's not like that anymore - 99% of the time, it takes a large team of experts in different disciplines to make a new technology work. Once upon a time, software was like a book in that one guy could write it, or maybe a small team. Nowadays it's like writing an encyclopedia: there are hundreds of people involved, most of whom never stray from their own particular specialty.
As vital as information technology is to our world, you'd think they'd all be pretty well paid. By national standards, they are; but by the standards of the past, current IT salaries are nothing like as large as they once were.
Not a day goes by that a newspaper doesn't report on the reason why: we are now in competition with everyone else in the world, from Africa to India to China, and even against Africans and Indians and Chinese right here at home. This drives down wages just as Adam Smith would predict.
So what's a well-educated techie who likes his Mercedes to do, when there's some other fellow with an accent who'll do the same work for half the price?
It turns out that there's a pretty good answer. Most of us have found computer programs frustrating from time to time; they get better slowly over the years, but can still be surprisingly opaque and unreliable.
What you may not realise is that you are getting the best of user interface design. Remember the old command prompt of MS-DOS from your school days? Most of the world's Internet servers use an interface that looks almost the same. It's UNIX instead DOS and the commands offered are vastly more powerful than any IBM-PC, but from the point of view of usability it's just as unintelligible to the uninitiated.
A reasonably intelligent person can, sort of, figure out how to use Microsoft Windows and Office just by poking around at the buttons on the screen. Nobody can figure out anything by randomly poking at a command prompt; how would you know what commands to enter or what they might do? You have to actively learn the commands. You must take a course, be taught by somebody else, or do a whole lot of Googling.
This creates a premium on knowledge. It is not possible to hire just anybody off the street to run a server farm; to do anything effective, system administrators must have training and experience. This, naturally, lets them demand more money, and makes it harder and riskier to replace them with cheaper people.
The result is what the cynical would expect: inside the tech world, there seems to be a positive glee in making essential software as difficult to use as possible. Let's consider just one example among many.
When you have a large team working on the same piece of software, it's impossible to keep everybody from stepping on each other's work without using a complex piece of support software called a version control system. This keeps track of all the changes everyone is making, so you can spread the work around or rip changes out easily if they're buggy.
By its nature, version control software is going to be a complicated affair. The currently most popular version control system, an open-source product called "Git," laughs at this problem: it is so opaque that computer scientists get into shouting matches over how to execute the most simple of functions. This amusing article "10 things I hate about Git" gives you some idea of the problem.
Please remember, all the commenters breathing fire over the frustrations of using Git are well-paid software professionals who, in general, know what they are doing - or at least, they know enough to convince their employers that they know what they're doing. None of them are "normal" people; all spent all day every day in front of a glowing screen. Yet, they're confounded on a daily basis by obscurantist Git.
Why is this tolerated? Aren't there any software developers smart enough and annoyed enough to come up with something easier to use?
Sure there are, and in fact there are other version control systems that are reputed to be far easier to use. But very few projects use them. Why is this?
Consider how a new commercial software project is started. Some executive with money to invest has an idea and hires a chief software architect to make it happen. This top-flight nerd is the guy who knows how to "do software" - that's why the executive hired him. He's an expert, so naturally the business people follow his plans, within reason.
Is it in the chief software architect's best interest to design the software using tools that anybody can use? Of course not! It's vastly more prudent to use tools that only very smart people can understand and which routinely require his personal knowledge to unravel. If the business executives cannot do without him for more than the week of his annual vacation, they're unlikely to dare laying him off to replace him with someone in Bangalore.
Of course, normal people have to use the resulting software, so a halfway friendly interface is important to maintain sales. Under the hood, though, the messier the better for all involved, except for the next generation who'll have to somehow figure it out to improve it.
That's their problem, of course. For today's paycheck, chaos is power, and being an initiate in some obscure but essential black art - especially one made artificially and needlessly complex - is the royal road to a comfortable retirement.
Over the past five years, the editors have been secretly working on a book that summarizes the fundamental viewpoints of Scragged.