Have We Passed The Point of Maximum Useful Tech?
Let us briefly review some of the significant events in the history of personal computing.
1983: Steve Jobs lures CEO John Sculley from Pepsi to Apple Computer, asking him:
Do you want to spend the rest of your life selling sugared water, or do you want a chance to change the world?
1984: Football fans watching the Super Bowl see a now famous advertisement. In this sixty-second commercial a blond, athletic, brightly attired young woman heaves a huge hammer towards a grey talking head broadcasting a message of oppressive unity to a seated, emaciated multitude. The hammer shatters the screen, terminating the transmission, and we then hear:
On January 24th, Apple Computer will introduce Macintosh. And you’ll see why 1984 won’t be like “1984.”
1990: Steve Jobs remarks, in an interview for the Library of Congress:
The computer is the most remarkable tool that we’ve ever come up with. It’s the equivalent of a bicycle for our minds.
2018: iPhone says:
Weekly Report Available: You averaged 57 minutes of screen time last week.
Whoa! What happened between 1990 and 2018? Who stuck the playing card between the spokes of my mind’s bicycle, turning a mental enabler into an annoying distraction?
I started programming mainframes in 1974. I can still remember the excitement of possessing my own personal computer back in the eighties. I can recall the thrill of learning how to use a graphical user interface and a mouse with my first Mac. I still vividly remember downloading my CD collection to my first iPod, and being able to carry my entire music library in my pocket. And I have fond memories of installing my first iPhone apps from the online store.
Perhaps most tellingly I maintained a subscription to at least one monthly Mac magazine (starting with Macworld and MacUser and continuing with MacAddict) continuously from the mid-eighties to sometime in 2018, when I finally let my subscription to MacLife lapse.
In hindsight, I can see the following distinct models of computer usage that have emerged over the years, with varying degrees of utility.
- The Mainframe model – Huge, expensive computers used to store and process organizational data.
- The Personal Computer Model – Small affordable computers that can be used at home by individuals.
- The Gaming Model – Well, you know – computers used to play games.
- The Office PC Model – Small affordable computers that can be used by office workers at their desks, powered by increasingly powerful and complex application suites especially designed to allow such workers to spend hours fiddling with obscure options, fonts, styles and formulas, all while claiming to be doing “work.”
- The iPod Model – Computers you can carry with you, used for mobile personal entertainment.
- The Internet Model – Everyone can communicate with everyone else, more or less instantaneously, and at little or no cost. What could possibly go wrong? Inevitably, the value of each individual communication decreases in direct inverse correspondence to the total number of communications sent and received. Note that, by combining the Office PC Model with the Internet Model, office workers effectively freed themselves from the necessity of ever leaving their desks while “at work,” while at the same time effectively shrouding their efforts from any sort of managerial oversight.
- The All the World’s (Mis)Information at your Fingertips Model – Huge amounts of information become instantaneously accessible to everyone with a Web browser. In particular, think of Wikipedia and Google.
- The E-Commerce Model – Businesses use computers to sell stuff directly to consumers, as well as to other businesses.
- The User Interface as Fashion Statement Model – Applications, and application upgrades (as well as operating system upgrades) are appreciated not just for their functionality, or ease of use, but for their overall aesthetics. In 2000, when introducing Mac OS X, Steve Jobs says with pride that “We made the buttons on the screen look so good you’ll want to lick them.” We begin to talk about visual interfaces as becoming “stale,” and needing to be “freshened.”
- The Content Farms Model – Vast troves of text, audio and video become easily and instantaneously accessible, either for free or at low monthly rental rates. The value of each individual piece of content diminishes, again in direct inverse proportion as the sheer volume increases. Users no longer have much interest in individual works, but rather seek to immerse themselves in the collections, swimming mindlessly for hours at a time through seas of content.
- The Technology for Its Own Sake Model – Technology begins to be appreciated simply because it is new, and because it is affordable. This is the age of voice-activated speakers, programmable lighting, hackable front door locks and entertainment systems so complex that they require intensive training just to watch TV.
- The Ubiquitous Advertising Model – Most things are free, and everything (free or not) comes with advertising. We now have one set of technical forces working to continually show us the most effective advertising, while another works to block the ads. Net value created: zero.
- The Eyeballs for Sale Model – Computers are no longer delivered with languages that would allow their users to program them. Instead, computers are used to program their owners, telling them what to buy, where to buy it, and who to vote for. Whereas personal computers were originally intended to enhance and encourage individuality, they are now used to sort us into various predictable demographic groups.
- The Intrusive Model – It is no longer enough that computers be available to do our bidding. They now buzz, vibrate, play complex tonal sequences, show videos and even talk to us when they feel that it is time for us to pay attention to them. Everyplace we look there is one device or another wanting us to see what is new or what is trending.
To be fair, all of these different usage models have some things to recommend them – none are completely evil or worthless. And yet, at some point along this progression, the amount of time and energy and money and attention demanded by all this technology seems, at least for me, to outweigh any conceivable benefits.
The net effect of all this is that I begin to feel the need for a Jimmy Buffett style rant after a while:
I don’t want it, I don’t want that much organization in my life. I don’t want other people thinking for me. Where did the headphone jack go? I don’t want to buy another dongle for 25 dollars. Where did the home button go? I don’t want an edge-to-edge screen for $1,000. I WANT MY HOME BUTTON! Where did the Control Center go? Why do I have to learn a new place to find the flashlight button every time Jony Ive gets a hot flash? Why do I have to pay an annual fee to Microsoft just for the privilege of being able to read and write a bunch of word docs? How did Mark Zuckerberg get as rich as Croesus by selling information about me? WHEN I’M STILL WAITING TO SEE THE FIRST RED CENT! We need more fruitcakes in this world and fewer Tech Titans! We need people that care! I’m mad as hell and I don’t want to take it anymore!
And so, these days I’m favoring a bit more simplicity in my life. I’ve practically become the Henry David Thoreau of the modern tech world, trading the shores of Walden Pond for those of Green Lake, sitting in my wooden cabin with nothing but my rustic iMac, a trusty text editor, and a workable copy of Markdown.
So when pundits rush to tell us how further advances in technology are going to become even more pervasive in all of our lives, and eliminate all our jobs … well, I have my doubts. Here’s why:
- Computer hardware becomes obsolete quickly. If a piece of tech is still useful after three years, we consider it a minor miracle. I don’t see this changing.
- Computer software never lives up to all the hype. This is why, despite all of the expensive developers and vast server farms, the Internet mostly shows me ads for things I’ve already purchased, or things I would never buy in a million years.
- Software maintenance is crazy expensive. Most industry experts will readily attest that developers spend at least ten times as much time in maintenance as they do in development.
- Rewriting software is even more expensive and failure-prone. That’s why companies are still running software written in COBOL and Java, and are not in any hurry to rewrite it using the latest tech stack to be touted as the Next Big Thing.
- The best, i.e. most valuable, decisions are still made by smart, capable, experienced, knowledgeable people, not by big data and algorithms. An example: A couple of days ago I visited an Apple store to do some Christmas shopping. There was a line outside the door before they opened, and they were selling stuff left and right as soon as people entered. And then today I came across this interview with Ron Johnson, telling his story of how he and Steve Jobs and a few others in Apple designed the original Apple stores. Guess what: the only computers they used were the ones customers would see, to make full-sized store prototypes. Definitely worth a read.
- We will continue to see rebellion against the model of human life that reduces all experience to an unending stream of bits. This is why vinyl recordings are experiencing a resurgence, why independent bookstores are still around, why live theatre and concerts are still attracting audiences, and why the best colleges and universities are turning qualified students away, even with the increasing cost of tuition.
- Using computers to program customers is a very short-term strategy. It’s already damaging our society. One way or another, it’s not going to last long: either we’ll collectively turn towards a longer-term strategy, or there won’t be customers or companies around much longer.
- Ditto for apps that are constantly interrupting whatever we’re doing to tell us the latest bit of inanity.
- The entire goal of society is to create and support an ongoing, sustainable stream of healthy, happy humans. If that is not what society is doing, then some adjustments will need to be made.
- At their best, humans are far more adaptive and longer-lasting than any computing technology yet invented, or likely to be invented in the future. We don’t need better artificial intelligence – we do need smarter, better-educated humans.
December 17, 2018