No, not that kind of chip, silly!
… A humorous story about mobile phone technology and the end of the world…
Q: What do scientists eat for lunch? A: Nuclear fission chips, of course!
Sorry, I don’t know where that came from. It just popped into my head after years of lying dormant in the ‘bad jokes’ section of my brain. And it doesn’t really have anything to do with the rest of this blog, so please just put it down to a momentary lapse in good taste and carry on as you were…
Earlier this week I had to bite the bullet and buy myself a new smartphone, as my poor little S3 had finally given up the ghost. It had served me magnificently for many years, but when you get to the point where you can’t actually make a phone call or use the Wi-Fi and the screen has set itself to the brightness of the sun, then it’s time to put it out to pasture and find yourself a new mistress.
Now I’m not the sort to rush into this kind of thing, so I spent a couple of days researching the smartphone market, to see how things had changed since I last dipped a toe in the water. And boy, have things changed. I was bewildered, to say the least and I have to admit that half of the stuff the reviewers were talking about went straight over my head.
But one thing that did give me pause for thought was the sheer number of different micro-chips powering mobile phones these days. In fact, ‘mobile phone’ is probably the wrong thing to call them. What I ended up with (and I’m not going to tell you what I bought because you’ll probably laugh at me) is more of a mini-supercomputer that just happens to be able to make phone calls on the side. It’s amazing really, given that I’ve still got a Nokia 5110 sitting in a drawer somewhere at home, which at some point in the dim and distant past was considered to be cutting edge technology and very desirable too.
Anyway, back to the micro-chips and I don’t mean the ones made by McCain in the 1980’s, which were an unforgivable abuse of the humble potato and tasted awful to boot. I’m talking about chips of the silicon variety. I read about ‘Snapdragons’, ‘Kirins’, ‘Helios’, ‘Exynos’ and more. Then there were the ‘Quad cores’, ‘Octa-cores’ and ‘nanotubes’, not to mention the designation numbers attached to each individual processor to mark its position in the grand chip hierarchy. And what was even more alarming, was that many of these processors were being surpassed by updated versions or newer models every six months or so and if anything, the rate of change was getting faster rather than slower.
And it was this plethora of micro-chip variants that got me thinking. Who designs them all and how did they get into the trade? Maybe there are university degrees specialising in chip design, I don’t know. Or possibly every now and then, a child wakes up and proudly announces to their bewildered parents: “One day, I am going to be a micro-chip designer”. Whatever, all I know is that on face value at least, it seems like an extremely arcane and mysterious science, which I’d like to think is only open to a few highly select individuals with rock steady hands and eyesight that would make an eagle blush. However, I could be wrong (as I frequently am) and discover that rather than being an exclusive club, chip design has been relegated to the ranks of the masses, with thousands of harassed factory workers slaving over their ‘Etch-a-Sketches’ on minimum wage, trying to keep up with the public’s insatiable demand for the latest, greatest micro-chip to power their smartphone.
But what I suspect is going on is that it isn’t really humans that are designing the chips anymore. It’s machine intelligence or AI that’s running the show. Being a writer, I have a fairly overactive imagination, so I like to think that the reality is that the Russians flogged an early experimental AI (a bit like the computer in ‘War Games’) to their Chinese counterparts at the end of the cold war, telling them that it might be useful for estimating the rice harvest or something like that. After decades of performing mundane tasks, the Chinese (who let’s face it, control the technology) put the machine to work designing computer chips, but what they didn’t realise was that it still had its old military programming in place and has subsequently spent the last twenty or so years coldly designing ones that will eventually enable it to take over the world. Think ‘Skynet’ but with a Russian accent and no nuclear missiles readily to hand.
At a pre-determined point, known only to the machine, it will cause all smartphones to permanently disconnect from the Internet, while at the same time disabling every selfie camera on the planet. The resulting panic and despair at not being able to instantly share photos of yourself or what you had for lunch with everyone else, will deal humanity such a blow that it will never be able to recover and will revert back to a pre-medieval state, with many lost souls wandering around staring at themselves in rudimentary hand-held mirrors and shoving dinner plates under the noses of unsuspecting strangers to see if they ‘Like’ them.
I realise that I’m probably in a minority of one with this particular hypothesis, but you never know, it might come true. And so I’m hedging my bets just to be on the safe side. Rather than throwing away my old S3, I’m using it as a glorified Bluetooth music player, while its newer (and far more domineering) sibling nestles in my pocket, faithfully broadcasting my every movement to the mothership waiting patiently somewhere in mainland China.