Just when I thought I had enough to worry about, I heard a story on NPR about a threat far greater than nuclear holocaust, climate change and viral plagues put together. It’s called the AI SINGULARITY. Artificial Intelligence Singularity is the point at which a computer becomes capable of improving itself. It can figure out how to make its own computing power greater. According to the story, if left unchecked, Singularity will result in worldwide catastrophe sometime between 30 and 60 years from now. A small group of scientists, programmers and big thinkers are busy trying to proactively head off this disaster at the Singularity Institute of Artificial Intelligence in Berkeley, CA.
Machine Learning, which is a branch of Artificial Intelligence, might give us a glimpse into what it means for a computer to improve itself. A basic and early-stage example of machine learning is Amazon.com's integrated recommender, which suggests books, movies and music that I might like based on other people's similar purchases. Similarly, Pandora is an online radio that builds libraries of music for me based on the music I like.
None of these sound sinister or life-threatening. So what’s the concern about the “Singularity?”
Once that Singularity line is crossed, and experts insist that it will be, the rate at which computers will learn and evolve could be so fast that humans can’t keep pace or stop it. However, the real problem is that computer self-evolution could just as easily include behaviors that are life-threatening as life-benefitting. Think HAL in the movie “2001: A Space Odyssey.”
Because the speed of computer evolution could dramatically surpass human ability to counter actions, the worst-case scenario envisions a sudden geometric explosion of computer evolution that is uncontrollable and harmful, thus bringing all life to a sudden end.
The race is on to build an Emergency Stop Switch that will interrupt a runaway malevolent Singularity. Jason Murray, one of the programmers working on the problem, is not optimistic about their chances of correctly addressing the threat. “We have a low chance of solving this ridiculously hard problem, and it’s…” He paused to let out what seemed like a muffled cry for help. “It’s really, really bad.”
Maybe this out-to-save-the-world group in Berkeley needs a manufacturing engineer on the team. Stopping a runaway turning or machining center before it does too much damage is a problem that we in metalworking deal with every day. Didn’t the machine tool industry invent that big red emergency stop button that’s prominently mounted on every machine tool? Machine tool crashes are inevitable. That’s why weak-link shear plates and pins are built into critical load machine tool components. When the machine is pushed too hard, the weak link breaks and shuts down the machine in less than a second, thereby containing the damage and avoiding major destruction. And broken tool detectors and built-in servo overload devices are all designed, not to prevent crashes or breaks, but to minimize the effect of crashes when they occur. All these devices work to prevent catastrophe.
Isn’t the concern about computers evolving too fast for humans to keep pace not unlike the concern we have for an out-of-control machine tool? Then maybe the key to dealing with the inevitable Singularity, like machine tool crashes, is to focus on controlling it, not preventing it.