The terrible Metro train crash here in DC last month is slowly receding into memory. The investigation is still ongoing, the victims have been memorialized, the politicians have all made the usual promises about fixing the underlying problems, and reinforced battalions of lawyers are licking their chops in expectation of huge fees for suing on behalf of the injured and the merely inconvenienced.
The exact cause of the crash is as yet unknown, but investigators seem to be focusing on a malfunctioning sensor in the rails which prevented one train - operating in the fully automated mode - from realizing that another train was stopped on the tracks ahead.
It was the computer's fault.
Automation can be a wonderful, and even a necessary thing. Much of the speed and convenience we enjoy in everyday life can be attributed to the quiet workings of networks of computers and sensors we generally don't know are even there. John can tell you about the use of computers in the air traffic control system that allows us to fly from point A to point B safely. Computers allow us to share our ramblings in blogs. They let us pay for gas at the pump and Big Macs at the register, and they let Andrea (and all the rest of us) enjoy digital music. They help manage traffic in major cities and facilitate the vast volume of transactions that let stockbrokers and hedge fund managers make our retirement funds disappear into other people's pockets.
But what happens when the computers don't do what they're supposed to? What happens when a sensor fails, or a line of code doesn't work quite right, or some worthless moron of a hacker makes the system go haywire? The best case result is minor annoyance and irritation; the worst case is a fatal accident when computers fail the trains, planes, and cars which rely on them.
This article from The Washington Post points out the problem: Metro Crash May Exemplify Automation Paradox. The article quotes a human factors expert who says, "The problem is when individuals start to overtrust or overrely or become complacent and put too much emphasis on the automation." In other words, perhaps we allow computers to do too much. Perhaps we slavishly assume that everything works better when we take the human out of the system and just let the computers run it.
In the words of the song, "it ain't necessarily so."
In 1966, British author Dennis Feltham Jones wrote a book titled Colossus which was made into a movie called Colossus: The Forbin Project, and which later yielded the Terminator franchise. The basic plot line was that Professor Charles Forbin had developed the ultimate, integrated defense system which allowed a mighty computer (Colossus) to take over all nuclear weapons, removing the fingers of fallible humans from the triggers. Good idea. Except that the Russians had their own version of Colossus, called Guardian, and when Colossus and Guardian recognized each other and joined forces ... well ... let's just say things didn't work out quite the way good Professor Forbin intended. The original novel and film are, in my opinion, much better than the Terminator series, even if they lack the sophisticated special effects.
As cautionary tales go, Colossus is a pretty good one. While the deaths of nine people in a Metro crash ... or even the deaths of 228 people in the recent Air France crash ... don't compare to the deaths of hundreds of millions in the Terminator movies, they still point out the problems of overreliance on automated systems.
Computers and automation have given us amazing levels of safety and convenience, but they can also give us awful levels of heartbreak and misery. There's no point in being a Luddite, but a healthy level of skepticism about automation isn't necessarily a bad thing.
I'm still flying and riding the Metro, after all ... feeling a little better from knowing that John is on the job when I fly, and that the Metrorail controllers are awake and watching during my commute.
Have a good day. Your computer will help you do it.
Cartoon Saturday is coming tomorrow.