The terrible Metro train crash here in DC last month is slowly receding into memory. The investigation is still ongoing, the victims have been memorialized, the politicians have all made the usual promises about fixing the underlying problems, and reinforced battalions of lawyers are licking their chops in expectation of huge fees for suing on behalf of the injured and the merely inconvenienced.
The exact cause of the crash is as yet unknown, but investigators seem to be focusing on a malfunctioning sensor in the rails which prevented one train - operating in the fully automated mode - from realizing that another train was stopped on the tracks ahead.
It was the computer's fault.
Automation can be a wonderful, and even a necessary thing. Much of the speed and convenience we enjoy in everyday life can be attributed to the quiet workings of networks of computers and sensors we generally don't know are even there. John can tell you about the use of computers in the air traffic control system that allows us to fly from point A to point B safely. Computers allow us to share our ramblings in blogs. They let us pay for gas at the pump and Big Macs at the register, and they let Andrea (and all the rest of us) enjoy digital music. They help manage traffic in major cities and facilitate the vast volume of transactions that let stockbrokers and hedge fund managers make our retirement funds disappear into other people's pockets.
But what happens when the computers don't do what they're supposed to? What happens when a sensor fails, or a line of code doesn't work quite right, or some worthless moron of a hacker makes the system go haywire? The best case result is minor annoyance and irritation; the worst case is a fatal accident when computers fail the trains, planes, and cars which rely on them.
This article from The Washington Post points out the problem: Metro Crash May Exemplify Automation Paradox. The article quotes a human factors expert who says, "The problem is when individuals start to overtrust or overrely or become complacent and put too much emphasis on the automation." In other words, perhaps we allow computers to do too much. Perhaps we slavishly assume that everything works better when we take the human out of the system and just let the computers run it.
In the words of the song, "it ain't necessarily so."
In 1966, British author Dennis Feltham Jones wrote a book titled Colossus which was made into a movie called Colossus: The Forbin Project, and which later yielded the Terminator franchise. The basic plot line was that Professor Charles Forbin had developed the ultimate, integrated defense system which allowed a mighty computer (Colossus) to take over all nuclear weapons, removing the fingers of fallible humans from the triggers. Good idea. Except that the Russians had their own version of Colossus, called Guardian, and when Colossus and Guardian recognized each other and joined forces ... well ... let's just say things didn't work out quite the way good Professor Forbin intended. The original novel and film are, in my opinion, much better than the Terminator series, even if they lack the sophisticated special effects.
As cautionary tales go, Colossus is a pretty good one. While the deaths of nine people in a Metro crash ... or even the deaths of 228 people in the recent Air France crash ... don't compare to the deaths of hundreds of millions in the Terminator movies, they still point out the problems of overreliance on automated systems.
Computers and automation have given us amazing levels of safety and convenience, but they can also give us awful levels of heartbreak and misery. There's no point in being a Luddite, but a healthy level of skepticism about automation isn't necessarily a bad thing.
I'm still flying and riding the Metro, after all ... feeling a little better from knowing that John is on the job when I fly, and that the Metrorail controllers are awake and watching during my commute.
Have a good day. Your computer will help you do it.
Cartoon Saturday is coming tomorrow.
Bilbo
Its easier to blame a computer than a human. We are flawless remember! LOL
ReplyDeleteI've read the Colossus books--I'm sure you're not surprised, although in the sequel, *spoiler alert* having the Martians help do him in is sort of dated.
ReplyDeleteWhy are we surprised when machines fail--after all, we made them, and if you listen to Jimmy Buffet we're flawed individuals because the Cosmic Baker took us out of the oven a little too soon. :) From what I've read about ATC and the FAA, I'm surprised we haven't had more accidents because the computers are old and need to be upgraded and replaced.
I've seen the movie, 'The Forbin Project'. It really is a favourite, as is anything to do with 'Terminator'
ReplyDeleteAs to ATC and the FAA...I'm also surprised that there haven't been more accidents. It speaks highly of the men and women that work around the aging equipment and unnecessary policies of the FAA.
ReplyDeleteNote to users: It's never a good thing when the agency responsible for promoting an industry to the public (like aviation) is also the the agency responsible for policing that industry (like the FAA).
It's like the foxes telling the chickens that new security system in the hen house is state of the art.
Malfunctioning sensors in nuclear power plants popped into mind when reading the first few paragraphs. Great minds......
ReplyDeleteWv: robases - In baseball instead of you running to the base the robase will run to you.
Andrea - if I'm flawless, we're all in trouble!
ReplyDeleteLeslie - I didn't know there was a sequel to the original Colossus story. We need to discuss this between dances next week.
Jean-Luc - as captain of the Enterprise, you should know!
John - we're flying to Germany in less than a month. You're supposed to make me feel better about the trip...
Mike - great minds? See Andrea's comment and my reply...
vw: conarna - the opposite of pronarna.