Column written for Interactions. © CACM, 2005. This is the author’s version of the work. It is posted here by permission of ACM for your personal use. It may be redistributed for non-commercial use only, provided this paragraph is included. The definitive version will be published in Interactions
Consider this scenario: You are driving along, about to change lanes, when your car suddenly tenses up. The seatbelts tighten. The seat straightens up, the headrest moves forward. As you turn the wheel to the right, the car starts quivering, buzzing from the right side. “Calm down,” you say, “I know what I'm doing.”
A nervous, skittish car? A car distrustful of its driver? Yes, and often for good cause. Am I serious? Yes: everything I have just described already exists in some high-end automobiles.
Consider the modern automobile. It is a wonder of computation. multiple CPUs, hundreds of miles of cabling. Automatic this and automatic that. Lots of automatic stuff controlling the engine, but more and more of it affects the driver: Automatic transmission; anti-skid braking; stability controls; adaptive cruise control; lane control; even automatic parking. Automatic payment for highway tolls, parking lots, and drive-through restaurants. Navigation systems, entertainment systems, HVAC systems — sometimes different for each passenger.
How do we automate sensibly, controlling some parts of the driving experience, but ensuring that drivers are kept alert and informed — “in the loop,” is the way this is described in aviation safety. How do we warn drivers when they are about to change lanes that there is another vehicle in the way. What about an obstacle on the road, perhaps detected by the vehicles systems, but still not visible to the driver? (The auto research labs are experimenting with systems that tell the car what is around the corner.)
What should be done when the car decides the best way to avoid a collision is to accelerate past the danger zone, but the driver steps on the brakes? Modern braking systems already disregard the driver's actions and instead act according to what the designer has decided are the driver's intentions. Should the car be allowed to ignore the brake pedal and just accelerate? Should the car prevent the driver from changing lanes when another vehicle is already there? Should the car prevent the driver from exceeding the speed limit? Or from going slower than the minimum limit, or from going too close to the car ahead? All this — and much more — is possible today. Asking the driver, or even giving the driver the relevant information, is not the answer: there simply isn't enough time.
Cars today can almost drive themselves. Take adaptive cruise control that adjusts the auto's speed according to the distance of the car in front. Today, it arbitrarily relinquishes control below a certain speed (probably due to concerns from the lawyers). But it could easily control at any speed, including the slow crawl of heavy traffic and stops. Add lane control and automatic toll payment systems and the car could continue on a highway for hours while the driver slept. During a recent visit to an automobile company's research lab in Japan, I was startled to discover that all of these components now exist in commercial, or near commercial form. Put them together, and oops, we are training drivers to be inattentive. In other words, the driver is no longer “in the loop.” Even the research team was surprised when I pointed this out: they had never put it altogether either.
In many of the classical fields studied by engineering psychologists and human factors engineers, there is a well-known and well-studied problem called overautomation. There have been accidents caused by the poor communication between the automatic equipment and the human operators.
I once argued (The Problem with Automation) that the current state of automation in aviation was fundamentally unsound because it was in the middle. Either have no automation or full automation, I argued, but what we have today is halfway automation. Even worse, the system takes over when the going is easy and gives up — usually without any warning — when the going gets tough. Just the reverse of what you would want. But in an airplane, when everything goes wrong and the plane starts crashing to earth, the highly skilled, professional crew has minutes to respond. In an automobile, the driver might have a second to respond.
The current designs for automobile automation are being done by engineers who are ignorant of the lessons learned from studies of automation. Here we go again. Each new industry fails to learn the lessons of the previous ones. So, once again, here is a field in desperate need of help, yet not quite realizing it. A field with new lessons to learn, and a lot of very old lessons that have to be taught once again. Unless we, Human-Computer Interaction scientists, the profession of interaction get involved, there are apt to be serious repercussions along the way.
Yes, there are many good behavioral scientists at work on these issues of driver automation in universities, government laboratories, and the research labs of auto companies. A panel at the 2003 international confrence on HCI (CHI 2003) correctly called this area “The Next Revolution.” But we are not being consulted when the engineering decisions are being made. Sure, the research labs are active, but who makes the product decisions? I fear that all the errors of the past, errors in nuclear power control rooms, in process control rooms, in the control of ships, and commercial aviation will simply get repeated. All of the lessons will have to be re-learned for yet another industry.
There is an automobile in HCI's future — and the sooner the better.
Don Norman wears many hats, including co-founder of the Nielsen Norman group, Professor at Northwestern University, and author, his latest book being Emotional Design. He lives at www.jnd.org.
NOTE: If you want to learn more about this area, an excellent starting point is the list of links provided by the Surface Transportation Technical Group of the Human Factors and Ergonomic Society: http://sttg.hfes.org/HFlinks.html and http://sttg.hfes.org/
In addition, I highly recommend the book by Bishop, Intelligent Vehicle Technology and Trends (even though it is expensive): see my review in the "Recommended Readings" section of this website.