As automation increasingly takes its place in industry, especially high-risk industry, it is often blamed for causing harm and increasing the chance of human error when failures occur. I propose that the problem is not the presence of automation, but rather its inappropriate design. The problem is that the operations under normal operating conditions are performed appropriately, but there is inadequate feed back and interaction with the humans who must control the overall conduct of the task. When the situations exceed the capabilities of the automatic equipment, then the inadequate feedback leads to difficulties for the human controllers.
The problem, I suggest, is that the automation is at an intermediate level of intelligence, powerful enough to take over control that used to be done by people, but not powerful enough to handle all abnormalities. Moreover, its level of intelligence is insufficient to provide the continual, appropriate feedback that occurs naturally among human operators. This is the source of the current difficulties. To solve this problem, the automation should either be made less intelligent or more so, but the current level is quite inappropriate.
The overall message is that it is possible to reduce error through appropriate design considerations. Appropriate design should assume the existence of error, it should continually provide feedback, it should continually interact with operators in an effective manner, and it should allow for the worst of situations. What is needed is a soft, compliant technology, not a rigid, formal one.
This essay was published 15 years ago, but it is still relevant, especially as automation moves into the auto industry.Norman, D. A. (1990). The "problem" of automation: Inappropriate feedback and interaction, not "over-automation". In D. E. Broadbent, A. Baddeley & J. T. Reason (Eds.), Human factors in hazardous situations (pp. 585-593). Oxford: Oxford University Press.The full paper is available as a PDF file:(The Problem of Automation: Inapropriate feedback and interaction, not over-automation)