Automation is supposed to make routine tasks safer and easier and minimize the opportunity for errors, and for the most part it does. But there are unintended consequences of eliminating the majority of opportunity for error. In the aviation world, adhering to major airline Standard Operating Procedures, or SOP’s, which dictate engaging the autopilot at a relatively low altitude in most situations, means that most pilots get very little time hand-flying the airplane. If everything goes well, this is not an issue. Which leads to a second unintended consequence: when things do go wrong, there is a huge potential for operators to not see it coming and have to “catch up” quickly under the worst of circumstances.
Tim Harford, author of the article “Crash: how computers are setting us up for disaster” below, explains it this way: “It is precisely because the digital devices tidily tune out small errors that they create the opportunities for large ones. Deprived of any awkward feedback, any modest challenges that might allow us to maintain our skills, when the crisis arrives we find ourselves lamentably unprepared.” Dubbed “The Paradox of Automation”, this dependence has the potential to mask incompetence, erode skills and produce rare but unexpected modes of failure. This paradox affects not only aviation, but virtually any process involving the use of algorithms for decision making, from monitoring nuclear power plants to predicting the weather.
In the article, examples are given that seem to indicate that a certain amount of vagueness and confusion actually increases vigilance and skill enhancement because more human inputs and decision-making are required.
Harford offers a solution worth considering: “An alternative solution is to reverse the role of computer and human. Rather than letting the computer fly the plane with the human poised to take over when the computer cannot cope, perhaps it would be better to have the human fly the plane with the computer monitoring the situation, ready to intervene. Computers, after all, are tireless, patient and do not need practice. Why, then, do we ask people to monitor machines and not the other way round?”
Case in point: a recent Aviation Week online article included video of an F-16 pilot whose life was saved by the engagement of the auto Ground Collision Avoidance System (GCAS) after the pilot had lost consciousness during a high-G training exercise. The onboard computer system compares a prediction of the aircraft’s trajectory against a terrain profile generated from onboard terrain elevation data. If the predicted trajectory touches the terrain profile, an automatic recovery maneuver is performed without input from the pilot.
Please read the entire article here.