Automation in Aviation: Where to draw the line?

Parumita Sachdeva
6 min readFeb 25, 2021
A pilot flying an aircraft and adjusting flight controls
Photo by Chris Leipelt on Unsplash

From the steam gauges to glass cockpits, automation in aviation has come a long way. It has become integral to flying and pilots have become increasingly dependent on automation systems like autopilot, auto-throttle, and fly-by-wire. While these automation systems have hugely reduced the number of fatal crashes due to manual error, they have also led to new challenges and questions about where to draw the line, and with whom should the overruling control reside. In this article, I attempt to study the mapping of complex relationships between automation technologies and humans and talk about Actor-Network Theory [1] in aviation. The article also discusses the interaction of pilots with automation systems and the role of pilots in case of automation failures.

About automation and its ubiquity

Automation in aviation or the commonly known system, autopilot is the use of computers and technology to execute tasks during a flight path with minimal assistance from the Pilots. According to the Federal Aviation Administration, it is known to be “capable of many very time-intensive tasks, helping the pilot focus on the overall status of the aircraft and flight. This allows them to focus on broader aspects of operations such as monitoring the trajectory, weather, and systems. The autopilot determines which control movements are required to follow the flight profile entered by the pilot, and it moves the controls to affect tracking of the flight profile” [2].

Automation systems have been used extensively and ubiquitously in aircraft over the years and pilots have grown to have huge dependencies on them. It has become vital to measure the negative impacts of such technologies as well as consider the human factors concerning the increased complexity of these systems.

Activity Network Theory and Automation Systems as “Actors”

According to Cila, N., Smit, I., and Giaccardi, E., in Activity Network Theory, humans and non-humans are studied as equal actors in any kind of network and their agencies can be continuously transformed into one another. They define agency as a metaphorical term to introduce networked products and distinguish them as Collectors, Actors, and Creators [1]. In other words, ANT is a very fluid concept and could essentially be applied to Aviation. Agency flows between the pilot and the automation systems in various legs of the flight. While pilots take Agencies in critical situations like take-offs, landings, and emergencies, Autopilot does a great job in acquiring agency while cruising, performing repetitive tasks, and keeping the aircraft level.

A person flying an aircraft
Photo by Kristopher Allison on Unsplash

Autopilot in much of this sense plays the role of the “Actor”. The pilots interact with the automation system to input data about the flight path. The Autopilot then performs a sequence of tasks to keep the plane airborne.

“The last type of agency is drawn from near-future scenarios, in which the products will become the Creator of futures” [1]. A controversial but important topic here is should autopilots be assuming roles where they are given full authority over their actions? Will such a decision be detrimental to aviation or will it be much safer?

Safety of Automation Systems

With the increasing layers of complexity and amount of control given to technology, it has become crucial for the pilots to monitor these complex systems carefully and regularly since there are some factors that only humans can detect and discern which automation systems cannot anticipate. To illustrate, in the movie ‘Mayday’, the radars could not track the military drone, whereas the captain noticed it with his naked eyes.

Accident Analysis: Autopilots as Actors Qantas Flight 72

According to investigation reports, a faulty reading in the Qantas flight forced the Autopilot to pitch down the airplane violently. Although the airplane was perfectly leveled, the Autopilot assumed that something was wrong. The Captain of the aircraft recognized the problem and immediately pulled up on his stick to save the plane from diving. A few minutes later, the system went haywire again and the airplane dived towards the ocean. “It’s the worst thing that can happen when you are in an airplane — when you are not in control,” said Captain Sullivan in Sydney Morning’s article [11].

The article further describes the accident, “The flight control computers — the brains of the plane– are supposed to keep the plane within an “operating envelope”: maximum altitude, maximum and minimum G-force, speed, etc. Yet against the pilots’ will, the computers were making commands that were imperiling all on board. For reasons unknown to the pilots, the computer system had switched on “protections” [11].

“The plane is not communicating with me. It’s in meltdown. The systems are all vying for attention but they are not telling me anything,” Sullivan recalls. “It’s high-risk and I don’t know what’s going to happen “[11].

This incident reflects a clear example of the Autopilot system behaving as an actor and leading to adverse consequences. Although the data for the flight path was inputted by the pilots and everything was going as planned, a sudden failure of the autopilot led to a possible air crash situation.

The autopilot systems overruled the pilot’s instructions and did not cooperate. It was Captain Sullivan’s situational awareness and cognitive thinking that saved 315 lives onboard.

Conclusion

This paper discussed the significance of automation systems in aviation over the years. Automation systems have helped in reducing the number of accidents due to manual error considerably over the past few years and have become fundamental to flying in today’s time. However, with increasing dependencies on these systems, new types of problems have evolved.

Autopilot systems and pilots, both work in a collaborative environment and exchange important information with each other throughout the flight path. While autopilot can take agency in uneventful flight conditions, in cases of unusual conditions and emergency, the agency should flow back to the pilots to take final decisions based on their cognitive skills, situational awareness, and years of experience.

However, the question arises about the future of automation. New technologies are becoming more and more complex and automated, making it harder for the human mind to understand and interpret its entirety.

The key question is: is it ethical to give majority control of flying an aircraft in the hands of technology or should pilots have the autonomy to take overruling decisions?

To conclude, it’s crucial to understand the implications of design in cockpits and how better design and usability can help the pilots' monitor systems and take decisions effectively. Hence, designers and technology experts in any industry have a vital role to play. It’s important to not only understand design and usability but also the agency that we assign to various systems that we design. Moral, ethical, and severe consequences should be taken into consideration before creating any technology and actions should be well thought out before introducing autonomous systems.

References

[1] Cila, N., Smit, I., Giaccardi, E., & Kröse, B. (2017). Products as Agents. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems — CHI 17, 448–452. doi: 10.1145/3025453.3025797

[2] Chapter 04: Automated Flight Control. (0AD). Retrieved from https://www.faa.gov/regulations_policies/handbooks_manuals/aviation/advanced_avionics_hand book/media/aah_ch04.pdf.

[3] Chialastri, A. (2012). Automation in aviation. Automation. doi: 10.5772/49949

[4] Brown, J. P. (2017). The Effect of Automation on Human Factors in Aviation. The Journal of Instrumentation, Automation and Systems, 3(2), 31–46. doi: 10.21535/jias.v3i2.916

[5] Norman, D. A. (1990). The ‘problem’ with automation: inappropriate feedback and interaction, not ‘over-automation.’ Human Factors in Hazardous SituationsProceedings of a Royal Society Discussion Meeting Held on 28 and 29 June 1989, 137–146. doi: 10.1093/acprof:oso/9780198521914.003.0014

[6] Sullenberger, C. C. (2012, July 9). Air France 447: Final report on what brought airliner down. Retrieved from https://www.youtube.com/watch?v=kERSSRJant0.

[7] Bødker, S., Lyle, P., & Saad-Sulonen, J. (2017). Untangling the Mess of Technological Artifacts. Proceedings of the 8th International Conference on Communities and Technologies -C&T 17. doi: 10.1145/3083671.3083675

[8] O’Sullivan, M. (2017, May 12). The untold story of QF72: What happens when ‘psycho’ automation leaves pilots powerless? Retrieved fromhttps://www.smh.com.au/lifestyle/the-untold-story-of-qf72-what-happens-when-psycho-aut omation-leaves-pilots-powerless-20170511-gw26ae.html.

--

--