Lion air accident in 2018.
In the modern society, the human being is the central element of the design process of non-fully automated control and safety system, as well as the major source and contributor to accidents. Therefore, accounting for human modeling is necessary in the design processes and in safety assessments, in order to ensure appropriate consideration of human factors. The book, “Human Modelling in Assisted Transportation” (HMAT), provides an overview of the state of the art of the problem of human behaviour modelling in transportation systems, and to confront models, methods and tools developed with the ongoing research carried out worldwide.
The optimistic view in HMAT underlines that most ‘human errors’ have their origin in useful and adaptive processes. Such methodologies relate ‘human error’ to forms that support the human capacity to manage complex ambiguous, and unverifiable circumstances. The view of human performance as basically competent focuses on the correspondence between capabilities and the situation or the demands. Human performance is competent because we can identify the relevant and regular features of a task and use that to optimise resource usage. Since the environment is constantly changing this ‘strategy’ will sometimes lead to failures—on either a small or a large scale. But the underlying performance adjustments are in themselves correct.
The fact is that humans operating the ill fated machine were not informed of a design feature which malfunctioned, leading to ambiguous indications goes against the theory as stated above. The optimistic view is indicated by the efforts of the crew to control the aircraft soon after departure. The cockpit environment is unimaginable and can be perceived only by those at the control.
The investigation must lean towards the optimistic view of pilots and their aircraft handling capabilities. A purely technical investigation would be biased towards the manufacturer, therefore the human factor consideration is imperative.
In a simplistic model of root causes analysis there are two possible perspectives. Theory X reaches a technical conclusion and theory Y reaches a human factor cause. A single cause of AOA error could have been handled by the pilots with ease but the ambiguous indications coupled by a physical out of phase control surface malfunction complicated the situation. The control surface forces increased beyond the physical capability of the human in-charge leading to the accident.
Theory ‘X’, 5 steps simplistic model of Root cause analysis:
- Why did the accident occur?- Aircraft upset/Undesirable aircraft state(UAS)
- Why did the upset/UAS take place?- Pilot could not counter the Stab malfunction
- Why could Pilot not counter the Stab trim?-due to Maneuvering characteristics augmentation system (MCAS) malfunction
- Why did the MCAS malfunction?- AOA disagree and erroneous AOA indication
- Why did the AOA indication show an error?? The root cause.
Theory ‘Y’, 5 steps simplistic model of Root cause analysis:
- Why did the accident occur?- Aircraft upset/UAS
- Why did the upset take place?- Pilot could not counter the Stab trim malfunction
- Why could Pilot not counter the Stab trim?-Pilot did not deactivate MCAS as per Stab runaway/malfunction procedure
- Why did the Pilot did not deactivate MCAS as per Stab runaway/malfunction procedure?-Pilot could not comprehend the malfunction
- Why could the Pilot not comprehend the malfunction?? The root cause
Article on unintended consequences of automation presents a view point on the accident. The automation aspect highlights the fact that the user is neither involved in the design and development nor informed of the consequences of design changes and their effects.
Papers written by experts on automation have highlighted that the users have to be taught about the Use, Misuse, Disuse and Abuse of automation. At present, the user is taught to operate the automation and a sketchy background of the system based on the OEM philosophy of disclosure. EASA has made a safety case for revising the examination methodology faced by the alarming fact that over 70% of the pilots undergoing type rating course did not have conceptual understanding of the subject.
The bottom line is that there are no short cuts in Safety, Training and Quality. We have to invest time, money and effort to see the long term effects. The training curriculums have to be revised to introduce automation as a separate subject, users have to be a part of the design process and SMS change management has to highlight the changes and risk associated with them.