The GoAir flight 338, an A320 was accelerating for takeoff on the runway on 21st Jun 2017 when the aircraft suffered a bird hit on Engine 2. The crew continued the takeoff but noticed abnormal sounds and vibrations. After takeoff the crew did not follow the documented procedure as a result identified the wrong engine and shut down Engine 1. The co-pilot read out N1 as ENG number 1, thereby giving incorrect input to the Captain. The aircraft climbed to 2500feet and soon the crew realized their mistake and attempted starting Engine 1. They failed to start in the first attempt and by the time the engine was eventually started, the aircraft had descended to 1800 feet and lost speed to a near stall when the Airbus stall protection feature, the alpha floor activated to prevent the impending stall. The aircraft landed safely, thereafter.
Errors typically are defined as deviations from a criterion of accuracy. However, no clear standard of “correctness” may exist in naturalistic contexts. The “best” decision may not be well defined, as it often is in a highly structured laboratory task. Second, there is a loose coupling of event outcome and decision process, so that outcomes cannot be used as reliable indicators of the quality of the decision. Redundancies in the system can “save” a poor decision or error. Conversely, even the best decision may be overwhelmed by events over which the decision maker has no control, resulting in an undesirable outcome
Intuitive decisions taken based on the famous concept of Thin Slicing (Gladwell,2005) help in faster decisions based on limited exposure. Sometimes it refers to using only a small slice of the available information for decision-making and ignoring the rest.At other times it implies compressing a great deal of information into a simple underlying pattern to be used in snap decision-making.The term is also used to refer to the simple underlying pattern itself and sometimes to the thin slice of time in which rapid cognition occurs. Thin slicing works well in day to day life when the risk levels are low, however when the risk is high, normative decisions and slowing down thinking (Kahnemann,2011) helps in cognitive ease and accuracy of actions.
On the 8th Jan 1989, a British Midland B-737 was climbing through 28,300 feet when one blade of the fan in Engine 1 detached. This resulted in shuddering of the airframe, ingress of smoke in the cockpit and compressor stall of engine 1. The crew suspected that Engine 2 was the cause and shut down engine 2. Interestingly, the Engine 1 operated normally subsequently after the brief period of high vibration through descent towards the nearest airport.
The approach to the airport of intended landing was normal till the vibration started again resulting in engine fire and loss of power. Efforts to start engine 2 were not successful. The aircraft crashed before the runway.
The TransAsia ATR 72 was accelerating on the runway on 04th Feb 2015 when an intermittent signal discontinuity caused Engine 2 to auto feather. The propellers stop producing thrust any longer which is a feature to reduce drag from the propellers when an engine has failed. The crew did not reject and continued the takeoff. After takeoff the crew did not follow the documented procedure for identifying the engine and shut down the working Engine1. The aircraft suffered a series of STALL and crashed.
mindFly human factor analysis
Decisions in aviation typically are prompted by cues that signal an off-nominal condition that may require an adjustment of the planned course of action. Orasanu and Fischer (1997) described a decision process model for aviation that involves two components: situation assessment (SA) and choosing a course of action (CoA). In aviation, situation assessment involves defining the problem, as well as assessing the levels of risk associated with it and the amount of time available for solving it. Once the problem is defined, a course of action is chosen based on options available in the situation. Building on Rasmussen (1985), Orasanu and Fischer specified three types of option structures: rule-based, choice, and creative. All involve application of knowledge, but vary in the degree to which the response is determined.
Thus, there are two major ways in which error may arise. Pilots may (a) develop an incorrect interpretation of the situation, which leads to a poor decision — an SA error, or (b) establish an accurate picture of the situation, but choose an inappropriate course of action — a CoA error. Situation assessment errors can be of several types: cues may be misinterpreted, misdiagnosed, or ignored, resulting in a wrong picture of the problem; risk levels may be misassessed (Orasanu, Dismukes & Fischer, 1993); or the amount of available time may be misjudged (Orasanu & Strauch, 1994). One problem is that when conditions change gradually, pilots may not update their situation models.
Pilots work under conditions of Divided Attention. When not able to devote all perceptual and decision making resources to one input or output on a continuous basis, attention can be divided; e.g., monitoring speed and attending to signals or directions at the same time. Under conditions of high stress and arousal, the scanning or sampling rate may increased but the pattern of scanning is reduced to a narrower range of inputs
• Attention is restricted to the primary task
• This can lead to important information being missed because the stress response caused attention to be restricted to the primary cause or a perceived primary aspect of the problem.
This effect is sometimes called ‘CONING OF ATTENTION’ or ‘NARROWING OF ATTENTION’
Therefore it important that when a critical decision like shutting down an engine is required to be taken, the crew must reduce stress and slowdown. This helps to increase the scan area and proper identification and fault analysis leading to a safe outcome.