The human factor in aviation is relevant to every stage of flight. Aviation industry’s focus is on reducing the unstable approaches. Above all, the objective is to mitigate the risk of a human factor during approach and landing accidents. A Boeing analysis of runway overrun excursions from 2003 to 2010 found that 68 per cent followed stable approaches. Overruns accounted for the other 47 per cent of landing excursions.
Unstabilised approach crew
The crew of an aircraft on a final approach can fail to achieve the stabilised parameters at the 1000’AAL gate. They should ideally identify the situation, process and mitigate by executing a go-around procedure. Why would the crew continue with the approach and risk the possibility of an overrun?
The chance of a given loss or injury is the definition of risk(Merriam-Webster, 2013). Perception of risk is how individuals think and feel about the risks they face. It is an important determinant of human’s protective behaviour. Neurological or psychological perspective can help understand risk.
Harvard medical school published a study on risk perception. It is based on a combination of cognitive skills and emotional appraisals (human factor). The mixture is rarely rational. David Ropeik, has identified 14 factors, which affect the perception of danger.
Examples of risk perception
Below mentioned are a few relevant ones.
- Imposed vs voluntary: An imposed risk makes people more afraid. For example, the driver in the car next to us using his cell phone. In comparison, when they voluntarily expose themselves to the same risk, most noteworthy, when we are using a cell phone while we drive).
- Catastrophic vs chronic: We tend to be more afraid of things that can kill a lot of us, suddenly and violently and all in one place. An example is a plane crash, then something like heart disease, which causes hundreds of thousands of more deaths, but one at a time, over time, and not all in the same place.
- The dread factor: We are more afraid of the worst outcome from the risk, such as being eaten alive by a shark, the more afraid of it we are. This helps explain our excessive fear of carcinogens or potential carcinogens. Cancer ranks high on the dread scale.
- Awareness: When the news is full of stories about a given risk, like ozone depletion, our fear of that risk is more significant. Does it affect me?: We don’t perceive a risk to “them,” to society, as fearfully as we do risks to ourselves.
- Control vs no control: If a person feels as though he or she can control the outcome of a hazard, that individual is less likely to be afraid. Driving is one obvious example, as is riding a bike and not wearing a helmet.
Human factor in presentation of risk
Risk proxy is another way of presenting risk. The human factors definition is that they are typically not perfect representations of risk. Instead, they are correlates. They correlate with risk, meaning they bear a statistical relation to risk, can serve as representations of risk, and can function as predictors of risk to some degree.
Despite the common notion that “brains are computers,” people rarely calculate risk the way a computer program might. Instead, one usually relies on risk-proxies and heuristics. Heuristics are simple rules, often used unconsciously. (Fraizer, 2015)
The instruments of the primary flight display (PFD) typically displays the artificial horizon in the centre, airspeed, altimeter & vertical speed on either side. Displayed on top are, the bank & slip indicator. Displayed at the bottom is the heading. At every stage of the flight, the parameters required for manoeuvring the aircraft change depending on the stage of flight, the configuration of flaps/slats & landing gear. Risk proxies let individual pilots estimate their own individual risk from the available information.
Since risk itself is rather difficult to represent directly, proxy, where some stimulus attribute such as color stands in for the more abstract quality of risk, typically represents it indirectly.
A typical example would be the Traffic Collision Avoidance System (TCAS). While manoeuvring, to avoid a potential traffic hazard, the colour of the vertical speed indicator (VSI) changes to a green band and a red band. The pilot manoeuvres smoothly and stay in the green band. Therefore the colour green represents the safe zone where the risk is eliminated and the red zone is the area of high risk and best avoided.
When we compare the two examples above, risk perception is more natural when represented with colour. This is in comparison to the combination of numerical digits from different instrument readouts.
Many object qualities are not tangible things that can be perceived directly in the same sense as colour and taste are perceived directly. Instead, object qualities are often things one infers from physical stimuli that usefully correlate with a given class. For instance, one often unconsciously uses visual clarity to judge how close an object is, because closer objects tend to look brighter, while distant objects look blurrier (Kahneman, 1982)
If one can perceive risks directly, or correlates thereof, or if one can mentally construct risk estimates, then the thresholds of those qualities, below which no behavioral alteration is necessary, can be called one’s risk tolerance.
Many of the risk estimates are surprisingly accurate. Yet, the mind appears particularly poor at estimating high-impact, low-probability events (Camerer, 1989). Aviation accidents, and the events leading up to them, fall into this category, making aviation safety not only an area of great practical concern for all but one of considerable interest for decision theory, as well.
Providing appropriate risk information has implications for the effectiveness of the resultant decisions. When people are confronted with risk information they have to decide whether to accept or reject the message it conveys (Stroebe and Jonas 1997). As such, the content and amount of information in the risk message needs to be provided in a way that enables the individual to comprehend fully the (risky) situation they are facing.
There is presently a substantial amount of information available that makes specific recommendations for or could be applied to, the design of risk messages.
mindFly human factor analysis
The crew of the aircraft on final approach must interpret the numbers on the primary flight display at 1000’AAL. Thereafter, form a mental picture and relate it to the consequence of a runway overrun. Studies on human factor have shown that statistical values are difficult to relate to the consequences. The brain interprets Audio/Visual inputs more rapidly.
Humans are cognitive misers. As a result, they believe in heuristics such as satisficing. We don’t always lay-down all the options. While analysing the situation, the cognitive process is shut down the moment a reasonably compliant solution is reached.
Studies like Dunning Kruger effect apeak about cognitive bias. Biases in which people with low ability at a task overestimate their ability. It is related to the bias of illusory superiority. It comes from the inability of people to recognize their lack of ability.
DeJoy contends that it is not uncommon for individuals to accurately assess the hazards and risks presented in a situation. Consequently, they overestimate their ability to effectively manage the situation (DeJoy,1992). This unwarranted optimism results in individuals’ negating the dangers of the task. Hence, this exaggerated sense of skill places these pilots at an increased risk of an accident or incident.
In part 2, I will link the neurological aspects and age with risk perception. If you don’t see or feel the risk, how will you act on it?