ESA title
People are an indispensable element of space systems
Enabling & Support

Human dependability: how to deal with human error

3866 views 3 likes
ESA / Enabling & Support / Space Engineering & Technology

To err is human, goes the saying – one that space mission planners can never forget. Human error has spelt the doom of numerous missions during the last half century. Mistakes will inevitably be made in any system that people are involved in, and space is extremely unforgiving of mistakes.

The expression ‘unmanned spacecraft’ is ultimately misleading. Human beings are an indispensable part of space missions. The systems up in orbit may most often be automated but this ‘space segment’ is only the tip of a very large iceberg, which includes what is referred to as the ‘ground segment’. This takes in all the infrastructure, computer systems and human personnel down on Earth that keep the mission running.

Human dependability is about the contribution of the human in a space system to safety and reliability. Machines can fail, so can ‘people-in-the-loop’ of space systems, sometimes with catastrophic consequences. ESA’s Dependability and Safety Section therefore has a longstanding interest in the subject of human dependability: how can the incidence of human error be reduced, and its effects minimised?

A standardised approach

Bad design increases the likelihood of human error
Bad design increases the likelihood of human error

This interest has been boosted in recent years, as the scope of ESA’s space activities has expanded (as with the possibility of providing safety-critical systems and services on Earth through EGNOS and Galileo). Human errors not only have the potential to put multi-million euro spacecraft at risk, but also human lives.

ESTEC hosted a Human Dependability Workshop – called HUDEP – in September 2009. HUDEP was attended by more than 40 participants from ESA programmes and operations, as well as space and non-space industry. The purpose was to share current best practices within the space sector, and ‘spin-in’ non-space expertise on this extremely complex subject.

The subject owes its origin to aircraft accident investigations in the 1940s, out of which the very term ‘human error’ was coined. Today, human dependability is an extremely important consideration within safety-critical technological fields such as nuclear energy, air traffic control and high-speed railway transport.

The workshop conclusions are being used as the basis to elaborate an initiative to establish human dependability as a discipline at ESA in order to provide and coordinate support to Agency projects.

Taking lessons from the air

Managing human error is central to the airline industry
Managing human error is central to the airline industry

Humans are involved in all instances in the development, management and operation of space systems. Potential human errors need to be identified early and prevented. Air traffic makes for a good comparison. The pilot of a plane as well as air traffic controllers can make mistakes with fatal consequences.

In contrast to machines, which are good at doing endless repetitive tasks, the human intellect can solve unexpected situations or other problems in a creative way – as when a pilot makes a split-second decision during an emergency. However, humans tend to get bored and tired when having to do repetitive tasks for too long and tend to make more fatal mistakes in extreme stress situations.

Research by Prof. Heiner Bubb of Munich Technical University shows that the ‘mean time between human failures’ decreases dramatically as the complexity and stress of the tasks being done increases – from a mistake around every half an hour for simple well-practiced activities, to a mistake every 30 seconds for the hardest, seldom-tried procedures.

Avoiding error

What we know is that human failure rate is high, and airplane operation is risk management

“What we know is that human failure rate is high,” Captain Eugen H. Buehle of Lufthansa told the workshop. “And airplane operation is risk management. When it comes to the people flying planes, pilot selection is one key factor and pilot training is the other.”

The company’s pilots are given a minimum of four simulator sessions per year as well as special training scenarios and coaching. The aim is to keep their ability to handle emergency situations sharply honed.

Poor human-machine interface designs, an unsuitable operator environment, insufficient training, an overly ambitious operational schedule or even time of day – all of these factors make mistakes more likely to occur.

Air traffic controllers
Air traffic controllers

Sylvie Figarol of France’s Air Navigation Technical Centre recounted how simulator-based tests of air traffic controllers and pilots fed back into improvements of the Traffic Collision Avoidance System (TCAS), a highly-reliable automated system that raises the alarm when aircraft risk coming too near each other.

The widely-used TCAS system has saved many lives, but miscommunication or errors around these alarms have occasionally triggered dangerous actions. A 2002 mid-air collision over Überlingen, Germany, took place after an air traffic controller told aircrews to do the opposite of what TCAS automatically instructed them.

Actual incidents were recreated in test simulations, with participants wired up to observe their physiological behaviour. Extreme stress increased the tendency to act, though not always in a well-judged way. Test results inspired TCAS training improvements and simplified procedures. “Design of such a safety net must include human aspects,” said Ms Figarol.

Article continues via right hand link...

Related Links

Related Links