27 April 2017
Video cameras designed to use facial recognition to identify criminals, terrorists or other individuals of concern are already in use on a trial basis at Berlin Südkreuz railway station. This project of the German Ministry of the Interior has sparked fierce controversy around the questions of the legal basis for the use of smart video surveillance and its implications for data protection.
Since August 2017, three inconspicuous cameras have been training their lenses on an entrance, an exit and an escalator in Berlin’s Südkreuz railway station. Visually speaking, they are indistinguishable from the video cameras that are already in use in train stations in many places. And yet, their blue information boxes draw the attention of the onlooker to a special feature: “Facial recognition pilot project. Detection area” is the legend printed here. The point is that the videos recorded are being processed by software whose task it is to identify the faces of 300 test subjects who have been registered in advance. With this pilot project, the German Ministry of the Interior aims to test whether this technology can be used to automatically and reliably identify criminals, terrorists or other potentially dangerous individuals. The expectations of the authority are summed up in a press release of the Ministry of the Interior, according to which “the use of intelligent facial recognition systems will in the future bring about significant improvements to public safety.” It is hoped that video surveillance with facial detection will reinforce the fight against criminal activity and speed up the process of solving crimes. In December 2017, the then Minister of the Interior Thomas de Maizière reported positive interim results - with gratifyingly high detection rates. The conclusion was that a positive end result would potentially pave the way to the widespread introduction of automatic facial recognition at train stations and airports.
An effective means to combat crime?
Privacy advocates see the issue in part from a different perspective. They fear that the state could use intelligent video surveillance to plot the movement patterns of the population as a means of tracking their every footstep. The assertion that this will offer greater security has been greeted with varying levels of doubt. After all, what would happen if a terrorist were to be identified via facial recognition? The police would then have to get to the scene fast enough to prevent them from going about their deadly business. As lawyer Dennis-Kenji Kipker also admits: “Intelligent video surveillance can at best help in manhunts and might serve to identify patterns of action. But it would be of very little use in actually preventing crimes,” says the Scientific Director of the Institute for Information, Health and Medical Law (IGMR) at the University of Bremen, who is also a member of the board of the European Academy for Freedom of Information and Data Protection (EAID). “The chance of identifying individuals is much higher than with standard cameras. But you always have to store other data in advance to enable the identification to take place” - an aspect that also gives rise to misgivings in the mind of Berlin’s senior-most data guardian, Maja Smoltczyk. Jörg Schlißke, data protection expert at TÜViT, goes a step further: “The fundamental right of informational self-determination must not be cede any ground to generalised suspicion directed at members of the public.” After all, anyone who passes through the field of vision of the cameras on a daily basis could in principle be deemed suspicious from the point of view of the state. Worse still, they might be wrongfully placed under suspicion because of possible technical malfunctions.
"Mere reference to an abstract threat or comparable cases does not meet the requirements of the GDPR."
The freedom to move anonymously in public: a fundamental right?
According to Jörg Schlißke, whether and to what extent intelligent video surveillance should be used is open to legal debate. The General Data Protection Regulation (GPDR) and the revised version of the German data protection law stipulate that each individual case must be assessed on its own merits and require evidence of a real and specific threat, he explains. “In other words, mere reference to an abstract threat or comparable cases does not meet the requirements of the GDPR.” What this means in practice is that the mere fact that the terror threat has generally increased everywhere in Germany does not justify the preventive installation of facial recognition cameras in every place and every railway station. The Federal Government and the Ministry of the Interior would dearly like to extend the use of video surveillance to football stadia, shopping centres, further railway stations and public transport. It was with this in mind that the German Data Protection Act was amended in 2017. The crucial question here, too, is whether citing a theoretical risk of terrorism would in future suffice, for example, to permit operators of shopping centres to install security cameras, or whether they would need to furnish evidence of a specific threat. Video surveillance of football stadia and concerts can from a security standpoint be legitimate in individual cases, concludes the data protection expert from TÜViT. But, as far as he is concerned, the use of automatic facial recognition in shopping centres, for instance, is out of the question. “Such a scenario runs counter to the principle of necessity. It would be the equivalent of using a cannon to shoot sparrows: and there simply has to be a more proportionate means that doesn’t require biometric evaluation.”
“Intelligent video surveillance is highly risky from a data protection perspective too”
Also questionable is whether the use of facial recognition by public authorities is fundamentally compatible with the data protection regulation. After all, the GDPR does in principle impose strict limits on the processing of biometric data. It is allowed only if the subjects give their express consent, as was the case with the Südkreuz guinea pigs, or if identification using biometric data is required for reasons of significant public interest. The security services might in the first instance be able to invoke the latter, for instance, in the case of the prosecution of serious crime. This is an argument which the Ministry of the Interior would also be able to fall back on.
Generally, however, from the perspective of privacy advocates like Kipke and Schlißke, the possible use of facial recognition raises a slew of unanswered questions: Where do the biometric data from the databases used for matching come from? For how long will the data of the members of the public identified in this way be kept? Where will they be stored? Who has access to the data? How will the security of the data and their use only for the stated purpose be guaranteed? How can we be sure that the data will be used only for the stated purpose and not passed on, for instance, to foreign intelligence services? “This hasn’t yet all been clarified,” Schlißke says.
Before any possible use of smart video surveillance, Kipker advocates the introduction of uniform standards across Germany. In respect, for example, of an independent trust authority. This rather than individual police authorities would be the central repository of the data – and the police and concerned citizens would be able to access to it. A model that Schlißke from TÜViT, too, believes in principle to be workable, although, at the same time, he also recommends further technical and organisational measures.
As one of the leading testing service providers for IT security and data protection, TÜV Informationstechnik (TÜViT) works on a daily basis with such approaches to solving the problem. It is already possible to develop sustainable solutions at an early stage in the planning and implementation of video surveillance to strike a balance between the interests of public security and the fundamental rights of citizens, says Jörg Schlißke. "We privacy advocates always base our arguments on the necessity principle: if there’s a less drastic way of doing things that doesn’t impact so strongly on the protective rights of the public, such measures should always be employed.” Those responsible for surveillance need, for instance, to examine the extent to which surveillance time can be limited and which areas can be hidden or pixelated. Data protection needs to be “built in” to the technology as early as its procurement and installation. Functions that are not absolutely necessary, such as free swivelling, zoom capability, and wireless transmission, should are not supported by the technology used. Data protection experts Schlißke and Kipker agree that it is essential not to store data for any longer than is absolutely necessary.
You may also like
Jörg Schlißke is a product manager for data protection training at TÜViT. With his team, the qualified business lawyer is responsible for data protection advice, privacy assessments and certification for data protection and data security. He also runs the special office for data protection experts at the independent State Centre for Data Protection in Schleswig-Holstein.
Dennis-Kenji Kipker is the Scientific Director of the Institute for Information, Health and Medical Law (IGMR) at the University of Bremen and also a member of the board of the European Academy for Freedom of Information and Data Protection (EAID) in Berlin. The expert in information law deals primarily with IT security, privacy and government surveillance activities. Kipker also advocates an “informational equality of arms” between security agencies and citizens. The use of surveillance technology should always be compensated by measures for the benefit of the citizens to ensure that any interference with their fundamental rights is kept in proportion.