HUMAN SCIENCE

When security brings fear

© Shutterstock
© Shutterstock

Security is at the top of every political agenda. But can the same be said of respect for the vital individual freedoms needed for democracy to function? Anxious to reassure its citizens – and to steal a technological competitive advantage – the Union has not yet defined a clear policy. As so often happens, the technology is progressing faster than the debate on its potential abuses. It is a gap that the Commission is trying to bridge, in particular with the aid of human science.

Since the attack on the World Trade Center’s twin towers on 11 September 2001, the world has been in the grip of a security frenzy. The New York attack, followed shortly by the Madrid and London attacks, heightened the public’s fear still further. It is perhaps this irrational fear that has led to the widespread introduction of new security measures (like America’s Patriot Act) that undermine many of the fundamental rights essential to our democracies.

In the face of this diffuse threat, the European Commission grouped under the umbrella term ‘security’ the prevention of a huge number of risks of diverse nature and origin (ranging from natural disasters, terrorism and the regulation of illegal immigration to computer infrastructure security), so technical solutions have come to be considered as essential allies. “Things have moved very fast since 2001. There has been enormous political pressure for organising a rapid response to security threats, from both the international community and citizens at home. Although it is true to say that the Commission has responded highly effectively, its zeal has helped to obscure the security research agenda,” explains J. Peter Burgess, a researcher at PRIO, the Peace Research Institute (NO). “The Commission’s designated security threats are so diverse that it is extremely complex to identify all the impacts and ethical implications of the various technologies concerned.”

A clear technological lead

J. Peter Burgess currently coordinates INEX(1), a research project financed under the ‘security’ portfolio of the Seventh Framework Programme (FP7). The project aims to determine the ethical implications of recent developments in security strategies, characterised mainly by a gradual rapprochement between defence and internal security policies. “Our goal is to place people, that is to say citizens, back at the centre of the security issue. As is so often the case, legislation lags far behind innovation. This sometimes means that new security devices are introduced without first conducting an exhaustive study to evaluate their ethical and legal implications,” adds J. Peter Burgess. ”The Commission appears to be keeping a watchful eye on the situation though, as the second FP7 call for security proposals includes more human science projects on such subjects.

Similarly, the European Security Research and Innovation Forum (ESRIF), which was set up in September 2008 to develop a strategy for civil security research and innovation in Europe through public/private dialogue, includes an ethics working group. In spite of all this, more research is needed to strictly control the use of these new technologies. Human science and legislation are clearly lagging behind security solutions.”

Christiane Bernard, who is in charge of ethics at the Security Research and Development Unit of the European Commission’s Directorate-General for Enterprise and Industry, agrees but wishes to set our minds at rest: “Although it is true that the ethical aspects lag somewhat behind technological advances, the Commission takes special care to ensure that ‘security’ projects are developed in a way that respects individual rights and freedoms. Ethical compliance is studied for the first time upstream when projects are selected,(2) a second time when the mid-term report is reviewed and a final time downstream, when the research results are evaluated. As a general rule, the Commission also ensures that the most sensitive projects systematically include an ethics committee, to enable researchers to develop technologies that respect individual freedoms.”

Catching up…

Ethical problems are particularly acute when it comes to surveillance technologies, the long list of which includes cameras equipped with facial recognition programmes, computer systems capable of intercepting telephone calls and e-mails, radar that can see through walls and databases. But what about the right to privacy? Researchers for PRISE(3), one of the few human science projects to be launched under the Preparatory Action on the enhancement of the European industrial potential in the field of Security Research (PASR), which was in operation prior to the FP7 security portfolio, have devised a methodology to guarantee that, throughout their development, security solutions comply with this fundamental right to privacy. “It is in the interests of the industries involved in security research to respect individual freedoms so as to guarantee support from civil society. Otherwise these technologies may never be introduced, with a huge loss of potential earnings,” stresses Walter Peissl, deputy director of the Institute of Technology Assessment (AT), the research centre responsible for coordinating PRISE.

As part of the PRISE project, a consultation was conducted to ascertain which boundaries citizens place on security technologies. There was general consensus among the respondents that under no circumstances does the terrorist threat justify restricting the right to privacy and that devices that invade a person’s physical privacy are not to be tolerated. They also stressed the danger of security technologies being misappropriated for criminal ends or for purposes that contravene fundamental rights and they deemed it unacceptable for technology or personal information to be used for purposes other than the original purpose for which the technology or information was introduced or collected. Most of the respondents believed that the use of security devices should be strictly controlled by the judiciary.

…before it’s too late?

“As the consultation gathered the views of 160 people who had been fully apprised of security development issues in Europe, it could never be described as representing all of Europe’s citizens,” Walter Peissl explains. “Nevertheless it does provide us with useful insights into the public’s expectations regarding security.” On the basis of these results, supplemented by a study of legislation on the respect of privacy and other fundamental rights that could be affected by surveillance technologies, the researchers were able to draw up two analytical matrices, one for research proposers and the other for Commission evaluators. The aim? To develop surveillance devices that comply with the rules on individual rights.

This is essential if citizens’ interests are to be kept at the centre of new security system developments. Nevertheless, the PRISE methodology was not used when implementing the FP7 security portfolio. “The simple reason is that the project’s results have only just been submitted and the Commission has yet to approve them,” explains Christiane Bernard.

Meanwhile, the gap between security technology and fundamental rights seems to be growing inexorably. Even though PRISE consultation participants defined people’s physical privacy as a boundary that security technologies should not cross, it has not prevented a number of European airports, including Luton (London) and Schiphol (Amsterdam) from introducing body scanners that can see through passengers’ clothing …

Julie Van Rossom

  1. Converging and conflicting ethical values in the internal/external security in continuum in Europe.
  2. See «La fin des savants fous?», page 30 of this edition.
  3. Privacy enhancing shaping of security research and technology – A participatory approach to develop acceptable and accepted principles for European Security Industries and Policies.


TOP

Find out more