From education to employment

British public want restrictions on the use of facial recognition technology

Published by the Ada Lovelace Institute, the survey shows that the British public are prepared to accept use of facial recognition technology in some instances, when there is a clear public benefit and where appropriate safeguards are put in place, but they also want the government to impose restrictions on its use.

The Ada Lovelace Institute is calling for companies to temporarily stop selling and using facial recognition technology while the public is consulted on its use. It will lead public consultation by establishing the Citizens’ Biometric Council, a citizens’ assembly supported by the Information Commissioner’s Office.

The nationally representative survey of 4,109 adults was undertaken in partnership with YouGov and reveals:

  • A majority of people (55%) want the government to impose restrictions on police use of facial recognition technology. Nearly one third (29%) of the public are uncomfortable with police using facial recognition technology in any circumstance.
  • Nearly half the public (46%) want the right to opt out of the use of facial recognition technology. This figure is higher for people from minority ethnic groups (56%), for whom the technology is less accurate.
  • Most people oppose the use of facial recognition technology by companies for commercial benefit, often because they don’t trust the technology will be used ethically. For example, 77% are uncomfortable with facial recognition technology being used in shops to track customers and 76% are uncomfortable with it being used by HR departments in recruitment.
  • The public supports companies voluntarily pausing sales of facial recognition technology to police (50%) and schools (70%) to allow for further public consultation.
  • People fear the normalisation of surveillance but are prepared to accept facial recognition technology when there is a clear public benefit, provided safeguards are in place. For example, nearly half (49%) support the use of facial recognition technology in day to day policing, assuming appropriate safeguards are in place, but the majority are opposed to its use in schools (67%) or on public transport (61%).

Findings from the survey are summarised in a new report from the Ada Lovelace Institute, Beyond face value: public attitudes to facial recognition technology. It provides much-needed evidence on public attitudes at a time when police use of facial recognition technology is being challenged in the courts and the Information Commissioner is investigating its deployment in the Kings Cross area of Central London.

Carly Kind, Director of the Ada Lovelace Institute said: “These findings show that companies and the government have a responsibility to act now. The UK is not ready for facial recognition technology. As a first step, a voluntary moratorium by all those selling and using facial recognition technology would enable a more informed conversation with the public about limitations and appropriate safeguards. To model the values we espouse, we are setting up a Citizens Biometric Council, a citizens’ assembly that will bring experts and citizens together to ensure the public can play an active role in shaping the use of facial recognition and other similar biometrics technologies – an approach that has been endorsed by the Information Commissioner.”

Jonathan Bamford, Director of Strategic Policy from the Information Commissioner’s Office said: “The Information Commissioner’s Office welcomes this valuable insight on public attitudes to facial recognition technology and is very pleased to support the Ada Lovelace Institute’s Citizens’ Biometrics Council, following on from its own extensive citizen engagement work through the use of citizen juries to understand public attitudes to explainability in AI. The Council will bring much-needed informed public perspective to the development of public policy on emerging biometrics technologies such as facial recognition and fingerprint scanning.’’

About the survey

4109 adults aged 16 and over in the UK responded to the online survey, administered by YouGov, between 12 and 16 July 2019. The national sample was weighted to the following UK demographics: gender, age, region and social grade. This meant that some other UK demographics were not fully captured, but could still be analysed. Black, Asian and minority ethnic groups (BAME) were under-represented (although still large enough in number for analysis), with an unweighted base size of 236 which forms 6% of the total survey response.

About the Ada Lovelace Institute

The Ada Lovelace Institute (Ada) is an independent research and deliberative body with a mission to ensure data and AI work for people and society. It aims to: build evidence and foster rigorous research and debate on how data and AI affect people and society; convene diverse voices to create a shared understanding of the ethical issues arising from data and AI; and define and inform good practice in the design and deployment of data and AI.

Ada is funded by the Nuffield Foundation, an independent charitable trust with a mission to advance social well-being. It was established in early 2018, in collaboration with the Alan Turing Institute, the Royal Society, the British Academy, the Royal Statistical Society, the Wellcome Trust, the Omidyar Network for Citizens and Governance, techUK and the Nuffield Council on Bioethics.


Related Articles

Responses