On September 4, 2019, the High Court of England and Wales dismissed a challenge to South Wales Police’s use of Automated Facial Recognition technology (“AFR”). The Court determined that the police’s use of AFR had been necessary and proportionate to achieve their statutory obligations.
An individual (Mr. Bridges) had brought judicial review proceedings after South Wales Police launched a project involving the use of AFR (“AFR Locate”). This technology was deployed at certain events and in certain public locations where crime was likely to occur, and was able to capture up to 50 faces per second. The police would subsequently match the images captured with wanted persons in their own databases using biometric data analysis. Where a match was not made with any of these watchlists, the images were immediately and automatically deleted.
Bridges had not been identified as a wanted person but had likely been captured by AFR Locate during its pilot deployment in Cardiff. He considered this to be unlawfully intrusive, specifically under Article 8 of the European Convention on Human Rights (“ECHR”) (right to respect for private and family life) and data protection law in the UK, including both the Data Protection Act 1998 (“DPA 1998”) and the Data Protection Act 2018 (“DPA 2018”). With regard to the DPA 1998, Bridges claimed that the prior use of AFR Locate had infringed Section 4(4), as it failed to comply with the data protection principles. Bridges also claimed that future use would constitute a failure to comply with Section 35 of the DPA 2018, which requires that processing of personal data for law enforcement purposes be lawful and fair. Bridges also pointed to the fact that the police had failed to carry out an adequate data protection impact assessment (“DPIA”), as required under Section 64(1).
The Court found that the use of AFR did interfere with an individual’s rights under Article 8 of the ECHR, and that this type of biometric data has an intrinsically private character, similar to DNA, as it enabled “the extraction of unique information and identifiers about an individual allowing … identification with precision in a wide range of circumstances.” Despite the fact that the images were immediately deleted, this process constituted an interference with Article 8 of the ECHR – it was sufficient that there was momentary storage of the data.
Despite this, the Court found that the interference was nonetheless carried out in accordance with the law as it was within the police’s common law powers to prevent and detect crime. The Court also found that the use of the AFR system was proportionate and met existing criteria that the technology be deployed openly and transparently and with significant public engagement. It was only deployed for a limited period, for a specific purpose, and was publicized before its use (for example, on Facebook and Twitter). The Court also pointed to the fact that the pilot had been successful in identifying wanted individuals, and that “this new technology has resulted in arrests or disposals in 37 cases where the individual in question had not been capable of location by existing methods.”
With regard to data protection law, the Court considered that the images of individuals captured (even those not matched with wanted persons lists) did constitute personal data, as the technology singled them out and made them distinguishable from others. The Court specified that AFR is more complex than simple CCTV, stating:
“AFR technology uses … digital information to isolate pictures of individual faces, extract information about facial features from those pictures, compare that information with … watchlist information, and indicate matches between faces captured through the CCTV recording and those held on the watchlist.”
By its nature, AFR had to make all images captured distinguishable from one another in order to attempt to match them to a watchlist. In fact, the processing was judged to constitute “sensitive processing” under section 35(8)(b) of the DPA 2018, even though there was no intention on the part of the police to identify individuals not present on any of their watchlists.
However, the processing did not infringe the relevant principle under the DPA 1998 for the same reasons the Court discussed regarding Article 8 of the ECHR:the Court found that the processing satisfied the conditions of lawfulness and fairness, and was necessary for the police’s legitimate interest in preventing and detecting crime, as required by their common law obligations. The requirement under Section 35(5) of the DPA 2018 that the processing be strictly necessary was also satisfied, as was the requirement that the processing be necessary for the exercise of the police’s functions.
The final requirement under Section 35(5) of the DPA 2018 was that there be an appropriate policy document in place to govern the processing. Although the Court found the relevant policy document in this case to be brief and lacking in detail, the Court declined to make a judgement as to whether the document was adequate, stating that it would leave that judgement to the police in light of more detailed guidance to be released by the UK Information Commissioner’s Office (“ICO”).
Finally, the Court determined that the impact assessment carried out by the police had been sufficient to meet the requirements under Section 64 of the DPA 2018.
The ICO, which recently completed its own investigation into the police’s piloting of this technology, emphasized that it would be reviewing the judgement carefully: “This new and intrusive technology has the potential, if used without the right privacy safeguards, to undermine rather than enhance confidence in the police… Any police forces or private organisations using these systems should be aware that existing data protection law and guidance still apply.”
The ICO stated that it will take the High Court’s judgement into consideration when finalizing its recommendations and guidance regarding the use of Live Facial Recognition systems.
Search
Recent Posts
Categories
- Behavioral Advertising
- Centre for Information Policy Leadership
- Children’s Privacy
- Cyber Insurance
- Cybersecurity
- Enforcement
- European Union
- Events
- FCRA
- Financial Privacy
- General
- Health Privacy
- Identity Theft
- Information Security
- International
- Marketing
- Multimedia Resources
- Online Privacy
- Security Breach
- U.S. Federal Law
- U.S. State Law
- Workplace Privacy
Tags
- Aaron Simpson
- Accountability
- Adequacy
- Advertisement
- Advertising
- American Privacy Rights Act
- Anna Pateraki
- Anonymization
- Anti-terrorism
- APEC
- Apple Inc.
- Argentina
- Arkansas
- Article 29 Working Party
- Artificial Intelligence
- Australia
- Austria
- Automated Decisionmaking
- Baltimore
- Bankruptcy
- Belgium
- Biden Administration
- Big Data
- Binding Corporate Rules
- Biometric Data
- Blockchain
- Bojana Bellamy
- Brazil
- Brexit
- British Columbia
- Brittany Bacon
- Brussels
- Business Associate Agreement
- BYOD
- California
- CAN-SPAM
- Canada
- Cayman Islands
- CCPA
- CCTV
- Chile
- China
- Chinese Taipei
- Christopher Graham
- CIPA
- Class Action
- Clinical Trial
- Cloud
- Cloud Computing
- CNIL
- Colombia
- Colorado
- Committee on Foreign Investment in the United States
- Commodity Futures Trading Commission
- Compliance
- Computer Fraud and Abuse Act
- Congress
- Connecticut
- Consent
- Consent Order
- Consumer Protection
- Cookies
- COPPA
- Coronavirus/COVID-19
- Council of Europe
- Council of the European Union
- Court of Justice of the European Union
- CPPA
- CPRA
- Credit Monitoring
- Credit Report
- Criminal Law
- Critical Infrastructure
- Croatia
- Cross-Border Data Flow
- Cyber Attack
- Cybersecurity
- Cybersecurity and Infrastructure Security Agency
- Data Brokers
- Data Controller
- Data Localization
- Data Privacy Framework
- Data Processor
- Data Protection Act
- Data Protection Authority
- Data Protection Impact Assessment
- Data Transfer
- David Dumont
- David Vladeck
- Delaware
- Denmark
- Department of Commerce
- Department of Health and Human Services
- Department of Homeland Security
- Department of Justice
- Department of the Treasury
- District of Columbia
- Do Not Call
- Do Not Track
- Dobbs
- Dodd-Frank Act
- DPIA
- E-Privacy
- E-Privacy Directive
- Ecuador
- Ed Tech
- Edith Ramirez
- Electronic Communications Privacy Act
- Electronic Privacy Information Center
- Elizabeth Denham
- Employee Monitoring
- Encryption
- ENISA
- EU Data Protection Directive
- EU Member States
- European Commission
- European Data Protection Board
- European Data Protection Supervisor
- European Parliament
- Facial Recognition Technology
- FACTA
- Fair Credit Reporting Act
- Fair Information Practice Principles
- Federal Aviation Administration
- Federal Bureau of Investigation
- Federal Communications Commission
- Federal Data Protection Act
- Federal Trade Commission
- FERC
- FinTech
- Florida
- Food and Drug Administration
- Foreign Intelligence Surveillance Act
- France
- Franchise
- Fred Cate
- Freedom of Information Act
- Freedom of Speech
- Fundamental Rights
- GDPR
- Geofencing
- Geolocation
- Georgia
- Germany
- Global Privacy Assembly
- Global Privacy Enforcement Network
- Gramm Leach Bliley Act
- Hacker
- Hawaii
- Health Data
- Health Information
- HIPAA
- HIPPA
- HITECH Act
- Hong Kong
- House of Representatives
- Hungary
- Illinois
- India
- Indiana
- Indonesia
- Information Commissioners Office
- Information Sharing
- Insurance Provider
- Internal Revenue Service
- International Association of Privacy Professionals
- International Commissioners Office
- Internet
- Internet of Things
- IP Address
- Ireland
- Israel
- Italy
- Jacob Kohnstamm
- Japan
- Jason Beach
- Jay Rockefeller
- Jenna Rode
- Jennifer Stoddart
- Jersey
- Jessica Rich
- John Delionado
- John Edwards
- Kentucky
- Korea
- Latin America
- Laura Leonard
- Law Enforcement
- Lawrence Strickling
- Legislation
- Liability
- Lisa Sotto
- Litigation
- Location-Based Services
- London
- Madrid Resolution
- Maine
- Malaysia
- Markus Heyder
- Maryland
- Massachusetts
- Meta
- Mexico
- Microsoft
- Minnesota
- Mobile App
- Mobile Device
- Montana
- Morocco
- MySpace
- Natascha Gerlach
- National Institute of Standards and Technology
- National Labor Relations Board
- National Science and Technology Council
- National Security
- National Security Agency
- National Telecommunications and Information Administration
- Nebraska
- NEDPA
- Netherlands
- Nevada
- New Hampshire
- New Jersey
- New Mexico
- New York
- New Zealand
- Nigeria
- Ninth Circuit
- North Carolina
- Norway
- Obama Administration
- OECD
- Office for Civil Rights
- Office of Foreign Assets Control
- Ohio
- Oklahoma
- Opt-In Consent
- Oregon
- Outsourcing
- Pakistan
- Parental Consent
- Payment Card
- PCI DSS
- Penalty
- Pennsylvania
- Personal Data
- Personal Health Information
- Personal Information
- Personally Identifiable Information
- Peru
- Philippines
- Phyllis Marcus
- Poland
- PRISM
- Privacy By Design
- Privacy Policy
- Privacy Rights
- Privacy Rule
- Privacy Shield
- Protected Health Information
- Ransomware
- Record Retention
- Red Flags Rule
- Regulation
- Rhode Island
- Richard Thomas
- Right to Be Forgotten
- Right to Privacy
- Risk-Based Approach
- Rosemary Jay
- Russia
- Safe Harbor
- Sanctions
- Schrems
- Scott H. Kimpel
- Scott Kimpel
- Securities and Exchange Commission
- Security Rule
- Senate
- Serbia
- Service Provider
- Singapore
- Smart Grid
- Smart Metering
- Social Media
- Social Security Number
- South Africa
- South Carolina
- South Dakota
- South Korea
- Spain
- Spyware
- Standard Contractual Clauses
- State Attorneys General
- Steven Haas
- Stick With Security Series
- Stored Communications Act
- Student Data
- Supreme Court
- Surveillance
- Sweden
- Switzerland
- Taiwan
- Targeted Advertising
- Telecommunications
- Telemarketing
- Telephone Consumer Protection Act
- Tennessee
- Terry McAuliffe
- Texas
- Text Message
- Thailand
- Transparency
- Transportation Security Administration
- Trump Administration
- United Arab Emirates
- United Kingdom
- United States
- Unmanned Aircraft Systems
- Uruguay
- Utah
- Vermont
- Video Privacy Protection Act
- Video Surveillance
- Virginia
- Viviane Reding
- Washington
- Whistleblowing
- Wireless Network
- Wiretap
- ZIP Code