On August 11, 2020, the Court of Appeal of England and Wales overturned the High Court’s dismissal of a challenge to South Wales Police’s use of Automated Facial Recognition technology (“AFR”), finding that its use was unlawful and violated human rights.
In September 2019, the UK’s High Court had dismissed the challenge to the use of AFR, determining that its use was necessary and proportionate to achieve South Wales Police’s statutory obligations. Mr. Bridges, the civil liberties campaigner who originally brought judicial review proceedings after South Wales Police launched a project involving the use of AFR (“AFR Locate”) appealed the High Court’s dismissal. With AFR Locate, South Wales Police deployed AFR technology at certain events and in certain public locations where crime was considered likely to occur, and images of up to 50 faces per second. The police subsequently matched the captured images with “watchlists” of wanted persons in police databases using biometric data analysis. Where a match was not made with any of these watchlists, the images were immediately and automatically deleted.
Mr. Bridges challenged AFR Locate on the basis that it was unlawfully intrusive, including under Article 8 of the European Convention on Human Rights (“ECHR”) (right to respect for private and family life) and data protection law in the UK. His appeal was based on the following five grounds:
- The High Court had erred in its conclusion that South Wales Police’s use of AFR and interference with Mr. Bridges’ rights was in accordance with the law under Article 8(2) of the ECHR.
- The High Court had incorrectly concluded that the use of AFR and interference with Mr. Bridges’ rights was proportionate under Article 8(2) of the ECHR.
- The High Court was wrong to consider the DPIA carried out in relation to the processing sufficient for the purposes of Section 64 of the DPA 2018.
- The High Court should not have declined to reach a conclusion as to whether South Wales Police had an “appropriate policy document” in place regarding the use of AFR Locate that was within the meaning of Section 42 of the DPA 2018 for carrying out sensitive data processing.
- The High Court was wrong to hold that South Wales Police had complied with the Public Sector Equality Duty (“PSED”) under Section 149 of the Equality Act 2010, on the grounds that the Equality Impact Assessment carried out was “obviously inadequate” and failed to recognize the risk of indirect discrimination on the basis of sex or race.
The Court of Appeal granted the appeal on grounds 1, 3 and 5, but rejected grounds 2 and 4.
Ground 1
On the first ground, the Court of Appeal overturned the High Court’s determination, finding “fundamental deficiencies” in the legal framework around the use of AFR, specifically the policies that governed its use. The Court found that South Wales Police’s policies gave too much discretion to individual police officers to determine which individuals were placed on watchlists and where AFR Locate could be deployed. The Court commented that “the current policies do not sufficiently set out the terms on which discretionary powers can be exercised by the police and for that reason do not have the necessary quality of law.” The Court further described the discretion as “impermissibly wide”, for example because the deployment of the technology was not limited to areas in which it could be thought on reasonable grounds that individuals on a watchlist might be present. The Court implied that this should be a significant factor in determining where AFR Locate should be deployed, stating, “it will often, perhaps always, be the case that the location will be determined by whether the police have reason to believe that people on the watchlist are going to be at that location.”
Ground 2
Since the Court decided that AFR Locate’s use was not lawful, it was not necessary for the Court to decide the second ground of appeal on proportionality. Regardless, the Court chose to address this question and rejected it. Mr. Bridges argued that the balancing test between the rights of the individual and the interests of the community, which forms part of the proportionality analysis, should not only consider the impact on Mr. Bridges, but also the impact on all other individuals whose biometric data was processed by the technology on the relevant occasions. The Court of Appeal disagreed, commenting that Mr. Bridges had only detailed the impact on himself, not the wider public, in his original complaint and that the impact on each of the other relevant individuals was as negligible as the impact on Mr. Bridges and should not be considered cumulatively. The Court stated, “An impact that has very little weight cannot become weightier simply because other people were also affected. It is not a question of simple multiplication. The balancing exercise which the principle of proportionality requires is not a mathematical one; it is an exercise which calls for judgement.”
Ground 3
On the third ground of appeal relating to South Wales Police’s failure to carry out a sufficient DPIA, Mr. Bridges argued that the DPIA was defective in three specific ways. First, it failed to recognize that the personal data of individuals not present on a watchlist (whose data was therefore immediately and automatically deleted) was nonetheless “processed” within the meaning of data protection law. Second, the DPIA also did not acknowledge that the rights of individuals under Article 8 of the ECHR were engaged by the processing, and third, it was silent as to other risks that may have been raised by AFR Locate’s use, such as the right to freedom of expression or freedom of assembly.
The UK Information Commissioner’s Office (“ICO”), an intervener in the case, also criticized the DPIA undertaken by South Wales Police on the basis that it did not contain an assessment of “privacy, personal data and safeguards,” failed to acknowledge that AFR involves the collection of personal data on a “blanket and indiscriminate basis” and that the risk of false-positive results may in fact result in longer retention periods rather than data being immediately deleted. In addition, the DPIA failed to address potential gender and racial bias that could arise from AFR Locate’s use. As such, the ICO stated that the DPIA failed to appropriately assess the risks and mitigation of them as required under Section 64 of the DPA 2018.
The Court of Appeal did not accept all of these arguments. For example, it highlighted that the DPIA had specifically referred to the relevance of Article 8 of the ECHR. However, based on its conclusion that the deployment of the technology was not lawful, the Court found that South Wales Police was wrong to conclude in its DPIA that Article 8 of the ECHR was not infringed. The Court of Appeal stated, “The inevitable consequence of those deficiencies is that, notwithstanding the attempt of the DPIA to grapple with the Article 8 issues, the DPIA failed properly to assess the risks to the rights and freedoms of data subjects and failed to address the measures envisaged to address the risks arising from the deficiencies we have found, as required by section 64(3)(b) and (c) of the DPA 2018.”
Ground 4
With regards to the requirement to have an “appropriate policy document” in place under Section 42 of the DPA 2018, Mr. Bridges argued that the assessment of the document’s sufficiency should not have been referred back to South Wales Police for consideration in light of guidance from the ICO, but instead, the High Court should have found it to be insufficient. The Court of Appeal rejected this argument on the basis that, at the time of AFR Locate’s deployment, the DPA 2018 was not yet in force, and therefore, there could not have been a failure to comply with the law. In relation to AFR Locate’s future use and the requirement for an appropriate policy document, the Court of Appeal commented that, “[A] section 42 document is an evolving document, which, in accordance with section 42(3), must be kept under review and updated from time to time.” Since ICO guidance had not been issued on the drafting of this type of document at the time of the High Court hearing, and given that South Wales Police had updated the document in light of the ICO’s subsequently published guidance, the Court of Appeal found that the High Court’s approach in this respect had been appropriate. It also referred to the fact that the ICO had repeatedly expressed the view that the original version of the document met Section 42 requirements, though it would ideally contain more detail.
Ground 5
On the final ground of appeal concerning the PSED under Section 149 of the Equality Act 2010, the Court found that South Wales Police had not gathered sufficient evidence to establish whether or not AFR Locate was inherently biased prior to its use for two reasons: (1) because the data of individuals whose images did not match those on the watchlists were automatically deleted (and therefore could not be analyzed for the purpose of assessing bias), and (2) because South Wales Police was not aware of the dataset on which AFR Locate had been trained and could not establish whether there had been a demographic imbalance in the relevant training data. Although it was not alleged that AFR Locate produced biased results, the Court determined that South Wales Police, “never sought to satisfy themselves, either directly or by way of independent verification, that the software program in this case does not have an unacceptable bias on grounds of race or sex.” The Court added, “We would hope that, as AFR is a novel and controversial technology, all police forces that intend to use it in the future would wish to satisfy themselves that everything reasonable which could be done had been done in order to make sure that the software used does not have a racial or gender bias.”
South Wales Police has stated that it will not appeal the decision. The Court of Appeal’s full judgement may be viewed here.
Search
Recent Posts
Categories
- Behavioral Advertising
- Centre for Information Policy Leadership
- Children’s Privacy
- Cyber Insurance
- Cybersecurity
- Enforcement
- European Union
- Events
- FCRA
- Financial Privacy
- General
- Health Privacy
- Identity Theft
- Information Security
- International
- Marketing
- Multimedia Resources
- Online Privacy
- Security Breach
- U.S. Federal Law
- U.S. State Law
- Workplace Privacy
Tags
- Aaron Simpson
- Accountability
- Adequacy
- Advertisement
- Advertising
- American Privacy Rights Act
- Anna Pateraki
- Anonymization
- Anti-terrorism
- APEC
- Apple Inc.
- Argentina
- Arkansas
- Article 29 Working Party
- Artificial Intelligence
- Australia
- Austria
- Automated Decisionmaking
- Baltimore
- Bankruptcy
- Belgium
- Biden Administration
- Big Data
- Binding Corporate Rules
- Biometric Data
- Blockchain
- Bojana Bellamy
- Brazil
- Brexit
- British Columbia
- Brittany Bacon
- Brussels
- Business Associate Agreement
- BYOD
- California
- CAN-SPAM
- Canada
- Cayman Islands
- CCPA
- CCTV
- Chile
- China
- Chinese Taipei
- Christopher Graham
- CIPA
- Class Action
- Clinical Trial
- Cloud
- Cloud Computing
- CNIL
- Colombia
- Colorado
- Committee on Foreign Investment in the United States
- Commodity Futures Trading Commission
- Compliance
- Computer Fraud and Abuse Act
- Congress
- Connecticut
- Consent
- Consent Order
- Consumer Protection
- Cookies
- COPPA
- Coronavirus/COVID-19
- Council of Europe
- Council of the European Union
- Court of Justice of the European Union
- CPPA
- CPRA
- Credit Monitoring
- Credit Report
- Criminal Law
- Critical Infrastructure
- Croatia
- Cross-Border Data Flow
- Cyber Attack
- Cybersecurity
- Cybersecurity and Infrastructure Security Agency
- Data Brokers
- Data Controller
- Data Localization
- Data Privacy Framework
- Data Processor
- Data Protection Act
- Data Protection Authority
- Data Protection Impact Assessment
- Data Transfer
- David Dumont
- David Vladeck
- Delaware
- Denmark
- Department of Commerce
- Department of Health and Human Services
- Department of Homeland Security
- Department of Justice
- Department of the Treasury
- District of Columbia
- Do Not Call
- Do Not Track
- Dobbs
- Dodd-Frank Act
- DPIA
- E-Privacy
- E-Privacy Directive
- Ecuador
- Ed Tech
- Edith Ramirez
- Electronic Communications Privacy Act
- Electronic Privacy Information Center
- Elizabeth Denham
- Employee Monitoring
- Encryption
- ENISA
- EU Data Protection Directive
- EU Member States
- European Commission
- European Data Protection Board
- European Data Protection Supervisor
- European Parliament
- Facial Recognition Technology
- FACTA
- Fair Credit Reporting Act
- Fair Information Practice Principles
- Federal Aviation Administration
- Federal Bureau of Investigation
- Federal Communications Commission
- Federal Data Protection Act
- Federal Trade Commission
- FERC
- FinTech
- Florida
- Food and Drug Administration
- Foreign Intelligence Surveillance Act
- France
- Franchise
- Fred Cate
- Freedom of Information Act
- Freedom of Speech
- Fundamental Rights
- GDPR
- Geofencing
- Geolocation
- Georgia
- Germany
- Global Privacy Assembly
- Global Privacy Enforcement Network
- Gramm Leach Bliley Act
- Hacker
- Hawaii
- Health Data
- Health Information
- HIPAA
- HIPPA
- HITECH Act
- Hong Kong
- House of Representatives
- Hungary
- Illinois
- India
- Indiana
- Indonesia
- Information Commissioners Office
- Information Sharing
- Insurance Provider
- Internal Revenue Service
- International Association of Privacy Professionals
- International Commissioners Office
- Internet
- Internet of Things
- IP Address
- Ireland
- Israel
- Italy
- Jacob Kohnstamm
- Japan
- Jason Beach
- Jay Rockefeller
- Jenna Rode
- Jennifer Stoddart
- Jersey
- Jessica Rich
- John Delionado
- John Edwards
- Kentucky
- Korea
- Latin America
- Laura Leonard
- Law Enforcement
- Lawrence Strickling
- Legislation
- Liability
- Lisa Sotto
- Litigation
- Location-Based Services
- London
- Madrid Resolution
- Maine
- Malaysia
- Markus Heyder
- Maryland
- Massachusetts
- Meta
- Mexico
- Microsoft
- Minnesota
- Mobile App
- Mobile Device
- Montana
- Morocco
- MySpace
- Natascha Gerlach
- National Institute of Standards and Technology
- National Labor Relations Board
- National Science and Technology Council
- National Security
- National Security Agency
- National Telecommunications and Information Administration
- Nebraska
- NEDPA
- Netherlands
- Nevada
- New Hampshire
- New Jersey
- New Mexico
- New York
- New Zealand
- Nigeria
- Ninth Circuit
- North Carolina
- Norway
- Obama Administration
- OECD
- Office for Civil Rights
- Office of Foreign Assets Control
- Ohio
- Oklahoma
- Opt-In Consent
- Oregon
- Outsourcing
- Pakistan
- Parental Consent
- Payment Card
- PCI DSS
- Penalty
- Pennsylvania
- Personal Data
- Personal Health Information
- Personal Information
- Personally Identifiable Information
- Peru
- Philippines
- Phyllis Marcus
- Poland
- PRISM
- Privacy By Design
- Privacy Policy
- Privacy Rights
- Privacy Rule
- Privacy Shield
- Protected Health Information
- Ransomware
- Record Retention
- Red Flags Rule
- Regulation
- Rhode Island
- Richard Thomas
- Right to Be Forgotten
- Right to Privacy
- Risk-Based Approach
- Rosemary Jay
- Russia
- Safe Harbor
- Sanctions
- Schrems
- Scott H. Kimpel
- Scott Kimpel
- Securities and Exchange Commission
- Security Rule
- Senate
- Serbia
- Service Provider
- Singapore
- Smart Grid
- Smart Metering
- Social Media
- Social Security Number
- South Africa
- South Carolina
- South Dakota
- South Korea
- Spain
- Spyware
- Standard Contractual Clauses
- State Attorneys General
- Steven Haas
- Stick With Security Series
- Stored Communications Act
- Student Data
- Supreme Court
- Surveillance
- Sweden
- Switzerland
- Taiwan
- Targeted Advertising
- Telecommunications
- Telemarketing
- Telephone Consumer Protection Act
- Tennessee
- Terry McAuliffe
- Texas
- Text Message
- Thailand
- Transparency
- Transportation Security Administration
- Trump Administration
- United Arab Emirates
- United Kingdom
- United States
- Unmanned Aircraft Systems
- Uruguay
- Utah
- Vermont
- Video Privacy Protection Act
- Video Surveillance
- Virginia
- Viviane Reding
- Washington
- Whistleblowing
- Wireless Network
- Wiretap
- ZIP Code