On August 14, 2014, the Centre for Information Policy Leadership at Hunton & Williams (the “Centre”) submitted its response to the National Telecommunications and Information Administration’s (“NTIA’s”) request for public comment on big data and consumer privacy issues. The NTIA’s request, which follows the White House’s recent study of big data, the May 2014 Big Data Report, and the associated President’s Council of Advisors on Science and Technology Report, seeks further public input on how big data impacts the Consumer Privacy Bill of Rights, and whether the Consumer Privacy Bill of Rights should be modified to contemplate big data.
In its submission, the Centre recommends clarifying or modifying the principles of “individual control” and organizational “accountability” in the Consumer Privacy Bill of Rights to better reflect the opportunities and challenges associated with big data. While the principle of “individual control” should continue to encompass notice and consent where appropriate, it should evolve into a broader “focus on the individual” that includes additional protections for situations where consent is impractical, impossible or illusory, as is increasingly the case in the big data context. Further, the principle of “accountability” should be amended to recognize these additional protections as integral components of organizational accountability.
Such additional protections focused on the individual include the “risk-based approach” to privacy and the concept of “legitimate interest” found in EU privacy law. Including these two concepts also would improve the viability and effectiveness of self-regulatory and co-regulatory efforts to implement the Privacy Bill of Rights.
The Risk-Based Approach
The risk-based approach to privacy involves organizations undertaking risk assessments of their proposed data processing to understand the potential impacts of such processing on individuals. Although such risk assessments should be performed in connection with all data processing, they are particularly well-suited to facilitate sound decisionmaking regarding data use in the big data context when individuals may not be in a position to exercise individual control.
Risk assessments allow organizations to identify and quantify the possible risks and harms associated with their proposed data processing, devise appropriate mitigations and controls, and then make decisions about whether and how to proceed with processing in light of any residual harms and the countervailing benefits that would flow from the processing. As such, risk assessments shift the burden of privacy protections to the organizations (and away from individuals) in contexts where individual control and consent would be impracticable or impossible.
Risk assessments also allow organizations to prioritize their privacy controls and resources to reflect the likelihood and potential severity of harm, thereby contributing to the overall effectiveness of privacy protections. De-identification of data is an important risk mitigation mechanism in this context, but it should be employed with additional appropriate safeguards to ensure its effectiveness.
In its submission, the Centre notes its Privacy Risk Framework Project, an ongoing multiyear project on the risk-based approach to privacy. The project seeks to develop a comprehensive privacy risk framework as well as consensus on what is meant by privacy harms, how to quantify and mitigate privacy harms, and how to weigh residual harms against countervailing benefits.
Legitimate Interest
The Centre notes that even the more restrictive European privacy law regime includes a concept that allows for data processing where consent is not feasible (the “legitimate interest” ground). Processing for “legitimate interests” is closely related to the risk-based approach and should be included in the Privacy Bill of Rights. The legitimate interest ground permits organizations to collect, use or share information when it is in their legitimate interest to do so, and the collection, use or sharing does not prejudice individual’s rights and freedoms.
The test for whether an organization may process data on the basis of its “legitimate interest” includes consideration of the impacts of the proposed processing on the individual, and balancing the respective rights and interests of the organization and the individual. The Centre explains that the legitimate interest analysis:
- Facilitates data collection, use, sharing and disclosure where consent is not feasible, practicable or effective;
- Enables new uses of information beyond the original purposes stated at the time of collection, provided there is no harm to consumers;
- Is consistent with the responsible use model and the accountability principle pursuant to which organizations implement safeguards in the entire lifecycle of information; and
- Ensures the protection of individuals’ privacy, while allowing organizations to pursue the benefits of new technologies, products and services.
According to the Centre, including the risk-based approach to privacy and a “legitimate interest” provision would make the Privacy Bill of Rights more technology neutral, and ensure its continued relevance in the face of constant technological innovation and change.
Search
Recent Posts
- Website Use of Third-Party Tracking Software Not Prohibited Under Massachusetts Wiretap Act
- HHS Announces Additional Settlements Following Ransomware Attacks Including First Enforcement Under Risk Analysis Initiative
- Employee Monitoring: Increased Use Draws Increased Scrutiny from Consumer Financial Protection Bureau
Categories
- Behavioral Advertising
- Centre for Information Policy Leadership
- Children’s Privacy
- Cyber Insurance
- Cybersecurity
- Enforcement
- European Union
- Events
- FCRA
- Financial Privacy
- General
- Health Privacy
- Identity Theft
- Information Security
- International
- Marketing
- Multimedia Resources
- Online Privacy
- Security Breach
- U.S. Federal Law
- U.S. State Law
- Workplace Privacy
Tags
- Aaron Simpson
- Accountability
- Adequacy
- Advertisement
- Advertising
- American Privacy Rights Act
- Anna Pateraki
- Anonymization
- Anti-terrorism
- APEC
- Apple Inc.
- Argentina
- Arkansas
- Article 29 Working Party
- Artificial Intelligence
- Australia
- Austria
- Automated Decisionmaking
- Baltimore
- Bankruptcy
- Belgium
- Biden Administration
- Big Data
- Binding Corporate Rules
- Biometric Data
- Blockchain
- Bojana Bellamy
- Brazil
- Brexit
- British Columbia
- Brittany Bacon
- Brussels
- Business Associate Agreement
- BYOD
- California
- CAN-SPAM
- Canada
- Cayman Islands
- CCPA
- CCTV
- Chile
- China
- Chinese Taipei
- Christopher Graham
- CIPA
- Class Action
- Clinical Trial
- Cloud
- Cloud Computing
- CNIL
- Colombia
- Colorado
- Committee on Foreign Investment in the United States
- Commodity Futures Trading Commission
- Compliance
- Computer Fraud and Abuse Act
- Congress
- Connecticut
- Consent
- Consent Order
- Consumer Protection
- Cookies
- COPPA
- Coronavirus/COVID-19
- Council of Europe
- Council of the European Union
- Court of Justice of the European Union
- CPPA
- CPRA
- Credit Monitoring
- Credit Report
- Criminal Law
- Critical Infrastructure
- Croatia
- Cross-Border Data Flow
- Cyber Attack
- Cybersecurity and Infrastructure Security Agency
- Data Brokers
- Data Controller
- Data Localization
- Data Privacy Framework
- Data Processor
- Data Protection Act
- Data Protection Authority
- Data Protection Impact Assessment
- Data Transfer
- David Dumont
- David Vladeck
- Delaware
- Denmark
- Department of Commerce
- Department of Health and Human Services
- Department of Homeland Security
- Department of Justice
- Department of the Treasury
- District of Columbia
- Do Not Call
- Do Not Track
- Dobbs
- Dodd-Frank Act
- DPIA
- E-Privacy
- E-Privacy Directive
- Ecuador
- Ed Tech
- Edith Ramirez
- Electronic Communications Privacy Act
- Electronic Privacy Information Center
- Elizabeth Denham
- Employee Monitoring
- Encryption
- ENISA
- EU Data Protection Directive
- EU Member States
- European Commission
- European Data Protection Board
- European Data Protection Supervisor
- European Parliament
- Facial Recognition Technology
- FACTA
- Fair Credit Reporting Act
- Fair Information Practice Principles
- Federal Aviation Administration
- Federal Bureau of Investigation
- Federal Communications Commission
- Federal Data Protection Act
- Federal Trade Commission
- FERC
- FinTech
- Florida
- Food and Drug Administration
- Foreign Intelligence Surveillance Act
- France
- Franchise
- Fred Cate
- Freedom of Information Act
- Freedom of Speech
- Fundamental Rights
- GDPR
- Geofencing
- Geolocation
- Georgia
- Germany
- Global Privacy Assembly
- Global Privacy Enforcement Network
- Gramm Leach Bliley Act
- Hacker
- Hawaii
- Health Data
- Health Information
- HIPAA
- HIPPA
- HITECH Act
- Hong Kong
- House of Representatives
- Hungary
- Illinois
- India
- Indiana
- Indonesia
- Information Commissioners Office
- Information Sharing
- Insurance Provider
- Internal Revenue Service
- International Association of Privacy Professionals
- International Commissioners Office
- Internet
- Internet of Things
- IP Address
- Ireland
- Israel
- Italy
- Jacob Kohnstamm
- Japan
- Jason Beach
- Jay Rockefeller
- Jenna Rode
- Jennifer Stoddart
- Jersey
- Jessica Rich
- John Delionado
- John Edwards
- Kentucky
- Korea
- Latin America
- Laura Leonard
- Law Enforcement
- Lawrence Strickling
- Legislation
- Liability
- Lisa Sotto
- Litigation
- Location-Based Services
- London
- Madrid Resolution
- Maine
- Malaysia
- Markus Heyder
- Maryland
- Massachusetts
- Meta
- Mexico
- Microsoft
- Minnesota
- Mobile App
- Mobile Device
- Montana
- Morocco
- MySpace
- Natascha Gerlach
- National Institute of Standards and Technology
- National Labor Relations Board
- National Science and Technology Council
- National Security
- National Security Agency
- National Telecommunications and Information Administration
- Nebraska
- NEDPA
- Netherlands
- Nevada
- New Hampshire
- New Jersey
- New Mexico
- New York
- New Zealand
- Nigeria
- Ninth Circuit
- North Carolina
- Norway
- Obama Administration
- OECD
- Office for Civil Rights
- Office of Foreign Assets Control
- Ohio
- Oklahoma
- Opt-In Consent
- Oregon
- Outsourcing
- Pakistan
- Parental Consent
- Payment Card
- PCI DSS
- Penalty
- Pennsylvania
- Personal Data
- Personal Health Information
- Personal Information
- Personally Identifiable Information
- Peru
- Philippines
- Phyllis Marcus
- Poland
- PRISM
- Privacy By Design
- Privacy Policy
- Privacy Rights
- Privacy Rule
- Privacy Shield
- Protected Health Information
- Ransomware
- Record Retention
- Red Flags Rule
- Regulation
- Rhode Island
- Richard Thomas
- Right to Be Forgotten
- Right to Privacy
- Risk-Based Approach
- Rosemary Jay
- Russia
- Safe Harbor
- Sanctions
- Schrems
- Scott Kimpel
- Securities and Exchange Commission
- Security Rule
- Senate
- Serbia
- Service Provider
- Singapore
- Smart Grid
- Smart Metering
- Social Media
- Social Security Number
- South Africa
- South Carolina
- South Dakota
- South Korea
- Spain
- Spyware
- Standard Contractual Clauses
- State Attorneys General
- Steven Haas
- Stick With Security Series
- Stored Communications Act
- Student Data
- Supreme Court
- Surveillance
- Sweden
- Switzerland
- Taiwan
- Targeted Advertising
- Telecommunications
- Telemarketing
- Telephone Consumer Protection Act
- Tennessee
- Terry McAuliffe
- Texas
- Text Message
- Thailand
- Transparency
- Transportation Security Administration
- Trump Administration
- United Arab Emirates
- United Kingdom
- United States
- Unmanned Aircraft Systems
- Uruguay
- Utah
- Vermont
- Video Privacy Protection Act
- Video Surveillance
- Virginia
- Viviane Reding
- Washington
- Whistleblowing
- Wireless Network
- Wiretap
- ZIP Code