On November 13, 2019, the European Data Protection Board (“EDPB”) published its draft guidelines 4/2019 (the “Guidelines”) on the obligation of Data Protection by Design and by Default (“DPbDD”) set out under Article 25 of the EU General Data Protection Regulation (“GDPR”).
Background and Scope
Article 25 of the GDPR requires all data controllers, irrespective of their size, to implement:
- appropriate technical and organizational measures and necessary safeguards, which are designed to implement the basic data protection principles of Article 5 of the GDPR and to protect individuals’ data protection rights laid down in Articles 12-22 of the GDPR, as well as individuals’ freedoms set out in Recital 4 of the GDPR and the EU Charter of Fundamental Rights (“Data protection by design”); and
- appropriate technical and organizational measures for ensuring that, by default, only personal data, which is necessary for each specific purpose of the data processing, is processed (“Data protection by default”).
Both DPbDD requirements serve the same objective, i.e., the effective implementation of the GDPR data protection principles and individuals’ data protection rights and freedoms regarding the processing of their personal data.
The Guidelines aim to provide guidance on what DPbDD means in practice and on how to apply DPbDD in the context of the data protection principles set out in Article 5(1) of the GDPR. In this respect, the Guidelines list key design and default elements to effectively implement those principles and also provide practical cases for illustration. The Guidelines further address the possibility of establishing a certification mechanism to demonstrate compliance with Article 25 of the GDPR, and how EU data protection authorities may enforce that provision. Finally, although the Guidelines are directly addressed to data controllers, they explicitly recognize data processors and technology providers as key enablers for PDbDD, and further provide 11 recommendations on how data controllers, data processors and technology providers can cooperate to achieve DPbDD as well as leverage it as a competitive advantage. We have summarized and assessed the key aspects of the Guidelines below.
Data protection by design
- The technical or organizational measures to be implemented by data controllers refer to any methods or means that they may use when processing personal data, ranging from the use of advanced technical solutions to the basic training of personnel, e.g., on how to handle personal data. These measures must be appropriate, i.e., they must be fit to implement the GDPR data protection principles effectively by reducing the risks of infringing individuals’ rights and freedoms. In addition, safeguards act as a second tier to ensure the effectiveness of these principles throughout the life-cycle of the personal data being processed and secure individuals’ rights and freedoms in the data processing activity. Examples of necessary safeguards include: enabling individuals to intervene in the data processing; providing automatic and repeated information about what personal data is being stored; having a retention reminder in a data repository or implementing a malware detection system on a computer network or storage system, in addition to training employees about phishing; and basic “cyber hygiene.” An example of a technical measure or safeguard is pseudonymization of personal data (such as hashing or encryption).
- Data controllers are not required to implement any prescribed technical and organizational measures or safeguards, so long as the chosen measures and safeguards already in place are in fact appropriate for implementing data protection into data processing.
- Data controllers must be able to demonstrate that the measures and safeguards they have implemented achieve the desired effect in terms of data protection. To do so, data controllers may determine appropriate key performance indicators to demonstrate compliance, including quantitative or qualitative metrics to demonstrate the effectiveness of the measures in question. Alternatively, data controllers may provide the rationale behind their assessment of the effectiveness of the chosen measures and safeguards.
- When determining the appropriate technical and organizational measures, data controllers must take into account the “state of art,” i.e., they must stay up to date on technological progress. Technology providers should play an active role in ensuring that the “state of the art” criterion is met, and should notify data controllers of any changes to the “state of the art” that may affect the effectiveness of the measures they have in place. Data controllers should include this requirement as a contractual clause to ensure they are kept up to date.
- Data controllers must take into account the cost and resources required for the effective implementation and continued maintenance of all data protection principles throughout the processing operation. This includes the potential cost of monetary fines as a result of non-compliance with the GDPR.
- Data controllers must take into account the nature, scope, context and purpose of data processing, and the risk posed to individuals’ rights and freedoms by such processing. With respect to risk assessment, the Guidelines refer to the EDPB guidelines on Data Protection Impact Assessment (“DPIA”), which provides guidance on how to assess data protection risks that could apply to other situations where the GDPR requires that assessment, such as in the context of DPbDD.
- The GDPR requires data controllers to implement data protection by design when they are in the process of determining which means or design elements (g., the architecture, procedures, protocols, layout and appearance) must be incorporated into the data processing. The Guidelines go further and recommend thinking of DPbDD from the initial stages of planning a data processing operation, even before the time of determination of the means of data processing.
- Data controllers have a continued obligation to maintain DPbDD once the data processing operations have started. This includes any data processing operations carried out by data processors, which should be regularly reviewed and assessed to ensure that they enable continual compliance with the DPbDD principles and support the data controller’s obligations.
Data protection by default
- The obligation to only process personal data that is necessary for a specific purpose applies to the following elements:
- the amount of personal data collected—data controllers must consider both the volume of personal data and the types, categories and level of detail of the personal data required for the purposes of data processing;
- the extent of processing;
- the period of storage—if personal data is not needed after initial processing, it shall be by default deleted or anonymized. Any retention should be objectively justifiable and demonstrable by the data controller in an accountable way. Data controllers must have systematic procedures for data deletion (or anonymization) embedded in the data processing; and,
- accessibility—access controls must be observed for the whole data flow during processing. Data controllers must ensure that personal data is readily accessible to those who need it when necessary, e.g., in critical situations.
The EDPB is accepting comments on these Guidelines until January 16, 2020.
Search
Recent Posts
- Website Use of Third-Party Tracking Software Not Prohibited Under Massachusetts Wiretap Act
- HHS Announces Additional Settlements Following Ransomware Attacks Including First Enforcement Under Risk Analysis Initiative
- Employee Monitoring: Increased Use Draws Increased Scrutiny from Consumer Financial Protection Bureau
Categories
- Behavioral Advertising
- Centre for Information Policy Leadership
- Children’s Privacy
- Cyber Insurance
- Cybersecurity
- Enforcement
- European Union
- Events
- FCRA
- Financial Privacy
- General
- Health Privacy
- Identity Theft
- Information Security
- International
- Marketing
- Multimedia Resources
- Online Privacy
- Security Breach
- U.S. Federal Law
- U.S. State Law
- Workplace Privacy
Tags
- Aaron Simpson
- Accountability
- Adequacy
- Advertisement
- Advertising
- American Privacy Rights Act
- Anna Pateraki
- Anonymization
- Anti-terrorism
- APEC
- Apple Inc.
- Argentina
- Arkansas
- Article 29 Working Party
- Artificial Intelligence
- Australia
- Austria
- Automated Decisionmaking
- Baltimore
- Bankruptcy
- Belgium
- Biden Administration
- Big Data
- Binding Corporate Rules
- Biometric Data
- Blockchain
- Bojana Bellamy
- Brazil
- Brexit
- British Columbia
- Brittany Bacon
- Brussels
- Business Associate Agreement
- BYOD
- California
- CAN-SPAM
- Canada
- Cayman Islands
- CCPA
- CCTV
- Chile
- China
- Chinese Taipei
- Christopher Graham
- CIPA
- Class Action
- Clinical Trial
- Cloud
- Cloud Computing
- CNIL
- Colombia
- Colorado
- Committee on Foreign Investment in the United States
- Commodity Futures Trading Commission
- Compliance
- Computer Fraud and Abuse Act
- Congress
- Connecticut
- Consent
- Consent Order
- Consumer Protection
- Cookies
- COPPA
- Coronavirus/COVID-19
- Council of Europe
- Council of the European Union
- Court of Justice of the European Union
- CPPA
- CPRA
- Credit Monitoring
- Credit Report
- Criminal Law
- Critical Infrastructure
- Croatia
- Cross-Border Data Flow
- Cyber Attack
- Cybersecurity and Infrastructure Security Agency
- Data Brokers
- Data Controller
- Data Localization
- Data Privacy Framework
- Data Processor
- Data Protection Act
- Data Protection Authority
- Data Protection Impact Assessment
- Data Transfer
- David Dumont
- David Vladeck
- Delaware
- Denmark
- Department of Commerce
- Department of Health and Human Services
- Department of Homeland Security
- Department of Justice
- Department of the Treasury
- District of Columbia
- Do Not Call
- Do Not Track
- Dobbs
- Dodd-Frank Act
- DPIA
- E-Privacy
- E-Privacy Directive
- Ecuador
- Ed Tech
- Edith Ramirez
- Electronic Communications Privacy Act
- Electronic Privacy Information Center
- Elizabeth Denham
- Employee Monitoring
- Encryption
- ENISA
- EU Data Protection Directive
- EU Member States
- European Commission
- European Data Protection Board
- European Data Protection Supervisor
- European Parliament
- Facial Recognition Technology
- FACTA
- Fair Credit Reporting Act
- Fair Information Practice Principles
- Federal Aviation Administration
- Federal Bureau of Investigation
- Federal Communications Commission
- Federal Data Protection Act
- Federal Trade Commission
- FERC
- FinTech
- Florida
- Food and Drug Administration
- Foreign Intelligence Surveillance Act
- France
- Franchise
- Fred Cate
- Freedom of Information Act
- Freedom of Speech
- Fundamental Rights
- GDPR
- Geofencing
- Geolocation
- Georgia
- Germany
- Global Privacy Assembly
- Global Privacy Enforcement Network
- Gramm Leach Bliley Act
- Hacker
- Hawaii
- Health Data
- Health Information
- HIPAA
- HIPPA
- HITECH Act
- Hong Kong
- House of Representatives
- Hungary
- Illinois
- India
- Indiana
- Indonesia
- Information Commissioners Office
- Information Sharing
- Insurance Provider
- Internal Revenue Service
- International Association of Privacy Professionals
- International Commissioners Office
- Internet
- Internet of Things
- IP Address
- Ireland
- Israel
- Italy
- Jacob Kohnstamm
- Japan
- Jason Beach
- Jay Rockefeller
- Jenna Rode
- Jennifer Stoddart
- Jersey
- Jessica Rich
- John Delionado
- John Edwards
- Kentucky
- Korea
- Latin America
- Laura Leonard
- Law Enforcement
- Lawrence Strickling
- Legislation
- Liability
- Lisa Sotto
- Litigation
- Location-Based Services
- London
- Madrid Resolution
- Maine
- Malaysia
- Markus Heyder
- Maryland
- Massachusetts
- Meta
- Mexico
- Microsoft
- Minnesota
- Mobile App
- Mobile Device
- Montana
- Morocco
- MySpace
- Natascha Gerlach
- National Institute of Standards and Technology
- National Labor Relations Board
- National Science and Technology Council
- National Security
- National Security Agency
- National Telecommunications and Information Administration
- Nebraska
- NEDPA
- Netherlands
- Nevada
- New Hampshire
- New Jersey
- New Mexico
- New York
- New Zealand
- Nigeria
- Ninth Circuit
- North Carolina
- Norway
- Obama Administration
- OECD
- Office for Civil Rights
- Office of Foreign Assets Control
- Ohio
- Oklahoma
- Opt-In Consent
- Oregon
- Outsourcing
- Pakistan
- Parental Consent
- Payment Card
- PCI DSS
- Penalty
- Pennsylvania
- Personal Data
- Personal Health Information
- Personal Information
- Personally Identifiable Information
- Peru
- Philippines
- Phyllis Marcus
- Poland
- PRISM
- Privacy By Design
- Privacy Policy
- Privacy Rights
- Privacy Rule
- Privacy Shield
- Protected Health Information
- Ransomware
- Record Retention
- Red Flags Rule
- Regulation
- Rhode Island
- Richard Thomas
- Right to Be Forgotten
- Right to Privacy
- Risk-Based Approach
- Rosemary Jay
- Russia
- Safe Harbor
- Sanctions
- Schrems
- Scott Kimpel
- Securities and Exchange Commission
- Security Rule
- Senate
- Serbia
- Service Provider
- Singapore
- Smart Grid
- Smart Metering
- Social Media
- Social Security Number
- South Africa
- South Carolina
- South Dakota
- South Korea
- Spain
- Spyware
- Standard Contractual Clauses
- State Attorneys General
- Steven Haas
- Stick With Security Series
- Stored Communications Act
- Student Data
- Supreme Court
- Surveillance
- Sweden
- Switzerland
- Taiwan
- Targeted Advertising
- Telecommunications
- Telemarketing
- Telephone Consumer Protection Act
- Tennessee
- Terry McAuliffe
- Texas
- Text Message
- Thailand
- Transparency
- Transportation Security Administration
- Trump Administration
- United Arab Emirates
- United Kingdom
- United States
- Unmanned Aircraft Systems
- Uruguay
- Utah
- Vermont
- Video Privacy Protection Act
- Video Surveillance
- Virginia
- Viviane Reding
- Washington
- Whistleblowing
- Wireless Network
- Wiretap
- ZIP Code