On April 15, 2019, the UK Information Commissioner’s Office (the “ICO”) issued for public consultation a draft code of practice, “Age Appropriate Design,” that will regulate the provision of online services likely to be accessed by children in the UK. Given the extraterritorial reach of the UK Data Protection Act 2018, organizations based outside of the UK may be subject to the code, which is expected to take effect by the end of 2019. The deadline for responding to the public consultation is May 31, 2019.
The draft code was published in accordance with the ICO’s obligation under section 123 of the Data Protection Act 2018 to prepare a code of practice on standards of age-appropriate design of online services likely to be accessed by children. The scope of the draft code is broad; it covers social media platforms, apps, online games, messaging services, search engines, online marketplaces, streaming services, news and educational websites, connected toys or devices, and any websites offering goods and services over the Internet. Free services (e.g., funded by advertising revenue) are covered, as are not-for-profit services that would normally be provided for remuneration.
The code will apply to any service that a child (defined as someone under the age of 18) is likely to access, regardless of whether or not the service provider intends to target children. Even where a service is ostensibly aimed at adults, service providers must be able to demonstrate, with specific documented evidence, that children are not likely to access the service.
The draft code is based on 16 headline standards of age-appropriate design and aims to protect the best interests and privacy of children. The standards are cumulative and interdependent, so that each one must be met in order for a service provider to demonstrate compliance with the code.
Many of the standards expand requirements already included in the EU General Data Protection Regulation (“GDPR”), with a view to providing additional, specific safeguards for children. For example, standard 8 provides that children’s personal data should not be disclosed unless there is a compelling reason for disclosure, considering the best interests of the child. Generalized data sharing for the purposes of commercial reuse is unlikely to meet this standard. The transparency standard (standard 3) reflects the transparency requirement of the GDPR, but specifies that “bite-sized” explanations of how personal data is used should be provided to children “at the point that use is activated.” Information must be provided in “clear language suited to the age of the child.”
The standards also require that all profiling and geolocation settings are, by default, set to “off,” and that a website or app’s settings are “high privacy” by default, meaning that children’s personal data should only be visible or accessible to other users to the extent that the child actively selects these options (standards 6, 9 and 11). Children should be informed of parental monitoring of their online activities (standard 10). When conducting a DPIA (standard 15), companies are encouraged to take into account the additional risk factors relevant to children accessing online services, such as features that may encourage excessive screen time, or increase exposure to online grooming.
The draft code emphasizes that the best interests of the child should be a primary consideration in the design of online services (standard 1), and that data should not be processed in a way that could be detrimental to a child’s physical or psychological well-being (standard 4). Further, the draft code states that the interests of the processing organization are unlikely to outweigh a child’s right to privacy.
In order to meet the draft code’s requirement to deliver services in an age-appropriate manner, service providers must either apply the code’s standard of protection to all users, or have robust age-verification mechanisms to distinguish children from adult users. The ICO notes that “asking users to self-declare their age or age range does not in itself amount to a robust age-verification mechanism under this code.” The draft code recommends that service providers deliver a child-appropriate service to all users, but provide age-verification options for adults to opt-out of the code’s protections, disincentivizing children from lying about their age.
Companies must also avoid using “nudge techniques,” designed to encourage users to select the option favored by the service provider (often a lower privacy option). Such nudges are sometimes employed, for example, where a website provider frames an option in more positive language than another, or makes one option more cumbersome to select. The draft code encourages the use of nudges towards higher privacy options, particularly for younger children.
The finalized code will be enforced by the ICO under the Data Protection Act 2018. Processing children’s personal data in breach of the code is likely to result in regulatory action, including enforcement notices and administrative fines of up to €20 million or 4% of annual worldwide turnover, whichever is greater. The ICO will take the code into account when considering whether an online service has complied with its obligations under the GDPR and the Privacy and Electronic Communications Regulations.
The ICO anticipates that the code, when finalized, will become an international benchmark.
Search
Recent Posts
Categories
- Behavioral Advertising
- Centre for Information Policy Leadership
- Children’s Privacy
- Cyber Insurance
- Cybersecurity
- Enforcement
- European Union
- Events
- FCRA
- Financial Privacy
- General
- Health Privacy
- Identity Theft
- Information Security
- International
- Marketing
- Multimedia Resources
- Online Privacy
- Security Breach
- U.S. Federal Law
- U.S. State Law
- Workplace Privacy
Tags
- Aaron Simpson
- Accountability
- Adequacy
- Advertisement
- Advertising
- American Privacy Rights Act
- Anna Pateraki
- Anonymization
- Anti-terrorism
- APEC
- Apple Inc.
- Argentina
- Arkansas
- Article 29 Working Party
- Artificial Intelligence
- Australia
- Austria
- Automated Decisionmaking
- Baltimore
- Bankruptcy
- Belgium
- Biden Administration
- Big Data
- Binding Corporate Rules
- Biometric Data
- Blockchain
- Bojana Bellamy
- Brazil
- Brexit
- British Columbia
- Brittany Bacon
- Brussels
- Business Associate Agreement
- BYOD
- California
- CAN-SPAM
- Canada
- Cayman Islands
- CCPA
- CCTV
- Chile
- China
- Chinese Taipei
- Christopher Graham
- CIPA
- Class Action
- Clinical Trial
- Cloud
- Cloud Computing
- CNIL
- Colombia
- Colorado
- Committee on Foreign Investment in the United States
- Commodity Futures Trading Commission
- Compliance
- Computer Fraud and Abuse Act
- Congress
- Connecticut
- Consent
- Consent Order
- Consumer Protection
- Cookies
- COPPA
- Coronavirus/COVID-19
- Council of Europe
- Council of the European Union
- Court of Justice of the European Union
- CPPA
- CPRA
- Credit Monitoring
- Credit Report
- Criminal Law
- Critical Infrastructure
- Croatia
- Cross-Border Data Flow
- Cyber Attack
- Cybersecurity
- Cybersecurity and Infrastructure Security Agency
- Data Brokers
- Data Controller
- Data Localization
- Data Privacy Framework
- Data Processor
- Data Protection Act
- Data Protection Authority
- Data Protection Impact Assessment
- Data Transfer
- David Dumont
- David Vladeck
- Delaware
- Denmark
- Department of Commerce
- Department of Health and Human Services
- Department of Homeland Security
- Department of Justice
- Department of the Treasury
- District of Columbia
- Do Not Call
- Do Not Track
- Dobbs
- Dodd-Frank Act
- DPIA
- E-Privacy
- E-Privacy Directive
- Ecuador
- Ed Tech
- Edith Ramirez
- Electronic Communications Privacy Act
- Electronic Privacy Information Center
- Elizabeth Denham
- Employee Monitoring
- Encryption
- ENISA
- EU Data Protection Directive
- EU Member States
- European Commission
- European Data Protection Board
- European Data Protection Supervisor
- European Parliament
- Facial Recognition Technology
- FACTA
- Fair Credit Reporting Act
- Fair Information Practice Principles
- Federal Aviation Administration
- Federal Bureau of Investigation
- Federal Communications Commission
- Federal Data Protection Act
- Federal Trade Commission
- FERC
- FinTech
- Florida
- Food and Drug Administration
- Foreign Intelligence Surveillance Act
- France
- Franchise
- Fred Cate
- Freedom of Information Act
- Freedom of Speech
- Fundamental Rights
- GDPR
- Geofencing
- Geolocation
- Georgia
- Germany
- Global Privacy Assembly
- Global Privacy Enforcement Network
- Gramm Leach Bliley Act
- Hacker
- Hawaii
- Health Data
- Health Information
- HIPAA
- HIPPA
- HITECH Act
- Hong Kong
- House of Representatives
- Hungary
- Illinois
- India
- Indiana
- Indonesia
- Information Commissioners Office
- Information Sharing
- Insurance Provider
- Internal Revenue Service
- International Association of Privacy Professionals
- International Commissioners Office
- Internet
- Internet of Things
- IP Address
- Ireland
- Israel
- Italy
- Jacob Kohnstamm
- Japan
- Jason Beach
- Jay Rockefeller
- Jenna Rode
- Jennifer Stoddart
- Jersey
- Jessica Rich
- John Delionado
- John Edwards
- Kentucky
- Korea
- Latin America
- Laura Leonard
- Law Enforcement
- Lawrence Strickling
- Legislation
- Liability
- Lisa Sotto
- Litigation
- Location-Based Services
- London
- Madrid Resolution
- Maine
- Malaysia
- Markus Heyder
- Maryland
- Massachusetts
- Meta
- Mexico
- Microsoft
- Minnesota
- Mobile App
- Mobile Device
- Montana
- Morocco
- MySpace
- Natascha Gerlach
- National Institute of Standards and Technology
- National Labor Relations Board
- National Science and Technology Council
- National Security
- National Security Agency
- National Telecommunications and Information Administration
- Nebraska
- NEDPA
- Netherlands
- Nevada
- New Hampshire
- New Jersey
- New Mexico
- New York
- New Zealand
- Nigeria
- Ninth Circuit
- North Carolina
- Norway
- Obama Administration
- OECD
- Office for Civil Rights
- Office of Foreign Assets Control
- Ohio
- Oklahoma
- Opt-In Consent
- Oregon
- Outsourcing
- Pakistan
- Parental Consent
- Payment Card
- PCI DSS
- Penalty
- Pennsylvania
- Personal Data
- Personal Health Information
- Personal Information
- Personally Identifiable Information
- Peru
- Philippines
- Phyllis Marcus
- Poland
- PRISM
- Privacy By Design
- Privacy Policy
- Privacy Rights
- Privacy Rule
- Privacy Shield
- Protected Health Information
- Ransomware
- Record Retention
- Red Flags Rule
- Regulation
- Rhode Island
- Richard Thomas
- Right to Be Forgotten
- Right to Privacy
- Risk-Based Approach
- Rosemary Jay
- Russia
- Safe Harbor
- Sanctions
- Schrems
- Scott H. Kimpel
- Scott Kimpel
- Securities and Exchange Commission
- Security Rule
- Senate
- Serbia
- Service Provider
- Singapore
- Smart Grid
- Smart Metering
- Social Media
- Social Security Number
- South Africa
- South Carolina
- South Dakota
- South Korea
- Spain
- Spyware
- Standard Contractual Clauses
- State Attorneys General
- Steven Haas
- Stick With Security Series
- Stored Communications Act
- Student Data
- Supreme Court
- Surveillance
- Sweden
- Switzerland
- Taiwan
- Targeted Advertising
- Telecommunications
- Telemarketing
- Telephone Consumer Protection Act
- Tennessee
- Terry McAuliffe
- Texas
- Text Message
- Thailand
- Transparency
- Transportation Security Administration
- Trump Administration
- United Arab Emirates
- United Kingdom
- United States
- Unmanned Aircraft Systems
- Uruguay
- Utah
- Vermont
- Video Privacy Protection Act
- Video Surveillance
- Virginia
- Viviane Reding
- Washington
- Whistleblowing
- Wireless Network
- Wiretap
- ZIP Code