On October 24, 2024, the Irish Data Protection Commission announced that it had issued a fine of 310 million euros against LinkedIn Ireland Unlimited Company for breaches of the EU GDPR related to transparency, fairness and lawfulness in the context of the company’s processing of its users’ personal data for behavioral analysis and targeted advertising.
On October 4, 2024, the Court of Justice of the European Union issued its judgment in case C‑446/21 to assess whether the GDPR imposes limits to Meta Platforms Ireland’s use of personal data collected outside of the Facebook social network for advertising purposes.
On October 3, 2024, Texas Attorney General Ken Paxton announced a lawsuit against TikTok for operating its platform in violation of the Texas Secure Children Online through Parental Empowerment Act.
Coming on the heels of its Social Media Data Practices report, the FTC announced that it will hold a virtual workshop on February 25, 2025 examining “The Attention Economy: Monopolizing Kids’ Time Online.” The event will convene researchers, technologists, child development and legal experts, consumer advocates and industry professionals to discuss design features that keep children and teens engaged online.
On September 27, 2024, the Irish Data Protection Commission announced it had issued a fine of 91 million euros and a reprimand against Meta Ireland for inadvertently storing passwords of certain users in plaintext on its internal systems.
Last week, the House Energy and Commerce Committee advanced the Kids Online Safety Act (H.R. 7891) and the Children and Teen’s Online Privacy Protection Act (H.R. 7890).
On September 19, 2024, the Federal Trade Commission announced the publication of a staff report entitled, A Look Behind the Screens: Examining the Data Practices of Social Media and Video Streaming Services. The Report documents the data collection and use practices of major social media and video streaming services and provides recommendations for better protecting users’ data and privacy, with a particular focus on children and teens.
On September 10, 2024, the U.S. District Court for the District of Utah issued an Order granting a Motion for a Preliminary Injunction, prohibiting the Utah Attorney General from implementing and enforcing the Utah Minor Protection in Social Media Act, which was set to take effect October 1, 2024.
On September 4, 2024, the Irish High Court dismissed proceedings against X related to X’s use of personal data for its AI tool Grok.
On August 1, 2024, the Office of the New York State Attorney General released two Advanced Notices of Proposed Rulemaking (“ANPRM”) for the SAFE for Kids Act and the Child Data Protection Act.
On August 2, 2024, the UK Information Commissioner’s Office issued a statement confirming that it has identified 11 social media and video sharing platforms that must improve their children’s privacy practices.
On August 2, 2024, the U.S. sued ByteDance, TikTok and its affiliates for violating the Children’s Online Privacy Protection Act of 1998 and the Children’s Online Privacy Protection Rule.
On July 30, 2024, Texas AG Ken Paxton announced that Meta agreed to pay $1.4 billion to settle a lawsuit over allegations that Meta processed facial geometry data of Texas residents in violation of Texas law, including the Texas Capture or Use of Biometric Identifier Act (“CUBI”).
On June 20, 2024, New York Governor Kathy Hochul signed into law Senate Bill S7694, the Stop Addictive Feeds Exploitation (“SAFE”) for Kids Act. The Act is the first of its kind to regulate the provision of addictive social media feeds to minors.
On June 18, 2024, the Federal Trade Commission announced its referral of a complaint to the U.S. Department of Justice (“DOJ”) against TikTok and its parent company regarding their compliance with a 2019 privacy settlement. The complaint is related to the FTC’s investigation into potential violations of the Children’s Online Privacy Protection Act (“COPPA”) and the FTC Act.
On June 17, 2024, the United States Court of Appeals for the Ninth Circuit issued an opinion in Zellmer v. Meta Platforms, Inc., No. 22-16925, (9th Cir. June 17, 2024) affirming the Northern District of California’s order granting summary judgment in favor of Meta and dismissing the action for lack of standing. Clayton Zellmer, an individual who had never used Facebook, brought claims against the social media company under the Illinois Biometrics Information Privacy Act (“BIPA”), alleging that Meta had improperly obtained his biometric data from photos Zellmer’s friends had uploaded to the platform. Zellmer alleged that Facebook’s “Tag Suggestions” feature, which created a “face signature” using photos of Zellmer, violated Sections 15(a) and 15(b) of BIPA by collecting, using, and storing his biometric identifiers without first obtaining his written consent or establishing a public retention schedule. On appeal, the Ninth Circuit held that “face signatures” are not biometric information or identifiers, and thus are not subject to BIPA.
On June 7, 2024, the New York legislature passed a bill (S.B. S7694A), the Stop Addictive Feeds Exploitation (SAFE) for Kids Act, addressing children’s use of social media platforms. The bill is pending Governor Kathy Hochul’s signature.
On April 17, 2024, the European Data Protection Board adopted its non-binding Opinion 08/2024 on Valid Consent in the Context of Consent or Pay Models Implemented by Large Online Platforms, stating that such models generally are not compliant with the GDPR, though their use should be considered on a case-by-case basis.
On March 25, 2024, Florida Governor Ron DeSantis signed into law a bill prohibiting minors under the age of 14 from having accounts on social media platforms.
Last week, Utah Governor Spencer J. Cox signed three privacy-related bills into law. The bills are focused on, respectively, protection of motor vehicle consumer data, regulations on social media companies with respect to minors, and access to protected health information by third parties. The Utah legislature appears to be focused on data-related legislation this session, as Governor Cox signed two other bills related to AI into law last week as well.
On February 12, 2024, a federal court in the Southern District of Ohio issued an order granting a Motion for a Preliminary Injunction, prohibiting the Ohio Attorney General from implementing and enforcing the Parental Notification by Social Media Operators Act, Ohio Rev. Code § 1349.09(B)(1) (the “Act”).
On January 9, 2024, an Ohio federal judge placed a temporary restraining order on Ohio’s Parental Notification by Social Media Operators Act, Ohio Rev. Code § 1349.09(B)(1) (the “Act”), which was approved in July 2023 and was set to go into effect on January 15,2024. Under the Act, parents must provide consent for children under 16 to set up an account on social media and online gaming platforms. The platform operators must also provide parents with a list of content moderation features.
On October 26, 2023, the UK Online Safety Act (the “Act”) received Royal Assent, making it law in the UK. The Act seeks to protect children from online harm and imposes obligations on relevant organizations, including social media platforms, to prevent and remove illegal and harmful content. In a press release, the UK Government stated that the Act “takes a zero-tolerance approach to protecting children from online harm, while empowering adults with more choices over what they see online.” For example, the Act requires relevant organizations to:
On October 15, 2023, a proposal was published on Utah’s social media regulation law, S.B. 152, which was signed earlier this year.
On September 29, 2023, the Supreme Court of the United States (“SCOTUS”) accepted petitions challenging the constitutionality of social media laws in Florida and Texas. Florida’s law, S.B. 7072, prohibits “a social media platform from willfully deplatforming a [political] candidate.” Texas’s law, H.B. 20, refers to social media platforms as “common carriers” that are “central public forums for public debate,” and requires common carriers to publicly disclose information related to the common carrier’s method of recommending content to users, content moderation efforts, use of algorithms to determine search results, and the common carrier’s ordinary disclosures to its users on user performance data for each of its platforms. Both of these laws were challenged by NetChoice, LLC, a national trade association of large online businesses, who had recent successes in blocking several laws, including the California Age-Appropriate Design Code and a similar social media law in Arkansas.
On July 5, 2023, Ohio Governor, Mike DeWine, signed into law House Bill 33, which includes the Social Media Parental Notification Act (“Act”).
On September 6, 2023, the European Commission designated six companies as gatekeepers under Article 3 of the Digital Markets Act (“DMA”). The new gatekeepers are Alphabet, Amazon, Apple, ByteDance, Meta and Microsoft. Jointly, these companies provide 22 core platform services, including social networks, internet browsers, operating systems and mobile app stores.
On August 24, 2023, 12 data protection authorities published a joint statement calling for the protection of personal data from unlawful data scraping. The statement was issued by the authorities of Argentina, Australia, Canada, Colombia, Hong Kong, Jersey, Mexico, Morocco, New Zealand, Norway, Switzerland and the UK. The joint statement reminds organizations that personal data that is publicly accessible is still subject to data protection and privacy laws in most jurisdictions, and highlights the risks facing such data, including increased risk of social engineering or phishing attacks, identify fraud, and unwanted direct marketing or spam.
On June 28, 2023, Louisiana Governor John Bel Edwards signed into law H.B. 61, which requires interactive computer services to get parental consent (or consent from a legal representative of a minor) to enter into a contract or other agreement, including the creation of an online account, with minors younger than 18 years of age. The Act comes after similar laws enacted in Texas, Utah and Arkansas. H.B. 61 will take effect on August 1, 2024.
On May 4, 2023, the Florida Senate and House of Representatives voted in favor of sending the Florida Digital Bill of Rights (“FDBR”) and other amendments related to government moderation of social media and protection of children in online spaces (S.B. 262) to Governor Ron DeSantis for signature. Unlike the other comprehensive state privacy laws that have been enacted, the FDBR applies to a much narrower subset of entities.
On April 12, 2023, Arkansas Governor Sarah Huckabee Sanders signed into law S.B. 396 creating the state’s Social Media Safety Act (the “Act”). The Act comes after Utah’s similar social media laws enacted in March.
On March 3, 2023, the U.S. Department of Justice (“DOJ”) released an update to its Evaluation of Corporate Compliance Programs guidance (“ECCP Guidance”). The ECCP Guidance serves as a guidance document for prosecutors when evaluating a corporate compliance program. Among other updates, the ECCP Guidance now includes new guidance for assessing how companies govern employees’ use of personal devices, communication platforms and messaging applications.
On March 16, 2023, the Federal Trade Commission announced it issued orders to eight social media and video streaming platforms seeking Special Reports on how the platforms review and monitor commercial advertising to detect, prevent and reduce deceptive advertisements, including those related to fraudulent healthcare products, financial scams and the sale of fake goods. The FTC sent the orders pursuant to its resolution directing the FTC to use all available compulsory process to inquire into this topic, and using the FTC’s Section 6(b) authority, which authorizes the FTC to conduct studies that do not have a specific law enforcement purpose.
On March 1-3, 2023, the Utah legislature passed a series of bills, SB 152 and HB 311, regarding social media usage for minors. For social media companies with more than five million users worldwide, SB 152 would require parental permission for social media accounts for users under age 18, while HB 311 would hold social media companies liable for harm minors experience on the platforms. Both bills have been sent to the governor’s desk for signature.
On February 24, 2023, following public consultation, the European Data Protection Board (EDPB) published the following three sets of adopted guidelines:
- Guidelines on the Interplay between the application of Article 3 and the provisions on international transfers as per Chapter V GDPR (05/2021) (final version);
- Guidelines on certification as a tool for transfers (07/2022) (final version); and
- Guidelines on deceptive design patterns in social media platform interfaces (03/2022) (final version).
On February 14, 2023, the U.S. Senate Committee on the Judiciary held a hearing titled, “Protecting Our Children Online.” Chaired by Sen. Durbin, the hearing examined the potentially harmful effects of social media use on young people, and represented a renewal of the Committee’s efforts to pass legislation to protect children and teenagers online. In 2022, the Senate Judiciary Committee approved several bills designed to enhance the online safety and wellbeing of children and teenagers, among them the Kids Online Protection Act (“KOSA”), but the bills did not receive a floor vote. During the hearing, Democratic and Republican senators expressed their commitment to pass bills that would limit the immunity of social media companies under Section 230 of the Communications Decency Act, and would require website and app developers to design products that protect young people from cyberbullying, online sexual exploitation, social media addiction, and other harms.
On January 12, 2023, the French Data Protection Authority (the “CNIL”) announced a €5,000,000 fine for the social network TikTok for violations of applicable cookie rules. The fine was imposed at the end of 2022.
On November 25, 2022, Ireland’s Data Protection Commission (“DPC”) released a decision fining Meta Platforms, Inc. (“Meta”) €265 million for a 2019 data leak involving the personal information of approximately 533 million Facebook users worldwide.
On November 21, 2022, Meta Platforms, Inc. (“Meta”) announced updated practices designed to protect the privacy of young people on Facebook and Instagram, including default privacy settings for new accounts, measures to limit unwanted interactions with adult users, and a tool to limit the spread of teens’ intimate images online.
On October 26, 2022, House Energy and Commerce Committee and Consumer Protection and Commerce Subcommittee leaders (“Committee Leaders”) sent letters to several toy manufacturers, including Bandai Namco, Hasbro, Mattel, MGA Entertainment, LEGO Group and the Toy Association, asking how they plan to protect children and their information from BigTech companies like TikTok and YouTube. Given the shift of marketing efforts from traditional television outlets to social media platforms, Committee Leaders are concerned about failure to protect children’s privacy, security and mental health on social media platforms.
On October 17, 2022, the French Data Protection Authority (the “CNIL”) imposed a €20 million fine on Clearview AI for unlawful use of facial recognition technology. The fine was imposed after the CNIL’s prior formal notice remained unaddressed by Clearview AI.
On September 26, 2022, the UK Information Commissioner’s Office (“ICO”) confirmed in a statement that it issued TikTok Inc. and TikTok Information Technologies UK Limited (together, “TikTok”) a notice of intent to potentially impose a £27 million fine for failing to protect children’s privacy. This notice of intent follows an investigation by the ICO finding that TikTok may have breached UK data protection law between May 2018 and July 2020 by failing to protect children’s privacy when using the TikTok platform.
On September 15, 2022, California Governor Gavin Newsom signed into law the California Age-Appropriate Design Code Act (the “Act”). The Act, which takes effect July 1, 2024, places new legal obligations on companies with respect to online products and services that are “likely to be accessed by children” under the age of 18.
On September 5, 2022, the Irish Data Protection Commissioner (the “DPC”) imposed a €405,000,000 fine on Instagram (a Meta-owned social media platform) for violations of the EU General Data Protection Regulation’s (“GDPR’s”) rules on the processing of children’s personal data.
On June 10, 2022, New York became the first state to require attorneys to complete at least one credit of cybersecurity, privacy and data protection training as part of their continuing legal education (“CLE”) requirements. The new requirement will take effect July 1, 2023.
On July 6, 2022, the Better Business Bureau National Programs’ Children’s Advertising Review Unit (“CARU”) announced that it had found Outright Games in violation of the Children’s Online Privacy Protection Act (“COPPA”) and CARU’s Self-Regulatory Guidelines for Advertising and Guidelines for Children’s Online Privacy Protection. Outright Games owns and operates the Bratz Total Fashion Makeover app, which CARU determined to be a “mixed audience” child-directed app subject to COPPA and CARU’s Guidelines due to the app’s subject matter, bright colors, visual content, lively audio and gameplay features.
On June 3, 2022, the Federal Trade Commission announced it is seeking public comment on its 2013 guidance, “.com Disclosures: How to Make Effective Disclosures in Digital Advertising” (the “Guidance”). The FTC indicated that it is updating the Guidance to better protect consumers against online deceptive practices, particularly because some companies have interpreted the current version of Guidance to “justify practices that mislead consumers online.” For example, the FTC explains that companies have wrongfully claimed they can avoid FTC Act liability by placing required disclosures behind hyperlinks. The updated Guidance will address issues such as advertising on social media, in video games, in virtual reality environments, and on mobile devices and applications, as well as the use of dark patterns, manipulative user interface designs, multi-party selling arrangements, hyperlinks and online disclosures.
On May 25, 2022, Twitter reached a proposed $150 million settlement with the Department of Justice (“DOJ”) and the Federal Trade Commission to resolve allegations that the company deceptively used nonpublic user contact information obtained for account security purposes to serve targeted ads to Twitter users. In a complaint filed in federal court, the government alleged that Twitter violated both the FTC Act and a 2011 FTC Order by misrepresenting the extent to which the company maintained and protected users’ nonpublic contact information. The proposed settlement would require Twitter to pay $150 million in civil penalties and implement a comprehensive privacy and information security program “with extensive procedures to safeguard user information and assess internal and external data privacy risks.”
On April 23, 2022, the European Commission announced that the European Parliament and EU Member States had reached consensus on the Digital Services Act (“DSA”), which establishes accountability standards for online platforms regarding illegal and harmful content.
On March 2, 2022, eight states announced a bipartisan, nationwide investigation into whether TikTok operates in a way that causes or exacerbates harm to the physical and mental health of children, teens and young adults. The probe will further consider whether the company violated state consumer protection laws and put the public at risk.
On March 1, 2022, President Biden, in his first State of the Union address, called on Congress to strengthen privacy protections for children, including by banning online platforms from excessive data collection and targeted advertising for children and young people. President Biden called for these heightened protections as part of his unity agenda to address the nation’s mental health crisis, especially the growing concern about the harms of digital technologies, particularly social media, to the mental health and well-being of children and young people. President Biden not only urged for stronger protections for children’s data and privacy, but also for interactive digital service providers to prioritize safety-by-design standards and practices. In his address, President Biden called on online platforms to “prioritize and ensure the health, safety and well-being of children and young people above profit and revenue in the design of their products and services.” President Biden also called for a stop to “discriminatory algorithmic decision-making that limits opportunities” and impacts the mental well-being of children and young people.
On February 18, 2022, the Texas Attorney General’s Office (the “Texas AG”) announced that it had issued two Civil Investigative Demands (“CIDs”) to TikTok Inc. The Texas AG’s investigation focuses on TikTok’s alleged violations of children’s privacy and facilitation of human trafficking, along with other potential unlawful conduct.
On February 14, 2022, Texas Attorney General Ken Paxton brought suit against Meta, the parent company of Facebook and Instagram, over the company’s collection and use of biometric data. The suit alleges that Meta collected and used Texans’ facial geometry data in violation of the Texas Capture or Use of Biometric Identifier Act (“CUBI”) and the Texas Deceptive Trade Practices Act (“DTPA”). The lawsuit is significant because it represents the first time the Texas Attorney General’s Office has brought suit under CUBI.
On December 27, 2021, the Federal Trade Commission sought public comment on a petition filed by Accountable Tech calling on the FTC to use its rulemaking authority to prohibit “surveillance advertising” as an “unfair method of competition” (“UMC”). Accountable Tech is a non-profit organization that advocates for social media companies to strengthen the integrity of their platforms.
On December 15, 2021, the European Parliament adopted its position on the proposal for a Digital Markets Act (“DMA”), ahead of negotiations with the Council of the European Union.
The DMA introduces new rules for certain core platforms services acting as “gatekeepers,” (including search engines, social networks, online advertising services, cloud computing, video-sharing services, messaging services, operating systems and online intermediation services) in the digital sector and aims to prevent them from imposing unfair conditions on businesses and consumers and to ensure the openness of important digital services.
On June 9, 2021, President Biden signed an Executive Order on Protecting Americans’ Sensitive Data from Foreign Adversaries (the “EO” or “Biden EO”). The Biden EO elaborates on measures to address the national emergency regarding the information technology supply chain declared in 2019 by the Trump administration in Executive Order 13873. Simultaneously, the Biden EO also revokes three Trump administration orders (Executive Orders 13942, 13943 and 13971) that sought to prohibit transactions with TikTok, WeChat, their parent companies and certain other “Chinese connected software applications.” In their place, the Biden EO provides for (1) cabinet-level assessments and future recommendations to protect against risks from foreign adversaries’ (a) access to U.S. persons’ sensitive data and (b) involvement in software application supply and development; and (2) the continuing evaluation of transactions involving connected software applications that threaten U.S. national security.
On December 14, 2020, the Federal Trade Commission announced that it had issued orders to nine social media and video streaming companies, requesting information on how the companies collect, use and present personal information, their advertising and user engagement practices and how their practices affect children and teens. The orders will assist the FTC in conducting a study of these policies, practices and procedures. The FTC issued the orders pursuant to Section 6(b) of the FTC Act, which allows the agency to undertake broad studies separate from its law enforcement activities.
On December 15, 2020, the Irish Data Protection Commission (“DPC”) announced its fine of €450,000 against Twitter International Company (“Twitter”), following its investigation into a breach resulting from a bug in Twitter’s design. The fine is the largest issued by the Irish DPC under the EU General Data Protection Regulation (“GDPR”) to date and is also its first against a U.S.-based organization.
On November 27, 2020, New Mexico Attorney General Hector Balderas filed a notice of appeal to the U.S. Court of Appeals for the Tenth Circuit in the lawsuit it brought against Google on February 20, 2020, regarding alleged violations of the federal Children’s Online Privacy Protection Act (“COPPA”) in connection with G-Suite for Education (“GSFE”). As we previously reported, the U.S. District Court of New Mexico had granted Google’s motion to dismiss, in which it asserted that its terms governed the collection of data through GSFE and that it had complied with COPPA by using schools both as “intermediaries” and as the parent’s agent for parental notice and consent, in line with Federal Trade Commission Guidance.
On September 25, 2020, the District Court of New Mexico granted Google’s motion to dismiss a lawsuit filed on February 20, 2020, by New Mexico Attorney General Hector Balderas alleging, among other claims, that the company violated the federal Children’s Online Privacy Protection Act (“COPPA” or the “Act”) by using G Suite for Education to “spy on New Mexico students’ online activities for its own commercial purposes, without notice to parents and without attempting to obtain parental consent.”
On September 18, 2020, the U.S. Department of Commerce (“Commerce”) announced detailed sanctions relating to the mobile applications WeChat and TikTok. These prohibitions were issued in accordance with President Trump’s Executive Orders issued on August 6, 2020, imposing economic sanctions against the platforms under the International Emergency Economic Powers Act (50 U.S.C. § 1701 et seq.) and the National Emergencies Act (50 U.S.C. § 1601 et seq.). These orders, if they become fully effective, will (1) prohibit mobile app stores in the U.S. from permitting downloads or updates to the WeChat and TikTok mobile apps; (2) prohibit U.S. companies from providing Internet backbone services that enable the WeChat and TikTok mobile apps; and (3) prohibit U.S. companies from providing services through the WeChat mobile app for the purpose of transferring funds or processing payments to or from parties. The sanctions do not target individual or business use of the applications but are expected to degrade the ability of persons in the United States to use the apps for the purposes they were designed to serve.
On September 7, 2020, the European Data Protection Board (the “EDPB”) published Guidelines on the Targeting of Social Media Users (the “Guidelines”). The Guidelines aim to provide practical guidance on the role and responsibilities of social media providers and those using targeting services, such as for targeted advertising, on social media platforms (“targeters”).
UPDATE: On September 29, 2020, California Governor Gavin Newsom vetoed AB 1138.
On September 8, 2020, AB 1138, the Parent’s Accountability and Child Protection Act, was enrolled and presented to the California Governor for signature. If signed into law by the Governor, the bill would require a business that operates a social media website or application, beginning July 1, 2021, to obtain verifiable parental consent for California-based children that the business “actually knows” are under 13 years of age (hereafter, “Children”). The bill defines “social media” to mean an electronic service or account held open to the general public to post, on either a public or semi-public page dedicated to a particular user, electronic content or communication, including but not limited to videos, photos or messages intended to facilitate the sharing of information, ideas, personal messages or other content.
The Age Appropriate Design Code (the “code”) created by the UK Information Commissioner’s Office (the “ICO”) has completed the Parliamentary process and was issued by the ICO on August 12, 2020. It will come into force on September 2, 2020, with a 12-month transition period for online services to conform to the code.
On August 6, 2020, President Trump signed executive orders imposing new economic sanctions under the International Emergency Economic Powers Act (50 U.S.C. § 1701 et seq.) and the National Emergencies Act (50 U.S.C. § 1601 et seq.) against TikTok, a video-sharing mobile application, and WeChat, a messaging, social media and mobile payments application. The orders potentially affect tens of millions of U.S. users of these applications and billions of users worldwide.
On May 19, 2020, the Belgian Data Protection Authority (the “Belgian DPA”) announced that the Litigation Chamber had imposed a €50,000 fine on a social media provider for unlawful processing of personal data in connection with the “invite-a-friend” function offered on its platform.
In part two of our podcast by Never Stop Learning, Lisa Sotto, partner and chair of Hunton Andrews Kurth’s Privacy and Cybersecurity practice, and Eric Friedberg, Co-President of Stroz Friedberg, LLC, and Aon’s Cyber Solutions Group, discuss the fragmented nature of data security law in the U.S. and abroad. Sotto notes that the “patchwork quilt of regulation” in the U.S. regarding data security makes it difficult for companies to know what rules to implement. She stresses that the severity of cyber attacks has increased significantly over the past decade.
Facebook disclosed on January 29, 2020, that it has agreed to pay $550,000,000 to resolve a biometric privacy class action filed by Illinois users under the Biometric Information Privacy Act (“BIPA”). BIPA is an Illinois law enacted in 2008 that governs the collection, use, sharing, protection and retention of biometric information. In recent years, numerous class action lawsuits have been filed under BIPA seeking statutory damages ranging from $1,000 per negligent violation to $5,000 per reckless or intentional violation.
On January 21, 2020, the UK Information Commissioner’s Office (“ICO”) published the final version of its Age Appropriate Design Code (“the code”), which sets out the standards that online services need to meet in order to protect children’s privacy. It applies to providers of information services likely to be accessed by children in the UK, including applications, programs, websites, social media platforms, messaging services, games, community environments and connected toys and devices, where these offerings involve the processing of personal data.
On December 11, 2019, an updated version of India’s draft data privacy bill was introduced in the Indian Parliament (the “Draft Bill”) by the Ministry of Electronics and Information Technology (“MeitY”). The Draft Bill updates a prior version submitted to MeitY in July 2018.
On October 30, 2019, Facebook reached a settlement with the UK Information Commissioner’s Office (“ICO”) under which it agreed to pay (without admission of liability) the £500,000 fine imposed by the ICO in 2018 in relation to the processing and sharing of its users’ personal data with Cambridge Analytica.
On July 29, 2019, the Court of Justice of the European Union (the “CJEU”) released its judgment in case C-40/17, Fashion ID GmbH & Co. KG vs. Verbraucherzentrale NRW eV. The Higher Regional Court of Düsseldorf (Oberlandesgericht Düsseldorf) requested a preliminary ruling from the CJEU on several provisions of the former EU Data Protection Directive of 1995, which was still applicable to the case since the court proceedings had started before the implementation of the EU General Data Protection Regulation (“GDPR”).
On June 14, 2019, the United States Court of Appeals for the Ninth Circuit affirmed summary judgment in favor of Facebook, holding that the company did not violate the Illinois Biometric Information Privacy Act (“BIPA”) (740 ICLS ¶¶ 15, 20).
On June 28, 2019, the French data protection authority (the “CNIL”) published its action plan for 2019-2020 to specify the rules applicable to online targeted advertising and to support businesses in their compliance efforts.
Social media platforms, file hosting sites, discussion forums, messaging services and search engines in the UK are likely to come under increased pressure to monitor and edit online content after the UK Department of Digital, Culture, Media and Sport (“DCMS”) announced in its Online Harms White Paper (the “White Paper”), released this month, proposals for a new regulatory framework to make companies more responsible for users’ online safety. Notably, the White Paper proposes a new duty of care owed to website users, and an independent regulator to oversee compliance.
On February 12, 2019, the European Data Protection Board (the “EDPB”) released its work program for 2019 and 2020 (the “Work Program”). Following the EDPB’s endorsement of the Article 29 Working Party guidelines and continued guidance relating to new EU General Data Protection Regulation (“GDPR”) concepts, the EDPB plans to shift its focus to more specialized areas and technologies.
On June 12, 2018, Vietnam’s parliament approved a new cybersecurity law that contains data localization requirements, among other obligations. Technology companies doing business in the country will be required to operate a local office and store information about Vietnam-based users within the country. The law also requires social media companies to remove offensive content from their online service within 24 hours at the request of the Ministry of Information and Communications and the Ministry of Public Security’s cybersecurity task force. Companies could face ...
On January 28, 2018, Facebook published its privacy principles and announced that it will centralize its privacy settings in a single place.
On October 24, 2017, an opinion issued by the EU’s Advocate General Bot (“Bot”) rejected Facebook’s assertion that its EU data processing activities fall solely under the jurisdiction of the Irish Data Protection Commissioner. The non-binding opinion was issued in relation to the CJEU case C-210/16, under which the German courts sought to clarify whether the data protection authority (“DPA”) in the German state of Schleswig-Holstein could take action against Facebook with respect to its use of web tracking technologies on a German education provider’s fan page without first providing notice.
On May 16, 2017, the Governor of the State of Washington, Jay Inslee, signed into law House Bill 1493 (“H.B. 1493”), which sets forth requirements for businesses who collect and use biometric identifiers for commercial purposes. The law will become effective on July 23, 2017. With the enactment of H.B. 1493, Washington becomes the third state to pass legislation regulating the commercial use of biometric identifiers. Previously, both Illinois and Texas enacted the Illinois Biometric Information Privacy Act (740 ILCS 14) (“BIPA”) and the Texas Statute on the Capture or Use of Biometric Identifier (Tex. Bus. & Com. Code Ann. §503.001), respectively.
On October 3, 2016, the Texas Attorney General announced a $30,000 settlement with mobile app developer Juxta Labs, Inc. (“Juxta”) stemming from allegations that the company violated Texas consumer protection law by engaging in false, deceptive or misleading acts or practices regarding the collection of personal information from children.
On August 25, 2016, WhatsApp announced in a blog post that the popular mobile messaging platform updated its Terms of Service and Privacy Policy to permit certain information sharing with Facebook. After Facebook acquired WhatsApp in 2014, the Director of the FTC’s Bureau of Consumer Protection wrote a letter to both Facebook and WhatsApp that discussed the companies’ obligations to honor privacy statements made to consumers in connection with the acquisition.
On December 27, 2015, the Standing Committee of the National People’s Congress of the People’s Republic of China published the P.R.C. Anti-Terrorism Law. The law was enacted in response to a perceived growing threat from extremists and terrorists, particularly in regions in Western China, and came into effect on January 1, 2016.
On October 15 and 16, 2015, Hunton & Williams is pleased to sponsor PDP’s 14th Annual Data Protection Compliance Conference in London. Bridget Treacy, Head of the UK Privacy and Cybersecurity practice at Hunton & Williams, chairs the conference, which features speakers from the data protection industry, including Christopher Graham, UK Information Commissioner, and Rosemary Jay, senior consultant attorney at Hunton & Williams.
On August 7, 2015, Delaware Governor Jack Markell signed four bills into law concerning online privacy. The bills, drafted by the Delaware Attorney General, focus on protecting the privacy of website and mobile app users, children, students and crime victims.
Recent class actions filed against Facebook and Shutterfly are the first cases to test an Illinois law that requires consent before biometric information may be captured for commercial purposes. Although the cases focus on biometric capture activities primarily in the social-media realm, these cases and the Illinois law at issue have ramifications for any business that employs biometric-capture technology, including those who use it for security or sale-and-marketing purposes. In a recent article published in Law360, Hunton & Williams partner, Torsten M. Kracht, and associate, Rachel E. Mossman, discuss how businesses already using these technologies need to keep abreast of new legislation that might affect the legality of their practices, and how businesses considering the implementation of these technologies should consult local rules and statutes before implementing biometric imaging.
On June 9, 2015, Max Schrems tweeted that the Advocate General of the European Court of Justice (“ECJ”) will delay his opinion in Europe v. Facebook, a case challenging the U.S.-EU Safe Harbor Framework. The opinion was previously scheduled to be issued on June 24. No new date has been set.
On May 13, 2015, the Belgian Data Protection Authority (the “DPA”) published a recommendation addressing the use of social plug-ins associated with Facebook and its services (the “Recommendation”). The Recommendation stems from the recent discussions between the DPA and Facebook regarding Facebook’s privacy policy and the tracking of individuals’ Internet activities.
On May 11, 2015, the French Data Protection Authority (“CNIL”) and the UK Information Commissioner’s Office (”ICO”) announced that they will participate in a coordinated online audit to assess whether websites and apps that are directed toward children, and those that are frequently used by or popular among children, comply with global privacy laws. The audit will be coordinated by the Global Privacy Enforcement Network (“GPEN”), a global network of approximately 50 data protection authorities (“DPAs”) from around the world.
On January 1, 2015, Finland’s Information Security Code (2014/ 917, the “Code”) became effective. The Code introduces substantial revisions to Finland’s existing electronic communications legislation and consolidates several earlier laws into a single, unified text. Although many of these earlier laws remain unchanged, the Code includes extensive amendments in a number of areas.
On January 14, 2015, the data protection authority of the German federal state of Schleswig-Holstein (“Schleswig DPA”) issued an appeal challenging a September 4, 2014 decision by the Administrative Court of Appeals, which held that companies using Facebook’s fan pages cannot be held responsible for data protection law violations committed by Facebook because the companies do not have any control over the use of the data.
As reported in the Hunton Employment & Labor Perspectives Blog:
In Purple Communications, Inc., a divided National Labor Relations Board (“NLRB”) held that employees have the right to use their employers’ email systems for statutorily protected communications, including self-organization and other terms and conditions of employment, during non-working time. In making this determination, the NLRB reversed its divided 2007 decision in Register Guard, which held that employees have no statutory right to use their employer’s email systems for Section 7 purposes.
On September 30, 2014, California Governor Jerry Brown announced the recent signings of several bills that provide increased privacy protections to California residents. The newly-signed bills are aimed at protecting student privacy, increasing consumer protection in the wake of a data breach, and expanding the scope of California’s invasion of privacy and revenge porn laws. Unless otherwise noted, the laws will take effect on January 1, 2015.
On April 21, 2014, the Securities and Exchange Commission’s Division of Corporation Finance published new Compliance and Disclosure Interpretations (“C&DIs”) concerning the use of social media in certain securities offerings, business combinations and proxy contests. Notably, the C&DIs permit the use of an active hyperlink to satisfy the cautionary legend requirements in social media communications when the social media platform limits the text or number of characters that may be included (e.g., Twitter). The C&DIs also clarify that postings or messages re-transmitted by unrelated third parties generally will not be attributable to the issuer (so issuers will not be required to ensure that third parties comply with the guidance). In addition, requirements regarding cautionary legends contemplated by the C&DIs apply to both issuers and other soliciting parties in proxy fights or tender offers. Accordingly, although the new guidance will allow issuers to communicate with their shareholders and potential investors via social media, it also may prove useful to activists in proxy fights and tender offers.
On April 10, 2014, the Federal Trade Commission announced that the Director of the FTC’s Bureau of Consumer Protection had notified Facebook and WhatsApp Inc., reminding both companies of their obligation to honor privacy statements made to consumers in connection with Facebook’s proposed acquisition of WhatsApp.
On March 28, 2014, the 87th Conference of the German Data Protection Commissioners concluded in Hamburg. This biannual conference provides a private forum for the 17 German state data protection authorities (“DPAs”) and the Federal Commissioner for Data Protection and Freedom of Information, Andrea Voßhoff, to share their views on current issues, discuss relevant cases and adopt Resolutions aimed at harmonizing how data protection law is applied across Germany.
Join us in New York City on May 19-20, 2014, for the Privacy, Policy & Technology Summit – A High Level Briefing for Today’s Top Privacy Executives. Lisa Sotto, partner and head of the Global Privacy and Cybersecurity practice at Hunton & Williams LLP will be a featured speaker at the session on “Cybersecurity: Insider Tips for Proactively Protecting Your Company and Its Data While Reducing Downstream Regulatory and Litigation Exposure.”
On December 16, 2013, the French Data Protection Authority (“CNIL”) released a set of practical FAQs (plus technical tools and relevant source code) providing guidance on how to obtain consent for the use of cookies and similar technologies in compliance with EU and French data protection requirements (the “CNIL’s Guidance”). Article 5.3 of the revised e-Privacy Directive 2002/58/EC imposes an obligation to obtain prior consent before placing or accessing cookies and similar technologies on web users’ devices. Article 32-II of the French Data Protection Act transposes this obligation into French law.
On October 22, 2013, the Federal Trade Commission announced a proposed settlement with Aaron’s, Inc. (“Aaron’s”) stemming from allegations that it knowingly assisted its franchisees in spying on consumers. Specifically, the FTC alleged that Aaron’s facilitated its franchisees’ installation and use of software on computers rented to consumers that surreptitiously tracked consumers’ locations, took photographs of consumers in their homes, and recorded consumers’ keystrokes in order to capture login credentials for email, financial and social media accounts. The FTC had previously settled similar allegations against Aaron’s and several other companies.
As reported in the Hunton Employment & Labor Perspectives Blog:
The U.S. District Court for the District of New Jersey recently ruled that non-public Facebook wall posts are protected under the Federal Stored Communications Act (the “SCA”) in Ehling v. Monmouth-Ocean Hospital Service Corp., No. 2:11-CV-3305 (WMJ) (D.N.J. Aug. 20, 2013). The plaintiff was a registered nurse and paramedic at Monmouth-Ocean Hospital Service Corp. (“MONOC”). She maintained a personal Facebook profile and was “Facebook friends” with many of her coworkers but none of the MONOC managers. She adjusted her privacy preferences so only her “Facebook friends” could view the messages she posted onto her Facebook wall. Unbeknownst to the plaintiff, a coworker who was also a “Facebook friend” took screenshots of the plaintiff’s wall posts and sent them to a MONOC manager. When the manager learned of a wall post in which the plaintiff criticized Washington, D.C. paramedics in their response to a museum shooting, MONOC temporarily suspended the plaintiff with pay and delivered a memo warning her that the wall post reflected a “deliberate disregard for patient safety.” The plaintiff subsequently filed suit alleging violations of the SCA, among other claims.
Search
Recent Posts
- Website Use of Third-Party Tracking Software Not Prohibited Under Massachusetts Wiretap Act
- HHS Announces Additional Settlements Following Ransomware Attacks Including First Enforcement Under Risk Analysis Initiative
- Employee Monitoring: Increased Use Draws Increased Scrutiny from Consumer Financial Protection Bureau
Categories
- Behavioral Advertising
- Centre for Information Policy Leadership
- Children’s Privacy
- Cyber Insurance
- Cybersecurity
- Enforcement
- European Union
- Events
- FCRA
- Financial Privacy
- General
- Health Privacy
- Identity Theft
- Information Security
- International
- Marketing
- Multimedia Resources
- Online Privacy
- Security Breach
- U.S. Federal Law
- U.S. State Law
- Workplace Privacy
Tags
- Aaron Simpson
- Accountability
- Adequacy
- Advertisement
- Advertising
- American Privacy Rights Act
- Anna Pateraki
- Anonymization
- Anti-terrorism
- APEC
- Apple Inc.
- Argentina
- Arkansas
- Article 29 Working Party
- Artificial Intelligence
- Australia
- Austria
- Automated Decisionmaking
- Baltimore
- Bankruptcy
- Belgium
- Biden Administration
- Big Data
- Binding Corporate Rules
- Biometric Data
- Blockchain
- Bojana Bellamy
- Brazil
- Brexit
- British Columbia
- Brittany Bacon
- Brussels
- Business Associate Agreement
- BYOD
- California
- CAN-SPAM
- Canada
- Cayman Islands
- CCPA
- CCTV
- Chile
- China
- Chinese Taipei
- Christopher Graham
- CIPA
- Class Action
- Clinical Trial
- Cloud
- Cloud Computing
- CNIL
- Colombia
- Colorado
- Committee on Foreign Investment in the United States
- Commodity Futures Trading Commission
- Compliance
- Computer Fraud and Abuse Act
- Congress
- Connecticut
- Consent
- Consent Order
- Consumer Protection
- Cookies
- COPPA
- Coronavirus/COVID-19
- Council of Europe
- Council of the European Union
- Court of Justice of the European Union
- CPPA
- CPRA
- Credit Monitoring
- Credit Report
- Criminal Law
- Critical Infrastructure
- Croatia
- Cross-Border Data Flow
- Cyber Attack
- Cybersecurity and Infrastructure Security Agency
- Data Brokers
- Data Controller
- Data Localization
- Data Privacy Framework
- Data Processor
- Data Protection Act
- Data Protection Authority
- Data Protection Impact Assessment
- Data Transfer
- David Dumont
- David Vladeck
- Delaware
- Denmark
- Department of Commerce
- Department of Health and Human Services
- Department of Homeland Security
- Department of Justice
- Department of the Treasury
- District of Columbia
- Do Not Call
- Do Not Track
- Dobbs
- Dodd-Frank Act
- DPIA
- E-Privacy
- E-Privacy Directive
- Ecuador
- Ed Tech
- Edith Ramirez
- Electronic Communications Privacy Act
- Electronic Privacy Information Center
- Elizabeth Denham
- Employee Monitoring
- Encryption
- ENISA
- EU Data Protection Directive
- EU Member States
- European Commission
- European Data Protection Board
- European Data Protection Supervisor
- European Parliament
- Facial Recognition Technology
- FACTA
- Fair Credit Reporting Act
- Fair Information Practice Principles
- Federal Aviation Administration
- Federal Bureau of Investigation
- Federal Communications Commission
- Federal Data Protection Act
- Federal Trade Commission
- FERC
- FinTech
- Florida
- Food and Drug Administration
- Foreign Intelligence Surveillance Act
- France
- Franchise
- Fred Cate
- Freedom of Information Act
- Freedom of Speech
- Fundamental Rights
- GDPR
- Geofencing
- Geolocation
- Georgia
- Germany
- Global Privacy Assembly
- Global Privacy Enforcement Network
- Gramm Leach Bliley Act
- Hacker
- Hawaii
- Health Data
- Health Information
- HIPAA
- HIPPA
- HITECH Act
- Hong Kong
- House of Representatives
- Hungary
- Illinois
- India
- Indiana
- Indonesia
- Information Commissioners Office
- Information Sharing
- Insurance Provider
- Internal Revenue Service
- International Association of Privacy Professionals
- International Commissioners Office
- Internet
- Internet of Things
- IP Address
- Ireland
- Israel
- Italy
- Jacob Kohnstamm
- Japan
- Jason Beach
- Jay Rockefeller
- Jenna Rode
- Jennifer Stoddart
- Jersey
- Jessica Rich
- John Delionado
- John Edwards
- Kentucky
- Korea
- Latin America
- Laura Leonard
- Law Enforcement
- Lawrence Strickling
- Legislation
- Liability
- Lisa Sotto
- Litigation
- Location-Based Services
- London
- Madrid Resolution
- Maine
- Malaysia
- Markus Heyder
- Maryland
- Massachusetts
- Meta
- Mexico
- Microsoft
- Minnesota
- Mobile App
- Mobile Device
- Montana
- Morocco
- MySpace
- Natascha Gerlach
- National Institute of Standards and Technology
- National Labor Relations Board
- National Science and Technology Council
- National Security
- National Security Agency
- National Telecommunications and Information Administration
- Nebraska
- NEDPA
- Netherlands
- Nevada
- New Hampshire
- New Jersey
- New Mexico
- New York
- New Zealand
- Nigeria
- Ninth Circuit
- North Carolina
- Norway
- Obama Administration
- OECD
- Office for Civil Rights
- Office of Foreign Assets Control
- Ohio
- Oklahoma
- Opt-In Consent
- Oregon
- Outsourcing
- Pakistan
- Parental Consent
- Payment Card
- PCI DSS
- Penalty
- Pennsylvania
- Personal Data
- Personal Health Information
- Personal Information
- Personally Identifiable Information
- Peru
- Philippines
- Phyllis Marcus
- Poland
- PRISM
- Privacy By Design
- Privacy Policy
- Privacy Rights
- Privacy Rule
- Privacy Shield
- Protected Health Information
- Ransomware
- Record Retention
- Red Flags Rule
- Regulation
- Rhode Island
- Richard Thomas
- Right to Be Forgotten
- Right to Privacy
- Risk-Based Approach
- Rosemary Jay
- Russia
- Safe Harbor
- Sanctions
- Schrems
- Scott Kimpel
- Securities and Exchange Commission
- Security Rule
- Senate
- Serbia
- Service Provider
- Singapore
- Smart Grid
- Smart Metering
- Social Media
- Social Security Number
- South Africa
- South Carolina
- South Dakota
- South Korea
- Spain
- Spyware
- Standard Contractual Clauses
- State Attorneys General
- Steven Haas
- Stick With Security Series
- Stored Communications Act
- Student Data
- Supreme Court
- Surveillance
- Sweden
- Switzerland
- Taiwan
- Targeted Advertising
- Telecommunications
- Telemarketing
- Telephone Consumer Protection Act
- Tennessee
- Terry McAuliffe
- Texas
- Text Message
- Thailand
- Transparency
- Transportation Security Administration
- Trump Administration
- United Arab Emirates
- United Kingdom
- United States
- Unmanned Aircraft Systems
- Uruguay
- Utah
- Vermont
- Video Privacy Protection Act
- Video Surveillance
- Virginia
- Viviane Reding
- Washington
- Whistleblowing
- Wireless Network
- Wiretap
- ZIP Code