As reported on the Hunton Employment & Labor Perspectives blog, on October 24, 2024, the Consumer Financial Protection Bureau (“CFPB”) issued a policy statement (known as a Circular) to explain the link between the Fair Credit Reporting Act (“FCRA”) and employers’ growing use of artificial intelligence (“AI”) to evaluate, rank and score applicants and employees. Employers should take note that the FCRA does not only apply to criminal history or credit reports. As the use of advanced data analysis and AI rise, employers should ensure that they are not running afoul of the FCRA’s requirements.
On October 21, 2024, the U.S. Department of Justice National Security Division issued a Notice of Proposed Rulemaking implementing Executive Order 14117 that will restrict certain transactions with high-risk countries.
On October 16, 2024, the New York Department of Financial Services (“NYDFS”) issued an Industry Letter warning companies to update their AI security procedures around multifactor authentication, which are potentially vulnerable to deepfakes and AI-supplemented social engineering attacks.
On September 13, 2024, the Colorado Department of Law issued proposed draft amendments to the Colorado Privacy Act (“CPA”) Rules and a notice of proposed rulemaking addressing biometric data, minors’ online privacy, and a framework for opinion letters and interpretative guidance.
On September 3, 2024, the Dutch Data Protection Authority announced a €30.5 million fine against Clearview AI for the processing of personal data related to its biometric data database.
As reported on the Hunton Retail Law resource blog, on August 2, 2024, Illinois amended its Biometric Information Privacy Act (“BIPA”), curbing the potential for massive damages and modernizing the law’s written consent provisions. On their face, the amendments are not retroactive. It remains unclear, however, whether this change in Illinois law will nonetheless be applied retroactively by the courts.
On July 30, 2024, Texas AG Ken Paxton announced that Meta agreed to pay $1.4 billion to settle a lawsuit over allegations that Meta processed facial geometry data of Texas residents in violation of Texas law, including the Texas Capture or Use of Biometric Identifier Act (“CUBI”).
Last month, Colorado Governor Jared Polis signed into law a bill that amends the Colorado Privacy Act and introduces new obligations for processors of biometric data. The law goes into effect on July 1, 2025.
On June 17, 2024, the United States Court of Appeals for the Ninth Circuit issued an opinion in Zellmer v. Meta Platforms, Inc., No. 22-16925, (9th Cir. June 17, 2024) affirming the Northern District of California’s order granting summary judgment in favor of Meta and dismissing the action for lack of standing. Clayton Zellmer, an individual who had never used Facebook, brought claims against the social media company under the Illinois Biometrics Information Privacy Act (“BIPA”), alleging that Meta had improperly obtained his biometric data from photos Zellmer’s friends had uploaded to the platform. Zellmer alleged that Facebook’s “Tag Suggestions” feature, which created a “face signature” using photos of Zellmer, violated Sections 15(a) and 15(b) of BIPA by collecting, using, and storing his biometric identifiers without first obtaining his written consent or establishing a public retention schedule. On appeal, the Ninth Circuit held that “face signatures” are not biometric information or identifiers, and thus are not subject to BIPA.
On May 16, 2024, the Illinois House of Representatives passed S.B. 2979, following the bill’s passage in the Illinois Senate in April. S.B. 2979 would amend the Illinois Biometric Information Privacy Act definitions and limit liability for businesses with multiple duplicative BIPA violations that relate to the same individual.
On May 23, 2024, the European Data Protection Board adopted an Opinion on the use of facial recognition technologies by airport operators and airline companies to streamline the passenger flow at airports.
The Maryland legislature recently passed the Maryland Online Data Privacy Act of 2024 (“MODPA”), which was delivered to Governor Wes Moore for signature and, if enacted, will impose robust requirements with respect to data minimization, the protection of sensitive data, and the processing and sale of minors’ data.
The Centre for Information Policy Leadership (“CIPL”) at Hunton Andrews Kurth recently released a report on Enabling Beneficial and Safe Uses of Biometric Technology Through Risk-Based Regulations (the “Report”). The Report examines global laws and regulations that target biometric data and encourages adoption of a risk-based approach. According to the Report, biometric technology applications are growing and can provide societal and economic benefits. However, there are recognized concerns over potential harms for individuals and their rights, and data protection and privacy laws are increasingly targeting the collection and use of biometric data.
On April 7, 2024, U.S. Sen. Maria Cantwell (D-WA) and U.S. Rep. Cathy McMorris Rodgers (R-WA) released a discussion draft of the latest federal privacy proposal, known as American Privacy Rights Act (“APRA” or the “Act”). The APRA builds upon the American Data Privacy and Protection Act (“ADPPA”), which was introduced as H.R. 8152 in the 117th Congress and advanced out of the House Energy and Commerce Committee but did not become law. As the latest iteration of a federal privacy proposal, the APRA signals that some members of Congress continue to seek to create a federal standard in the wake of—and in spite of—the ever-growing patchwork of state privacy laws.
On February 28, 2024, President Biden released an Executive Order (“EO”) “addressing the extraordinary and unusual national security threat posed by the continued effort of certain countries of concern to access Americans’ bulk sensitive personal data and certain U.S. Government-related data.” In tandem with the EO, the Department of Justice’s (“DOJ’s”) National Security Division is set to issue an advance notice of proposed rulemaking (“ANPRM”) pursuant to the EO, which directs the DOJ to “establish, implement and administer new and targeted national security programming” to address the threat. The DOJ regulations will identify specific categories of “data transactions” that are prohibited or restricted due to their “unacceptable risk to national security.”
On February 23, 2024, the UK Information Commissioner’s Office (the “ICO”) reported that it had ordered public service providers Serco Leisure, Serco Jersey and associated community leisure trusts (jointly, “the Companies”) to stop using facial recognition technology (“FRT”) and fingerprint scanning (“FS”) to monitor employee attendance.
On January 22, 2024, a draft of the final text of the EU Artificial Intelligence Act (“AI Act”) was leaked to the public. The leaked text substantially diverges from the original proposal by the European Commission, which dates back to 2021. The AI Act includes elements from both the European Parliament’s and the Council’s proposals.
On May 18, 2023, the Federal Trade Commission issued a policy statement on “Biometric Information and Section 5 of the Federal Trade Commission Act.” The statement warns that the use of consumer biometric information and related technologies raises “significant concerns” regarding privacy, data security, and bias and discrimination, and makes clear the FTC’s commitment to combatting unfair or deceptive acts and practices related to the collection and use of consumers’ biometric information and the marketing and use of biometric information technologies.
On May 17, 2023, the European Data Protection Board (EDPB) adopted the final version of its Guidelines on facial recognition technologies in the area of law enforcement (the “Guidelines”). The Guidelines address lawmakers at the EU and EU Member State level, and law enforcement authorities and their officers implementing and using facial recognition technology.
On February 17, 2023, the Illinois Supreme Court issued an opinion in Cothron v. White Castle Systems, Inc., in response to a certified question from the Seventh Circuit, ruling that the plain language of Section 15(b) and 15(d) of the Illinois Biometric Privacy Act (“BIPA”) shows that a claim accrues under BIPA with every scan or transmission of biometric identifiers or biometric information without prior informed consent.
On February 10, 2023, an Illinois federal district court ordered the dismissal of a putative class action lawsuit alleging that an online tool that allowed users to virtually try on sunglasses violated the Illinois Biometric Privacy Act (“BIPA”).
On February 2, 2023, the Illinois Supreme Court reversed in part and remanded a judgment of the lower appellate court in a class action lawsuit alleging violation of the Illinois Biometric Information Privacy Act (“BIPA”).
On January 3, 2023, an Illinois state court entered a preliminary approval order for a settlement of nearly $300,000 in a class action lawsuit against Whole Foods for claims that the company violated the Illinois Biometric Information Privacy Act (“BIPA”). The plaintiffs alleged that Whole Foods unlawfully collected voiceprints from employees who worked at the company’s distribution centers.
On December 31, 2022, Baltimore’s ordinance banning the private sector’s use of facial recognition technology expired. The ordinance, which was enacted in 2021, banned private entities and individuals within the city limits from using facial recognition technology, including obtaining, retaining, accessing or using a “face surveillance system” or any information obtained from such system. The Baltimore ordinance followed a similar ban on the use of facial recognition technology by private sector companies in Portland, Oregon, enacted in 2020. New York City also passed an ordinance in 2021 regulating commercial establishments’ use of biometric technology.
On December 21, 2022, the Colorado Attorney General published an updated version of the draft rules to the Colorado Privacy Act (“CPA”). The draft, which follows the first iteration of the proposed rules published on October 10, 2022, solicits comments on five topics: (1) new and revised definitions; (2) the use of IP addresses to verify consumer requests; (3) a proposed universal opt-out mechanism; (4) streamlining the privacy policy requirements; and (5) bona fide loyalty programs.
On December 20, 2022, a former employee in Illinois brought a class action suit against Five Guys Enterprises, LLC (“Five Guys”), a burger chain, alleging that Five Guys violated the Illinois Biometric Information Privacy Act (“BIPA”).
On November 30, 2022, the Second District Appellate Court of Illinois reversed and remanded a grant of summary judgement in favor of defendant, J&M Plating, Inc., for alleged violation of the Illinois Biometric Information Privacy Act (“BIPA”). In Mora v. J&M Plating, Inc., the plaintiff claimed that J&M Plating had violated BIPA by collecting workers’ fingerprints without a proper data retention and destruction policy for biometric information.
On November 22, 2022, the Department of Commerce’s National Telecommunications and Information Administration (“NTIA”) announced that it filed comments with the Federal Trade Commission that call for new limits on how companies can collect and use personal information about consumers. The comments were filed in response to the FTC’s request for public comment on its Advanced Notice of Proposed Rulemaking on commercial surveillance and lax data security practices.
On October 1, 2022, the Colorado Attorney General’s Office submitted an initial draft of the Colorado Privacy Act Rules (“CPA Rules”), which will implement and enforce the Colorado Privacy Act (“CPA”). The CPA Rules, which are currently about 38 pages, address many recent issues in state data privacy regulation, including data profiling, data protection, automated data processing, biometric data, universal opt-out mechanisms and individual data rights.
On October 20, 2022, Texas Attorney General Ken Paxton brought suit against Google alleging various violations of Texas’s biometric privacy law, including that the company unlawfully collected and used the biometric data of millions of Texans without obtaining proper consent. The lawsuit alleges that, since 2015, Google has collected millions of biometric identifiers of Texas consumers, such as voiceprints and records of face geometry, through Google’s various products, including Google Photos, Google Assistant and Nest Hub Max, in violation of Texas’s biometric privacy law. Texas’s biometric privacy law prohibits the collection of biometric identifiers for a commercial purpose unless the individual whose biometric identifiers are collected is informed of the collection and provides consent. The law also requires companies to destroy biometric identifiers within a reasonable time, but not later than the first anniversary of the date the purpose for collecting the biometric identifier expires (except in limited circumstances).
On October 17, 2022, the French Data Protection Authority (the “CNIL”) imposed a €20 million fine on Clearview AI for unlawful use of facial recognition technology. The fine was imposed after the CNIL’s prior formal notice remained unaddressed by Clearview AI.
On October 12, 2022, a federal jury found BNSF Railway, operator of one of the largest freight railroad networks in North America, violated the Illinois Biometric Information Privacy Act (“BIPA”) in the first ever BIPA case to go to trial. In Richard Rogers v. BNSF Railway Company (Case No. 19-C-3083, N.D. Ill.), truck drivers’ fingerprints were scanned for identity verification purposes when visiting BNSF rail yards to pick up and drop off loads. The jury found that BNSF recklessly or intentionally violated the law 45,600 times when it collected such fingerprint scans without written, informed permission or notice.
On October 12, 2022, the UK Information Commissioner's Office (“ICO”) launched a public consultation on its draft guidance on employers’ obligations when monitoring at work (“Draft Guidance”). In addition, the ICO has published an impact scoping document, which outlines some of the context and potential impacts of the Draft Guidance (“Impact Scoping Document”).
On July 28, 2022, a federal judge approved TikTok’s $92 million class action settlement of various privacy claims made under state and federal law. The agreement will resolve litigation that began in 2019 and involved claims that TikTok, owned by the Chinese company ByteDance, violated the Illinois Biometric Information Privacy Act (“BIPA”) and the federal Video Privacy Protection Act (“VPPA”) by improperly harvesting users’ personal data. U.S. District Court Judge John Lee of the Northern District of Illinois also awarded approximately $29 million in fees to class counsel.
On June 1, 2022, Thailand’s Personal Data Protection Act (“PDPA”) entered into force after three years of delays. The PDPA, originally enacted in May 2019, provides for a one-year grace period, with the main operative provisions of the law originally set to come into force in 2020. Due to the COVID-19 pandemic, however, the Thai government issued royal decrees to extend the compliance deadline to June 1, 2022.
As reported in the Hunton Employment & Labor Perspectives Blog:
Assembly Bill 1651, or the Workplace Technology Accountability Act, a new bill proposed by California Assembly Member Ash Kalra, would regulate employers and their vendors regarding the use of employee data. Under the bill, data is defined as “any information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular worker, regardless of how the information is collected, inferred, or obtained.” Examples of data include personal identity information; biometric information; health, medical, lifestyle, and wellness information; any data related to workplace activities; and online information. The bill confers certain data rights on employees, including the right to access and correct their data.
On February 14, 2022, Texas Attorney General Ken Paxton brought suit against Meta, the parent company of Facebook and Instagram, over the company’s collection and use of biometric data. The suit alleges that Meta collected and used Texans’ facial geometry data in violation of the Texas Capture or Use of Biometric Identifier Act (“CUBI”) and the Texas Deceptive Trade Practices Act (“DTPA”). The lawsuit is significant because it represents the first time the Texas Attorney General’s Office has brought suit under CUBI.
On November 2, 2021, Facebook parent Meta Platforms Inc. announced in a blog post that it will shut down its “Face Recognition” system in coming weeks as part of a company-wide move to limit the use of facial recognition in its products. The company cited the need to “weigh the positive use cases for facial recognition against growing societal concerns, especially as regulators have yet to provide clear rules.”
On October 1, 2021, Connecticut’s two new data security laws become effective. As we previously reported, the new laws modify Connecticut’s existing breach notification requirements and establish a safe harbor from certain Connecticut Superior Court assessed damages for businesses that create and maintain a written cybersecurity program.
On September 17, 2021, in Tims v. Black Horse Carriers Inc., Ill. App. Ct., 1st Dist., No. 1-20-563, the Illinois Appellate Court, in a case of first impression at the appellate level, addressed the statute of limitations under the state’s Biometric Information Privacy Act (“BIPA”), holding that a five-year period applies to BIPA claims that allege the failure to (1) provide notice of the collection of biometric data, (2) take care in storing or transmitting biometric data, or (3) develop a publicly-available retention and destruction schedule for biometric data. The Court also held that a one-year period applies to claims alleging the improper disclosure of, or improper sale, lease, trade or profit from, biometric data.
On September 14, 2021, the Federal Trade Commission authorized new compulsory process resolutions in eight key enforcement areas: (1) Acts or Practices Affecting United States Armed Forces Members and Veterans; (2) Acts or Practices Affecting Children; (3) Bias in Algorithms and Biometrics; (4) Deceptive and Manipulative Conduct on the Internet; (5) Repair Restrictions; (6) Abuse of Intellectual Property; (7) Common Directors and Officers and Common Ownership; and (8) Monopolization Offenses.
On August 9, 2021, Baltimore joined Portland, Oregon and New York City in enacting a local ordinance regulating the private sector’s use of facial recognition technology. Baltimore’s ordinance will become effective on September 8, 2021. Read our earlier post for more details about Baltimore’s ban on the use of facial recognition technology by private entities and individuals within its city limits.
On September 1, 2021, the South Korean Personal Information Protection Commission (“PIPC”) issued fines against Netflix and Facebook for violations of the Korean Personal Information Protection Act (“PIPA”).
On August 20, 2021, China’s 13th Standing Committee of the National People’s Congress passed the Personal Information Protection Law (the “PIPL”). As we previously reported, the PIPL is China’s first comprehensive data protection law. It is modeled, in part, on other jurisdictions’ omnibus data protection regimes, including the EU General Data Protection Regulation (“GDPR”). The PIPL will become effective on November 1, 2021. Below are some of the key provisions under the PIPL.
Connecticut recently passed two cybersecurity laws that will become effective on October 1, 2021. The newly passed laws modify Connecticut’s existing breach notification requirements and establish a safe harbor for businesses that create and maintain a written cybersecurity program that complies with applicable state or federal law or industry-recognized security frameworks.
On July 27, 2021, the Spanish Data Protection Authority (the “AEPD”) imposed a €2,520,000 fine on Spanish supermarket chain Mercadona, S.A. for unlawful use of a facial recognition system.
On June 14, 2021, the Baltimore City Council passed a bill that would ban the use of facial recognition technology by private entities and individuals within the city limits. If signed into law, Baltimore, Maryland would become the latest U.S. city to enact stringent regulations governing the use of facial recognition technology in the private sector.
On April 23, 2021, the National Information Security Standardization Technical Committee of China published a draft standard (in Chinese) on Security Requirements of Facial Recognition Data (the “Standard”). The Standard, which is non-mandatory, details requirements for collecting, processing, sharing and transferring data used for facial recognition.
On April 23, 2021, the Centre for Information Policy Leadership (“CIPL”) at Hunton Andrews Kurth submitted its response to the European Data Protection Board (“EDPB”) consultation on draft guidelines on virtual voice assistants (the “Guidelines”). The Guidelines were adopted on March 12, 2021 for public consultation.
Building upon its April 2020 business guidance on Artificial Intelligence and algorithms, on April 19, 2021, the FTC published new guidance focused on how businesses can promote truth, fairness and equity in their use of AI.
On April 21, 2021, the European Commission (the “Commission”) published its Proposal for a Regulation on a European approach for Artificial Intelligence (the “Artificial Intelligence Act”). The Proposal follows a public consultation on the Commission’s white paper on AI published in February 2020. The Commission simultaneously proposed a new Machinery Regulation, designed to ensure the safe integration of AI systems into machinery.
On March 15, 2021, China’s State Administration for Market Regulation (“SAMR”) issued Measures for the Supervision and Administration of Online Transactions (the “Measures”) (in Chinese). The Measures implement rules for the E-commerce Law of China and provide specific rules for addressing registration of an online operation entity, supervision of new business models (such as social e-commerce and livestreaming), platform operators’ responsibilities, protection of consumers’ rights and protection of personal information.
On March 12, 2021, the European Data Protection Board (“EDPB”) published its Guidelines 01/2021 on Virtual Voice Assistants for consultation (the “Guidelines”). Virtual voice assistants (“VVAs”) understand and execute voice commands or coordinate with other IT systems. These tools are available on most smartphones and other devices and collect significant amounts of personal data, such as through user commands. In addition, VVAs require a terminal device equipped with a microphone and transfer data to remote service. These activities raise compliance issues under both the General Data Protection Regulation (“GDPR”) and the e-Privacy Directive.
On January 11, 2021, the FTC announced that Everalbum, Inc. (“Everalbum”), developer of the “Ever” photo storage app, agreed to a settlement over allegations that the company deceived consumers about its use of facial recognition technology and its retention of the uploaded photos and videos of users who deactivated their accounts.
On January 10, 2021, New York City enacted a new biometrics ordinance that regulates the commercial use and sale of biometric identifier information.
On December 22, 2020, New York Governor Andrew Cuomo signed into law legislation that temporarily bans the use or purchase of facial recognition and other biometric identifying technology in public and private schools until at least July 1, 2022. The legislation also directs the New York Commissioner of Education (the “Commissioner”) to conduct a study on whether this technology is appropriate for use in schools.
On November 12, 2020, Chief Judge Nancy J. Rosenstengel of the U.S. District Court for the Southern District of Illinois rejected Apple Inc.’s (“Apple’s”) motion to dismiss a class action alleging its facial recognition software violates Illinois’ Biometric Information Privacy Act (“BIPA”). Judge Rosenstengel agreed with Apple, however, that the federal court lacks subject matter jurisdiction over portions of the complaint.
On September 9, 2020, Portland, Oregon became the first jurisdiction in the country to ban the private-sector use of facial recognition technology in public places within the city, including stores, restaurants and hotels. The city Ordinance was unanimously passed by the Portland City Council and will take effect on January 1, 2021. The City Council cited as rationale for the Ordinance documented instances of gender and racial bias in facial recognition technology, and the fact that marginalized communities have been subject to “over surveillance and [the] disparate and detrimental impact of the use of surveillance.”
On August 4, 2020, Senators Jeff Merkley (OR) and Bernie Sanders (VT) introduced the National Biometric Information Privacy Act of 2020 (the “bill”). The bill would require companies to obtain individuals’ consent before collecting biometric data. Specifically, the bill would prohibit private companies from collecting biometric data—including eye scans, voiceprints, faceprints and fingerprints—without individuals’ written consent, and from profiting off of biometric data. The bill provides individuals and state attorneys general the ability to institute legal proceedings against entities for alleged violations of the act.
Texas Attorney General Ken Paxton is investigating Facebook Inc. (“Facebook”) for alleged violations of the Texas Business and Commercial Code, which contains provisions governing the collection, retention and disclosure of biometric data. As we previously reported, Facebook recently reached a $650 million settlement for alleged violations of Illinois’ Biometric Information Privacy Act for their use of facial recognition software without permission from affected users.
On July 1, 2020, amendments to Vermont’s data breach notification law, signed into law earlier this year, will take effect along with Vermont’s new student privacy law.
Facebook disclosed on January 29, 2020, that it has agreed to pay $550,000,000 to resolve a biometric privacy class action filed by Illinois users under the Biometric Information Privacy Act (“BIPA”). BIPA is an Illinois law enacted in 2008 that governs the collection, use, sharing, protection and retention of biometric information. In recent years, numerous class action lawsuits have been filed under BIPA seeking statutory damages ranging from $1,000 per negligent violation to $5,000 per reckless or intentional violation.
On October 11, 2019, California Governor Gavin Newsom signed into law AB 1130, which expands the types of personal information covered by California’s breach notification law to include, when compromised in combination with an individual’s name: (1) additional government identifiers, such as tax identification number, passport number, military identification number, or other unique identification number issued on a government document commonly used to verify the identity of a specific individual; and (2) biometric data generated from measurements or technical analysis of human body characteristics (e.g., fingerprint, retina, or iris image) used to authenticate a specific individual. Biometric data does not include a physical or digital photograph unless used or stored for facial recognition purposes.
Search
Recent Posts
- Website Use of Third-Party Tracking Software Not Prohibited Under Massachusetts Wiretap Act
- HHS Announces Additional Settlements Following Ransomware Attacks Including First Enforcement Under Risk Analysis Initiative
- Employee Monitoring: Increased Use Draws Increased Scrutiny from Consumer Financial Protection Bureau
Categories
- Behavioral Advertising
- Centre for Information Policy Leadership
- Children’s Privacy
- Cyber Insurance
- Cybersecurity
- Enforcement
- European Union
- Events
- FCRA
- Financial Privacy
- General
- Health Privacy
- Identity Theft
- Information Security
- International
- Marketing
- Multimedia Resources
- Online Privacy
- Security Breach
- U.S. Federal Law
- U.S. State Law
- Workplace Privacy
Tags
- Aaron Simpson
- Accountability
- Adequacy
- Advertisement
- Advertising
- American Privacy Rights Act
- Anna Pateraki
- Anonymization
- Anti-terrorism
- APEC
- Apple Inc.
- Argentina
- Arkansas
- Article 29 Working Party
- Artificial Intelligence
- Australia
- Austria
- Automated Decisionmaking
- Baltimore
- Bankruptcy
- Belgium
- Biden Administration
- Big Data
- Binding Corporate Rules
- Biometric Data
- Blockchain
- Bojana Bellamy
- Brazil
- Brexit
- British Columbia
- Brittany Bacon
- Brussels
- Business Associate Agreement
- BYOD
- California
- CAN-SPAM
- Canada
- Cayman Islands
- CCPA
- CCTV
- Chile
- China
- Chinese Taipei
- Christopher Graham
- CIPA
- Class Action
- Clinical Trial
- Cloud
- Cloud Computing
- CNIL
- Colombia
- Colorado
- Committee on Foreign Investment in the United States
- Commodity Futures Trading Commission
- Compliance
- Computer Fraud and Abuse Act
- Congress
- Connecticut
- Consent
- Consent Order
- Consumer Protection
- Cookies
- COPPA
- Coronavirus/COVID-19
- Council of Europe
- Council of the European Union
- Court of Justice of the European Union
- CPPA
- CPRA
- Credit Monitoring
- Credit Report
- Criminal Law
- Critical Infrastructure
- Croatia
- Cross-Border Data Flow
- Cyber Attack
- Cybersecurity and Infrastructure Security Agency
- Data Brokers
- Data Controller
- Data Localization
- Data Privacy Framework
- Data Processor
- Data Protection Act
- Data Protection Authority
- Data Protection Impact Assessment
- Data Transfer
- David Dumont
- David Vladeck
- Delaware
- Denmark
- Department of Commerce
- Department of Health and Human Services
- Department of Homeland Security
- Department of Justice
- Department of the Treasury
- District of Columbia
- Do Not Call
- Do Not Track
- Dobbs
- Dodd-Frank Act
- DPIA
- E-Privacy
- E-Privacy Directive
- Ecuador
- Ed Tech
- Edith Ramirez
- Electronic Communications Privacy Act
- Electronic Privacy Information Center
- Elizabeth Denham
- Employee Monitoring
- Encryption
- ENISA
- EU Data Protection Directive
- EU Member States
- European Commission
- European Data Protection Board
- European Data Protection Supervisor
- European Parliament
- Facial Recognition Technology
- FACTA
- Fair Credit Reporting Act
- Fair Information Practice Principles
- Federal Aviation Administration
- Federal Bureau of Investigation
- Federal Communications Commission
- Federal Data Protection Act
- Federal Trade Commission
- FERC
- FinTech
- Florida
- Food and Drug Administration
- Foreign Intelligence Surveillance Act
- France
- Franchise
- Fred Cate
- Freedom of Information Act
- Freedom of Speech
- Fundamental Rights
- GDPR
- Geofencing
- Geolocation
- Georgia
- Germany
- Global Privacy Assembly
- Global Privacy Enforcement Network
- Gramm Leach Bliley Act
- Hacker
- Hawaii
- Health Data
- Health Information
- HIPAA
- HIPPA
- HITECH Act
- Hong Kong
- House of Representatives
- Hungary
- Illinois
- India
- Indiana
- Indonesia
- Information Commissioners Office
- Information Sharing
- Insurance Provider
- Internal Revenue Service
- International Association of Privacy Professionals
- International Commissioners Office
- Internet
- Internet of Things
- IP Address
- Ireland
- Israel
- Italy
- Jacob Kohnstamm
- Japan
- Jason Beach
- Jay Rockefeller
- Jenna Rode
- Jennifer Stoddart
- Jersey
- Jessica Rich
- John Delionado
- John Edwards
- Kentucky
- Korea
- Latin America
- Laura Leonard
- Law Enforcement
- Lawrence Strickling
- Legislation
- Liability
- Lisa Sotto
- Litigation
- Location-Based Services
- London
- Madrid Resolution
- Maine
- Malaysia
- Markus Heyder
- Maryland
- Massachusetts
- Meta
- Mexico
- Microsoft
- Minnesota
- Mobile App
- Mobile Device
- Montana
- Morocco
- MySpace
- Natascha Gerlach
- National Institute of Standards and Technology
- National Labor Relations Board
- National Science and Technology Council
- National Security
- National Security Agency
- National Telecommunications and Information Administration
- Nebraska
- NEDPA
- Netherlands
- Nevada
- New Hampshire
- New Jersey
- New Mexico
- New York
- New Zealand
- Nigeria
- Ninth Circuit
- North Carolina
- Norway
- Obama Administration
- OECD
- Office for Civil Rights
- Office of Foreign Assets Control
- Ohio
- Oklahoma
- Opt-In Consent
- Oregon
- Outsourcing
- Pakistan
- Parental Consent
- Payment Card
- PCI DSS
- Penalty
- Pennsylvania
- Personal Data
- Personal Health Information
- Personal Information
- Personally Identifiable Information
- Peru
- Philippines
- Phyllis Marcus
- Poland
- PRISM
- Privacy By Design
- Privacy Policy
- Privacy Rights
- Privacy Rule
- Privacy Shield
- Protected Health Information
- Ransomware
- Record Retention
- Red Flags Rule
- Regulation
- Rhode Island
- Richard Thomas
- Right to Be Forgotten
- Right to Privacy
- Risk-Based Approach
- Rosemary Jay
- Russia
- Safe Harbor
- Sanctions
- Schrems
- Scott Kimpel
- Securities and Exchange Commission
- Security Rule
- Senate
- Serbia
- Service Provider
- Singapore
- Smart Grid
- Smart Metering
- Social Media
- Social Security Number
- South Africa
- South Carolina
- South Dakota
- South Korea
- Spain
- Spyware
- Standard Contractual Clauses
- State Attorneys General
- Steven Haas
- Stick With Security Series
- Stored Communications Act
- Student Data
- Supreme Court
- Surveillance
- Sweden
- Switzerland
- Taiwan
- Targeted Advertising
- Telecommunications
- Telemarketing
- Telephone Consumer Protection Act
- Tennessee
- Terry McAuliffe
- Texas
- Text Message
- Thailand
- Transparency
- Transportation Security Administration
- Trump Administration
- United Arab Emirates
- United Kingdom
- United States
- Unmanned Aircraft Systems
- Uruguay
- Utah
- Vermont
- Video Privacy Protection Act
- Video Surveillance
- Virginia
- Viviane Reding
- Washington
- Whistleblowing
- Wireless Network
- Wiretap
- ZIP Code