Online Privacy Act of 2026 to Provide Personal Information Rights


On March 19, 2026, House Bill 8014, “Online Privacy Act of 2026,” was introduced in the House and referred to the Energy and Commerce Committee. The bill has only one sponsor at this time: Rep. Zoe Lofgren [D-CA].

The stated purpose of the Bill is “To provide for individual rights relating to privacy of personal information, to establish privacy and security requirements for covered entities relating to personal information, and to establish an agency to be known as the Digital Privacy Agency to enforce such rights and requirements, and for other purposes.”

The bill (the “Online Privacy Act of 2026,” H.R. 8014) would create a comprehensive, rights‑based federal privacy framework, impose broad data‑minimization and governance duties on most commercial data handlers, and establish a dedicated Digital Privacy Agency with robust enforcement tools, significantly tightening the baseline for data privacy in the U.S. relative to today’s largely sectoral regime. HB-8014.pdf

The Data Privacy Compliance Group at Troutman Amin is here to provide you with a synopsis of the Bill.

Core structure and scope

  • The Act covers any “covered entity” that intentionally collects, processes, or maintains personal information and sends or receives it over the internet or similar networks, excluding natural persons engaged only in de minimis commercial activity and defining “small business” with detailed revenue, employee, and data‑volume thresholds.
  • “Personal information” is broadly defined to include any information linked or reasonably linkable to an individual or device, expressly including de‑identified personal information where the entity retains the means to re‑identify, while excluding truly non‑linkable derived data and lawfully available public information.
  • The Act applies to U.S. residents’ data regardless of where the processing occurs, and it prohibits covered entities from collecting, maintaining, processing, or disclosing personal information via channels of interstate commerce unless they comply with all requirements of the Act.

Individual rights

  • The bill grants a suite of GDPR‑style rights: access, correction, deletion, portability, human review of automated decisions, notice when data are collected from third‑party sources, and a “right to impermanence” limiting retention to periods expressly consented to by the individual.
  • Access must include categories of personal information, communication contents, sources of data, and lists of third parties, subsidiaries, and corporate affiliates to which information has been disclosed, along with the purposes for processing and disclosing.
  • Individuals can dispute accuracy where data are processed in ways that may increase reasonably foreseeable significant privacy harms, and entities must correct or explain why correction is denied (including by reference to defined exceptions).
  • Deletion rights extend to data collected from third parties and to inferences, subject to exemptions for legal obligations, safety, free expression, and transaction integrity, which the Director is empowered to cabin via rulemaking; service providers are generally exempt from direct rights obligations, which fall on the data‑controlling covered entities.
  • The portability right is unusually strong: for “portable categories” of services (defined by user scale and concentration metrics), entities must support both user download and a real‑time API for transmitting all personal information and communications contents to another covered entity, with a self‑certification and dispute system overseen by the Digital Privacy Agency.

Algorithmic autonomy and communications limits

  • The Act creates an explicit “right to individual autonomy” by requiring affirmative express consent before using an individual’s personal information to create or operate behavioral personalization systems designed to alter, influence, guide, or predict behavior, or to filter and rank content at an individualized level.
  • Consent for behavioral personalization must be obtained before a behaviorally personalized product or service is provided and at least annually; if the user does not consent, the entity must provide a non‑behaviorally‑personalized version or, where that is infeasible, at least a core non‑personalized version, with limited ability to deny access when a core feature cannot function without personalization.
  • Separate from personalization, the Act sharply limits the collection, processing, maintenance, and disclosure of the contents of communications, allowing such processing only for narrow purposes such as transmission and display, security, user‑requested processing, legal obligations, commercial spam filtering, and enforcement of abuse‑based bans.
  • Communications protections expressly preserve end‑to‑end encryption: entities may not prevent users from using encryption that the entity cannot decrypt, nor require users to disclose keys or circumvention methods, and are barred from treating themselves as “intended recipients” in many user‑to‑user contexts.

Data‑minimization and governance duties

  • Title II imposes an overarching duty of “minimization”: entities must have a reasonable, articulated basis for data collection and processing that balances business need against intrusion, potential privacy harms, and reasonable expectations, and they may not collect more data than reasonably needed to provide requested services.
  • Processing, maintaining, and disclosing must remain tied to the purposes for which data were originally collected (or, for service providers, as directed by another covered entity), with only tightly defined ancillary processing allowed without consent; many ancillary uses require notice, and higher‑risk ancillary uses require both notice and consent.
  • Entities must, where feasible without unreasonable effort or substantial loss of utility, substitute de‑identified, random, or artificial personal information for identified data, thereby pushing toward privacy‑preserving computation wherever possible.
  • Employee and contractor access to personal information must be minimized and logged in detail when breaches or data‑sharing abuses could foreseeably lead to increased privacy harms, including who accessed what, when, and about whom; this creates an audit trail for insider misuse and incident investigations.
  • De‑identification comes with strict conditions (technical safeguards, business processes against re‑identification, no attempts to re‑identify, and the Director’s power to deem methodologies insufficient), and there is an explicit prohibition on re‑identifying de‑identified data except in narrow research contexts approved by the Director.

Cross‑border data flows and third‑party risk

  • Covered entities are generally barred from disclosing personal information to entities outside the U.S. jurisdiction unless the recipients independently comply with the Act or have formal agreements with the Digital Privacy Agency; for foreign entities, DPA agreements require consent to U.S. jurisdiction and enforceability of U.S. judgments in their home countries, with termination triggers for non‑compliance.
  • When disclosing data to another covered entity, the disclosing entity must, to avoid liability, conduct and document due diligence, enter into contracts with robust privacy, security, audit, and indemnity provisions, file those terms and compliance programs with the Agency for approval and publication, and agree to accept and comply with orders and individual requests relating to the recipient’s actions.
  • If a receiving entity violates the Act, the disclosing entity is jointly liable for violations involving the disclosed information unless it was the first to report the violation to the Agency, in which case liability is several, creating strong incentives for upstream oversight and early reporting.
  • A non‑localization clause clarifies that nothing requires data to be stored or processed in the U.S., but the contractual and jurisdictional conditions functionally limit cross‑border transfers to entities willing to operate under U.S. privacy law and enforcement.

Enforcement architecture and remedies

  • The bill creates an independent Digital Privacy Agency with a Director and Deputy Director, broad investigatory powers, audit and reporting obligations, an Office of Civil Rights, complaint intake, and advisory boards, and authorizes appropriations for its operation.
  • Title IV gives the Agency administrative discovery, adjudication, and litigation authority; state attorneys general and “State privacy regulators” receive concurrent enforcement powers, and individuals gain a private right of action with defined relief (injunctive relief, statutory or actual damages, and other remedies), alongside whistleblower protections.
  • The Act prohibits waivers of its protections and bans predispute arbitration agreements for claims under it, thereby invalidating mandatory arbitration clauses and class‑action waivers as applied to privacy claims within its scope.
  • There is a new federal criminal offense for doxxing: knowingly disclosing personal information via interstate or foreign commerce with intent to threaten, intimidate, harass, incite, or facilitate violence, or place a person in reasonable fear of death or serious bodily injury, punishable by up to 15 years, with the Digital Privacy Agency supporting DOJ through complaint referral and technical assistance.

Relation to other laws and preemption

  • The Act includes a “Federal privacy laws” concept and a dedicated Title V on relation to other federal and state law, and allows the Director to coordinate with other agencies; the text provided indicates that it will sit alongside existing sectoral laws rather than replace them, but also includes a “relation to State law” section that will determine the degree of preemption.
  • Government entities face particular constraints: they may not disclose non‑redacted personal information in government records via interstate commerce without an agreement prohibiting recipients from selling the information absent express individual consent, though inter‑governmental disclosures remain permissible.

Likely impact on U.S. data privacy

  • Substantively, the Act would shift the U.S. from a fragmented, sector‑specific model toward a comprehensive, rights‑based regime closer to the EU’s GDPR or California’s CCPA/CPRA but with more aggressive mandates on minimization, retention, and behavioral personalization, particularly for communications content and algorithmic profiling.
  • For businesses, compliance would require end‑to‑end data‑lifecycle governance: granular purpose specification; default minimization; structured rights handling (access, correction, deletion, portability, human review); retention consent flows and schedules; internal access logging; third‑party and cross‑border contracting that meets the Act’s standards; and mechanisms for consent revocation and dark‑pattern‑free interfaces.
  • The combination of a dedicated privacy regulator, joint liability for downstream misuse, strong private rights of action, arbitration bans, and heightened obligations around behavioral personalization and communications content would materially raise enforcement risk and expected damages exposure, making privacy and data ethics a primary compliance and litigation vector for digital and data‑driven firms.
  • For individuals, the Act would significantly expand practical control over personal data: easier transparency into who holds what and why, meaningful ability to delete and port data, stronger ability to resist manipulative personalization and discriminatory processing, and clearer safeguards around intimate communications and government‑originated records, together likely reducing some forms of data‑driven harm and chilling effects if effectively enforced.

Given there is only one sponsor at this time and it was not introduced in a bipartisan fashion, it is unlikely this Bill will get much traction or attention in Congress. However, this is a step in the right direction to establishing a national law. My preference would be something similar to the UCC, which was drafted by the Uniform Law Commission and American Law Institute. The UCC is a comprehensive model act governing commercial transactions (sales, leases, banking, secured transactions) in the US, designed to standardize state laws. There is no reason the same cannot be done for data privacy and Ai governance. Alas, to dream the impossible dream.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *