Data Privacy Day offers a natural checkpoint to take stock of a fast‑moving legal landscape. As of January 1, 2026, several significant U.S. state privacy laws and regulatory updates are now live, with additional U.S. and global milestones queued up throughout 2026. Below we summarize important changes already in effect and highlight issues to monitor as the year unfolds.

What’s new as of January 1, 2026

Several comprehensive state privacy statutes and California rule changes took effect with the New Year, expanding the U.S. “patchwork” and sharpening enforcement expectations.

First, the Indiana Consumer Data Protection Act [1], the Kentucky Consumer Data Protection Act [2], and the Rhode Island Data Transparency and Privacy Protection Act [3] went into effect on January 1, 2026. These regimes generally align with state privacy laws already in effect, such as providing access, correction, deletion, and portability rights and opt‑out rights for targeted advertising, sales, and certain profiling, but contain some differences in thresholds, sensitive data definitions, and cure periods that complicate multi‑state compliance. However, Rhode Island stands out for omitting a cure period, not requiring processing of Universal Opt Out Mechanisms, and imposing heightened transparency on data sales (e.g., disclosure of the third party with whom personal data is sold), while Indiana and Kentucky hew more closely to Virginia‑style baselines. Indiana published a helpful guidance (entitled “Data Consumer Bill of Rights”), which covers the basic rights of consumer and obligations of businesses. Existing state privacy statutes were also amended in 2025, which went into effect on January 1, 2026, such as the Oregon Consumer Privacy Act now expressly banning the sale of precise geolocation data and personal data of consumers under the age of 16, and the Connecticut Data Privacy Act explicitly banning the sale of personal data or targeted advertising to minors when the controller has actual knowledge (or reasonably should know) that the consumer is a minor.[4] Businesses handling personal data at scale should validate applicability and update notices, rights workflows, and data protection assessment practices accordingly.

Second, California’s updated CCPA regulations [5] are in force, launching a new era of operational requirements and signaling stepped‑up scrutiny by CalPrivacy (the rebranded California Privacy Protection Agency).  The new regulations govern cybersecurity audits, risk assessments, and automated decision-making technology (ADMT), as well as changes to existing obligations, which we have previously discussed here.

California also tightened breach notification timelines to require consumer notice within 30 days of discovery and, when more than 500 Californians are notified, and notice to the Attorney General within 15 days thereafter. In parallel, CalPrivacy and the Attorney General have emphasized rigorous, technically accurate consent and opt‑out implementations in recent enforcement activity. Companies operating in or marketing to California should reassess risk assessment triggers, ADMT governance, and incident response timelines now.

Third, California’s “DELETE Act” has entered its operational phase. [6] CalPrivacy launched the centralized Delete Request and Opt‑out Platform (DROP) in January. Registered data brokers must open accounts, complete registration, and pay fees by January 31, 2026, and beginning August 1, 2026, must check the platform at least every 45 days to process global deletion requests. The statute authorizes daily penalties both for failure to register and for failing to delete, making classification as a “data broker” and readiness for DROP workstreams immediate 2026 action items.

Organizations should confirm that GPC and other recognized signals are detected and propagated consistently across web, mobile, and downstream systems and that user‑facing indications of opt‑out status are accurate and reliable.

Children’s and teens’ privacy and online safety

Children’s privacy and youth online safety remain focal points for U.S. and international regulators in 2026. Within the United States, attorneys general and specialized privacy agencies are expected to intensify enforcement around profiling, targeted advertising to minors, and opaque or manipulative consent flows. With California classifying under‑16 data as sensitive personal information and multiple states adding youth‑specific restrictions, companies offering online services with material youth audiences should reassess age‑appropriate data practices (e.g., design and age verification requirements), profiling limits, and youth‑specific DPIAs in light of evolving state standards and case law. Expect continuing rulemaking and litigation around age‑verification obligations and related First Amendment concerns, as states refine legal theories that can withstand constitutional challenge.

At the federal level, the FTC’s Division of Privacy and Identity Protection continues to prioritize children’s privacy under updated COPPA regulations, with close attention on how platforms collect, share, and monetize youth data. Businesses should anticipate heightened scrutiny on “actual knowledge,” default settings, and third‑party tracking in youth contexts, as well as tighter expectations for parental consent mechanisms and data minimization. FTC leadership has underscored a case‑by‑case enforcement posture, reinforcing the importance of clear disclosures, limitations on sharing, and technical truth in consent and opt‑out implementations for services likely to be accessed by minors.

Cybersecurity disclosure and breach notification acceleration

States continue to compress breach reporting timelines and expand breach definitions. California’s SB 446, [1] effective January 1, 2026, now imposes a 30‑day consumer notice deadline and a 15‑day post‑notice filing to the Attorney General when 500+ residents are affected, aligning with similar changes in other jurisdictions. The operational takeaway is clear: incident response plans should be retuned with these fixed statutory timelines in mind, including escalation protocols, outside counsel engagement, and regulator‑ready notice templates prepared in advance. Expect regulators to evaluate not just whether a notification was issued, but also how quickly an organization investigated, contained, and communicated the incident and whether the board exercised demonstrable governance over cyber risk.

At the federal securities level, the SEC’s cyber disclosure framework continues to shape incident response and governance for public companies.  They must disclose material cyber incidents on Form 8‑K within four business days of determining materiality and provide annual disclosures on risk management, strategy, and governance.  Staff guidance has emphasized using Item 8.01 to address non‑material events and reserving Item 1.05 for material incidents. Broker‑dealers, investment advisers, and transfer agents face updated Regulation S‑P obligations, including 30‑day customer notification and enhanced safeguards programs as compliance periods run. Enforcement will likely continue to focus on internal disclosure controls, timeliness, and accuracy of statements about cybersecurity. For more information on the updated Regulation S-P obligations, see our previous post, “With Compliance Date for Reg S-P Amendments Looming, Is Your Firm Ready Yet?”

Automated decision‑making and AI governance

Regulators are converging on risk‑based oversight of automated decision‑making. As noted above, in California, 2026 ushers in new CCPA risk assessment triggers around significant decisions, inferences from sensitive contexts, and training ADMT, foreshadowing broader scrutiny of profiling, fairness, and transparency across industries. Texas’ Responsible Artificial Intelligence Governance Act, which creates a framework for the development, deployment, and oversight of an AI system took effect on January 1, 2026. In addition, the Colorado Artificial Intelligence Act will go into effect on June 30, 2026. Plaintiffs’ counsel and state enforcers will be probing whether companies meaningfully honor privacy rights requests related to data used to train the AI model, including opt‑outs from profiling and targeted advertising, particularly where youth or sensitive data are implicated. At the same time, August 2, 2026 is the key compliance date for most obligations under the EU AI Act, which will pressure multinationals to map AI systems, classify risk, conduct assessments, document training data provenance, and ensure human oversight. Organizations deploying high‑risk AI systems should align AI governance with existing privacy programs—treating AI inventories, risk assessments, and control testing as core privacy artifacts rather than parallel tracks.

Universal opt‑out signals

Universal opt‑out mechanisms—browser or device‑level signals such as Global Privacy Control (GPC)—continue to migrate from best practice to a requirement in multiple states. Connecticut [7] and Oregon [8] join the roster of states requiring recognition of a universal opt‑out signal in 2026, adding to existing requirements in California, Colorado and others. State enforcement agencies have already penalized companies for ineffective or non‑functional opt‑out mechanisms, and regulators have cautioned that reliance on a consent management platform does not excuse technical failures. [9]

In 2025, Governor Newsom signed the California Opt Me Out Act, which is the first law in the nation to require web browsers to include a built-in feature that lets consumers tell all websites you visit not to sell or share your personal information with a single option. 

Enforcement in 2025 set the tone for 2026. Regulators will “look under the hood” to confirm that consent and opt‑out flows actually work. Cases against retailers and brands have highlighted non‑functional opt‑out forms, failure to process GPC signals, and inconsistent suppression of third‑party trackers. For 2026, regulators have telegraphed broader sweeps and coordinated actions—particularly in California, Colorado, and Connecticut—around dark patterns, malfunctioning opt‑out mechanisms, and failures in vendor governance that result in continued tracking after a user opts out. Companies should test their consent implementations end‑to‑end, verify propagation across web and mobile, audit vendor tags and pixels for leakage of sensitive categories, and ensure visible confirmation of opt‑out status where required.

Pixel tracking, session replay, and chat tools: litigation heat remains on

Expect continued class‑action pressure against the use of pixels, SDKs, session replay, and chat widgets, with theories under CIPA and other wiretap statutes, VPPA, and state consumer laws. Though some courts have tried to reign in these claims, most decisions, at least early in the cases, have allowed claims to proceed.  There is heightened exposure where sensitive health, financial, or video‑viewing data is alleged. Email “spy pixel” cases emerged in 2024, and plaintiffs are increasingly targeting financial institutions alongside health and retail. Risk controls include strict tag governance, data‑minimizing configurations, banner‑gated firing, and tailored notices that accurately describe third‑party data flows. On January 26, 2026, the U.S. Supreme Court agreed to help determine whether the Video Privacy Protection Act (VPPA) applies to consumers who subscribe to non-audiovisual content, helping to define “consumer” under the VPPA. [10]

Data breach litigation outlook

Plaintiffs’ firms continue to file large numbers of breach‑related putative class actions, with consolidation in mega‑events and evolving standing doctrines post‑TransUnion. Settlements remain sizable, and courts are focusing on the sufficiency of security measures, timeliness and content of notices, and post‑incident communications. Expect more fiduciary‑duty and securities theories where public statements or controls are challenged, alongside continued CCPA private‑action claims tied to “reasonable security.” Build litigation‑ready playbooks, preserve privilege around forensics, and document decision‑making contemporaneously.

Biometrics and facial recognition

Biometric data remains a top litigation and enforcement vector. Illinois BIPA continues to drive high‑exposure cases, while Texas and New York activity is rising. However, Texas’ biometric law, which was amended with changes going into effect on January 1, 2026, now includes exclusions for the training, development, and deployment of artificial intelligence. The FTC has framed misuse of facial recognition and broader biometric practices as potential unfair or deceptive acts, and recent orders highlight accuracy, bias, disclosure, and security expectations. Companies deploying biometrics should refresh notice/consent, retention schedules, vendor diligence, and accuracy testing, and avoid covert capture or privacy‑invasive defaults.

Cyber insurance: pricing, coverage, and claims trends

Cyber insurers report higher frequency and severity in data and privacy claims, with a noticeable rise in “non‑attack” privacy class actions tied to tracking technologies and consent failures. Underwriting continues to emphasize MFA, EDR, backups, vendor controls, and incident response maturity; policyholders should review panel requirements, privilege considerations for forensics, and exclusions that may apply to tracking‑technology claims. Expect closer scrutiny of representations around cookies/pixels and privacy program governance during renewal.

SEC cyber developments and the crypto intersection

Beyond issuer disclosures, SEC registrants including broker-dealers, registered investment advisers and transfer agents face expanded SEC cyber expectations under Reg S‑P and sector‑specific proposals. Enforcement has tested theories around disclosure controls, internal controls, and timeliness in cyber events. Public companies and SEC registrants operating in crypto and digital‑asset markets, should expect continued scrutiny of cyber risk disclosures, custody controls, and incident reporting, as well as privacy questions arising from blockchain analytics, wallet identifiers, and on‑chain advertising/attribution. Firms operating in crypto asset markets should align cyber, privacy, and financial‑crime programs and ensure incident narratives are consistent across SEC, state, and consumer notifications.

Practical next steps for 1H 2026

Given the breadth of changes already in force, most organizations should focus first on execution:

  • Confirm state law applicability, align notices and rights workflows with 2026 changes;
  • Conduct an online tracker audit to identify and mitigate risk related to wiretapping, invasion of privacy, and similar claims;
  • Review cyber and privacy insurance policies to ensure coverage for emerging risks;
  • Validate universal opt‑out signal detection and end‑to‑end suppression;
  • Refresh data protection assessments for high‑risk processing;
  • In California, calibrate the new CCPA regulations;
  • Tune identity‑verification and appeals pathways for consumer rights;
  • Rehearse incident response timelines to meet updated notification deadlines (e.g., California’s 30‑day and 15‑day deadlines);
  • For knowingly processing or sharing minor’s personal information, update your privacy program to reflect updated restrictions and prohibitions on the collection, use, disclosure, and sale of minor’s information;
  • For data brokers and companies adjacent to the data broker ecosystem, complete DROP registration steps and operationalize deletion request handling well ahead of August; and
  • For AI and automated decision‑making, build a single governance spine that covers ADMT under CCPA and risk‑based AI obligations under the EU AI Act to avoid duplicative, siloed controls.

Our team will continue to track enforcement priorities, rulemakings, and emerging litigation that shape how these laws apply in practice. If you have questions about how these developments impact your business or would like a readiness assessment tailored to your data footprint, we are here to help.


[1] https://iga.in.gov/ic/2024/Title_24/Article_15.pdf

[2] https://apps.legislature.ky.gov/law/statutes/chapter.aspx?id=39092

[3] https://legiscan.com/RI/text/S2500/2024

[4] See https://olis.oregonlegislature.gov/liz/2025R1/Downloads/MeasureDocument/HB2008/Enrolled and https://www.cga.ct.gov/2025/ACT/PA/PDF/2025PA-00113-R00SB-01295-PA.PDF.

[5] https://cppa.ca.gov/regulations/pdf/ccpa_statute_eff_20260101.pdf

[6] https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=202320240SB362

[1] https://leginfo.legislature.ca.gov/faces/billNavClient.xhtml?bill_id=202520260SB446

[7] See https://portal.ct.gov/ag/sections/privacy/the-connecticut-data-privacy-act

[8] https://olis.oregonlegislature.gov/liz/2023R1/Downloads/MeasureDocument/SB619/Enrolled

[9] See CPPA Orders Clothing Retailer Todd Snyder to Pay Six-Figure Fine, Overhaul Privacy Practices.

[10] See Michael Salazar v. Paramount Global, d/b/a247Sports, case number 25-459.