Emerging Trends as Neural Data Legislation Gains Momentum Across the States

17 Mar 2026
Client Alert

Legislative sessions are underway, and neural data privacy remains top of mind for lawmakers across the country. As was the case last year (see our prior client alert), states continue to focus on how to regulate neural data and what exactly they are trying to regulate, and we are starting to see a few models take shape. What is notable in early 2026 is not only the volume of proposals, but also the range of regulatory strategies through which they are advancing. While the final form of these bills may shift as sessions progress, key themes are emerging.

1.  Legislators Are Trying to Determine What Neural Data Is Exactly and in What Contexts it Matters Most.

States are attempting to regulate neural data, and define what exactly it is, across a variety of sectors and statutory frameworks, reflecting a wide range of priorities and uncertainty. Most proposed bills define neural data as information generated by measuring the activity of an individual’s central or peripheral nervous system. However, what that means in practice varies by state with some proposals expressly recognizing neural data as including information derived from non-neural information (e.g., behavioral data), and others excluding such inferred data. In their effort to draw boundaries, there has been an inconsistent approach (see our prior client alert regarding the varied definitions of neural data in enacted laws).

Against that backdrop, states are experimenting with where neural data protections should live and how the data should be categorized. Is neural data health data, biometric data, employment data, consumer data, or maybe something entirely new?

States have different answers, and we are seeing proposals that:

  • Amend comprehensive state consumer privacy laws, such as Virginia’s HB654, which would classify neural data as “biometric data” (a category of sensitive data under Virginia’s current law), and Connecticut’s SB5, which would expressly categorize neural data as “sensitive data,” in each case subjecting the data to heightened consent and enhanced protections under existing privacy frameworks;
  • Create standalone neural data privacy protections, such as Alabama’s HB263, reflecting a view that neural data warrants its own statutory framework and applying broadly to any entity that maintains, owns, or licenses neural data in the course of business;
  • Address neural data in the employment context, such as California’s AB1883, which would impose restrictions on the use of workplace surveillance tools that monitor and collect neural data about workers for employee monitoring or performance evaluation and to infer protected characteristics;
  • Target data brokers, such as New York’s S9008/A10008, which would incorporate neural data into the data broker regulatory regime and impose consent and disclosure requirements before such data can be sold or shared;
  • Expand genomic privacy laws, such as Illinois’ SB2994, which would fold neural data into existing genetic privacy frameworks applicable to insurers and employers and trigger private rights of action and statutory damages for noncompliance;
  • Incorporate neural data into health privacy laws, such as Vermont’s H814, which would regulate neural data collected through a brain-computer interface (BCI) in health-related contexts; and
  • Regulate government use of neural data, such as Vermont’s H791, which would restrict the collection, maintenance, use, and disclosure of neural data by public-sector agencies, departments, boards, and commissions.

These disparate state approaches are increasing the likelihood that the end result will be a patchwork of obligations, and a complex compliance maze, with an already complicated set of data. Companies operating across multiple jurisdictions will be well served by tracking these legislative proposals closely to determine how they will navigate this new terrain and what they should do now if they are collecting, interacting with, or otherwise developing what could constitute neural data.

2. Despite Structural Differences, Core Protections Are Emerging.

Across proposals, lawmakers are increasingly requiring heightened protection, transparency, and individual rights and treating neural data as a sensitive category of personal information.  

Common requirements for handling, processing, storing or using neural data include:

  • Clear, prominent notice regarding the collection and use of neural data;
  • Express, opt-in consent prior to collection, processing, and disclosure of neural data;
  • An individual must have the right to revoke consent at any time;
  • Purpose limitation, restricting use of neural data to disclosed purposes and purposes necessary to perform services or provide the goods expected or requested;
  • Restrictions on sales or sharing of neural data with third parties; and
  • Retention limits and requiring deletion once neural data is no longer necessary or within certain time frames when consent is revoked.

Although there are outliers in some proposals (e.g., the prohibition on using a BCI to bypass conscious decision-making without specific, written informed consent in Vermont’s H814), the principles are generally familiar, and many of the controls contemplated for compliance and good practice are ones which are already key components of existing data protection and compliance programs. This is good news, as organizations may not need to build separate neural data compliance frameworks and can instead adapt existing privacy governance programs and policies to account for neural data.

3. Enforcement Mechanisms and Penalties Are Increasing.

A third emerging trend is the inclusion of meaningful enforcement mechanisms. Depending on the legislative vehicle, consequences for noncompliance with the applicable rules may include:

  • Civil penalties enforced by state regulators;
  • Private rights of action and statutory damages; and
  • Enforcement tied to existing consumer protection or privacy regulators.

The inclusion of civil penalties and, in certain models, private litigation risk, reflects lawmakers’ intent to make sure compliance is taken seriously. Organizations should anticipate active enforcement as these proposals become law.

Key Takeaways

Organizations developing, deploying, or integrating neurotechnology—or otherwise collecting or inferring what may constitute neural data—should expect continued legislative activity in 2026 and beyond. Even where proposals do not pass in their current form, they are shaping a policy baseline that may influence future state and federal action. Organizations should proactively align practices with heightened security, transparency, and consent standards to prepare for the changes ahead.

We are Morrison Foerster — a global firm of exceptional credentials. Our clients include some of the largest financial institutions, investment banks, and Fortune 100, technology, and life sciences companies. Our lawyers are committed to achieving innovative and business-minded results for our clients, while preserving the differences that make us stronger.

Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Prior results do not guarantee a similar outcome.