More States Propose Privacy Laws Safeguarding Neural Data

17 Mar 2025
Client Alert

Neural data privacy has continued to be a hot topic in 2025, with more states turning their focus to neural data privacy in their new legislative sessions.

Connecticut, Massachusetts, Minnesota, Illinois, and Vermont are the latest to propose bills aimed at protecting neural data, and California, who already took action in 2024 to protect neural privacy, is looking to add more (read our MoFo Minute detailing California’s prior amendment). If passed, these states would join California and Colorado (see our MoFo Minute regarding Colorado’s amendment) in regulating the collection or processing of neural data. Companies developing or providing technologies such as wearable devices, virtual reality headsets, and other brain-computer interfaces should take careful note.

While the bills generally track with current neural data privacy laws’ definitions of neural data, this new wave adds novel data sets and enhances avenues for enforcement. Below, we detail the new state proposals aimed at protecting neural data and how they differ from the current Colorado and California laws:

California

California has introduced two new bills to address neural data: (1) the Neural Data Protection Act (SB-44), and (2) the No Robo Bosses Act (SB-7).

The Neural Data Protection Act would amend California’s Consumer Privacy Protection Act (CCPA) to include new protections for neural data. The amendment would require covered businesses (i.e., those that make available a brain-computer interface to a person in California) to use neural data collected through a brain-computer interface only for the purpose for which it was collected and to delete the neural data when such purpose is accomplished. 

The No Robo Bosses Act, which aims to regulate the use of automated decision systems (“ADS”) in employment, would prohibit employers or vendors engaged by employers from using an ADS that collects or infers a worker’s neural data. The bill does not define neural data.

Connecticut

Connecticut’s An Act Concerning Data Privacy, Online Monitoring, Social Media, and Data Brokers (SB 1356) would revise the existing state consumer privacy law, the Connecticut Data Privacy Act (CTDPA), to include neural data as a type of sensitive data under the law. Similar to the Colorado law, the Connecticut law would define neural data as including “any information that is generated by measuring the activity of an individual’s central or peripheral nervous system.” However, Connecticut’s definition goes beyond Colorado’s definition of neural data as it is not limited to data which is used or intended to be used “for identification purposes” (it remains unclear what “for identification purposes” will mean in this context). Under the CTDPA, sensitive data is subject to heightened requirements, including the need to obtain opt-in consent before processing and to perform data protection impact assessments for the processing. SB 1356 would also prohibit the sale of sensitive data unless individual consent is obtained.

Illinois

Illinois’s HB 2984 would amend the Biometric Information Privacy Act (BIPA), one of the most hotly litigated privacy statutes in the country, to include neural data as a “biometric identifier.” Like the California law, the Illinois bill defines neural data as “information that is generated by the measurement of activity of an individual’s central or peripheral nervous system, and that is not inferred from non-neural information.” By including neural data as a biometric identifier, neural data would be subject to the same stringent regulations as other biometric identifiers, like fingerprints and scans of facial geometry, including that organizations would be required to provide individuals with notice regarding how neural data is collected and stored, and for how long, and obtain explicit written consent before collecting, storing, or using neural data.

Massachusetts

Massachusetts’s Neural Data Privacy Protection Act (HD 4127) would be a significant step for the Bay State as it does not currently have a comprehensive consumer privacy law. The Massachusetts proposal narrowly defines neural data to exclude information inferred from non‑neural information, similar to California’s approach. Nonetheless, the bill puts neural data under the broader category of “sensitive covered data,” and prohibits covered entities from:

  • Collecting or processing neural data unless it is strictly necessary to provide or maintain a product or service requested by the individual whose data is in question;
  • Transferring an individual’s neural data to a third party unless the transfer is (i) made pursuant to the consent of the individual before the transfer takes place; (ii) necessary to comply with an obligation under federal law; or (iii) necessary to prevent an individual from imminent injury; and
  • Processing neural data for targeted advertising.

The Massachusetts bill also requires that covered entities provide consumers with various individual rights—including access, correction, and deletion of neural data—in line with the California and Colorado laws.

Minnesota

Minnesota’s 2025 proposal (SF 1240) is a revival of the state’s attempt to codify neural data privacy protections in 2023. Unlike the California and Colorado laws, the Minnesota bill would not amend the Minnesota Consumer Data Privacy Act, which takes effect in July 2025, but would be a standalone law providing separate protections for neural data and mental privacy. The bill establishes certain prohibited practices and rights for both government and private entities, such as:

  • Prohibiting government entities from collecting data transcribed directly from brain activity without informed consent and from interfering with the free and competent decision-making of individuals when making neurotechnology decisions;
  • Requiring companies to provide individuals with the right to change a decision regarding neurotechnology, the right to protect against unauthorized access to or manipulation of an individual’s brain activity, and the right to protect against unauthorized neurotechnological alterations in mental functions critical to personality;
  • Requiring companies connecting individuals to brain-computer interfaces to provide notice of (1) the uses of their data, and (2) the third parties with which the data will be shared; such companies must obtain individual consent for each use and third-party sharing using a separate consent form; and
  • Prohibiting companies from using a brain-computer interface to bypass conscious decision making by an individual.

Vermont

Vermont’s legislature has introduced three bills capturing neural data protections: (1) the Age‑Appropriate Design Code Act, (2) the Data Privacy and Online Surveillance Act, and (3) An Act Relating to Neurological Rights.

The Age-Appropriate Design Code Act (H.210) focuses on protecting minors under the age of 18 online and represents the first proposed law aimed explicitly at children’s neural data. Like the Colorado law, the Vermont bill defines neural data broadly as “information that is collected through biosensors and that could be processed to infer or predict mental states,” therefore including neural data inferred from non-neural information. Neural data would constitute sensitive data under the law.

The Data Privacy and Online Surveillance Act (H.208) is a comprehensive consumer privacy bill that includes specific protections for neural data like the California and Colorado laws. The bill includes the same broad definition of neural data as is included in the Vermont Age-Appropriate Design Code Act described above. Neural data would constitute sensitive data under the law, creating similar heightened protections for sensitive data as seen in other state privacy law (e.g., prohibitions on the sale of sensitive data and requirements to perform data protection impact assessments for the processing of sensitive data).

An Act Relating to Neurological Rights (H.366) would create a number of rights with respect to an individual’s mental privacy, similar to Minnesota’s SF 1240 mentioned above. H.366 defines neural data differently than the other two Vermont bills: “information that is generated by the measurement of the activity of an individual’s central or peripheral nervous systems and that can be processed by or with the assistance of a device.” Specifically, the bill would:

  • Prohibit the collection of neural data from a brain-computer interface without providing notice regarding how the neural data will be used and obtaining written informed consent;
  • Prohibit the sharing of neural data from a brain-computer interface with a third party unless notice is provided regarding the neural data to be shared with a third party and for what purposes, including the name and address of the third party, and written informed consent is obtained;
  • Allow individuals who have consented to the collection or sharing of their neural data to revoke their consent, which must be as easy as providing consent; and
  • Require destruction of all neural data within 10 days after receiving a revocation of consent, and in the case of a revocation of consent to sharing, cease sharing neural data with all third parties immediately upon receipt and inform all third parties with whom neural data has been shared that the consent has been revoked.

These legislative developments reflect a growing trend towards strengthening neural data privacy protections in the United States. Businesses operating in these states should keep an eye on these legislative developments and proactively review their neural data practices in anticipation of these emerging legal standards.

We are Morrison Foerster — a global firm of exceptional credentials. Our clients include some of the largest financial institutions, investment banks, and Fortune 100, technology, and life sciences companies. Our lawyers are committed to achieving innovative and business-minded results for our clients, while preserving the differences that make us stronger.

Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Prior results do not guarantee a similar outcome.