Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Government Privacy

California Passes Law To Protect Consumer 'Brain Data' (govtech.com) 20

On September 28, California amended the California Consumer Privacy Act of 2018 to recognize the importance of mental privacy. "The law marks the second such legal protection for data produced from invasive neurotechnology, following Colorado, which incorporated neural data into its state data privacy statute, the Colorado Privacy Act (CPA) in April," notes Law.com. GovTech reports: The new bill amends the California Consumer Privacy Act of 2018, which grants consumers rights over personal information that is collected by businesses. The term "personal information" already included biometric data (such as your face, voice, or fingerprints). Now it also explicitly includes neural data. The bill defines neural data as "information that is generated by measuring the activity of a consumer's central or peripheral nervous system, and that is not inferred from nonneural information." In other words, data collected from a person's brain or nerves.

The law prevents companies from selling or sharing a person's data and requires them to make efforts to deidentify the data. It also gives consumers the right to know what information is collected and the right to delete it. "This new law in California will make the lives of consumers safer while sending a clear signal to the fast-growing neurotechnology industry there are high expectations that companies will provide robust protections for mental privacy of consumers," Jared Genser, general counsel to the Neurorights Foundation, which cosponsored the bill, said in a statement. "That said, there is much more work ahead."

California Passes Law To Protect Consumer 'Brain Data'

Comments Filter:
  • Won't matter. (Score:4, Insightful)

    by Petersko ( 564140 ) on Saturday October 05, 2024 @06:22AM (#64841473)

    Whether it's through a data breach, an internal leak, a partner with an NDA who ignores their agreement... that data will escape. A medical firm going bankrupt has enough chaos in which to "lose" the information.

    I'm not saying there shouldn't be such rules in place. But they're far more likely to be used as punishment than prevention. Consider it "in the wild" from the start.

  • Brain data (Score:3, Funny)

    by Rosco P. Coltrane ( 209368 ) on Saturday October 05, 2024 @07:11AM (#64841503)

    MAGA folks are safe.

  • This basically removes the ability of scientists from being able to do any research. You could make a case for the actual brain imaging (eg using defacing/deskulling techniques, which are problematic in itself) but neural activity is as unique as a fingerprint, any sort of brain injury affects regions too. On the other hand it is not identifiable either as we currently do not have the technology to do things like functional MRI at scale.

    • Here is the text: https://leginfo.legislature.ca... [ca.gov]
      It formulates in these terms: "Research with personal information that may have been collected from a consumer in the course of the consumer’s interactions with a business’ service or device". This does not apply to the relation between a scientist and the subjects. Research subjects hired by a university or a private company are not consumers.

      What seems to be the intent of the prohibition (developing on an example from TFA) is you a consumer pu

      • by Lehk228 ( 705449 )
        On one hand that does sound creepy, on the other hand they already have my search history
      • Sure, but that scenario is so far up the realm of sci-fi, it would be laughable even if you put it in the Star Trek universe.

        Cortical brain waves have minimal to no information about your thoughts and desires. At best brain activity measurements show regions of interest for further imaging and nobody is putting an MRI on their head (it physically will never be possible to even make a device that can do MRI at the scale of a VR headset). Even implants like Neuralink require massive training which basically c

  • It'll spend years in court with conflicting interpretations and result in absolutely no protection of anything.

    • by sodul ( 833177 )

      It might be used in the courts on lie detectors as a starter. The classic machines do record nerve activity and that has been used by police.

  • by backslashdot ( 95548 ) on Saturday October 05, 2024 @11:29AM (#64841821)

    I understand wanting to make a name for yourself, but seriously stop making stupid laws that add layers of bureaucracy and bullshit to an industry that is just trying to get started.

    • Forgot to add: This is the type of regulations that scare off investors, that's what's so dangerous about it. Many people need this kind of technology, but it will be slowed down or never get developed when you have laws like this. They are reducing the size of the investor pool and the amount of money.

  • It also gives consumers ... the right to delete it.

    Hand me another beer.

How many hardware guys does it take to change a light bulb? "Well the diagnostics say it's fine buddy, so it's a software problem."

Working...