Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Government Privacy

California Passes Law To Protect Consumer 'Brain Data' (govtech.com) 28

On September 28, California amended the California Consumer Privacy Act of 2018 to recognize the importance of mental privacy. "The law marks the second such legal protection for data produced from invasive neurotechnology, following Colorado, which incorporated neural data into its state data privacy statute, the Colorado Privacy Act (CPA) in April," notes Law.com. GovTech reports: The new bill amends the California Consumer Privacy Act of 2018, which grants consumers rights over personal information that is collected by businesses. The term "personal information" already included biometric data (such as your face, voice, or fingerprints). Now it also explicitly includes neural data. The bill defines neural data as "information that is generated by measuring the activity of a consumer's central or peripheral nervous system, and that is not inferred from nonneural information." In other words, data collected from a person's brain or nerves.

The law prevents companies from selling or sharing a person's data and requires them to make efforts to deidentify the data. It also gives consumers the right to know what information is collected and the right to delete it. "This new law in California will make the lives of consumers safer while sending a clear signal to the fast-growing neurotechnology industry there are high expectations that companies will provide robust protections for mental privacy of consumers," Jared Genser, general counsel to the Neurorights Foundation, which cosponsored the bill, said in a statement. "That said, there is much more work ahead."

California Passes Law To Protect Consumer 'Brain Data'

Comments Filter:
  • Won't matter. (Score:4, Insightful)

    by Petersko ( 564140 ) on Saturday October 05, 2024 @06:22AM (#64841473)

    Whether it's through a data breach, an internal leak, a partner with an NDA who ignores their agreement... that data will escape. A medical firm going bankrupt has enough chaos in which to "lose" the information.

    I'm not saying there shouldn't be such rules in place. But they're far more likely to be used as punishment than prevention. Consider it "in the wild" from the start.

    • I'm not saying there shouldn't be such rules in place. But they're far more likely to be used as punishment than prevention. Consider it "in the wild" from the start.

      This, exactly. Except for the punishment part, which depends on your definition of the word "punishment". If you consider Cost of Business a punishment, then yes. Otherwise, no.

    • Sometimes just discouraging something is enough, we don't have to kill the good enough in the quest for the perfect.

    • "A medical firm going bankrupt has enough chaos in which to "lose" the information." Yeah, it would be VERY tempting for a lower ranking employee facing termination and the possible homelessness and who has access to that information to try to sell it to a 3rd party to keep his/her head above water for maybe an extra month. When the cost of living has reached insane, almost "no mere mortal can afford..." levels, people tend to start putting their principals/morals/law abiding aside and do whatever they need
  • Brain data (Score:3, Funny)

    by Rosco P. Coltrane ( 209368 ) on Saturday October 05, 2024 @07:11AM (#64841503)

    MAGA folks are safe.

  • This basically removes the ability of scientists from being able to do any research. You could make a case for the actual brain imaging (eg using defacing/deskulling techniques, which are problematic in itself) but neural activity is as unique as a fingerprint, any sort of brain injury affects regions too. On the other hand it is not identifiable either as we currently do not have the technology to do things like functional MRI at scale.

    • by test321 ( 8891681 ) on Saturday October 05, 2024 @10:16AM (#64841699)

      Here is the text: https://leginfo.legislature.ca... [ca.gov]
      It formulates in these terms: "Research with personal information that may have been collected from a consumer in the course of the consumer’s interactions with a business’ service or device". This does not apply to the relation between a scientist and the subjects. Research subjects hired by a university or a private company are not consumers.

      What seems to be the intent of the prohibition (developing on an example from TFA) is you a consumer purchased a VR headset to play games, it happens to measure brain waves, through your brain wave reactions when presented some images with it determines you must be gay (even if you don't know or accept it), so it registers you into their database as such and starts showing you relevant ads.

      • by Lehk228 ( 705449 )
        On one hand that does sound creepy, on the other hand they already have my search history
      • Sure, but that scenario is so far up the realm of sci-fi, it would be laughable even if you put it in the Star Trek universe.

        Cortical brain waves have minimal to no information about your thoughts and desires. At best brain activity measurements show regions of interest for further imaging and nobody is putting an MRI on their head (it physically will never be possible to even make a device that can do MRI at the scale of a VR headset). Even implants like Neuralink require massive training which basically c

        • that scenario is so far up the realm of sci-fi,

          It was an example cited in TFA, based on actual research. "Deep Learning in the Identification of Electroencephalogram Sources Associated with Sexual Orientation" https://karger.com/nps/article... [karger.com]

          • That study is rife with issues, not the least that they basically used AI as a buzzword here and donâ(TM)t seem to know how it works. The text reads like they changed their methodology when they couldnâ(TM)t get the expected result. From a cursory reading it seems they trained their network on group 1 and tested the network against the same group despite having a larger sample they did some validation (k-1) but it is light on the details.

            Basically if this were true, you would expect morphological

  • It'll spend years in court with conflicting interpretations and result in absolutely no protection of anything.

    • by sodul ( 833177 )

      It might be used in the courts on lie detectors as a starter. The classic machines do record nerve activity and that has been used by police.

  • I understand wanting to make a name for yourself, but seriously stop making stupid laws that add layers of bureaucracy and bullshit to an industry that is just trying to get started.

    • Forgot to add: This is the type of regulations that scare off investors, that's what's so dangerous about it. Many people need this kind of technology, but it will be slowed down or never get developed when you have laws like this. They are reducing the size of the investor pool and the amount of money.

      • This is the type of regulations that scare off investors

        GOOD.

        I'll say it again: GOOD. Not everything needs to be open season for "investors." I'd rather see those "investors" shot dead and their companies burned to the ground along with all of their assets by a lynch mob than allow them to start demanding direct read / write access to my brain. You should too, at least you should if you want to be able to enjoy your gains a little longer. After all, it would only take a few writes to make it so you'd be happy to give them everything for nothing.

  • It also gives consumers ... the right to delete it.

    Hand me another beer.

  • This is nice and all, but I want more than some theoretical privacy rights that will net me a couple of bucks in a class action settlement years down the road. I want open protocols, the ability to change service providers, the ability to activate/setup the product without interacting with the manufacturer, and the ability to review/control/rollback product updates. Full control, full repairability damn it!
  • Do not need to worry.

Everybody needs a little love sometime; stop hacking and fall in love!

Working...