California Passes Law To Protect Consumer 'Brain Data' (govtech.com) 20
On September 28, California amended the California Consumer Privacy Act of 2018 to recognize the importance of mental privacy. "The law marks the second such legal protection for data produced from invasive neurotechnology, following Colorado, which incorporated neural data into its state data privacy statute, the Colorado Privacy Act (CPA) in April," notes Law.com. GovTech reports: The new bill amends the California Consumer Privacy Act of 2018, which grants consumers rights over personal information that is collected by businesses. The term "personal information" already included biometric data (such as your face, voice, or fingerprints). Now it also explicitly includes neural data. The bill defines neural data as "information that is generated by measuring the activity of a consumer's central or peripheral nervous system, and that is not inferred from nonneural information." In other words, data collected from a person's brain or nerves.
The law prevents companies from selling or sharing a person's data and requires them to make efforts to deidentify the data. It also gives consumers the right to know what information is collected and the right to delete it. "This new law in California will make the lives of consumers safer while sending a clear signal to the fast-growing neurotechnology industry there are high expectations that companies will provide robust protections for mental privacy of consumers," Jared Genser, general counsel to the Neurorights Foundation, which cosponsored the bill, said in a statement. "That said, there is much more work ahead."
The law prevents companies from selling or sharing a person's data and requires them to make efforts to deidentify the data. It also gives consumers the right to know what information is collected and the right to delete it. "This new law in California will make the lives of consumers safer while sending a clear signal to the fast-growing neurotechnology industry there are high expectations that companies will provide robust protections for mental privacy of consumers," Jared Genser, general counsel to the Neurorights Foundation, which cosponsored the bill, said in a statement. "That said, there is much more work ahead."
Re: (Score:1)
California sure likes to pass laws on any topic imaginable. I assume they are the champion, in North America at least although Trudeau is serious competition.
Re: (Score:1)
Sorry, Republicans are banning people from wearing masks [justia.com] to protect themselves and others.
In most states it is against the law to wear a mask -- these laws date from more than 100 years ago. This of course had nothing to do with Republicans or Democrats. Can you show any case where a random person was arrested just for wearing a mask in modern times? Recently, some have been amended to accommodate Muslims.
Is your complaint that Republicans have amended an old law so that it accommodates religious expression? And what might that have to do with privacy laws about brain scans?
Won't matter. (Score:4, Insightful)
Whether it's through a data breach, an internal leak, a partner with an NDA who ignores their agreement... that data will escape. A medical firm going bankrupt has enough chaos in which to "lose" the information.
I'm not saying there shouldn't be such rules in place. But they're far more likely to be used as punishment than prevention. Consider it "in the wild" from the start.
Re: (Score:2)
I'm not saying there shouldn't be such rules in place. But they're far more likely to be used as punishment than prevention. Consider it "in the wild" from the start.
This, exactly. Except for the punishment part, which depends on your definition of the word "punishment". If you consider Cost of Business a punishment, then yes. Otherwise, no.
Re: Won't matter. (Score:2)
Medical firm?
https://www.msn.com/en-us/mone... [msn.com]
And murderers kill even though it's illegal (Score:2)
Sometimes just discouraging something is enough, we don't have to kill the good enough in the quest for the perfect.
Brain data (Score:3, Funny)
MAGA folks are safe.
Re: (Score:1)
Neural activity is not deidentifiable (Score:2)
This basically removes the ability of scientists from being able to do any research. You could make a case for the actual brain imaging (eg using defacing/deskulling techniques, which are problematic in itself) but neural activity is as unique as a fingerprint, any sort of brain injury affects regions too. On the other hand it is not identifiable either as we currently do not have the technology to do things like functional MRI at scale.
Re: (Score:2)
Here is the text: https://leginfo.legislature.ca... [ca.gov]
It formulates in these terms: "Research with personal information that may have been collected from a consumer in the course of the consumer’s interactions with a business’ service or device". This does not apply to the relation between a scientist and the subjects. Research subjects hired by a university or a private company are not consumers.
What seems to be the intent of the prohibition (developing on an example from TFA) is you a consumer pu
Re: (Score:2)
Re: Neural activity is not deidentifiable (Score:1)
Sure, but that scenario is so far up the realm of sci-fi, it would be laughable even if you put it in the Star Trek universe.
Cortical brain waves have minimal to no information about your thoughts and desires. At best brain activity measurements show regions of interest for further imaging and nobody is putting an MRI on their head (it physically will never be possible to even make a device that can do MRI at the scale of a VR headset). Even implants like Neuralink require massive training which basically c
Completely unenforceable (Score:2)
It'll spend years in court with conflicting interpretations and result in absolutely no protection of anything.
Re: (Score:2)
It might be used in the courts on lie detectors as a starter. The classic machines do record nerve activity and that has been used by police.
Stop making stupid laws! (Score:3)
I understand wanting to make a name for yourself, but seriously stop making stupid laws that add layers of bureaucracy and bullshit to an industry that is just trying to get started.
Re: (Score:3)
Forgot to add: This is the type of regulations that scare off investors, that's what's so dangerous about it. Many people need this kind of technology, but it will be slowed down or never get developed when you have laws like this. They are reducing the size of the investor pool and the amount of money.
Brain data (Score:2)
It also gives consumers ... the right to delete it.
Hand me another beer.