EU Lawyers Say Plan To Scan Private Messages For Child Abuse May Be Unlawful (theguardian.com) 68
An anonymous reader quotes a report from The Guardian: An EU plan under which all WhatsApp, iMessage and Snapchat accounts could be screened for child abuse content has hit a significant obstacle after internal legal advice said it would probably be annulled by the courts for breaching users' rights. Under the proposed "chat controls" regulation, any encrypted service provider could be forced to survey billions of messages, videos and photos for "identifiers" of certain types of content where it was suspected a service was being used to disseminate harmful material. The providers issued with a so-called "detection order" by national bodies would have to alert police if they found evidence of suspected harmful content being shared or the grooming of children.
Privacy campaigners and the service providers have already warned that the proposed EU regulation and a similar online safety bill in the UK risk end-to-end encryption services such as WhatsApp disappearing from Europe. Now leaked internal EU legal advice, which was presented to diplomats from the bloc's member states on 27 April and has been seen by the Guardian, raises significant doubts about the lawfulness of the regulation unveiled by the European Commission in May last year. The legal service of the council of the EU, the decision-making body led by national ministers, has advised the proposed regulation poses a "particularly serious limitation to the rights to privacy and personal data" and that there is a "serious risk" of it falling foul of a judicial review on multiple grounds.
The EU lawyers write that the draft regulation "would require the general and indiscriminate screening of the data processed by a specific service provider, and apply without distinction to all the persons using that specific service, without those persons being, even indirectly, in a situation liable to give rise to criminal prosecution." The legal service goes on to warn that the European court of justice has previously judged the screening of communications metadata is "proportionate only for the purpose of safeguarding national security" and therefore "it is rather unlikely that similar screening of content of communications for the purpose of combating crime of child sexual abuse would be found proportionate, let alone with regard to the conduct not constituting criminal offenses." The lawyers conclude the proposed regulation is at "serious risk of exceeding the limits of what is appropriate and necessary in order to meet the legitimate objectives pursued, and therefore of failing to comply with the principle of proportionality". The legal service is also concerned about the introduction of age verification technology and processes to popular encrypted services. "The lawyers write that this would necessarily involve the mass profiling of users, or the biometric analysis of the user's face or voice, or alternatively the use of a digital certification system they note 'would necessarily add another layer of interference with the rights and freedoms of the users,'" reports the Guardian.
"Despite the advice, it is understood that 10 EU member states -- Belgium, Bulgaria, Cyprus, Hungary, Ireland, Italy, Latvia, Lithuania, Romania and Spain -- back continuing with the regulation without amendment."
Privacy campaigners and the service providers have already warned that the proposed EU regulation and a similar online safety bill in the UK risk end-to-end encryption services such as WhatsApp disappearing from Europe. Now leaked internal EU legal advice, which was presented to diplomats from the bloc's member states on 27 April and has been seen by the Guardian, raises significant doubts about the lawfulness of the regulation unveiled by the European Commission in May last year. The legal service of the council of the EU, the decision-making body led by national ministers, has advised the proposed regulation poses a "particularly serious limitation to the rights to privacy and personal data" and that there is a "serious risk" of it falling foul of a judicial review on multiple grounds.
The EU lawyers write that the draft regulation "would require the general and indiscriminate screening of the data processed by a specific service provider, and apply without distinction to all the persons using that specific service, without those persons being, even indirectly, in a situation liable to give rise to criminal prosecution." The legal service goes on to warn that the European court of justice has previously judged the screening of communications metadata is "proportionate only for the purpose of safeguarding national security" and therefore "it is rather unlikely that similar screening of content of communications for the purpose of combating crime of child sexual abuse would be found proportionate, let alone with regard to the conduct not constituting criminal offenses." The lawyers conclude the proposed regulation is at "serious risk of exceeding the limits of what is appropriate and necessary in order to meet the legitimate objectives pursued, and therefore of failing to comply with the principle of proportionality". The legal service is also concerned about the introduction of age verification technology and processes to popular encrypted services. "The lawyers write that this would necessarily involve the mass profiling of users, or the biometric analysis of the user's face or voice, or alternatively the use of a digital certification system they note 'would necessarily add another layer of interference with the rights and freedoms of the users,'" reports the Guardian.
"Despite the advice, it is understood that 10 EU member states -- Belgium, Bulgaria, Cyprus, Hungary, Ireland, Italy, Latvia, Lithuania, Romania and Spain -- back continuing with the regulation without amendment."
The old adage still applies. (Score:4, Insightful)
Re: (Score:1)
Never send anything over the internet that you wouldn't want to see published on the front page of your local newspaper the next morning.
There's nothing magical, special, or unique about the internet to set it apart from any other approach to communications, really. If it's true that you should never send anything over the internet that you wouldn't want to see made public, then you, likewise, should never send anything by ANY means whatsoever because anything sent in any fashion, if intercepted, could end up becoming public knowledge. The only reason that old adage even exists is because once upon a time, there were no good means for peop
Re: (Score:2, Interesting)
You can in Utah starting at age 15 with parent approval.
Re:The old adage still applies. (Score:4, Insightful)
Re: (Score:2)
Never send anything over the internet ...
The modern version is "never send anything to a cloud service": Your data is decrypted (transmission-only security) before being forwarded or saved, or you're depending on the cloud service not handing the keys to law-enforcement until they receive a warrant. Very few personal-use cloud services apply end-to-end encryption.
Re: (Score:2)
Or rather, make sure your own encryption is good and do never, ever depend on some for-profit to provide the only encryption layer. Of course, the problems that stem from sending anything to somebody else persists.
"Screening" (Score:5, Interesting)
It looks like those countries wishing to proceed are arguing that it's not the same as bulk interception of data or metadata for the purposes of screening. Instead the device itself would have a database of known illegal images, and compare images the user posts to that database. In that way privacy can claim to have been maintained, since nothing is being checked externally or by anyone but the user's own device which already has to process it.
I think that's a bit weak, and the lawyers are probably right. Furthermore, Apple tried it and it didn't work. Within hours people had demonstrated how two difference images can result in the same identity hash, and how trivial changes can alter the hash enough to avoid detection.
Also, I don't know how tech savvy paedophiles are in general, but presumably if this law did come in they would stop using services that implement this scanning.
Re: (Score:2)
Apple at least can keep the code for the visual hash obscured and rate limit attempts at probing an implementation in the secure enclave. Most current Android phones and PCs don't have any significant capability for running encrypted code (I doubt say the Intel ME can run a visual hash at reasonable rate).
Re: (Score:2)
Apple much vaunted "secure enclave" is just the same ARM tech that Android phones use.
Anyway, keeping the algorithm secret just makes it worse. We need to be able to check these technologies to make sure they perform reasonably well, and don't send cops after innocent people. We know from experience what the police are like - they don't bother checking the information properly, they just go arrest and sort it out later. People have been arrested because some cop fat-fingered an IP address, or because facial
Re: (Score:2)
The ISA isn't what's relevant. The isolated processor needs the instructions per second and the operating memory and bandwidth to run the visual hash at a decent rate.
Re: (Score:2)
In this case it is, because it doesn't run arbitrary code. That would be a massive security flaw. It only performs certain cryptographic functions.
If secure code needs to be executed then it can be used to validate it via a certificate. That's how it validates the OS image at boot time, for example. The actual code runs on the main CPU though.
Re: (Score:2)
It doesn't need to run arbitrary code, it just needs to run the perceptual hash implemented by Apple.
The entire perceptual hash code needs similar security guarantees as a single private key, but requires orders of magnitude more processing and memory to process ... that's the problem, you can't just run it on some microcontroller with a couple kB of embedded RAM like you can the cryptographic key functions.
Re: (Score:2)
The ISA is relevant from the standpoint that it tells us approximately the capabilities of the processor. That is, we know how fast an ARM processor can be.
As for resources, don't forget the ability to store a massive database of naughty hashes.
Re: (Score:2)
There's a whole constitutional aspect here. A company who stumbles across something and reports it is fine. A company who scans and reports as a process can be a violation of the 4th amendment warrant requirements if their relationship to the government is too cozy. So they play this silly dance instead of being two ends of a defined pipeline.
Re: (Score:1)
Since this is the EU the 4th Amendment does not apply.
Companies have a legitimate interest in policing their networks, because becoming known as a hot-bed of child pornography is not good for business. Of course, if they choose to go down that route, they must respect user privacy.
Re: "Screening" (Score:5, Interesting)
Even things as benign as traffic tickets are being generated by dubiously crafted computer algorithms and sent out to the wrong people.
I recently got into it with the city of New York when they mailed me a citation for using a toll road without paying. They even attached photographic proof of my dastardly deed as their evidence.
It would have worked out well for them save for a few minor discrepancies like how the picture showed a Honda Goldwing motorcycle with New Jersey plates but the included form had the license plate number from a nearly 50 year old Italian car that I own which is registered in a state some 2000 miles away. Or the fact that I have never, in all my life, set foot in New York or New Jersey and have never owned a Honda Goldwing.
What had happened was that their computer did a hasty analysis of the grainy 300px photograph, transposed one character of the actual offenderâ(TM)s license plate, and ran that result through a national database of registered vehicles. My name came up and they simply barfed out a ticket in the mail with my name on it and they just mailed it out.
Re: (Score:2)
I suspect New Jersey has a public/private partnership with the mafia. I’ve similarly had issues with New Jersey, they constantly send me tickets for some other guy, first time I called and the lady asked me if I wanted to do a “cease and desist” and I did. Next time an actual attorney sent them a letter.
They spam my shit up still.
Before that they also sent me a bill for a rental car that was supposedly on their bridge but was actually parked in another state and the keys were in my pocke
Re: (Score:2)
Reminds me of the time someone got a speeding ticket for driving 307km/h in his Peugeot 307 a car which definitely didn't have a top speed of 307km/h.
But then what happened? You see the issues of errors are only a problem when there's no recourse to deal with them. Did you pay the fine? I'm guessing no and the situation got dealt with right?
Re: (Score:2)
I called them when I had a moment of spare free time because I was not about to invest a huge amount of effort into correcting their mistake. Fortunately it was a 10 minute phone call. The woman I got on the phone asked how I'd like to pay for the fine. I politely informed her I was not going to be paying the fine and simply calling them to inform them of their error and give them the opportunity to correct it. I pointed out that the license plate number on the citation didn't even match the license plate i
Re: (Score:2)
The algorithm for these things isn't secret. It's actually just ... hashes. SHA256 most likely.
That's it'. Maybe regions of the image are hashed in case someone tries to do a simple change a pixel to change the hash thing, but in general the automated scanning is just comparing hashes.
Sure, you can come up with a false positive - it's hashes, you almost certainly will have a collision by definition. Thing is, if it's just one random image colliding, it likely isn't an image of interest. Child porn is hoarde
Re: (Score:2)
They were going to use NeuralHash and it's not that.
The problem with collisions is that adversaries can DOS the human reviewers, it's just not practical regardless of security.
Re: (Score:2)
*cough cough* Samsung Knox *cough*
Re: (Score:1)
It looks like those countries wishing to proceed are arguing that it's not the same as bulk interception of data or metadata for the purposes of screening.
"Our peeping is not peeping because we wouldn't do that."
"Our peeping is not peeping because we're the government."
"Our peeping is not peeping because..." ad nauseam.
Just like spammers. [rhyolite.com]
Instead the device itself would have a database of known illegal images, and compare images the user posts to that database.
Either it's local, so it can be reversed or oracled until collissions and then you can weaponize it. Or it's not, and you have a centralised database of badness that still can be oracled for collissions.
Either way, you have to keep the database unpublic because that would enable bad actors but they're enabled anyway, an
Re: (Score:2)
As could already be observed with URL blocklists, other stuff and stuff that is actually legal would be added to the screening lists fast. Something that can scan images can typically also scan for strings in text, for example or requires very little modification to do so.
Re: (Score:1)
Exactly, legislators use the child abuse excuse to try to push scanning/snitching systems onto user devices. Certainly it isn't a coincidence that UK and EU are trying to do this at the same time. Later they will be able to scan for whatever they want. Not only in the name of "national security", but sooner or later for regular police work and investigation of petty crimes.
Or perhaps also for espionage and ambushing innocent people to force them into some sort of cooperation with intelligence agencies.
In sh
Re: (Score:2)
In short, these efforts must be stopped at all costs.
Indeed. The alternatives, longer term, are a surveillance-fascism.
Re:"Screening" Edit the image by one pixel. (Score:2)
https://www.geeksforgeeks.org/... [geeksforgeeks.org]
Re:"Screening" Edit the image by one pixel. (Score:5, Interesting)
There are image hashing algorithms that are resilient to small changes, cropping, rotation, scaling, brightness changes and so forth. Unfortunately they are still relatively easy to defeat, both in terms of stopping them matching images and in terms of making them match to completely different images.
Machine Learning will never work for this. Even a human can't make that determination reliably. Even if it was possible, how would anyone not familiar with the individual know if a photo was just of their own child in the bath, or for medical purposes?
Re: (Score:2)
Even the best trained humans will probably fail to catch the most careful criminals. Sometimes the crime is perfect in its execution and the perpetrator will get away with their misdeeds. But th
Re: (Score:2)
Also, I don't know how tech savvy paedophiles are in general
I'd expect the same level as every crook, or the rest of the population for that matter: Some ain't, some are, and you'll catch the ones that ain't and feel good about it while the ones that are keep on going.
Re: (Score:2)
Re: (Score:2)
Furthermore, Apple tried it and it didn't work.
To be fair to the human race if we gave up every endeavour when we tried to do something and failed the first time we'd be freezing to death in caves hoping that our deity of choice will bless us with a lightning strike to create that fire thing we still haven't figured out. The fact Apple's implementation failed is irrelevant to the discussion.
I think that's a bit weak, and the lawyers are probably right.
I'm not sure it is weak. EU privacy rules are universally predicated on reasonableness. There is a difference between screening data by a 3rd party, and breaking enc
Re: (Score:2)
Lets just call it "extremely weak" and the CJEU has a habit of asking actual experts as to the facts of the matter.
Fortunately, this court has so far resisted attempts to corrupt it. Unfortunately, it is the only real protection layer against the efforts to establish a surveillance fascism. In the US, the Supreme Court has been successfully compromised and we are now going to see establishment of totalitarianisms and likely a religio-fascism after that in slow motion. Lets hope the EU countries remain suffi
Who is paying these people? (Score:4, Insightful)
Where is all the money and lobbying coming from to push this absurdity?
This is clearly a hail mary shot at getting a totalitarian surveillance machinery through the broken EU apparatus. It requires all the countries with veto power to make stupid decisions, the EU parliament to be a stupid lapdog and the ECJ to set a stupid precedent. It can certainly happen, but it's very unlikely. If they wanted something practical which would have a good chance to pass they'd just create a legal obligation for messaging/email providers to have lawful intercept ability. It would still prevent E2E, but there is precedent (phone) and proportionality ... it actually has a good chance of passing, without relying on the fact that EU politics is broken.
The total balls to the wall effort at totalitarianism is suspect, only money can move politicians to this.
Re:Who is paying these people? (Score:5, Insightful)
When politicians are failing and think they might lose the next election, this is the kind of bullshit they start pitching. Other popular targets include migrants/asylum seekers, people receiving benefits/welfare, and LGBT+ people.
Any objection is difficult to put across in the era of soundbites and slogans. "We will protect your children" is a very easy sell, and opens up anyone arguing against them to claims that they don't care about child safety, or worse.
Re: (Score:2)
The LGBT ? A popular target in the EU ? In Eastern Europe maybe but not in the rest of the EU.
Re: (Score:2)
Yes, that is what i meant by "Eastern Europe" Poland, Hungary for example.
Re: (Score:2)
Details https://notesfrompoland.com/20... [notesfrompoland.com]
Re: (Score:2)
Re: (Score:2)
I get what you mean and you are probably right. That said, there are probably easier targets in Western Europe. But what you said seems plausible. Scary.
Re: (Score:2)
"We will protect your children" ...
Answer:
I want to help, give me your User Names and passwords so I can save your children from pedophiles. ...' you.
What do you mean "No"? So, you don't want your children to be safe?
But this is your plan, we spy on each other like the Stassi.
Only you can act like the Stassi? So, the real message is, 'everyone is an arsehole except
Re: (Score:2, Interesting)
Right wingers need the constant threat of a boogeyman or else their dogma stops working. Like Fox telling viewers that anytime their doorbell rings it’s someone trying to cause harm. Or turning around in the wrong driveway, or playing hide and seek. https://abcnews.go.com/amp/US/... [go.com]
Re: (Score:3)
The main proponent of this scheme is (or was) in prison for accepting bribes. Makes you wonder.
Re: (Score:2)
"only money can move politicians to this."
Not so, votes can also do it. it's called "playing to the gallery." This explanation has the advantage of explaining why they're doing something that has no chance of getting enacted. Enacting it isn't the point; the point is being able to go to your target audience and say, "Look at this wonderful thing I tried to do to protect the children."
I'm tired of this straw man... (Score:5, Insightful)
Nobody likes the idea of child abuse.
But damn, I'm so sick of hearing that it is the perfect excuse to erode all of our privacy and personal liberties in the name of finding abusers.
I'm behind any punishment we want to dole out to someone who abuses a child. I'm just not behind letting cops or anyone else sift through my life to prove that I'm not an abuser.
Re:I'm tired of this straw man... (Score:5, Insightful)
If they actually cared about child abuse the people participating in Epstein island would be exposed.
This is about eavesdropping on citizens.
Re: (Score:2)
If they actually cared about child abuse the people participating in Epstein island would be exposed.
This is about eavesdropping on citizens.
Obviously. And it never was about anything else.
Re: (Score:2)
Even career criminals will beat the hell out of a child abuser in jail. It really tells you something about our political class who would instead let them escape justice or SVN keep abusing children to try to extort them.
How could this even get so far? (Score:2)
The EU has been trying to push a lot of insanity lately, with the upload filters and this message scan.
Can anyone explain to me how this could have remotely gotten this far?
Every dictatorship in the world would kill to have this. How was this not shot down on the first day in the EU???
And we all know they will keep tryingâ¦
Re: (Score:3)
Because the EU is last to the party...
US has this (cf Snowden), China has this, US having this implies all the parties to the 5 eyes agreement have this, Israel has it...
The EU is partially blind to its own citizens. A "state" can't have that. *
Furthermore, it appears EU courts (and those of quite a lot of member states) don't play ball with the argument that state security trumps privacy.
*It could also mean that in a weird twist of events EU internal surveillance agencies get a substantial amount of their
Funny (Score:4)
It's funny how the EU is all about privacy and data protection, until it comes time for the govt to have access.
Re: (Score:2)
Certain factions would prefer to divert your privacy concerns to stopping corporations from figuring out if you might be more interested to buy Depends or Pampers, than governement with the holy grail of tyranny: a panopticon.
Re: (Score:2)
It's funny how the EU is all about privacy and data protection, until it comes time for the govt to have access.
There is no single "EU", as there is no single "United States of America".
US has political parties, government departments, civil liberties organizations, corporations - all with different agendas and end goals.
EU (as a central organization) has Parliament, Council, Commission and so on. Actual lawmaking happens in Commission. Which is split up into DG-s, Directorates General. And those DG-s fight each other for power and influence.
In telecommunications security, two DG-s are involved. DG Home, for "home af
Parents (Score:5, Interesting)
After thousands of dollars in legal fees and several months they finally won the right to keep their baby... Why do parents take such pictures? I do not know, what I do know is that probably the great majority of parents are not paedophiles. Some small percentage probably are. And thus, you can be right and wrong about making sweeping generalizations.
Hence this is most likely a bad law and will result in kids taken from parents who are innocent.
Re:Parents (Score:4, Insightful)
There also was a comment by an expert in Germany that most such pictures sent are not actually abuse material but teens sending pictures of themselves because they can. It is also high time to stop criminalizing that. Teens are horny and stupid. Now give them a smartphone with a camera....
On the side of the claim that the availability of such material (however created) increases child abuse (which the surveillance proponents claim), there is reliable evidence to the contrary. Instead it works as all porn does: The more available porn the less rape. I am not saying this material should be legal to sell, obviously, because that would could create a market for abuse for money, and if such material shows actual child abuse and not some horny teen that thinks this is a good idea, the ones being abused and the ones doing the abuse should be identified and the abusers stopped, but that is it. This whole thing about the scanning is all about good intentions and the road to hell. And about lying by misdirection and establishing a surveillance state in those that know this is not actually going to help.
Re: (Score:2)
Hence this is most likely a bad law and will result in kids taken from parents who are innocent.
Hey, the EU can finally pick up some pointers from the US on treating its citizens with absolute disdain! I'm sure the government officials would rather take away a million babies from innocent parents than risk one abuser getting away with it! I mean, that's certainly the way these laws are always pushed.
I'm not sure how the world got so flipped around, but this concept that we're all guilty until / unless we can absolutely 100% prove that we aren't is seriously fucked up and has got to stop. Having our go
Re: (Score:1)
Re: (Score:2)
There was also a more recent case not involving film, but Google. A parent took and sent pictures of their sick child's rash to, and at the request of, their pediatrician. This was during COVID when everyone was still being all fussy about actually physically going to the doctor. I don't recall if it was code in Android itself spying on the parent and camera, or if the images were scanned when uploading to Google's cloud photo library or when they were sent via text or email... but something was broken i
Re: (Score:2)
It was on Slashdot a while ago: https://tech.slashdot.org/stor... [slashdot.org]
How Would it Differntiate Abuse from Ageplay/BDSM? (Score:1)
Any evidence of widespread child-abuse? (Score:2)
I shake my head at the money spent trying to track down abusers, and even more so at the attempts to hamstring technology to better spy on all users in the name of finding child-abusers, but is there any evidence to justify all of these measures?
It seems the child-abuse issue is paraded about, but I strongly doubt their expensive and intrusive measures are aimed only at child-abuse issues. It seems like they are looking for a way to monitor everyone's day-to-day activity to prosecute any sort of infraction
Too Late to Help Me (Score:2)
to alert police if they found evidence of .. the grooming of children.
I used to HATE when my mom tried to brush my hair or wipe my face, especially in public.
Had this law been been enacted a couple of months ago, I could have made her stop!