UK Pulls Back From Clash With Big Tech Over Private Messaging (ft.com) 20
The UK government will concede it will not use controversial powers in the online safety bill to scan messaging apps for harmful content until it is "technically feasible" to do so, postponing measures that critics say threaten users' privacy. Financial Times: A planned statement to the House of Lords on Wednesday afternoon will mark an eleventh-hour bid by ministers to end a stand-off with tech companies, including WhatsApp, that have threatened to pull their services from the UK over what they claimed was an intolerable threat to millions of users' security. The statement is set to outline that Ofcom, the tech regulator, will only require companies to scan their networks when a technology is developed that is capable of doing so, according to people briefed on the plan. Many security experts believe it could be years before any such technology is developed, if ever.
"A notice can only be issued where technically feasible and where technology has been accredited as meeting minimum standards of accuracy in detecting only child sexual abuse and exploitation content," the statement will say. The online safety bill, which has been in development for several years and is now in its final stages in parliament, is one of the toughest attempts by any government to make Big Tech companies responsible for the content that is shared on their networks.
"A notice can only be issued where technically feasible and where technology has been accredited as meeting minimum standards of accuracy in detecting only child sexual abuse and exploitation content," the statement will say. The online safety bill, which has been in development for several years and is now in its final stages in parliament, is one of the toughest attempts by any government to make Big Tech companies responsible for the content that is shared on their networks.
Technically Feasible (Score:5, Insightful)
I hope they are willing to accept "Never", as in "it'll never be technical feasible."
"There's no way to build a digital lock that only angels can open and demons cannot. Anyone saying otherwise is either ignorant of the mathematics, or less of an angel than they appear." [ref: CGP Grey [youtube.com]]
Re: (Score:2, Insightful)
Re: Technically Feasible (Score:1)
It's not always the gammons or Trumpsters. Regardless of which party is running the show, the UK skews towards an authoritarian state, always looking to spy ever more, regardless of how infeasible it may be.
Re: (Score:2)
Sort of. We've moved from a conservative government to a populist one. They will announce absolutely anything that they think will give them positive press, and then either drop it, roll it back, or do the opposite when they think nobody is looking.
Re: (Score:3)
Re: Technically Feasible (Score:4, Funny)
Re: (Score:2)
Yes they will. This is just a way of dropping the plan without actually admitting it.
Two within a fortnight! (Score:3)
It hasn't passed a fortnight since an attempt of regulation of some tech issue was pulled back because of technical difficulties. The post about that was https://tech.slashdot.org/stor... [slashdot.org].
However I now remember this story ( https://tech.slashdot.org/stor... [slashdot.org] ) of a father that was locked out of Google services and reported to the police because a picture intended for a doctor was flagged as child abuse.
I CALL BULLSHIT (Score:5, Insightful)
if it isn’t possible, then don’t make the law until it is. this is just a stalling tactic and the issue WILL arise again, because of course. they have this law and they MUST use it.
Re: I CALL BULLSHIT (Score:4, Insightful)
Yep. We'll be told it's to catch the terrorists, never to be abused, yet within months we'll learn it's being used to catch people putting out their bins on the wrong day. That and benefits fraud.
If military hardware intended for defence was as prone to feature creep then Raytheon would be on first name terms with all local councils in the UK, the councils claiming the Paveway was always intended for use in managing street trading without a loicence.
Hurray for Freedom! (Score:3)
Hurray for Freedom!
It's already technically feasible because... (Score:5, Interesting)
Right, anyone with enough access to the the hardware (like a government) could set up pipeline of screenshots fed into text recognition and then searched for keywords.
Bingo, bango, bongo. Feasible AF. Consumer grade encryption can't actually exist.
Re: (Score:1)
Only works if a secondary app can get access to a hardware buss or the screen buffer. Why do you think Apple (for all their anti-repair shenanigans) is code-locking everything? Part of it is to stonewall the repair market. Part of it is to prevent someone slipping in an after-market piece of hardware to siphon data out of one of the hardware busses.
Personally, I think the technology exists to develop a process whereby:
If you physically have the hardware (phone).
And are the manufacturer.
And have received
Re: (Score:3)
Only works if a secondary app can get access to a hardware buss or the screen buffer. Why do you think Apple (for all their anti-repair shenanigans) is code-locking everything? Part of it is to stonewall the repair market. Part of it is to prevent someone slipping in an after-market piece of hardware to siphon data out of one of the hardware busses.
Personally, I think the technology exists to develop a process whereby: If you physically have the hardware (phone). And are the manufacturer. And have received a court order. You (the manufacturer) should be able to unlock and extract it's contents.
The key thing being - physical possession of the device. Like some sort of cryptographic function that is physically stored on the phone (but unrelated to the serial number) that you have to take the phone apart to obtain, but cannot be obtained through electronic means (software, api, etc.)
If the hole is exists, no matter how secure we all think it might be? It'll be abused. Period. Don't create the hole. Not for the "manufacturer only." Not for the government judgment brigade. Not for the wanna-be nannies. Not for anybody.
Re: (Score:3)
Personally, I think the technology exists to develop a process whereby: If you physically have the hardware (phone). And are the manufacturer. And have received a court order. You (the manufacturer) should be able to unlock and extract it's contents.
Of course the technology exists to create such an unlock method. This was the crux of the FBI vs Apple thing a few years back... The feds tried to compel Apple to create a way to break into a dead terrorist's iPhone, and Apple refused.
This was upheld on the basis of you cannot (in the USA) compel someone to do something outside the course of their ordinary business. i.e. The Government cannot compel them to do something for that they do not otherwise offer as a service. -If they had already created a bypa
Re: (Score:3)
Allow me to introduce you to Australia's "Assistance and Access Bill (2018)". The act defines "Assistance" as "we can order any software company to provide us with software we specify (for which they will be recompensed)". Not stated is explicitly is that this covers "the software shall not be reported as a virus or malware, and could be modified versions of (say) your screen and touch/keyboard drivers that report everyth
Eventually, this is what they'll mandate: (Score:1)
Eventually, they will simply mandate that all traffic which is encrypted must be encrypted both to the government's public key, as well as the intended recipients' public key(s). Laws will be passed making failure to include the government's key in any encrypted traffic a crime with serious penalties. A few examples will be made via aggressive prosecution of the first violators of this law "pour encourager les autres" [merriam-webster.com].
It will be claimed that the traffic will not be decrypted by the government unless they
Uh-huh (Score:4, Insightful)
The UK government will concede it will not use controversial powers
Until it changes its mind.
The real purpose (Score:2)
Has the censor defined this? It is a percentage and how is it calculated? You can be sure the law says none of this, allowing the government in power to label anyone a criminal, anytime. Let's not forget that nudity is common in small groups: Parents send bath-time photos of children to themselves and family members, or naked-person photos to medical staff. Also, teens send risque photos of themselves, although that's already controlled.
The real purpose of this bill isn't detecting CS-AM, it's allowi