UK Police's Porn-Spotting AI Keeps Mistaking Desert Pics for Nudes (gizmodo.com) 144
An anonymous reader quotes Gizmodo:
London's Metropolitan Police believes that its artificial intelligence software will be up to the task of detecting images of child abuse in the next "two to three years." But, in its current state, the system can't tell the difference between a photo of a desert and a photo of a naked body... "Sometimes it comes up with a desert and it thinks its an indecent image or pornography," Mark Stokes, the department's head of digital and electronics forensics, recently told The Telegraph. "For some reason, lots of people have screen-savers of deserts and it picks it up thinking it is skin colour."
The article concludes that the London police software "has yet to prove that it can successfully differentiate the human body from arid landscapes."
The article concludes that the London police software "has yet to prove that it can successfully differentiate the human body from arid landscapes."
I guess it's down to camels now: (Score:5, Funny)
One hump, or two?
Re: (Score:1)
What do you call a three-humped camel?
Re: (Score:3, Funny)
Pregnant!
Re: (Score:3)
Beavis: That chick has three boobies!
Butthead: That's her belly fartknocker.
Re: (Score:2)
You mean they weren't watching Total Recall?
Re: (Score:2)
Re: (Score:2)
All the humps. I love fucking sand!! It got in my wife's Yahoo one summer and now I can't stay erect without it. We are divorced, but nothing sends me over the moon like sand dunes. In the shape of tits. Or hyperbolae. Or straight lines. Or curved straight lines. Or sandboxes.
Hmmmm sand box.
Re: (Score:2)
I didn't think there were many people left who were willing to admit to being a sand-botherer in public. Not since the convictions of the pavement-masturbator.
Not one of the above concepts is invented - all real "kinks".
Re: (Score:2)
Camels? ... Oh!
I was thinking of cherries on ice-cream scoops.
Eeeeeh?! (Score:5, Insightful)
""For some reason, lots of people have screen-savers of desert"
Are they really openly admitting to RAT-surveilling everyone?! What am I missing?
Re: (Score:2)
This seems to be an automated porn scanner that they can throw on a PC they collected as evidence. I guess the idea is to save them some time looking for porn manually.
I seem to recall that deserts were one of the Windows 7 wallpaper packs. Windows 8 had them too and included by default, and Windows 10 displays a random landscape from Bing on the login screen.
UK police scanning your screen saver images! (Score:5, Insightful)
WTF are police doing scanning the images used for screen savers? Those aren't, generally, on a public server.
It the UK cops have a trojan on UK computers, I think the rest of the world should get to work burying their servers in noise. Antivirus definitions need updating ASAP.
Re:UK police scanning your screen saver images! (Score:4, Funny)
Who is General Failure and what is he doing reading my hard drive?
Re: (Score:1)
He is the superior of Major Malfunction, and he is looking for Private Pictures.
Re:UK police scanning your screen saver images! (Score:4, Funny)
Re: UK police scanning your screen saver images! (Score:2)
This is the UK, they correctly pronounce.
Re: (Score:2)
Re: (Score:2)
In this case it sounds like they need to call on Private Browsing.
Re: (Score:2)
Most Windows distributions come with Remote Administration enabled. That allows for remote monitoring. A relative had a problem with her PC, called up technical support and got a technician to fix her PC remotely. She saw the cursor whizzing around, windows opening and closing, text being typed in. She thought it was marvelous. Didn't quite understand that meant anyone could be monitoring what she typed. Then some applications seem to stream telemetry via Amazon Web Services (AWS). Even a web browser has ac
Re: (Score:3)
Re: (Score:2)
GET OUT!
Re: (Score:2)
Okay how's this for an idea: FUSE python and PIL to generate an infinite directory tree containing random image data which matches the skin filter without being actually pornographic...
Re: (Score:2)
Okay so there are some false positives.
The reason for the false positives is almost certainly a dearth of desert pictures in the NN training set. So download 10,000 desert images, and retrain.
Does this really mean they need more people to sort the results because there is no way the algorithm alone will be enough for a conviction.
Umm. The purpose of the program is to flag suspicious images for human review. There is no fricken way that the program's output is going to be directly admissible in court.
Re: (Score:1)
BARE NAKED DUNES WAITING FOR YOU! (Score:5, Interesting)
African, Asian, European desert lands all uncovered and untouched by man. Clean shaven of all foliage. Aching for a BBC (big black camera) to shoot all over their majestic vistas. They want long lenses and offer roads for deep penetrations into their wildest areas. Only $9.95/mo gets you all access to hours of video and thousands of pictures. Round, brown mounds of dirt, golden, flowing sands, fiery red sunsets are all waiting for you!
Re: (Score:2)
"Round, brown mounds of dirt, golden, flowing sands, fiery red sunsets are all waiting for you!"
Don't forget the snake holes.
Re: (Score:2)
A naked person in a desert kills the system ... (Score:1)
Re: (Score:2)
Nobody likes a sandy vagina.
Re: (Score:2)
Nobody likes a sandy vagina.
Africans do, apparently.
http://www.dailymail.co.uk/hea... [dailymail.co.uk]
https://www.vice.com/en_au/art... [vice.com]
Re: (Score:2)
Once you've fed "Paradise (1982)" into the AI, all arid scenes will trigger that response. It doesn't matter the age. Same as humans.
The big question (Score:5, Interesting)
I would even ask why the hell do they want to spot pornography?
Both are very common human expressions and they should not be a reason for police investigation.
Re: (Score:2)
Considering that they're specifically examining the computers of people suspected of harming children, it would make a lot of sense that it would be done with the goal of protecting those children.
It is one thing if what they're doing has nothing to do with children, and they use it as an excuse anyways. But if it does involve children, and you're saying that anyways, it makes me wonder if they should be looking at your computer, too!
Re: (Score:1)
Indeed, I imagine the ultimate goal of this software is to automatically detect childhood abuse pictures on seized computers. This way they don't have to traumatize and burn out human staff on visually checking the images on a computer. It also saves them the time of having to look through the millions of pictures (mostly cats) possibly hosted on the average computer.
Re: (Score:2)
Indeed, I imagine the ultimate goal of this software is to automatically detect childhood abuse pictures on seized computers. This way they don't have to traumatize and burn out human staff on visually checking the images on a computer. It also saves them the time of having to look through the millions of pictures (mostly cats) possibly hosted on the average computer.
Yes it would save them time of having to look through millions of pictures but the software cannot alone be used to form a conviction. At some point a real person is going to have to review the images to confirm their contents as flagged by the computer. This won't prevent trauma and burn out on human staff but will speed up investigations greatly.
Re: (Score:2)
Re:The big question (Score:4, Insightful)
Well, if they're doing it by looking for a superabundance of Caucasian skin tones, they're doing a fucking horrible job of it. And apparently black kids don't count.
Re: (Score:1)
I think what they are looking for specifically are images of Child Sexual Abuse. (Which is not pornography, it is abuse, which is why the police are involved).
And yet in no other case in our entire legal code is mere possession of photographic or video evidence of a crime itself a crime. Pictures of ISIS beheading people in Syria? Totally legal. Picture of dead body at scene of drive by shooting? Totally legal. Why should this one crime be carved out from among all of the others as a special case? Perhaps you would argue that victims of abuse suffer knowing that images of that abuse are still out there but then do the victims of other crimes suffer less? Is one f
Re: (Score:2)
Two reasons (Score:2)
Re: (Score:2)
Haven't they heard the good news about Sithrak [oglaf.com]?
Re: (Score:2)
The big question is why the hell do they bother to detect nudes? I would even ask why the hell do they want to spot pornography? Both are very common human expressions and they should not be a reason for police investigation.
Limited time and resources? Here's a suspect's computer, it has 10k images on it. Run it through the system and rank the photos most likely to contain illegal pornography. I'd be surprised if they didn't try to make some kind of automated ranking. Narrowing it down to porn would be a start. That said, if they're triggering on deserts they must be doing something really stupidly like training it exclusively on porn - which would have a lot of those colors - and then trying to apply that to general images. Th
Re: (Score:2)
They say they're scouting for child porn. How can their software tell the difference between an adult body and a child body if it can't differentiate a nude from a dune?
Legit usage (Score:5, Funny)
There is legit usage cases for this software. I, for one, would not trust people who jack off to images of deserts. They should be put on some police list of some sort, so that I know to avoid them.
I played San Andreas.... (Score:2)
Would that be "white" sand? (Score:2)
Racists!
It's a simple coding error (Score:5, Funny)
DUNE is a misspelling of NUDE.
Re: (Score:2)
Nude.
Nude Messiah.
Children of Nude.
God Emperor of Nude.
Heretics of Nude.
Chapterhouse: Nude.
Re: (Score:2)
Funny coincidence, I read the Dune series when I was 14, too!
I think my reading went about the same way. Never figured out what the boring parts were supposed to be.
Re: (Score:3)
Oh, it gets better.
The series was translated into my native language. The titles as well.
Do you know what a dune is called in Danish?
Klit.
I'm not joking.
Obesity (Score:2)
Perhaps this says something about the obesity epidemic...
Legs or Hot Dogs? (Score:2)
AI? (Score:1)
Re: (Score:2)
No, they're not trying to get the computer to do all the police work, they're just trying to use it to narrow it down to nude images so that the amount of work they have to do is much smaller. This is for searching people who are already suspects.
Re: (Score:2)
Generally speaking they're not going to be trying to distinguish a 17 year old from an 18 year old. They're going to be trying to distinguish a 10 year old from an 18 year old. And there's definitely some obvious physiological differences there (in most instances.)
Re: (Score:2)
Re: (Score:2)
That's kind of my point. Its not easy, even for a human, to distinguish a 17 year old from an 18 year old based on pure visual clues. But other than extreme cases of precocious puberty its usually pretty easy for a human to tell a 10 year old from an 18 year old. Meaning that at least in principle, the latter should be doable by a smart enough AI (which is not saying such an AI is necessarily easy to build of course. Just that it should be possible based on how easy it is for us humans to do the job.)
Is there something left unsaid here? (Score:2)
How else would they become aware of the types of screensavers people are using? Not sure but I find this more than slightly creepy.
Is this more of that famous Nanny State® at work again? "Think Of The Children" gone wrong...
Re: (Score:2)
I mean, correct me if I am wrong but doesn't the only way to see what images people have on their desktops mean that the Metropolitan Police are poking around and doing some sort of unauthorized remote access into peoples' machines?
Where did you dredge up the word "unauthorized?" Why do you just presume that? Did you even know that child sex abuse is illegal, and that a warrant granted to search for evidence of the crime would be a legit authorization? Surely this is a tool to help the Metropolitan police poke around during authorized access.
Re: (Score:2)
He gets it from the TFS being kind of misleading and making it sound like the police are scanning everyone's computers.
Which may of course be authorized if the government is shitty enough.. and not implausible in the UK these days as there's been a push there for full-on surveillance state for 5 or so years now.
Re: (Score:2)
When you're not presented with a boundary, don't assume an absolute is implied.
Re: (Score:2)
And with the way governments around the world have been going in the past few years with regard to mass surveillance, failing to at least consider the absolute would be naive.. sometimes even when you are presented with a boundary.
Re: (Score:2)
No, extremist hyperbole does not actually change the logical value of fake absolutes at all.
It is only the shape your stupidity takes, it is not any sort of excuse.
Re: (Score:2)
I'm uncertain what is more creepy, the police remotely accessing your computer to check your screen saver, or people having child porn pictures as screen saver.
Spotted Dick (Score:2)
Skin detection algorithm (Score:2)
Re: (Score:2)
I came across an extremely simple skin detection algorithm some time ago and made a GLSL implementation... It is actually quite robust if the lighting conditions of the scene don't change too much.
How robust is it if the subject is clad from head to toe in black latex?
As for deserts, google image search "sexy iceberg".
50 R - G 250 (Score:2)
Re: 50 < R - G < 250 (Score:2)
Re: (Score:2)
Yeah, mistaking a desert for porn. (Score:2)
banana split [pinimg.com]
Deserts and Virgins (Score:1)
This is a tremendous advance! (Score:2)
Deserts in Nevada are legal... (Score:2)
...in you are into that.
Re: (Score:2)
Oh no, Mojave desert is very young compared to most, only 10,000 years ago there were lakes and marshes there. Only the millions of years old deserts should be attractive to you, you pedo!
Maybe it's dyslexic (Score:1)
Maybe it's dyslexic, and can't tell the difference between small nudes and small dunes.
ooh (Score:1)