Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI Crime United Kingdom

UK Police's Porn-Spotting AI Keeps Mistaking Desert Pics for Nudes (gizmodo.com) 144

An anonymous reader quotes Gizmodo: London's Metropolitan Police believes that its artificial intelligence software will be up to the task of detecting images of child abuse in the next "two to three years." But, in its current state, the system can't tell the difference between a photo of a desert and a photo of a naked body... "Sometimes it comes up with a desert and it thinks its an indecent image or pornography," Mark Stokes, the department's head of digital and electronics forensics, recently told The Telegraph. "For some reason, lots of people have screen-savers of deserts and it picks it up thinking it is skin colour."
The article concludes that the London police software "has yet to prove that it can successfully differentiate the human body from arid landscapes."
This discussion has been archived. No new comments can be posted.

UK Police's Porn-Spotting AI Keeps Mistaking Desert Pics for Nudes

Comments Filter:
  • by fyngyrz ( 762201 ) on Saturday December 23, 2017 @02:51PM (#55796759) Homepage Journal

    One hump, or two?

    • What do you call a three-humped camel?

    • And how many toes?
    • All the humps. I love fucking sand!! It got in my wife's Yahoo one summer and now I can't stay erect without it. We are divorced, but nothing sends me over the moon like sand dunes. In the shape of tits. Or hyperbolae. Or straight lines. Or curved straight lines. Or sandboxes.

      Hmmmm sand box.

      • Ah. An arenophile. How ... quaint.

        I didn't think there were many people left who were willing to admit to being a sand-botherer in public. Not since the convictions of the pavement-masturbator.

        Not one of the above concepts is invented - all real "kinks".

    • by quenda ( 644621 )

      Camels? ... Oh!
        I was thinking of cherries on ice-cream scoops.

  • Eeeeeh?! (Score:5, Insightful)

    by Anonymous Coward on Saturday December 23, 2017 @02:52PM (#55796763)

    ""For some reason, lots of people have screen-savers of desert"

    Are they really openly admitting to RAT-surveilling everyone?! What am I missing?

    • by AmiMoJo ( 196126 )

      This seems to be an automated porn scanner that they can throw on a PC they collected as evidence. I guess the idea is to save them some time looking for porn manually.

      I seem to recall that deserts were one of the Windows 7 wallpaper packs. Windows 8 had them too and included by default, and Windows 10 displays a random landscape from Bing on the login screen.

  • by HornWumpus ( 783565 ) on Saturday December 23, 2017 @02:53PM (#55796771)

    WTF are police doing scanning the images used for screen savers? Those aren't, generally, on a public server.

    It the UK cops have a trojan on UK computers, I think the rest of the world should get to work burying their servers in noise. Antivirus definitions need updating ASAP.

    • by PPH ( 736903 ) on Saturday December 23, 2017 @03:19PM (#55796891)

      Who is General Failure and what is he doing reading my hard drive?

    • by mikael ( 484 )

      Most Windows distributions come with Remote Administration enabled. That allows for remote monitoring. A relative had a problem with her PC, called up technical support and got a technician to fix her PC remotely. She saw the cursor whizzing around, windows opening and closing, text being typed in. She thought it was marvelous. Didn't quite understand that meant anyone could be monitoring what she typed. Then some applications seem to stream telemetry via Amazon Web Services (AWS). Even a web browser has ac

    • Reading the articles it seems that the police are using the AI to search through the machines of _suspects_. This implies that they have custody of the suspect's physical machines.
      • Reading the articles

        GET OUT!

      • Okay how's this for an idea: FUSE python and PIL to generate an infinite directory tree containing random image data which matches the skin filter without being actually pornographic...

  • by SeaFox ( 739806 ) on Saturday December 23, 2017 @03:02PM (#55796823)

    African, Asian, European desert lands all uncovered and untouched by man. Clean shaven of all foliage. Aching for a BBC (big black camera) to shoot all over their majestic vistas. They want long lenses and offer roads for deep penetrations into their wildest areas. Only $9.95/mo gets you all access to hours of video and thousands of pictures. Round, brown mounds of dirt, golden, flowing sands, fiery red sunsets are all waiting for you!

    • "Round, brown mounds of dirt, golden, flowing sands, fiery red sunsets are all waiting for you!"

      Don't forget the snake holes.

    • To each their own, but you can keep your strip malls. It's only bush country for me.
  • A picture of a nude person in a desert must render this thing catatonic ;)
  • The big question (Score:5, Interesting)

    by Teun ( 17872 ) on Saturday December 23, 2017 @03:16PM (#55796871)
    The big question is why the hell do they bother to detect nudes?
    I would even ask why the hell do they want to spot pornography?

    Both are very common human expressions and they should not be a reason for police investigation.
    • I think what they are looking for specifically are images of Child Sexual Abuse. (Which is not pornography, it is abuse, which is why the police are involved).
      • by Chris Mattern ( 191822 ) on Saturday December 23, 2017 @04:51PM (#55797279)

        I think what they are looking for specifically are images of Child Sexual Abuse.

        Well, if they're doing it by looking for a superabundance of Caucasian skin tones, they're doing a fucking horrible job of it. And apparently black kids don't count.

      • by Anonymous Coward

        I think what they are looking for specifically are images of Child Sexual Abuse. (Which is not pornography, it is abuse, which is why the police are involved).

        And yet in no other case in our entire legal code is mere possession of photographic or video evidence of a crime itself a crime. Pictures of ISIS beheading people in Syria? Totally legal. Picture of dead body at scene of drive by shooting? Totally legal. Why should this one crime be carved out from among all of the others as a special case? Perhaps you would argue that victims of abuse suffer knowing that images of that abuse are still out there but then do the victims of other crimes suffer less? Is one f

      • Looking for CP is just the excuse, since we're nowhere near an AI that could distinguish between a naked child and naked adult, especially since we insist on calling teenager sexts CP. Consider where this is-- the UK, where they have a long standing history of trying to restrict adult porn too.
    • first, Christians. There's a concept I call "Aggregate Sin" because I can't think of a better name for it. Christians (the ones who take the Bible literally) believe that if there's too much sin in the world God will punish us. e.g. God punishes the Faithful for the sins of the heretic. Therefore sin, any sin, is an existent threat to them. If you extrapolate from the tales of Sodom & Gomorrah & Noah's Ark this makes sense. You see this most commonly when a natural disaster strikes and a bunch of th
    • by Kjella ( 173770 )

      The big question is why the hell do they bother to detect nudes? I would even ask why the hell do they want to spot pornography? Both are very common human expressions and they should not be a reason for police investigation.

      Limited time and resources? Here's a suspect's computer, it has 10k images on it. Run it through the system and rank the photos most likely to contain illegal pornography. I'd be surprised if they didn't try to make some kind of automated ranking. Narrowing it down to porn would be a start. That said, if they're triggering on deserts they must be doing something really stupidly like training it exclusively on porn - which would have a lot of those colors - and then trying to apply that to general images. Th

    • They say they're scouting for child porn. How can their software tell the difference between an adult body and a child body if it can't differentiate a nude from a dune?

  • Legit usage (Score:5, Funny)

    by Ecuador ( 740021 ) on Saturday December 23, 2017 @03:25PM (#55796915) Homepage

    There is legit usage cases for this software. I, for one, would not trust people who jack off to images of deserts. They should be put on some police list of some sort, so that I know to avoid them.

  • by bobstreo ( 1320787 ) on Saturday December 23, 2017 @03:35PM (#55796953)

    DUNE is a misspelling of NUDE.

    • by Calydor ( 739835 )

      Nude.
      Nude Messiah.
      Children of Nude.
      God Emperor of Nude.
      Heretics of Nude.
      Chapterhouse: Nude.

      • Funny coincidence, I read the Dune series when I was 14, too!

        I think my reading went about the same way. Never figured out what the boring parts were supposed to be.

        • by Calydor ( 739835 )

          Oh, it gets better.

          The series was translated into my native language. The titles as well.

          Do you know what a dune is called in Danish?

          Klit.

          I'm not joking.

  • Perhaps this says something about the obesity epidemic...

  • They're expecting AI to be able to tell the difference between an under-aged nude and an adult nude? Wow that's going to take a long time.
    • No, they're not trying to get the computer to do all the police work, they're just trying to use it to narrow it down to nude images so that the amount of work they have to do is much smaller. This is for searching people who are already suspects.

    • by Altrag ( 195300 )

      Generally speaking they're not going to be trying to distinguish a 17 year old from an 18 year old. They're going to be trying to distinguish a 10 year old from an 18 year old. And there's definitely some obvious physiological differences there (in most instances.)

      • You just think it's easy because you're human and take for granted how very subtle the visual cues are.
        • by Altrag ( 195300 )

          That's kind of my point. Its not easy, even for a human, to distinguish a 17 year old from an 18 year old based on pure visual clues. But other than extreme cases of precocious puberty its usually pretty easy for a human to tell a 10 year old from an 18 year old. Meaning that at least in principle, the latter should be doable by a smart enough AI (which is not saying such an AI is necessarily easy to build of course. Just that it should be possible based on how easy it is for us humans to do the job.)

  • I mean, correct me if I am wrong but doesn't the only way to see what images people have on their desktops mean that the Metropolitan Police are poking around and doing some sort of unauthorized remote access into peoples' machines?

    How else would they become aware of the types of screensavers people are using? Not sure but I find this more than slightly creepy.

    Is this more of that famous Nanny State® at work again? "Think Of The Children" gone wrong...
    • I mean, correct me if I am wrong but doesn't the only way to see what images people have on their desktops mean that the Metropolitan Police are poking around and doing some sort of unauthorized remote access into peoples' machines?

      Where did you dredge up the word "unauthorized?" Why do you just presume that? Did you even know that child sex abuse is illegal, and that a warrant granted to search for evidence of the crime would be a legit authorization? Surely this is a tool to help the Metropolitan police poke around during authorized access.

      • by Altrag ( 195300 )

        He gets it from the TFS being kind of misleading and making it sound like the police are scanning everyone's computers.

        Which may of course be authorized if the government is shitty enough.. and not implausible in the UK these days as there's been a push there for full-on surveillance state for 5 or so years now.

        • When you're not presented with a boundary, don't assume an absolute is implied.

          • by Altrag ( 195300 )

            And with the way governments around the world have been going in the past few years with regard to mass surveillance, failing to at least consider the absolute would be naive.. sometimes even when you are presented with a boundary.

            • No, extremist hyperbole does not actually change the logical value of fake absolutes at all.

              It is only the shape your stupidity takes, it is not any sort of excuse.

    • I'm uncertain what is more creepy, the police remotely accessing your computer to check your screen saver, or people having child porn pictures as screen saver.

  • is delicious! :)
  • I came across an extremely simple skin detection algorithm some time ago and made a GLSL implementation. Check here for a demo and explanation (click about). http://labs.thygate.com/amazin... [thygate.com] It is actually quite robust if the lighting conditions of the scene don't change too much.
    • I came across an extremely simple skin detection algorithm some time ago and made a GLSL implementation... It is actually quite robust if the lighting conditions of the scene don't change too much.

      How robust is it if the subject is clad from head to toe in black latex?

      As for deserts, google image search "sexy iceberg".

    • Allow me to further elaborate as to WHY i found this algorithm so interesting, because it's so ridiculously simple. It considers a pixel is skin if 50 R - G 250, where R and G are the red and green channels from a 8-bit RGB pixel representation. Sure it's not meant for the prime-time, it's just a very interesting curiosity, and easy to perform in parallel.
  • What are they thinking?
    banana split [pinimg.com]
  • Damb there in there somewhere... all 70 of em.
  • It has gone beyond dreaming and is seeing mirages!
  • ...in you are into that.

    • Oh no, Mojave desert is very young compared to most, only 10,000 years ago there were lakes and marshes there. Only the millions of years old deserts should be attractive to you, you pedo!

  • Maybe it's dyslexic, and can't tell the difference between small nudes and small dunes.

  • hot sandy mounds

Remember, UNIX spelled backwards is XINU. -- Mt.

Working...