Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Privacy Software Linux

A Linux-Based "Breath Test" For Porn On PCs 345

Gwaihir the Windlord writes "A university in Western Australia has started beta testing a tool that's described as 'a random breath test' to scan computers for illicit images. According to this article it's a clean bootable Linux environment. Since it doesn't write to the hard drive, the evidence is acceptable in court, at least in Australia. They're also working on versions to search for financial documents in fraud squad cases, or to search for terrorist keywords. Other than skimming off the dumb ones, does anyone really expect this to make a difference?" The article offers no details on what means the software uses to identify suspicious files.
This discussion has been archived. No new comments can be posted.

A Linux-Based "Breath Test" For Porn On PCs

Comments Filter:
  • by Jane Q. Public ( 1010737 ) on Tuesday November 04, 2008 @02:05PM (#25629709)
    ... would be to get a hash value for individual files, and compare that to known hash values for known infringing files. And there are already tools that do this.
  • Psych-Ops (Score:5, Interesting)

    by unlametheweak ( 1102159 ) on Tuesday November 04, 2008 @02:12PM (#25629821)

    The article offers no details on what means the software uses to identify suspicious files.

    I highly suspect that the police don't want people to know the details of how sophisticated their technology is because they don't want to embarrass themselves. Keeping an aura of mystery and FUD around themselves and their techniques is also a form of psych-ops; it's the chrome facade of a lemon.

  • by Hatta ( 162192 ) on Tuesday November 04, 2008 @02:12PM (#25629831) Journal

    And trivial ways to get around it. An encrypted file system is the obvious solution, but hell if they're just checking hashes you could use ImageMagick and a very small shell script to very slightly alter the image, giving you an entirely new hash.

  • by Beryllium Sphere(tm) ( 193358 ) on Tuesday November 04, 2008 @02:23PM (#25630049) Journal

    A local forensics expert says the same thing of his practice. In fact, last time I heard him speak about it, he said he'd never encountered encryption in a case he handled.

    There's some sample bias going on there, because he refuses to handle some cases, and child pornography is one of the things he won't touch.

    BitLocker may make encryption more mainstream.

  • by TheRaven64 ( 641858 ) on Tuesday November 04, 2008 @02:31PM (#25630191) Journal
    Sounds dubious to me. In most jurisdictions I'm aware of, you are not allowed to connect hard drive to a machine physically capable of writing to it if you want anything retrieved from it to be admissible in court, and you need a chain of custody showing this. Software write protection is not good enough, you need to physically disconnect the write pins from the cable (no idea how they do this from SATA - probably something which intercepts write commands and blocks them and goes through an expensive approval process to ensure that it works).
  • by Facegarden ( 967477 ) on Tuesday November 04, 2008 @02:37PM (#25630307)

    'Human skin tones' is a pretty wide range though. Even just restricting it to 'white' people gives you a big range of colours if you consider the various shades of tan / sunburn - anything from deep red to pale white through dull brown. If you want to find naked black- or yellow-skinned people then it's an even bigger range. If something is blue or green you could probably guess it's not naked skin (unless the person is bruised, or wearing body paint), but without factoring in shape as well it's pretty difficult to tell if something is human coloured or not.

    Actually, human skin is pretty much all the same hue, it just has different saturation levels. If you convert each image to HSV from RGB, you can just look at the hue component and people all pretty much look the same. This is common in computer vision techniques for identifying skin.
    -Taylor

  • by TerranFury ( 726743 ) on Tuesday November 04, 2008 @02:40PM (#25630365)
    Once upon a time, a company did this, and sold their product to another corporation so that they could monitor employees' email. If I recall correctly, it ended in tears when somebody got sent baby pictures.
  • by sjf ( 3790 ) on Tuesday November 04, 2008 @03:25PM (#25631091)

    It's not the folks descended from criminals that worry me. It's the folks who are descended from the prison wardens who cause all the trouble.

  • by FictionPimp ( 712802 ) on Tuesday November 04, 2008 @03:30PM (#25631167) Homepage

    I have an encrypted disk that is full of encrypted disks. They are labeled backup_date and important_documents_date, etc. I have a special one named long_term_storage that is for 'special' files I do not want the rest of the world to have access to but do not belong in a category I set up.

    So not only do you need my encryption password to boot my notebook, but then you need to know the password of the individual containers to see what is inside them. That is of course assuming I don't have any hidden containers or keyfiles.

  • Re:Illicit? (Score:2, Interesting)

    by Mister Liberty ( 769145 ) on Tuesday November 04, 2008 @03:33PM (#25631231)
    Don't you understand -- the test is meant to sniff out those computers that are porn-clean.
    Owners of said machines must be hiding something.

    -- punt!
  • by thepotoo ( 829391 ) <thepotoospam@@@yahoo...com> on Tuesday November 04, 2008 @03:43PM (#25631405)
    And someone on a previous Slashdot story pointed out that a good way around this is to reduce the image to a small size (say 255x255 pixels), convert it to black and white, and take an MD5 of the resulting image. This way you have to drastically change the content of the image to foil MD5 checks.

    Pretty hard to beat, unless you just encrypt everything.

  • Re:Illicit? (Score:3, Interesting)

    by sanjosanjo ( 804469 ) <[sanjosanjo] [at] [gmail.com]> on Tuesday November 04, 2008 @03:58PM (#25631623)

    Last time I checked, porn was not illegal.

    While the summary says "porn", the article is referring to child pornography - which is illegal.

  • by _Sprocket_ ( 42527 ) on Tuesday November 04, 2008 @04:03PM (#25631691)

    One of the environments I worked in had a sniffer that grabbed all the images (and associated session information) it could see on the wire for that organization (or at least a subset - there was a LOT of traffic involved). It would then process those images and generate a "skin folder" of suspect imagery. We could then sift through that skin folder looking for illicit browsing, etc.

    Yeah - it caught porn. But it also contained a lot of imagery of furniture, mars landscapes, deserts (it really liked the time pictures of camel spiders in Sandland were the hot topic of emails) and other such not-skin-oriented imagery.

  • by HungryHobo ( 1314109 ) on Tuesday November 04, 2008 @04:20PM (#25631935)

    False negatives are something which gets less press but can still be funny.

    Girl I worked with was being driven home by her boyfriend. They get stopped at a checkpoint. He's cold sober but she's had enough alcohol to knock out a bull elephant.
    The officer taps on the window, window rolls down "could you blow on this please", "no problem", DING green light.
    At this point my very drunk workmate leans across her boyfriend "CAN I HAVE A GO!TEHEHEHE! You don't have to change the mouthpiece!". The police officer rolls her eyes but lets the mad drunk passenger blow into it as there were no other cars waiting.
    You guessed it.
    *DING green light*
    Que some odd looks from the officer and a great deal of lost faith in the technology.

  • by mdmkolbe ( 944892 ) on Wednesday November 05, 2008 @12:50AM (#25636753)

    As you demonstrate, the MD5 technique does not work. However there are other image "hashing" techniques that do work. For example, take the first three statistical moments of the histogram of the R, G and B intensities. To compare two images take a simple L1 distance between those moments. If it's below some threshold they are the same.

    Disclaimer: The above algorithm works best for detecting differences between two video streams even when those video streams are distorted by color shifts. (I have personal experience with using it on production software.) For detecting similarities of images you may have to use slightly different techniques.

  • by Firrenzi ( 229219 ) on Wednesday November 05, 2008 @02:49AM (#25637309)

    It's a fair comment to say that images that are changed are going to have different hash values. But how many non tech people who download images en masse that are of interest to law enf0rcment are going to reneame them? Often, it's those that don't think about what they are doing that these tools are designed to catch; end users as such.

    Tools like these I believe are for the majority of cases and the occasional big ring crackdown. It's not so that they can shut down kiddy pr0n, but to tell everyone that they are doing something about it while putting in minimal effort and thus justifying a government job. It's amazing how many people in government jobs will keep a cruisy job going if all they have to do is justify it every now and again. I see it around me every day in my job. Maximum output on paper, with minimal input in reality.

    Then again I could be wrong, but I've been known to lean on my government 'shovel' from time to time aswell.

Software production is assumed to be a line function, but it is run like a staff function. -- Paul Licker

Working...