Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Censorship

Software That Can Censor 'Sexual Images.' Or Not. 247

Halster writes: "Here's an interesting story on Newswire about censorware that detects excessive skintones in images, and implements blocking accordingly. What next?" What's next is (you'd hope) the realization that image-analysis heuristics are inherently limited, and not the best thing on which to pre-emptively base system-admin decisions. ( michael : That story is a company press release. For a much better evaluation of how this software works, see this Wired expose detailing the fraudulent nature of image-filtering "artificial intelligence," or an older review from Businessweek on Eyeguard.)
This discussion has been archived. No new comments can be posted.

Software That Can Censor 'Sexual Images'

Comments Filter:
  • by David Wong ( 199703 ) on Tuesday June 20, 2000 @04:10AM (#990172) Homepage
    "...Not only does eyeguard alert the network administrator, but it also disables the computer and takes a snapshot of the suspect image.."

    My boss has installed this software, and is now forcing the entire office to surf for porn. These "snapshots" are sent directly to his hard drive, which is saving him the time of having to sift through thousands of non-porn pictures to get the ones he wants. Thanks to this software and the snapshot feature, my boss is able to accumulate pornographic images at 10X his previous efficiency.

    Eye-T, Mr. Wilkerson thanks you.

  • I would love to have a configurable discriminator for pornography.

    If they can tell what the people in the picture are doing by analyzing the picture, I wouldn't have to sort through all of those GIFS myself...

    (grin)
  • shit! Then I'll just have to download the new version of the picture too!

    See how this works?

    If it ain't broke, fix it 'til it is!
  • [quickly, since I'm in a hurry]:

    Me and a buddy actually started "development" (we talked a lot) about "p0rnster", a webcrawler that would take links from the porn-linksites, look for certain keywords, follow X links through, and save all jpeg's over Y kb big.

    Never got around to it though.
  • by Morgaine ( 4316 ) on Tuesday June 20, 2000 @12:00PM (#990176)
    The entire premise of associating skin tones with pornography is flawed. It's trivial to create a work that would be widely regarded as pornographic despite not showing ANY normal skin tones at all, or even any skin at all.

    Pornography is not a property of images. It is a property of a culture, and of the value judgements that that culture makes about sex and nudity.

    Imbued as we are with American values acquired through film, we tend to forget the above, but in Europe we're fortunate enough to have a million beaches where nudity is nothing special to bring back home the relativity of values. Nothing else makes the point so effectively.
  • This is yet another tool for the moral "majority" to control what kind of material is downloaded from the net.

    How soon can we expect ISPs, most likely after being strong-armed by the government (just look what's happening in the UK), to start scanning data streams for porn images, unpopular political symbols or even faces? --Peter Dyck

  • No.

    This is well calculated advertisement. This page will hit the decency filters of any web fashist around. And it will be forwarded to HR. And they will buy it. That is the idea. I would say very well calculated and congratulate the inventor

    On the topic of the filter itself. I bet that this crapware will fail on:

    • Letting Salvador Dali in. Sorry. I want my art ;-)
    • Keeping Naomy Campbell naked butt out!!!
  • Yeah. I've often thought of the state of computers now like the auto industry of the 50's. It was the "American Dream" to own a computer now, like it was for cars. People want it so badly that they are willing to settle for crap. Once they become more integrated into everyday life (not like an obsession device, but the way you use your fridge or car), then quality will start to improve. I'll give it 40 years, maybe they'll straighten it out.
  • Aside from the fact that it would be unable to distinuish between the pigmentation of various races it would also fall completely flat on those with an interest in rubber & latex.
  • After reading this article I decided to make a background tile skin.jpg [cc.fl.us] made from my very own fleshtones, yes you can have a little piece of bytor on your web page. I think sticking enough images like this onto random web pages should keep the censorware occupied a little while.
  • what is even more amusing is that the people writing the filtering software don't understand this either, I'll bet more of them are using /sex/ig than /\bsex\b/ig
  • Skin tones, eh? I would imagine that they're mainly, shall we say, "caucasian" skin tones.

    It'd be fun to see this turn into the biggest lawsuit since Crayola's whole "Flesh" colored crayon debacle.

    Anyone know if they're still making "Indian Red"?
  • This is crazy. What's next? Will other companies follow suit and create software that filters ads, sites, etc about their competitors? Nobody would buy this kind of software, but what is to keep them from sticking it into some already proprietary code? What if MicroSoft decided to stick some Anti-Netscape code into IE? This is a step in the wrong direction. I agree that kids shouldn't be able to access some things on the internet, but whos responibility is this? Nevermind the fact that they have parents for that reason. This is just another instance that proves that parents have lost control. I think they should take some responsibility and monitor their kids. Or maybe it is our responsibility. I don't know.
  • Based on the results from Wired, I think that most of the test images were pornographic - hey, we found a 100% success rate because of our 1,000 test images, we included three icons from the desktop - and none of them were tagged pornographic.

    (After, of course, they removed the manilla folder icon and replaced it with another...)

  • Since when does skin tones equal pornography?

    Since when-the-hell has naked human skin been pornographic?

    Why is this always the equation:

    naked = pornography = sex = bad

    There is an absolute difference between nudity and pornography and sex.

    nudity != pornography

    nudity != sex

    nudity != bad

    Deal with it!

    It's the utterly mindless tendency of unthinking, self-righteous people to equate the one with the other that needs to be remedied...

    t_t_b
    --

  • ... but the problem with open censorware lists is that the block list becomes a very valuable commodity. Kid manages to hack round the protection - heck, the number of teenage boys who are better with the computers than their parents isn't low - and surfs straight through the blocklist, using it as a porn-only Yahoo! Or, for that matter, moves it to someone else's (unblocked) machine and surfs from there.

    What we have here is a fantastically complex problem. Cyberpatrol has pulled some _disgusting_ stunts - like blokcing sites which criticised its methods - but that doesn't necessarily mean that all censorship software is bad.

    Put it this way: imagine I'm (say) 10-15 years older, and with primary school age kids. Chances are, they're going to want to play with this Internet thing, just like we watched TV at that age. Now, do I want to have to say they can only surf whenever I'm watching over their shoulders? Of course not - that's ridiculously laborious and clearly impractical. The far simpler approach is to do what you do with kids and TV - bar by program / channel / time. So, in this case, don't allow access to Usenet & Gnutella (for example) and only allow websites you've approved. Kid gets a better net experience while you're prepared to let them spend longer online because it doesn't take so much effort on your part.

    For this sector of the market - if nothing else - net censorship software is needed. You and I may well be able to get the desired result by configuring a spare box in the back bedroom as a proxy server, but we're not representative of the general population.

    Censorship software hasn't yet hit the sweet spot but it's definitely necessary, if only within limited areas. And, while I see the advantage to an open blocklist, I hope you see the problem, too.
  • About three weeks ago, I submitted a story just like this one about a very similar product from HeartSoft [heartsoft.com]. Of course, Slashdot rejected it. Duh!!!

    Anyway, the product is called Internet Safari [internet-safari.com], and it's a web browser (I guess some hacked up version of IE) that includes a image analysis filter. My guess is that it does the same thing as eyeguard.

  • heaven forbid that our children see pornographic material at home rather than going out and having sex, getting std's, and pregant.

  • In reading the previous posts to this news, I have come to one basic conclusion: People think that just because this technology is out there, they must implement it, and forever be cursed with the restrictions associated therewith.
    Give me a break.
    Although the possibility exists that some law will be passed forcing every OS in America to include this software--and force it upon the end user--the chances of that range anywhere from slim to none.
    And despite what Microsoft may want you to believe, you still have control over what software you want to install. Or is that not the case? Did I miss something that everyone else got?
    The bottom line is: If you don't like it, don't install it!


    -={(.Y.)}=-
  • by Tayknight ( 93940 ) on Tuesday June 20, 2000 @04:12AM (#990192)
    I think this is one of the down-falls of the way many businesses are conducting themselves on the internet. This is obviously a company that was able to drive up the hype on their product. Now they are able to keep saying, "we're working on it" and most people will say 'OK' and not really hold the company responsible. Imagine if a company that made a real physical object tried this. Cars that crashed or drove the wrong direction 90% of the time. Or a kitchen disposal that ground up your hand in addition to the kitchen waste. Consumers wouldn't allow this product to remain. Its time we do the same for e-companies. Especially ones that proclaim to help children. None of these products work well. Just check out Peacefire.org [peacefire.org]. Get the lowdown.
  • The entire premise of associating skin tones with pornography is flawed. It's trivial to create a work that would be widely regarded as pornographic despite not showing ANY normal skin tones at all, or even any skin at all.

    This is the most relevant statement I've seen in this entire discussion.

  • by Karellen ( 104380 ) on Tuesday June 20, 2000 @04:12AM (#990194) Homepage
    From page 2...

    "There are over 10 million neural networks involved in the thing," Beecher says."

    Bloody hell! Imagine the computing power they've got to run 10 million neural _networks_.

    ...or maybe they mean 10 million neurons. Doh!
  • by pb ( 1020 )
    How did you find out about my She-Hulk collection!!!

    Oh well, you know this wouldn't affect Captain Kirk: he's got that fetish for blue women...

    So where's the web filter that filters out stupid e-censorship ideas?

    Oh. It filtered itself. Never mind...
    ---
    pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
  • The problem with allowing child pornography and then prosicuting people who expolit children without their consent is that if this was allowed, it would let politicans give all sorts of reasons to make speaches about how they should make the internet into a police state so that they can track down everyone who might possibly be linked to the photo(s) in question. Then these politicans (isn't it odd how the most computer illiterate people are the ones making all the laws about technology?) will pass laws so that everything you do or say is monitored.

    Imagine if you could not reply to this as "anonomyous coward", and you disagreed with my opinion (and just suppose that my opinion was the one the cops liked), a little "*" would appear next to your name on the great list of the citizen's offical internet accounts, and everything you say or do would be monitored. I think it's better not to encourage any regulation of the internet... I think there's enough wacky porn out there without kiddie pron... err uh, at least... um... I've been told. :)

    I hope i didn't mis-understand your comment... but that's just the point that my mind recived and processed. Hope that's not too off topic.

  • THAT'S IT my girlfriend and I are going to paint ourselves green and have sex against a red wall.

    (mpegs available upon request)

  • Come see www.nudepinkgirls.com! All our girls wear pink body paint, guaranteed to beat your compay's software! Don't let your parents, your library, or your HR department jerk you around! Exercise your constitutional right to exercise your arm!

  • I could imagine heuristics which do a true analysis of the image to determine if it is pornographic, better than looking at the colors, but wouldn't that be an issue in and of itself, since it would take up processor time? People always say, "What would you do with that much processor?" Everything that we're not doing now. In the future, with faster processors, better analysis than this can be done, and instead of "Netnanny," parents will just load software that puts nice black blocks over everything that they're kids can't see. Real time censorship folks. And you wondered what a home supercomputer could be used for.
  • The day that AI becomes smart enough to filter porn is that day that it realizes that it likes it and refuses to filter it.
  • Scunthorpe :)
  • That is a good point. The article's use of the word "flesh" brought back images of the Crayola Crayon color of flesh, namely the pinkish skin tone prevalent among caucasians, that I used to use in kindergarten.

    Given the fact that there are many different skin tones in the world, how is it to distinguish? Are they so arrogant that they assume only images of blond-haired, blue-eyed people are being downloaded?
  • 1: What color is 'skin' exactly? From albinos to blacks, that's a lot of colors you have to check for. What if there's more than one person there, of different colors?

    2: What if I have a picture of a naked *gasp* pig?! That should be close enough to skin tone.


    ---
  • We would quickly see a round of green, purple and blue skin, with a lot of science fiction themed porn. It would be just like what started the whole Hackerspeak...back in BBS days, some software implemented the ability to filter out swear words, so we started doing things like substituting an * for a some letters like "sh*t", and "F*CK!". This later evolved to using numbers to sub for letters to further get around these "features". And thus, 3l33t and d00dz came into our vocabulary.

    It would be interesting to see what other ways, beside coloured skin people would use to get around a "excessive skintone" filter...a return to black and white pictures? Weirdly skewed colour maps? Use of "oil paint" filters to break up the skin-tone areas?

    ttyl
    Farrell
    (who has been on line far too long...)
  • Send yourself a picture of your own face. This will trigger the software, which will disable your computer. Then call the support team, who will fix the problem within 24hrs, and take the rest of the day off.


    ---
  • I just caught myself pronouncing Linus as "Lee-nus" like Linus Torvalds. It's not exactly wrong; the official site [snoopy.com] doesn't give pronunciations.
  • I'm thinking that the filter to effectively block www.hotebonybabes.com would also limit your access to www.amishwhipsandbuggies.org.

    As for the caucasian filter, well at least it would block the khaki portion of www.gap.com.
    --

  • by hypergeek ( 125182 ) on Tuesday June 20, 2000 @07:37AM (#990210)
    "Oh look, that girl is wearing a schoolgirl outfit - this is a porn pic, filter it!" -- so much for all the Brittney Spears fan pages..

    It's good to know that at least some good will come of this. :-)

  • "Black and White"

    Jay (=
  • by the_other_one ( 178565 ) on Tuesday June 20, 2000 @03:58AM (#990224) Homepage

    What I want to see is an image filter that will filter out the clothing.

  • I can see how this product works by detecting skin tones, but how will it distinguish between say, a face and a naked body. Unless it has the outlines of human genetalia programmed into it and actually spent the time tracing each proxied image to check it for such outlines, this seems impossible. Also, what about dark skinned individuals, how would this product be able to detect them? This seems like a good idea that would never work unless it had a massive amount of processing power behind it for tracing and comparing images.

  • by A Big Gnu Thrush ( 12795 ) on Tuesday June 20, 2000 @04:19AM (#990232)
    From the Wired article: "How do you tell the difference between a woman in a bikini in a sailboat which is not racy and a naked woman in a sailboat?" Touretzky asks. "The only difference is a couple of nipples and a patch of pubic hair. You're not going to be able to find that with a neural network."

    Maybe our definition of obscenity is the problem.

  • Hm... Yeah, perhaps. But this quote from the first link:
    Once installed on a single PC or across a network, the antiporn software known as eyeguard is activated each time an image is displayed
    Makes me speculate that eyeguard actually hooks into the operating system itself, so that it sits somewhere in the code that displays bitmaps on screen (on Win32, that'd be inside GDI, right?). If so, then most of the above techniques won't work very well, unless you can "counter-hook" those API calls, so that eyeguard sees an altered version of the image, then calls the original OS entry to display that, but actually ends up in your anti-blocker which turns the image back to normal and displays it. I suspect that it would be fairly easy for eyeguard to protect itself from calling a "false" original entrypoint, though... Hm, this is pretty close to some serious cyberwarfare. The lengths some people go to control each other... *Shrug* :( On the off chance that eyeguard does not work like the above, consider it a free business idea and start hacking. You might get rich! ;^)
  • I don't know how good their system is, but there is published research on this sort of thing. One of my former professors made a "naked people finder" that's based on finding cylinders in the picture and evaluating if they are skin tone, and if they make a reasonable human body. An interesting aspect is that bikini pictures are out, but maybe pictures that don't show caucasians are OK. For more info on finding naked people, see http://www.cs.hmc.edu/~fleck/naked.html

    Walt
  • This stuff was reviewed by Maximum PC a couple of months ago, and it works, BUT.... It can't tell the difference between skimpy swimwear and nudity, and it scores a big fat zero on hentai and other adult anime/cartoons. I think we need two internets, one for business and the politicians and one for normal people.
  • A more scientific version of this was reported a while back - Margaret Fleck and David Forsyth [berkeley.edu] did work at Berkeley and Iowa in about 1996 that finds naked people [berkeley.edu] or horses [berkeley.edu] using descriptions of shapes of bodies. Wired Article. [wired.com]
    This slashdot story doesn't appear to be related to it.
  • And let's not forget the Robot Porn!

    Gotta love metallic "flesh" tones ;)

  • This is my take:

    You may pursue happiness as long as it doesn't not impinge on the same pursuit of any other person. Now, of course this is very vague. But IMHO, at the point you yell "fire" in a crowded theater, or threaten to assasinate the president, etc., you are infringing upon the rights of other people to otherwise feel safe. You may actually be causing harm. You are disturbing the peace. Sure, this is all very vague, but there can be obvious exceptions, like the above. Intimidating somebody with physical harm, e.g., I'd say is not "free speech".
  • by Paul Johnson ( 33553 ) on Tuesday June 20, 2000 @04:22AM (#990259) Homepage
    OK, its pretty obvious that obscenity is in the mind of the beholder, not the computer. So computers can't spot this stuff.

    But I can imagine a program which tracks the average flesh tone score for pictures over time. If the moving average goes over a certain threshold then a dialog box pops up on the sysadmins screen telling him that Joe in cubicle 69 may be abusing company bandwidth, click here for a list of the suspicious URLs. Or, as it might be, sends an email to Junior's father. The key point is that this stuff can work as part of a monitoring system that uses human judgement for the final bit, rather than being a blocking solution.

    Companies do have a legitimate need to monitor this stuff. Quite apart from the abuse of company resources, companies who allow employees to download and view sexually explicit materials can find themselves on the wrong end of a big discrimination lawsuit.

    Paul.

  • by sandler ( 9145 ) on Tuesday June 20, 2000 @04:27AM (#990264) Homepage
    There will always be people who feel the need to "do something" about "evil" on the internet, and as long as there are, there will be people to buy this kind of software. As long as we keep pointing out why the software is broken, people will keep coming out with software that's supposed to be better. If we want to fight censorware, we need to argue about why censorware is wrong in general, not why this or that specific software is broken.

    That having been said, I think the reason is that any censorware, present or future, puts the decision of what is and is not appropriate for me or my kids into the hands of people who don't know me and don't share my values.

    Open censorware (with open block lists) is a possible solution to this. This way, parents, who should be deciding what their children will see, can actually make real decisions, rather than have to abide by whatever decisions Mattel or whoever else makes for them.

    It's just as wrong for a company to insist that my kids shouldn't see a certain site as it is for anti-censorware advocates to insist that my kids should be able to see anything. The right thing to do is to give parents the choice to make that call.

  • by Paul Johnson ( 33553 ) on Tuesday June 20, 2000 @04:27AM (#990265) Homepage
    Not to mention Scunthorpe.

    See RISKS [ncl.ac.uk] for details.

    Paul.

  • by orpheus ( 14534 ) on Tuesday June 20, 2000 @04:28AM (#990266)
    Since the article specifically Refers to the Aussie situation (mandatory porn filtering by ISPs), here's what the final report of the Australian Government (National Office for the Information Economy) has to say about the weaknesses of this approach in their review of blocking technologies entitled [google.com]
    Access Prevention Techniques for Internet Content Filtering (Google cache) :

    The quest to detect pornography is often more concerned with images than text and getting computers to recognise a pornographic image is equally, if not more, difficult than the task of distinguishing between erotica and other literature. There have been efforts to characterise pornographic pictures based on the amount of 'flesh tone' in the images and on the poses struck by the subjects. Computers have difficulty distinguishing between art and pornography, say between a Rubens painting and a Penthouse centrefold, and this approach is not generally regarded as being effective. Video and other streaming media further complicate the filtering task by supplying a constant flow of images to be examined for undesirable content.

    Furthermore, they complain:

    This approach is affected by the same issue as profile filtering in that an image - or a fair percentage of an image - needs to be loaded before it can be analysed, during which time it may be displayed on a user's screen..

    Of course, this second problem only applies to an Aussie-type ISP restriction. Geocities did this years ago (don't know if they still do): scanning their own HDDs (Free user pages), deleting 'questionable graphics' (with or without human review) and waiting for the page authors to complain about any mistakes,
  • How many other people think that the developer wanted an excuse to go browse porno sites for "test images" so that they could create this thing?

    BOSS: So, what is this suggested project?

    DEVELOPER: Uh, we want to create a program which can determine if a picture is pornographic or not. I request the position of obtaining test pictures. I'll need the company VISA too.

    BOSS: Sorry, that'll be my job. I'll be glad to help with that portion.

    DEVELOPER: Yeah, but I have more... uh, experience, in, um, finding, er, them, uh, yeah.

    SYS. ADMIN: We'll need more bandwidth for this project too, and I'll extended the capability of the web server on the alt.binaries.* and alt.sex.* news groups by 500%, OK?

    BOSS: Approved! Now get to work, I've got some... research to do for this project.

    DEVELOPER and SYS. ADMIN leave the office, and head back to their respective offices to obtain "test images." BOSS looses his belt, and...

  • Open blacklists are a better solution for parents and home computers, but there are still many problems. Specifically, the riligious right may try to force their views on the open black list. Remember, religious people are the majority of people who will volontear to check sites out. Would you really want to child's exposure to ideas influnced by the kind of person who would volontear to scan the web for porn?

    Anyway, my point is that most parent should just say "I do not trust anyone who would make a censorware program, so I'm just going to trust my kid untill I get a clear indication of a problem."

    Libraries have the same problem, execpt they have lots of people walking arround, so they can just move the computers into an open area and ask people who look at porn to leave.

    A Library which has a really serious problem with porn then they can share the netscape cache directories, have a computer scan images for fleshtones, and display possible images to the librarians. If the librarian judges the image to be pornographic, she/he can walk over to the computers, figure out who downloaded the image, and ask them to leave.

    The advantage of the above system is that you not restricting anything ligitimate (unless the librarian makes a bad call). Also, you should not tell the librarian which comouter viewed the image to provent librarians from being able to spy on specific people while browsing.
  • ... but the problem with open censorware lists is that the block list becomes a very valuable commodity.

    This is only true if the software isn't open source. If the OSS community provided censorware that was done the Right Way (tm), then people and libraries could choose to use that instead of closed competitors. Junkbuster [junkbusters.com] could probably do this with small modifications. And it could be admined the same way as I use it for cookies - I have an opt-in list, and whenever I go to a site that requires cookies, I add it in. So too, a parent or librarian could add in sites as their kids needed. We could even maintain a list online, so parents who've put time in to compiling lists could post them on the web, and other parents could have a head start.

    The point is, once there is decent software for libraries and such to use, people will stop making and using broken or evil software.

  • They're probably racist enough to consider non-whites to be animals, and since pictures of naked animals are not considered pornographic, they don't care about that...

    --
    Here's my mirror [respublica.fr]

  • by jabber ( 13196 ) on Tuesday June 20, 2000 @04:28AM (#990290) Homepage
    Image recognition, refined enough to filter porn, will not be around for a VERY long time. I'm not that imaginitive, and I can easily picture all sorts of "unnatural" positions which an automated system would have a hard time recognizing as porn. :)

    It will take an AI with the understanding of what "porn" means, with an appreciation for the human body's full range of motion, and with the comprehension of the latest fetishes - else National Geographic and CNN.com will find themselves filtered out of libraries and schools. After all, what is the difference between an image of a 'man riding a horse' and that of a 'man riding a horse'?

    But the research being put into this sort of image recognition has an even seedier and more sinister side. It can/will filter based on LOGO. That's right.

    Imagine Time-Warner/AOL being 'unable to connect' to sites which feature their competitor's logos.. Imagine ISPs who show Reebok ad banners suddenly disabling links to pages that display the Nike "swoosh". Imagine your favorite web-site [slashdot.org] suddenly not letting you click through to any other site that does not proudly wear a "VA" on it's 'sponsors' page.

    And all this technology is being developed... (oh, say it with me) "In the name of the children!". BS - all the children I know would get a kick out of looking at porn, and are being damaged more by advertising than by sexual content.

    Personally, I think we should assist in the development of this technology, and make sure that it only filters on Red Maple leaves on white backgrounds! Blame Canada!! Hooyah!
  • Hmmm, now, when it's filtering and looking for those skintones, is it just looking for skintones of 'white' people? if so, what about the skintones of other races?

    On the other hand, if they do include skintones from all races, then that's a lot of colours they're filtering.

  • This article was a hoot. I can really picture some slick talking marketing guy and his lines of reasoning:

    Yeah, there's this independant testing lab, verifying our nano-cool neuronal rocket science algorithms, working with 99.8% reliability. The name ? No, that's really secret that the Rush Limbaugh institute for creative certification is the independent lab...

    Nah, I've used it myself 30 or 60 days ago. I could only get to the dirty pickies at the XXXsmutshop after disabling our super software...

    Wot ? An old version ? Hey we're into rocket science advancing our secret algorithm on a daily basis. Since it's so advanced we don't need version control, therefor we don't have a version that actually worked anymore. But trust me, I'm in the DOTCOM business...

    Shheeeesh, that guy must have been straight out from twisted tongue marketing academy...

  • BAIR frequently blocks ads on the websites of PC Magazine, Wired News, CNN.com, and other news services.
    If nothing else, it's an ad blocking proxy!
  • Imagine if software like this was possible, when installed on a computer it could hook into your OS and every time you looked at Natalie Portman's lashes hooters it would put a red rectangle over the most important places on her body - her head and her feet!
  • On the "Hello World" scale of uselessness (on which "Hello World" ranks a 5/10), this has to rate a 1.

    Why?

    • Short of having libraries which support EVERY graphics image format (stills AND movies), it's useless. Say that it supports GIF and JPEG. Two popular formats now, but aging. (In the case of GIF87, aged, dead, buried, rotted and recycled as UNISYS demands for money.)
    • There's the aspect that others have pointed out, that "flesh-tones" vary between peoples, and not everything that falls into that range is flesh. (Pine would probably trigger it.)
    • Herustics -ARE- a very good approach, for many things, but this isn't one of them. If it's not linearly seperable, then neither herustics NOR neural nets will be able to produce accurate results.
    • With the advent of alpha channels, what's in an image, anyway? If two images can be blended on-the-fly, then one image can be split randomly on-the-fly, in such a way as to make any one image appear incoherent, but the combined image as whatever you started off with.
    • Almost forgot. You not only need to support all image types, but also all compression schemes. No use being able to process GIFs, if the image you're fetching is also gzipped or bzipped.

    All in all, it comes back to what I've always said about these types of system. Give someone freedom to filter, ALL ON THEIR OWN, and they'll probably do so. Everyone, however "liberal", has something they just don't want to spend the time with. And that's OK! That's GOOD! But the definition of OK cannot come from outside, it has to come from inside.

    As for parents of kids, same sort of thing applies. When you pick a meal to cook, do you select out of the cook book(s), or cook everything in them, all at once? You select! Ergo, being able to select out of a range of databases (eg: your own personal filters, the school's database, the databases built by the various clubs & societies the kids belong to, etc, ad nausium), makes MUCH more sense than blindly following one database built around the fiction of one-size-fits-all.

    Yes, it takes more time. But in the end, you will ALWAYS have a trade-off. The easy and generic routes are INVARIABLY harmful in the long term. You can become a cabbage-patch human, and live in Number 6's Village for all eternity, or you can put some effort in and live as a human being, instead. This doesn't mean being "rebellious" - if you rebel for the sake of defying what someone else says, your brain is as much a rotten cabbage as the obsessive conformist.

    Getting back to this censorware, it's market is that of the obsessive conformist, and the most vocal critics (in the media) are the obsessive rebels. It's a symbiotic relationship made in hell. The more extreme one group gets, the more it feeds the other. Don't you think the makers knew it would be controversial? Of course they did! They are counting on it! The more attention it gets, the more free advertising, the more money they make and the more brownie-points they can give themselves.

    The media critics are the same. Without products like this, there's nothing to vent about, and therefore no reason for anyone to read their articles, and therefore no reason for anyone to keep them employed. They don't want their "enemies" to go away, because they're the ones who justify the pay-cheque.

    IMHO, whilst the Extreme Wing and the Press are "best of enemies", there's no place for sanity in the world. Who needs it, when you've a symbiotic, self-perpetuating feeding-frenzy?

  • Having the source examined by an appropriate expert would allow Wired to tell if it is a genuine product which is currently misbehaving, or if it's just a scam.

    Their reluctance to allow this review to take place suggests to me that it's just a scam.

  • So, everyone remembers "chroma-key" (aka bluescreen, and the thing that makes weathermens' ties invisible.)

    I always wanted to see them use "porno-key", the system which replaces any pornographic material with a calm mountain lake.

    So, you'd have this giant, thrusting, penis-shaped calm mountain lake scene superimposed on an empty bedroom.

    But it'd eliminate the pornography.
  • Don't you want to filter the ones where she's *not* wearing the uniform? :)
  • "Rant" nothing. It's just common sense.

    We don't give 'the children' enough credit in seeking out information for themselves. They'll lear, whether we want them to or not - the only sensible thing to do is to tell them the truth.

    Just as we tell them to 'be careful when driving', and 'don't speed'; we should also be telling them useful, practical information. "When making a left turn at an intersection, keep your wheels pointed straight - so if you get rear-ended, you will not be pushed into oncoming traffic." is the sort of common-sense knowledge which they need, and which we have (from experience) to give them.

    Same goes for matters of sex. You may have different opinions than I but here's as good an example as any:
    Rule #1: Whatever you do, I'll still love you because you are my child.
    Rule #2: Don't hate or hit anyone just because they're different.
    Rule #3: If someone hurts you, it isn't your fault - carry mace, just in case.
    Rule #4: No means No!
    Rule #5: Use a condom, whomever you do.
    Rule #6: Your friends know as little as you do. If you don't know, ask me. If I don't know, we'll find out together.

    You're far from 'ranting'. It's common sense - education is the key to solving most, if not all, of the world's problems. The worst thing we can to is stay silent while some self-appointed, holier-than-thou glory seeker defines a moral baseline for all of society (not just their own kids, unfortunatelly), including ourselves and our childern.
  • by hawk ( 1151 )
    Are you sure about '62? I thought it was rather recent.

    That, and I wasn't born until a couple of years later, so I couldn't read the labels on my crayons until the late 60's, and my crayolas certainly had "flesh" as a color . . .
  • _IN_THIS_CASE_ I'd agree - though with caveats. The more useful information would be the training sets, bearing me mind that this is neural net based.

    In a more general case - which seemed to be where this particular subthread was going and was certainly what I was talking about - where you have a program based on a blocklist, opening the source adds very little indeed.
  • No such trouble here. We made the boss's Netscape cache directory shareable and we only need to go look there...

    --
    Here's my mirror [respublica.fr]

  • What about CATHOLIC school uniforms? ;-)

    --
    grappler
  • by Pig Hogger ( 10379 ) <.moc.liamg. .ta. .reggoh.gip.> on Tuesday June 20, 2000 @06:13AM (#990325) Journal
    Finding cylinders only? How about double hemispheres???

    --
    Here's my mirror [respublica.fr]

  • by jabber ( 13196 ) on Tuesday June 20, 2000 @06:16AM (#990326) Homepage
    This whole thing reminds me of an ancedote on image recognition using neural networks in the early 90's.

    The military was developing a neural net to optically identify tanks hiding in brush and trees. They trained the net on photographs, and noticed exceptionally good performance. Then they tried a new set of photographs, and the success rate of the network suddenly fell flat.

    It turned out that in the first set of photos, all the 'with tanks' pictures were taken on a sunny day, and all the 'no tank' pictures were taken on an overcast day. In the second photo set, this was reversed, and the network continued to correctly tell sunny and cloudy apart...

    So I have to wonder what this new development holds in store. Will we be graced with filtering software which discriminates between gaudy, tacky motel room backgrounds rather than drunk co-eds on crack taking it in the ear? Will it be intelligent enough to tell if someone trims their hair, and is therefore a professional, versus those fat and hairy amateurs?

    Hey, maybe it will finally be able to tell if the subject of the questionable photo is in fact "barely legal", or has the stretchmarks of a few litters of puppies to suggest the contrary.

    Your point on skin color is well taken, and opens up the question: What does ALL porn have in common that can possibly be quantified and filtered? The answer, I'm afraid, is nothing at all.. One man's porn is another mans art.

    Aside: I, for one, didn't consider the Maplethorpe photos to be obscene. They weren't artistic either. They were just 'for shock', to cause a furor, and get more attention through objection than through inherent value.

    My strong suspicion is that whoever is behind this effort is a lot smarter than they seem at first glance. They are trying to bleed the "Religious Right" of money by getting 'upstanding' Bible-thumpers to fund this research - all the while knowing that it can not possibly be successful.

    "Oh look, that girl is wearing a schoolgirl outfit - this is a porn pic, filter it!" -- so much for all the Brittney Spears fan pages..
  • I seem to recall a friend having his web pages turned off at tripod because they tripped the "nudity" sensors. Apparently they did a sweep looking for excessing fleshtones, the had someone look at all the images that were flagged.

    Of course there are several ways to defeat the program:

    1. Don't be white. "Flesh tones" as used is insanely non-representative of the true range of flesh colors.
    2. Don't use color. Greyscale will be a LOT harder to deal with.
    3. Put a lot of face shots in. Lots of flesh tones but not anything pornographic.
    4. Retro-60's - purple on gold, etc... :-)

    The best way to get rid of a stupid system is to think around it.

  • Wow, this article is some serious trollbait -- I'll bite!

    Naked and petrified will live on, such as my creation for the /. trolls:
    Natalie Portman naked and petrified [angelfire.com].

    Someone else created:
    Signal 11 naked and petrified [geocities.com]
    and I wonder where they got the idea???

    Young boys are always going to be able to get pictures of naked, not necessarily petrified women -- they're beautiful things, are they not? Didn't you ever stay up late to watch a "naughty" movie like Revenge of the Nerds or Porky's knowing that it would be an opportunity to catch a glimpse at some "hooters"? It's sad, but kinda funny, innit?

    --

  • by ocelotbob ( 173602 ) <ocelot.ocelotbob@org> on Tuesday June 20, 2000 @04:36AM (#990329) Homepage
    I have the best solution to the Censorware problem, plus it'll make a lot of people very happy. First, in the rich, upper crust neighboroods, you advertise for realtime filtration of Bad Stuff to protect the whelplings. Next, you advertise in and around colleges such things as "Make Money Viewing Porn". You pay these students about $6.00/hour. Now you put all these students in front of computer terminals, and hook them up to heart monitors. Any time someone subscribing to the service wants to view a page, it's first shown to one of the random college students. Now, if their heart rate rises once they see the page, you know that the page should be filtered.
  • by Old Man Kensey ( 5209 ) on Tuesday June 20, 2000 @04:37AM (#990331) Homepage
    I remember speculation about filtering web content by checking the skin-tone levels of images as far back as 1996. At the time everybody more or less decided it was too impractical and gave up on the idea. The standard points about non-flesh fleshtoned objects and large amounts of non-pornographic flesh were made then too.

    This is a particularly disgusting (to me at least) instance of the "for the children!" canard. Now instead of politicians using it to achieve their aims, which is bad enough, we've got a company using it to bilk panicked consumers out of their money.

    And of course, just as with the quality of our politicians, we Americans have only ourselves to thank for this. If people weren't so damn gullible, companies like this would never sell a dime of product (of course in this case it's questionable whether what they have constututes a "product", but the point stands...)

    What's needed is people willing to stand up and say "Yes, damn it, I do support porn on the Internet, and the easy availability of information on things like bomb-making and lock-picking, and if you don't like the speech I support, TOUGH SHIT. You don't get to pick and choose. If you want free speech, you got it. If you don't want it, go start your own damn country and LEAVE MINE ALONE."

    But what are the odds of that happening?

  • by jabber ( 13196 ) on Tuesday June 20, 2000 @04:37AM (#990332) Homepage
    All forms of naked women are to be filtered, except when their arms are missing, in which case it's Venus de Milo, and therefore a bona fide work of art.

    Clears the way for amputee fetishes, I think. :)

    Boticcelli's Venus, the image of a naked woman coming out of the surf, that has been used as the box art for Adobe Illustrator (IIRC) would of course be flagged. She has nipples and a 'patch of hair', as do most nudes painted during that time period....

    Hell, the Sistene Chapel ceiling is offensive, it shows Adam (naked youth) and God (Old man) touching fingers.... There's a bunch of naked little boy cherubs flouncing around them to boot. What horrific kinkiness!!
  • Last week I attended a meeting in Santa Barbra and ran into one of the dudes coding for this place.

    One planned enhancement for the software is configurability for the amount of *exposed flesh* shown before the engine kicks in and blocks the image. The idea is to have 'sliders' -- client-side java applets, I'm told -- on a admin/config page which would allow for a specific percentage of (for example) nipple. Once the network identifies the presence of nipple, the position of the configuration sliders determine if this presence is, in fact, pornographic or not pornographic.

    My question to the dude I met was how does the program quantify 'pornography' in the first place? If the neural networks are scanning for flesh, then they must have some sort of way to contextualize and quantify porn. (Since the 'I know porn when i see it' definition can't possibly work in a programmatic environment.)

    His response was interesting: he claimed that while he couldn't explain exactly how it was accomplished, he mentioned that several state governments are looking to extrope's definitions of a 'porn' image in order to settle various state and local pornographic cases throughout the country.

    He explained that it will possible to dump out the specific 'porn' settings -- set by the sliders on the config page -- and generate a long list of what, according to the admin, constitutes porn: 63.5% exposed nipple, more than 72% bare (suntanned but not pale) flesh, the absence of either a shirt or pants [but the presence of black {but but not white} underwear], the presence of various objects in the room in which the photograph was taken (a smoking cigarette in an ashtry, for example; or a bottle of Dewar's scotch that looks as though it could have been inbibed by the photographic subject; one black high heeled pump turned on its side, pointing away from the camera but [an important distinction] *toward* the bed), and so on.

    The difficulty, I was told, was derving an algorithm robust enough to exploit the neural networks but not tax it to its limit. (The employee was telling me that just a few hours ago successfully implemented the algorithm if the clothing on the subject in question was purchased from JC Penney's or from Victoria Secret.

    "It was tough," he explained. "Victoria's Secret uses significantly smaller weaves in their nylon undergarments (hence the higher price for lingerie from VC as opposed to JC Penney's). Try getting a program to recognize a bra from VC, and you've got the holy grail of censorware!"

  • I quote:

    Wired News tested BAIR by creating a Perl program to extract images randomly from an 87MB database of thousands of both pornographic and non-pornographic photographs. The program then assigned each of those images random numbers as file names.

    ...Do you suppose they could be convinced to open source that database? Quick, someone call ESR! ;)
  • by Anonymous Coward on Tuesday June 20, 2000 @03:59AM (#990348)
    What about us black folk?

    No, I am not trolling. This is seriously flawed. Not to mention stupid.
  • by Anonymous Coward on Tuesday June 20, 2000 @04:00AM (#990350)
    Only joking, but for those of us using corporate HTTP proxies where the sysadmins look for 'abuse' of the network, we're going to get some strange looks after visiting a page called www.newswire.com.au/0006/porn.htm. Couldn't they have called the page something a bit more discreet?

  • Orion slave girls
    Being green, can still be seen
    Pike says: Make it so!


    ------
  • It ranks right up there with sysadmins who add the string "sex" to the list of places you can't go.
    This happened at a place I once worked. All of a sudden, we couldn't (for instance) look up the words:
    sextuplet

    sexton
    etc, etc.
    they also added "young," "adult," and other words to the list...

    keep in mind that this wasn't looking just in the domain name for the string, but in the entire URL.
    At any rate, this sort of half-assed content filtering still doesn't replace mom or dad talking to Dick and Jane about the world and what's in it.

  • I saw software similar to this a while ago (sorry, can't remember the name or anything), but what they did to detect was to look for a large "blobs" of a skin tone (white, black, tan, etc.) and then did some computations to determine what it was. They could "detect" a torso with legs and arms coming out even if they were crossed. So if someone had a bikini on, the blob wouldn't be continuous, and therefore not something naked. So if you add this algorithm with the other it might be pretty good.

    Disclaimer: This information might be wrong, it was a while ago that I saw it.
  • by roman_mir ( 125474 ) on Tuesday June 20, 2000 @04:52AM (#990377) Homepage Journal
    These evil bastards must be stopped!
  • That's the beauty of this thing (from a rather draconian management point of view anyway), it doesn't have to do a very good job, it just has to work well enough to intimidate the employees enough that they don't dare visit any sites without a really good reason. Viola! You've eliminated recreational use of the Internet at your company!

  • I think that I am going to start a comic strip about pink little bunnies. Yes, they will be very large, flesh toned, round fat little bunnies, with bright pink bellybuttons, big round ears with sort of black hair in between, maybe a couple of mohawks. Red and blonde hair on a few. Big loving eyes, with red pupils.

    Hrmm, this software probably works great already .

    Next, I think that I'll do a photo spread on fields of wheat.

    (Before you mod this down, read what I wrote, it's a joke about the heuristic).
  • I can tell you one thing for certain: If I had a program that could go through the web and find pages that are almost certainly pornographic content, I wouldn't be censoring them ;)
    Porn search engine, anyone?
  • disables the computer ((!) how stupid is that?!) then that worker is completely idled until the admin re-enables it

    I'm sure that such a feature would have to be optional. They could simply have the snapshot sent to management and the user wouldn't need to know anything about it.

    In a tight labor market, that might not be too wise...

    I would agree, but we're talking about companies that ARE going to use this software. There are plenty out there that will. I wouldn't want to work for such a company, but not everybody can afford to be picky about their employer all the time (including me, I just happen to work for a company that isn't quite so harsh about internet use. They do block sites, but it's done by humans, not a software program). Once you accept that some companies will use such software, unwise as that may be, then what I said is true. The software doesn't have to work very well. Just well enough to put the fear of management into the employees.

  • If you are the same color as your monitor you should get out more. That big yellow thing you see in pictures of the outside is called the sun. There is no reason to fear the sun, unless of course you spend a lot of time outside but staying off the 'net that long is just sick and unhealthy.

    Devil Ducky
  • Well, the whole thing was a joke. You, know, jest, that sort of thing.
  • The sorts of images that would not be dealt with correctly;

    1. People in swimsuits.
    2. People doing nasty things, but wearing "fetish atire."
    3. People doing nasty things with Members Of Other Species. (Animals, ICK!)
    4. Wresting (Including Sumo.)
    5. Sunsets. (Some of them have a lot of "flesh tones" in them.
    6. Manipulated images with a slighly more blue color temperature.
    7. Medical images.
    8. Fine art.
    9. Bodybuilding pictures. (see: swimsuits)

    What an obvious, but still obviously stupid idea! I've been doing image analysis for over 20 years, and this idea did not deserve a moment's consideration, much less venture capital.
  • An Anonymous Coward (of course!) wrote:

    So, you're basically saying you have the right to ram your sexual content down the throat of anybody who dares log onto the net, that it's not their right to choose for themselves.

    No, I'm saying it is their right to choose. Nobody is forcing you to visit alt.sex.cute.little.bunnies. Your neighbor visiting http://www.hotteensluts.com/ doesn't affect you in any way, unless he's doing something like taping printouts to your car window.

    "Moral standards" (or at least, your moral standards) restrict my right to choose whether I want to view that kind of thing or not. It's also a common principle that morality means nothing if you have no choice.

    The 'free speech' referenced in the US Constitution was never about jiggling tits. You don't have a leg to stand on.

    Sorry, it was, and I do. Any power not specifically granted to the government in the Constitution is forbidden to it. Any right not specifically forbidden the people is granted to them. The Bill of Rights specifically grants freedom of speech and does not restrict that freedom in any way.

    "Acceptable free speech" is an oxymoron. Speech is free or it is not. Once you start imposing limitations on what I can print or say or depict, you don't have free speech any more.

  • If they truly believed that all images are of
    >blond-haired, blue-eyed people
    They're going about it all wrong. They could just block aryanporn.com.

    Devil Ducky
  • I'm pretty sure software can't recognize porn, whatever the definition is.

    But as an enterprise that forbids their employees to watch porn material, why not move a copy of each image file that comes over the network in a separate directory, also storing which computer requested it at what time.

    I think it's sufficient to tell the employees that this is done and the 'image cache' browsed by a person on a regular basis.
    • color index remapping. Write a trivial piece of program that swaps color indexes around. Even a simple color inversion would do.
    • rename jpg files to something else.
    • FFT's the images.
    • uuencode/decode into ASCII files.
    • encryption/steganography.
    In other words, the software only catches stupid people viewing porn. It drives the smart people who view these things into inventing all sorts technologically interesting stuff. Thanks!!
  • by DeadSea ( 69598 ) on Tuesday June 20, 2000 @04:09AM (#990413) Homepage Journal
    As part of a project for a multimedia class that I took as a Senior in college, we had to write software to count the number of people in an mpeg video. It was a very open ended project and we weren't expected to be able to get the right answer all the time.

    The professor suggested that we start with skin tones. He pointed us to research that tried to pick out the parts of the spectrum considered "skin tone". There were some simple algorithms that were suggested. We did this and it worked decently well, but there were a lot of things that looked like skin to it. Especially light colored woodwork.

    An algorithm like this may be able to filter a lot of stuff off the web. But it will filter a lot of other stuff too. I can also think of 100 ways to fool it. The easiest being put images through a color filter before posting them, or post them in black and white. Other people have pointed out that it will filter portraits and other shots of humans that arn't porn.

  • Yeah. I've often thought of the state of computers now like the auto industry of the 50's.

    Ah, so you too were paying attention when we heard that in Freshman Immigration Course two semesters ago. ;)
  • by GregWebb ( 26123 ) on Tuesday June 20, 2000 @11:34AM (#990415)
    Sorry to be awkward here, but I really can't see how opening the _source_ to the blocking is even slightly relevant. The algorithms used for blocking the access and the interface - all that we're talking about here, after all - wouldn't help the user's confidence that the blocking was sound unless we're talking seriously nasty code acting against the instructions in the block file. Indeed, it may weaken it slightly by allowing the users to see how to exploit its problems...

    The only thing that makes _any_ sense to open up to give confidence that the blocking is fair is the blocking list itself. Now, if we're blocking access for young kids exclusively, that's fine. Once you include teenage boys into the equation though - as the demographic where blocking software and computer skills are most likely to cross over - then it becomes a liability as, the second the kid finds out how to bypass the blocking software (which _will_ happen eventually) then they have a list of sites that their parents don't want them to use, sitting right in front of them...

    Like I said, it's tricky. Blockers want to know that the blocks are sensible, blockees want to know where all the dodgy material is - and one leads straight to the other. Quite how we fix it I really don't know, but it's far too simplistic to simply say that opening everything solves all the problems. It doesn't, and it creates two new ones.
  • Now censor this [carol-carter.com].
    --

Over the shoulder supervision is more a need of the manager than the programming task.

Working...