Software That Can Censor 'Sexual Images.' Or Not. 247
Halster writes: "Here's an interesting story on Newswire about censorware that detects excessive skintones in images, and implements blocking accordingly. What next?" What's next is (you'd hope) the realization that image-analysis heuristics are inherently limited, and not the best thing on which to pre-emptively base system-admin decisions. ( michael : That story is a company press release. For a much better evaluation of how this software works, see this Wired expose detailing the fraudulent nature of image-filtering "artificial intelligence," or an older review from Businessweek on Eyeguard.)
I'm on a Porn Collection Task Force (Score:5)
My boss has installed this software, and is now forcing the entire office to surf for porn. These "snapshots" are sent directly to his hard drive, which is saving him the time of having to sift through thousands of non-porn pictures to get the ones he wants. Thanks to this software and the snapshot feature, my boss is able to accumulate pornographic images at 10X his previous efficiency.
Eye-T, Mr. Wilkerson thanks you.
If only it worked as well as they claim (Score:1)
If they can tell what the people in the picture are doing by analyzing the picture, I wouldn't have to sort through all of those GIFS myself...
(grin)
Re:MD4 smut DB (Score:1)
See how this works?
If it ain't broke, fix it 'til it is!
Re:If I had this.... (Score:1)
Me and a buddy actually started "development" (we talked a lot) about "p0rnster", a webcrawler that would take links from the porn-linksites, look for certain keywords, follow X links through, and save all jpeg's over Y kb big.
Never got around to it though.
Pornography -- property of a culture (Score:3)
Pornography is not a property of images. It is a property of a culture, and of the value judgements that that culture makes about sex and nudity.
Imbued as we are with American values acquired through film, we tend to forget the above, but in Europe we're fortunate enough to have a million beaches where nudity is nothing special to bring back home the relativity of values. Nothing else makes the point so effectively.
Yet another brick in the wall (Score:1)
How soon can we expect ISPs, most likely after being strong-armed by the government (just look what's happening in the UK), to start scanning data streams for porn images, unpopular political symbols or even faces? --Peter Dyck
Re:Can someone mirror the article? (Score:2)
This is well calculated advertisement. This page will hit the decency filters of any web fashist around. And it will be forwarded to HR. And they will buy it. That is the idea. I would say very well calculated and congratulate the inventor
On the topic of the filter itself. I bet that this crapware will fail on:
Re:New internet and unaccountable businesses (Score:1)
Not a hope of working (Score:1)
Hey Kids! Make your own blocked porn page! (Score:1)
Re:Oh, yea...this is a *great* idea.... (Score:1)
Can you say racism lawsuit? (Score:2)
It'd be fun to see this turn into the biggest lawsuit since Crayola's whole "Flesh" colored crayon debacle.
Anyone know if they're still making "Indian Red"?
Crazy (Score:1)
Re:I think this was built for an excuse to find po (Score:1)
(After, of course, they removed the manilla folder icon and replaced it with another...)
Re:this will be unpopular but... (Score:1)
Since when-the-hell has naked human skin been pornographic?
Why is this always the equation:
naked = pornography = sex = bad
There is an absolute difference between nudity and pornography and sex.
nudity != pornography
nudity != sex
nudity != bad
Deal with it!
It's the utterly mindless tendency of unthinking, self-righteous people to equate the one with the other that needs to be remedied...
t_t_b
--
Re:It would be simple to get around... (Score:1)
Re:What to fight (Score:2)
What we have here is a fantastically complex problem. Cyberpatrol has pulled some _disgusting_ stunts - like blokcing sites which criticised its methods - but that doesn't necessarily mean that all censorship software is bad.
Put it this way: imagine I'm (say) 10-15 years older, and with primary school age kids. Chances are, they're going to want to play with this Internet thing, just like we watched TV at that age. Now, do I want to have to say they can only surf whenever I'm watching over their shoulders? Of course not - that's ridiculously laborious and clearly impractical. The far simpler approach is to do what you do with kids and TV - bar by program / channel / time. So, in this case, don't allow access to Usenet & Gnutella (for example) and only allow websites you've approved. Kid gets a better net experience while you're prepared to let them spend longer online because it doesn't take so much effort on your part.
For this sector of the market - if nothing else - net censorship software is needed. You and I may well be able to get the desired result by configuring a spare box in the back bedroom as a proxy server, but we're not representative of the general population.
Censorship software hasn't yet hit the sweet spot but it's definitely necessary, if only within limited areas. And, while I see the advantage to an open blocklist, I hope you see the problem, too.
Similar product from HeartSoft (Score:2)
Anyway, the product is called Internet Safari [internet-safari.com], and it's a web browser (I guess some hacked up version of IE) that includes a image analysis filter. My guess is that it does the same thing as eyeguard.
good grief (Score:2)
heaven forbid that our children see pornographic material at home rather than going out and having sex, getting std's, and pregant.
NetNanny on steriods... (Score:1)
Give me a break.
Although the possibility exists that some law will be passed forcing every OS in America to include this software--and force it upon the end user--the chances of that range anywhere from slim to none.
And despite what Microsoft may want you to believe, you still have control over what software you want to install. Or is that not the case? Did I miss something that everyone else got?
The bottom line is: If you don't like it, don't install it!
-={(.Y.)}=-
New internet and unaccountable businesses (Score:3)
+1 insightful (Score:1)
This is the most relevant statement I've seen in this entire discussion.
_How_ many neural networks???? (Score:3)
"There are over 10 million neural networks involved in the thing," Beecher says."
Bloody hell! Imagine the computing power they've got to run 10 million neural _networks_.
...or maybe they mean 10 million neurons. Doh!
Re:Hrrrmmm (Score:1)
Oh well, you know this wouldn't affect Captain Kirk: he's got that fetish for blue women...
So where's the web filter that filters out stupid e-censorship ideas?
Oh. It filtered itself. Never mind...
---
pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
Re:internet has everything (Score:1)
The problem with allowing child pornography and then prosicuting people who expolit children without their consent is that if this was allowed, it would let politicans give all sorts of reasons to make speaches about how they should make the internet into a police state so that they can track down everyone who might possibly be linked to the photo(s) in question. Then these politicans (isn't it odd how the most computer illiterate people are the ones making all the laws about technology?) will pass laws so that everything you do or say is monitored.
Imagine if you could not reply to this as "anonomyous coward", and you disagreed with my opinion (and just suppose that my opinion was the one the cops liked), a little "*" would appear next to your name on the great list of the citizen's offical internet accounts, and everything you say or do would be monitored. I think it's better not to encourage any regulation of the internet... I think there's enough wacky porn out there without kiddie pron... err uh, at least... um... I've been told. :)
I hope i didn't mis-understand your comment... but that's just the point that my mind recived and processed. Hope that's not too off topic.
THAT'S IT... (Score:1)
(mpegs available upon request)
Measure? Counter-measure! (Score:2)
Heuristics (Score:2)
The day will come... (Score:1)
Don't forget... (Score:1)
Re:Only for white-folk? - Of course!!!! (Score:2)
Given the fact that there are many different skin tones in the world, how is it to distinguish? Are they so arrogant that they assume only images of blond-haired, blue-eyed people are being downloaded?
This will be fun (Score:1)
2: What if I have a picture of a naked *gasp* pig?! That should be close enough to skin tone.
---
It would be simple to get around... (Score:2)
It would be interesting to see what other ways, beside coloured skin people would use to get around a "excessive skintone" filter...a return to black and white pictures? Weirdly skewed colour maps? Use of "oil paint" filters to break up the skin-tone areas?
ttyl
Farrell
(who has been on line far too long...)
To take a day off work... (Score:1)
---
Coming back to PEANUTS after GNU... (Score:2)
Re:Can you say racism lawsuit? (Score:2)
As for the caucasian filter, well at least it would block the khaki portion of www.gap.com.
--
Re:Of tanks and sunny days (Score:3)
It's good to know that at least some good will come of this. :-)
Three words about "skin tone filtering"... (Score:2)
Jay (=
hmm (Score:5)
What I want to see is an image filter that will filter out the clothing.
How will this distinguish things? (Score:2)
My favorite quote... (Score:5)
Maybe our definition of obscenity is the problem.
Re:The different ways to circumvent this: (Score:2)
Once installed on a single PC or across a network, the antiporn software known as eyeguard is activated each time an image is displayed
Makes me speculate that eyeguard actually hooks into the operating system itself, so that it sits somewhere in the code that displays bitmaps on screen (on Win32, that'd be inside GDI, right?). If so, then most of the above techniques won't work very well, unless you can "counter-hook" those API calls, so that eyeguard sees an altered version of the image, then calls the original OS entry to display that, but actually ends up in your anti-blocker which turns the image back to normal and displays it. I suspect that it would be fairly easy for eyeguard to protect itself from calling a "false" original entrypoint, though... Hm, this is pretty close to some serious cyberwarfare. The lengths some people go to control each other... *Shrug*
Re:No Close-ups (Score:2)
Walt
censorware that detects excessive skin tones (Score:2)
Berkeley-Iowa Naked People Finder (Score:2)
This slashdot story doesn't appear to be related to it.
Re:How many flaws in this stupid idea? (Score:2)
Gotta love metallic "flesh" tones ;)
Re:Nothing new, just a new scam (Score:2)
You may pursue happiness as long as it doesn't not impinge on the same pursuit of any other person. Now, of course this is very vague. But IMHO, at the point you yell "fire" in a crowded theater, or threaten to assasinate the president, etc., you are infringing upon the rights of other people to otherwise feel safe. You may actually be causing harm. You are disturbing the peace. Sure, this is all very vague, but there can be obvious exceptions, like the above. Intimidating somebody with physical harm, e.g., I'd say is not "free speech".
Useful to alert admins, not to block (Score:5)
But I can imagine a program which tracks the average flesh tone score for pictures over time. If the moving average goes over a certain threshold then a dialog box pops up on the sysadmins screen telling him that Joe in cubicle 69 may be abusing company bandwidth, click here for a list of the suspicious URLs. Or, as it might be, sends an email to Junior's father. The key point is that this stuff can work as part of a monitoring system that uses human judgement for the final bit, rather than being a blocking solution.
Companies do have a legitimate need to monitor this stuff. Quite apart from the abuse of company resources, companies who allow employees to download and view sexually explicit materials can find themselves on the wrong end of a big discrimination lawsuit.
Paul.
What to fight (Score:3)
That having been said, I think the reason is that any censorware, present or future, puts the decision of what is and is not appropriate for me or my kids into the hands of people who don't know me and don't share my values.
Open censorware (with open block lists) is a possible solution to this. This way, parents, who should be deciding what their children will see, can actually make real decisions, rather than have to abide by whatever decisions Mattel or whoever else makes for them.
It's just as wrong for a company to insist that my kids shouldn't see a certain site as it is for anti-censorware advocates to insist that my kids should be able to see anything. The right thing to do is to give parents the choice to make that call.
Re:Oh, yea...this is a *great* idea.... (Score:3)
See RISKS [ncl.ac.uk] for details.
Paul.
Re:How will this distinguish things? (Score:3)
Access Prevention Techniques for Internet Content Filtering (Google cache) :
The quest to detect pornography is often more concerned with images than text and getting computers to recognise a pornographic image is equally, if not more, difficult than the task of distinguishing between erotica and other literature. There have been efforts to characterise pornographic pictures based on the amount of 'flesh tone' in the images and on the poses struck by the subjects. Computers have difficulty distinguishing between art and pornography, say between a Rubens painting and a Penthouse centrefold, and this approach is not generally regarded as being effective. Video and other streaming media further complicate the filtering task by supplying a constant flow of images to be examined for undesirable content.
Furthermore, they complain:
This approach is affected by the same issue as profile filtering in that an image - or a fair percentage of an image - needs to be loaded before it can be analysed, during which time it may be displayed on a user's screen..
Of course, this second problem only applies to an Aussie-type ISP restriction. Geocities did this years ago (don't know if they still do): scanning their own HDDs (Free user pages), deleting 'questionable graphics' (with or without human review) and waiting for the page authors to complain about any mistakes,
I think this was built for an excuse to find porn (Score:3)
BOSS: So, what is this suggested project?
DEVELOPER: Uh, we want to create a program which can determine if a picture is pornographic or not. I request the position of obtaining test pictures. I'll need the company VISA too.
BOSS: Sorry, that'll be my job. I'll be glad to help with that portion.
DEVELOPER: Yeah, but I have more... uh, experience, in, um, finding, er, them, uh, yeah.
SYS. ADMIN: We'll need more bandwidth for this project too, and I'll extended the capability of the web server on the alt.binaries.* and alt.sex.* news groups by 500%, OK?
BOSS: Approved! Now get to work, I've got some... research to do for this project.
DEVELOPER and SYS. ADMIN leave the office, and head back to their respective offices to obtain "test images." BOSS looses his belt, and...
Re:What to fight (Score:2)
Anyway, my point is that most parent should just say "I do not trust anyone who would make a censorware program, so I'm just going to trust my kid untill I get a clear indication of a problem."
Libraries have the same problem, execpt they have lots of people walking arround, so they can just move the computers into an open area and ask people who look at porn to leave.
A Library which has a really serious problem with porn then they can share the netscape cache directories, have a computer scan images for fleshtones, and display possible images to the librarians. If the librarian judges the image to be pornographic, she/he can walk over to the computers, figure out who downloaded the image, and ask them to leave.
The advantage of the above system is that you not restricting anything ligitimate (unless the librarian makes a bad call). Also, you should not tell the librarian which comouter viewed the image to provent librarians from being able to spy on specific people while browsing.
Re:What to fight (Score:2)
This is only true if the software isn't open source. If the OSS community provided censorware that was done the Right Way (tm), then people and libraries could choose to use that instead of closed competitors. Junkbuster [junkbusters.com] could probably do this with small modifications. And it could be admined the same way as I use it for cookies - I have an opt-in list, and whenever I go to a site that requires cookies, I add it in. So too, a parent or librarian could add in sites as their kids needed. We could even maintain a list online, so parents who've put time in to compiling lists could post them on the web, and other parents could have a head start.
The point is, once there is decent software for libraries and such to use, people will stop making and using broken or evil software.
Re:Only for white-folk? (Score:2)
--
Here's my mirror [respublica.fr]
The real dark side of this (Score:5)
It will take an AI with the understanding of what "porn" means, with an appreciation for the human body's full range of motion, and with the comprehension of the latest fetishes - else National Geographic and CNN.com will find themselves filtered out of libraries and schools. After all, what is the difference between an image of a 'man riding a horse' and that of a 'man riding a horse'?
But the research being put into this sort of image recognition has an even seedier and more sinister side. It can/will filter based on LOGO. That's right.
Imagine Time-Warner/AOL being 'unable to connect' to sites which feature their competitor's logos.. Imagine ISPs who show Reebok ad banners suddenly disabling links to pages that display the Nike "swoosh". Imagine your favorite web-site [slashdot.org] suddenly not letting you click through to any other site that does not proudly wear a "VA" on it's 'sponsors' page.
And all this technology is being developed... (oh, say it with me) "In the name of the children!". BS - all the children I know would get a kick out of looking at porn, and are being damaged more by advertising than by sexual content.
Personally, I think we should assist in the development of this technology, and make sure that it only filters on Red Maple leaves on white backgrounds! Blame Canada!! Hooyah!
Skintones. (Score:2)
On the other hand, if they do include skintones from all races, then that's a lot of colours they're filtering.
Every day a good laugh (Score:2)
Yeah, there's this independant testing lab, verifying our nano-cool neuronal rocket science algorithms, working with 99.8% reliability. The name ? No, that's really secret that the Rush Limbaugh institute for creative certification is the independent lab...
Nah, I've used it myself 30 or 60 days ago. I could only get to the dirty pickies at the XXXsmutshop after disabling our super software...
Wot ? An old version ? Hey we're into rocket science advancing our secret algorithm on a daily basis. Since it's so advanced we don't need version control, therefor we don't have a version that actually worked anymore. But trust me, I'm in the DOTCOM business...
Shheeeesh, that guy must have been straight out from twisted tongue marketing academy...
hey, it's not all bad... (Score:2)
ha! (Score:2)
What a useless bit of code! (Score:3)
Why?
All in all, it comes back to what I've always said about these types of system. Give someone freedom to filter, ALL ON THEIR OWN, and they'll probably do so. Everyone, however "liberal", has something they just don't want to spend the time with. And that's OK! That's GOOD! But the definition of OK cannot come from outside, it has to come from inside.
As for parents of kids, same sort of thing applies. When you pick a meal to cook, do you select out of the cook book(s), or cook everything in them, all at once? You select! Ergo, being able to select out of a range of databases (eg: your own personal filters, the school's database, the databases built by the various clubs & societies the kids belong to, etc, ad nausium), makes MUCH more sense than blindly following one database built around the fiction of one-size-fits-all.
Yes, it takes more time. But in the end, you will ALWAYS have a trade-off. The easy and generic routes are INVARIABLY harmful in the long term. You can become a cabbage-patch human, and live in Number 6's Village for all eternity, or you can put some effort in and live as a human being, instead. This doesn't mean being "rebellious" - if you rebel for the sake of defying what someone else says, your brain is as much a rotten cabbage as the obsessive conformist.
Getting back to this censorware, it's market is that of the obsessive conformist, and the most vocal critics (in the media) are the obsessive rebels. It's a symbiotic relationship made in hell. The more extreme one group gets, the more it feeds the other. Don't you think the makers knew it would be controversial? Of course they did! They are counting on it! The more attention it gets, the more free advertising, the more money they make and the more brownie-points they can give themselves.
The media critics are the same. Without products like this, there's nothing to vent about, and therefore no reason for anyone to read their articles, and therefore no reason for anyone to keep them employed. They don't want their "enemies" to go away, because they're the ones who justify the pay-cheque.
IMHO, whilst the Extreme Wing and the Press are "best of enemies", there's no place for sanity in the world. Who needs it, when you've a symbiotic, self-perpetuating feeding-frenzy?
Re:What to fight (Score:2)
Their reluctance to allow this review to take place suggests to me that it's just a scam.
Heh. Sorta like chroma-key. (Score:2)
I always wanted to see them use "porno-key", the system which replaces any pornographic material with a calm mountain lake.
So, you'd have this giant, thrusting, penis-shaped calm mountain lake scene superimposed on an empty bedroom.
But it'd eliminate the pornography.
But if she's wearing it, why filter it? (Score:2)
Re:" One man's porn is another man's art. " (Score:2)
We don't give 'the children' enough credit in seeking out information for themselves. They'll lear, whether we want them to or not - the only sensible thing to do is to tell them the truth.
Just as we tell them to 'be careful when driving', and 'don't speed'; we should also be telling them useful, practical information. "When making a left turn at an intersection, keep your wheels pointed straight - so if you get rear-ended, you will not be pushed into oncoming traffic." is the sort of common-sense knowledge which they need, and which we have (from experience) to give them.
Same goes for matters of sex. You may have different opinions than I but here's as good an example as any:
Rule #1: Whatever you do, I'll still love you because you are my child.
Rule #2: Don't hate or hit anyone just because they're different.
Rule #3: If someone hurts you, it isn't your fault - carry mace, just in case.
Rule #4: No means No!
Rule #5: Use a condom, whomever you do.
Rule #6: Your friends know as little as you do. If you don't know, ask me. If I don't know, we'll find out together.
You're far from 'ranting'. It's common sense - education is the key to solving most, if not all, of the world's problems. The worst thing we can to is stay silent while some self-appointed, holier-than-thou glory seeker defines a moral baseline for all of society (not just their own kids, unfortunatelly), including ourselves and our childern.
'62 ? (Score:2)
That, and I wasn't born until a couple of years later, so I couldn't read the labels on my crayons until the late 60's, and my crayolas certainly had "flesh" as a color . . .
Re:What to fight (Score:2)
In a more general case - which seemed to be where this particular subthread was going and was certainly what I was talking about - where you have a program based on a blocklist, opening the source adds very little indeed.
Re:I'm on a Porn Collection Task Force (Score:2)
--
Here's my mirror [respublica.fr]
Schoolgirl outfit... (Score:2)
--
grappler
Re:No Close-ups (Score:3)
--
Here's my mirror [respublica.fr]
Of tanks and sunny days (Score:4)
The military was developing a neural net to optically identify tanks hiding in brush and trees. They trained the net on photographs, and noticed exceptionally good performance. Then they tried a new set of photographs, and the success rate of the network suddenly fell flat.
It turned out that in the first set of photos, all the 'with tanks' pictures were taken on a sunny day, and all the 'no tank' pictures were taken on an overcast day. In the second photo set, this was reversed, and the network continued to correctly tell sunny and cloudy apart...
So I have to wonder what this new development holds in store. Will we be graced with filtering software which discriminates between gaudy, tacky motel room backgrounds rather than drunk co-eds on crack taking it in the ear? Will it be intelligent enough to tell if someone trims their hair, and is therefore a professional, versus those fat and hairy amateurs?
Hey, maybe it will finally be able to tell if the subject of the questionable photo is in fact "barely legal", or has the stretchmarks of a few litters of puppies to suggest the contrary.
Your point on skin color is well taken, and opens up the question: What does ALL porn have in common that can possibly be quantified and filtered? The answer, I'm afraid, is nothing at all.. One man's porn is another mans art.
Aside: I, for one, didn't consider the Maplethorpe photos to be obscene. They weren't artistic either. They were just 'for shock', to cause a furor, and get more attention through objection than through inherent value.
My strong suspicion is that whoever is behind this effort is a lot smarter than they seem at first glance. They are trying to bleed the "Religious Right" of money by getting 'upstanding' Bible-thumpers to fund this research - all the while knowing that it can not possibly be successful.
"Oh look, that girl is wearing a schoolgirl outfit - this is a porn pic, filter it!" -- so much for all the Brittney Spears fan pages..
Tripod has been doing this for some time... (Score:2)
Of course there are several ways to defeat the program:
The best way to get rid of a stupid system is to think around it.
some nudity will slip through... (Score:2)
Wow, this article is some serious trollbait -- I'll bite!
Naked and petrified will live on, such as my creation for the /. trolls:
Natalie Portman naked and petrified [angelfire.com].
Someone else created:
Signal 11 naked and petrified [geocities.com]
and I wonder where they got the idea???
Young boys are always going to be able to get pictures of naked, not necessarily petrified women -- they're beautiful things, are they not? Didn't you ever stay up late to watch a "naughty" movie like Revenge of the Nerds or Porky's knowing that it would be an opportunity to catch a glimpse at some "hooters"? It's sad, but kinda funny, innit?
--
Best filtration software... (Score:4)
Nothing new, just a new scam (Score:5)
This is a particularly disgusting (to me at least) instance of the "for the children!" canard. Now instead of politicians using it to achieve their aims, which is bad enough, we've got a company using it to bilk panicked consumers out of their money.
And of course, just as with the quality of our politicians, we Americans have only ourselves to thank for this. If people weren't so damn gullible, companies like this would never sell a dime of product (of course in this case it's questionable whether what they have constututes a "product", but the point stands...)
What's needed is people willing to stand up and say "Yes, damn it, I do support porn on the Internet, and the easy availability of information on things like bomb-making and lock-picking, and if you don't like the speech I support, TOUGH SHIT. You don't get to pick and choose. If you want free speech, you got it. If you don't want it, go start your own damn country and LEAVE MINE ALONE."
But what are the odds of that happening?
Heuristic analysis (Score:5)
Clears the way for amputee fetishes, I think.
Boticcelli's Venus, the image of a naked woman coming out of the surf, that has been used as the box art for Adobe Illustrator (IIRC) would of course be flagged. She has nipples and a 'patch of hair', as do most nudes painted during that time period....
Hell, the Sistene Chapel ceiling is offensive, it shows Adam (naked youth) and God (Old man) touching fingers.... There's a bunch of naked little boy cherubs flouncing around them to boot. What horrific kinkiness!!
Planned Enhancements for the Program (Score:2)
One planned enhancement for the software is configurability for the amount of *exposed flesh* shown before the engine kicks in and blocks the image. The idea is to have 'sliders' -- client-side java applets, I'm told -- on a admin/config page which would allow for a specific percentage of (for example) nipple. Once the network identifies the presence of nipple, the position of the configuration sliders determine if this presence is, in fact, pornographic or not pornographic.
My question to the dude I met was how does the program quantify 'pornography' in the first place? If the neural networks are scanning for flesh, then they must have some sort of way to contextualize and quantify porn. (Since the 'I know porn when i see it' definition can't possibly work in a programmatic environment.)
His response was interesting: he claimed that while he couldn't explain exactly how it was accomplished, he mentioned that several state governments are looking to extrope's definitions of a 'porn' image in order to settle various state and local pornographic cases throughout the country.
He explained that it will possible to dump out the specific 'porn' settings -- set by the sliders on the config page -- and generate a long list of what, according to the admin, constitutes porn: 63.5% exposed nipple, more than 72% bare (suntanned but not pale) flesh, the absence of either a shirt or pants [but the presence of black {but but not white} underwear], the presence of various objects in the room in which the photograph was taken (a smoking cigarette in an ashtry, for example; or a bottle of Dewar's scotch that looks as though it could have been inbibed by the photographic subject; one black high heeled pump turned on its side, pointing away from the camera but [an important distinction] *toward* the bed), and so on.
The difficulty, I was told, was derving an algorithm robust enough to exploit the neural networks but not tax it to its limit. (The employee was telling me that just a few hours ago successfully implemented the algorithm if the clothing on the subject in question was purchased from JC Penney's or from Victoria Secret.
"It was tough," he explained. "Victoria's Secret uses significantly smaller weaves in their nylon undergarments (hence the higher price for lingerie from VC as opposed to JC Penney's). Try getting a program to recognize a bra from VC, and you've got the holy grail of censorware!"
Open Source (Score:2)
Wired News tested BAIR by creating a Perl program to extract images randomly from an 87MB database of thousands of both pornographic and non-pornographic photographs. The program then assigned each of those images random numbers as file names.
...Do you suppose they could be convinced to open source that database? Quick, someone call ESR!
Only for white-folk? (Score:5)
No, I am not trolling. This is seriously flawed. Not to mention stupid.
Can someone mirror the article? (Score:3)
Porn again Trek (Score:2)
Orion slave girls
Being green, can still be seen
Pike says: Make it so!
------
Oh, yea...this is a *great* idea.... (Score:2)
This happened at a place I once worked. All of a sudden, we couldn't (for instance) look up the words: they also added "young," "adult," and other words to the list...
keep in mind that this wasn't looking just in the domain name for the string, but in the entire URL.
At any rate, this sort of half-assed content filtering still doesn't replace mom or dad talking to Dick and Jane about the world and what's in it.
Censor this... (Score:2)
http://equip.zdnet.com/digitalimaging/cameras/fea
A different way to detect..... (Score:2)
Disclaimer: This information might be wrong, it was a while ago that I saw it.
Re:hmm (Score:3)
Doesn't have to work well... (Score:2)
That's the beauty of this thing (from a rather draconian management point of view anyway), it doesn't have to do a very good job, it just has to work well enough to intimidate the employees enough that they don't dare visit any sites without a really good reason. Viola! You've eliminated recreational use of the Internet at your company!
My New Comic Strip (Score:2)
Hrmm, this software probably works great already .
Next, I think that I'll do a photo spread on fields of wheat.
(Before you mod this down, read what I wrote, it's a joke about the heuristic).
If I had this.... (Score:2)
Porn search engine, anyone?
Re:Doesn't have to work well... (Score:2)
disables the computer ((!) how stupid is that?!) then that worker is completely idled until the admin re-enables it
I'm sure that such a feature would have to be optional. They could simply have the snapshot sent to management and the user wouldn't need to know anything about it.
In a tight labor market, that might not be too wise...
I would agree, but we're talking about companies that ARE going to use this software. There are plenty out there that will. I wouldn't want to work for such a company, but not everybody can afford to be picky about their employer all the time (including me, I just happen to work for a company that isn't quite so harsh about internet use. They do block sites, but it's done by humans, not a software program). Once you accept that some companies will use such software, unwise as that may be, then what I said is true. The software doesn't have to work very well. Just well enough to put the fear of management into the employees.
Re:skin tones? (Score:2)
Devil Ducky
Re:Can you say racism lawsuit? (Score:2)
How many flaws in this stupid idea? (Score:3)
1. People in swimsuits.
2. People doing nasty things, but wearing "fetish atire."
3. People doing nasty things with Members Of Other Species. (Animals, ICK!)
4. Wresting (Including Sumo.)
5. Sunsets. (Some of them have a lot of "flesh tones" in them.
6. Manipulated images with a slighly more blue color temperature.
7. Medical images.
8. Fine art.
9. Bodybuilding pictures. (see: swimsuits)
What an obvious, but still obviously stupid idea! I've been doing image analysis for over 20 years, and this idea did not deserve a moment's consideration, much less venture capital.
Re:Nothing new, just a new scam (Score:2)
So, you're basically saying you have the right to ram your sexual content down the throat of anybody who dares log onto the net, that it's not their right to choose for themselves.
No, I'm saying it is their right to choose. Nobody is forcing you to visit alt.sex.cute.little.bunnies. Your neighbor visiting http://www.hotteensluts.com/ doesn't affect you in any way, unless he's doing something like taping printouts to your car window.
"Moral standards" (or at least, your moral standards) restrict my right to choose whether I want to view that kind of thing or not. It's also a common principle that morality means nothing if you have no choice.
The 'free speech' referenced in the US Constitution was never about jiggling tits. You don't have a leg to stand on.
Sorry, it was, and I do. Any power not specifically granted to the government in the Constitution is forbidden to it. Any right not specifically forbidden the people is granted to them. The Bill of Rights specifically grants freedom of speech and does not restrict that freedom in any way.
"Acceptable free speech" is an oxymoron. Speech is free or it is not. Once you start imposing limitations on what I can print or say or depict, you don't have free speech any more.
All wrong (Score:2)
>blond-haired, blue-eyed people
They're going about it all wrong. They could just block aryanporn.com.
Devil Ducky
Caching files, having persons look at them? (Score:2)
But as an enterprise that forbids their employees to watch porn material, why not move a copy of each image file that comes over the network in a separate directory, also storing which computer requested it at what time.
I think it's sufficient to tell the employees that this is done and the 'image cache' browsed by a person on a regular basis.
The different ways to circumvent this: (Score:2)
Detecting Skin tones (Score:4)
The professor suggested that we start with skin tones. He pointed us to research that tried to pick out the parts of the spectrum considered "skin tone". There were some simple algorithms that were suggested. We did this and it worked decently well, but there were a lot of things that looked like skin to it. Especially light colored woodwork.
An algorithm like this may be able to filter a lot of stuff off the web. But it will filter a lot of other stuff too. I can also think of 100 ways to fool it. The easiest being put images through a color filter before posting them, or post them in black and white. Other people have pointed out that it will filter portraits and other shots of humans that arn't porn.
Re:New internet and unaccountable businesses (Score:2)
Ah, so you too were paying attention when we heard that in Freshman Immigration Course two semesters ago.
Re:What to fight (Score:3)
The only thing that makes _any_ sense to open up to give confidence that the blocking is fair is the blocking list itself. Now, if we're blocking access for young kids exclusively, that's fine. Once you include teenage boys into the equation though - as the demographic where blocking software and computer skills are most likely to cross over - then it becomes a liability as, the second the kid finds out how to bypass the blocking software (which _will_ happen eventually) then they have a list of sites that their parents don't want them to use, sitting right in front of them...
Like I said, it's tricky. Blockers want to know that the blocks are sensible, blockees want to know where all the dodgy material is - and one leads straight to the other. Quite how we fix it I really don't know, but it's far too simplistic to simply say that opening everything solves all the problems. It doesn't, and it creates two new ones.
What can I say? (Score:2)
--