Tech Sites Including Microsoft's Bing Criticized Over Child Pornography Policies (cnet.com) 73
"Microsoft's Bing search engine reportedly still served up child porn, nearly a year after the tech giant said it was addressing the issue," reports CNET:
The news comes as part of a Saturday report in The New York Times that looks at what the newspaper says is a failure by tech companies to adequately address child pornography on their platforms.... [A] former Microsoft executive told the Times that it now looks as if the company is failing to use its own tools. The Times' Saturday report notes that 10 years ago, Microsoft helped create software called PhotoDNA that "can use computers to recognize photos, even altered ones, and compare them against databases of known illegal images." But, the Times said, Bing and other search engines that use Bing's results are serving up imagery that doesn't pass muster with PhotoDNA....
The Bing news is part of a larger story from the Times about how various tech companies are dealing with child porn on their platforms.
The Times criticizes a tech industry which they say is looking the other way: Amazon, whose cloud storage services handle millions of uploads and downloads every second, does not even look for the imagery. Apple does not scan its cloud storage, according to federal authorities, and encrypts its messaging app, making detection virtually impossible. Dropbox, Google and Microsoft's consumer products scan for illegal images, but only when someone shares them, not when they are uploaded. And other companies, including Snapchat and Yahoo, look for photos but not videos, even though illicit video content has been exploding for years. (When asked about its video scanning, a Dropbox spokeswoman in July said it was not a "top priority." On Thursday, the company said it had begun scanning some videos last month.)
The largest social network in the world, Facebook, thoroughly scans its platforms, accounting for over 90 percent of the imagery flagged by tech companies last year, but the company is not using all available databases to detect the material. And Facebook has announced that the main source of the imagery, Facebook Messenger, will eventually be encrypted, vastly limiting detection.
The Bing news is part of a larger story from the Times about how various tech companies are dealing with child porn on their platforms.
The Times criticizes a tech industry which they say is looking the other way: Amazon, whose cloud storage services handle millions of uploads and downloads every second, does not even look for the imagery. Apple does not scan its cloud storage, according to federal authorities, and encrypts its messaging app, making detection virtually impossible. Dropbox, Google and Microsoft's consumer products scan for illegal images, but only when someone shares them, not when they are uploaded. And other companies, including Snapchat and Yahoo, look for photos but not videos, even though illicit video content has been exploding for years. (When asked about its video scanning, a Dropbox spokeswoman in July said it was not a "top priority." On Thursday, the company said it had begun scanning some videos last month.)
The largest social network in the world, Facebook, thoroughly scans its platforms, accounting for over 90 percent of the imagery flagged by tech companies last year, but the company is not using all available databases to detect the material. And Facebook has announced that the main source of the imagery, Facebook Messenger, will eventually be encrypted, vastly limiting detection.
Wait for it... (Score:1)
Re: (Score:2)
"I have never run into CP (or teen porn) in all the years I've been alive. Where does this stuff actually exist? This is a serious question. Is there some secret tor-based P2P software people use, or are we literally talking about people trading DVDs of stuff they made? "
We can't tell you, because you're a cop.
Re: Wait for it... (Score:2)
Re: (Score:2)
As the co-webmaster of my high school in 1999, we started to see there were people say any picture of a person under 21 should be considered "porn" and we disagreed. Our kind of school website is no longer accessible to Google and Bing, but I know there are schools that publish student pictures to the school community, but it doesn't leave the community of the school... childless people are locked out.
Part of this is that the "we call it porn!" crowd wants to shut down Nickelodeon and Disney because that gi
Re: (Score:2)
There's a "dark web" that Google and Bing can't find... such as sites that only go by IP numbers. Remember, anything is acceptable to The Pirate Bay when that whac-a-mole site pops up.
Re: (Score:2)
This article is claiming that Bing can find it.
Re: (Score:2)
A guy from my high school showed a friend some child porn images on his computer. The friend was disgusted and told the FBI. When they busted him, they supposedly found all sorts of still images and videos, including DVDs.
Many people who start viewing this type of material are "groomed" by others in a similar way.
Re: (Score:2)
How? Ya know, sexuality is kinda set, it's not something you can really be "groomed" for. If you don't like fucking other guys in the butt, it's not really something you can acquire a taste for.
Re: (Score:1)
Re: (Score:2)
If you think of the children all the time, you're probably a pedo.
They all highlight the reason (Score:3)
Encryption. If only they didn't use encryption. Apple encrypts everything, designed for evil child pornographers of course. How much pornography is 'really' out there, what percentage of total images? Is it worth violating everyone's privacy?
Re: (Score:1)
Re: (Score:1)
Child porn is the wedge that is intended to be used to force a capability for surveillance to see everything.
Re: They all highlight the reason (Score:2)
Re: (Score:2)
Why, does "terrorism" not work anymore?
Because nobody really gave a fuck about organized crime and drug dealers anymore even earlier, the other two [wikipedia.org] boogeymen in the fold.
Plesae don't make AI detect this (Score:2)
Please don't force tech companies to make AI that can detect child porn. That could only be used for evil. Someone will go "Hey, let's use a GAN [wikipedia.org] for this task!"
Microsoft already made the tool, they don't use it (Score:2)
Microsoft already made the tool. They just aren't using it for search.
Sometimes when people make unreasonable proposals they bring up child porn as an argument. There are also reasonable things that platforms can do. Let's do the reasonable things and not do the unreasonable things.
The summary mentions Facebook. For their own purposes, Facebook developed and uses software that identifies what is in images, including whose face it is. Facebook already knows for certain if the image is in the database of k
Re: Microsoft already made the tool, they don't us (Score:2)
How could they know it's child porn that isn't in the database?
Re: Microsoft already made the tool, they don't u (Score:1)
Re: (Score:2)
Except - there isn't actually a database of CP pictures like that. That wouldn't be illegal. It also wouldn't be very useful, as exact matching isn't very useful. There's a database of image signatures that pictures can be matched against even with small changes.
Re: Microsoft already made the tool, they don't u (Score:1)
Re: (Score:3)
Who was the famous musician who claimed he was buying child porn in order to stop child porn?
IIRC, it was Pete Townshend (of The Who)
Re: (Score:2)
Throwing the pictures away doesn't sound very smart, because when you come up with a better signature method, you'd have to go out and take all the pictures again.
Re: (Score:2)
Retaining a catalogue of the pictures is something the police can do, once, and share the signatures with others.
That way the images do not get disseminated and don't get stored on corporate servers.
Of course, you are at this point having to trust that the police are giving you the signatures of only child pornography, but that's a different conversation.
Re: (Score:2)
That might be a good way to detect nudity, but nudity is not necessarily pornography. If I'm not mistaken, the child porn laws in the US are not quite insane enough to criminalize photos of kids in the bath tub for example. I've never heard of an algorithm that can determine whether an image is sexual in nature.
Three ways immediately come to mind (Score:2)
People who do this kind of thing for a living have probably come up with more clever ways than my first thought. My initial thought is that if a group of people are exchanging make pictures of kids, sending them around the group, and a few of the pics are marked/known child porn, the rest probably are too.
In addition there may be a textual message included with the pic. Those messages would tend to use certain words, "lolita" or whatever code words the sickos use.
You can then apply basically the Page Rank
*naked. I should use preview to avoid typos (Score:2)
I had a typo. It should be:
My initial thought is that if a group of people are exchanging naked pictures of kids, sending several naked kid pics around the group, and a few of the pics are marked/known child porn, the rest probably are too.
Stop making technology sole social problems (Score:5, Insightful)
STOP MAKING TECH COMPANIES RESPONSIBLE FOR THE WORLD'S SOCIAL PROBLEMS!
It is not the responsibility of Amazon, Microsoft, Google, or Facebook to make automated systems that patrol all human activity for bad behavior. I would rather throw the writer in jail for searching for the inappropriate material, than the developers who aren't searching for it. That is not their job! If this is anyone's job, it is law enforcement, not tech companies.
And as a reminder: In the US, even cartoon drawing can be considered child porn [wikipedia.org]. Good luck making an algorithm that can read a Japanese Manga and determine if it qualifies as child porn in a US courtroom. The requirement is that it be "found to be obscene or lacking in serious value." I don't want those companies deciding for me what has "serious value."
Re: (Score:2)
I would rather throw the writer in jail for searching for the inappropriate material,
You would rather imprison the journalist? You must have got up early to press your brown shirt. Aren't you late for a Bund meeting?
Re:Stop making technology sole social problems (Score:5, Insightful)
Being a journalist should not be a "Get out of jail free" card. While yes, there should be some allowances for borderline criminal (but non-harmful) stuff in order to expose actual criminal activity, we are being told again and again that every single search, every photo, every CARTOON DRAWING OF A MINOR is harmful to all children in the entire world. It's one or the other, you can't have both.
Re: (Score:1)
we are being told again and again that every single search, every photo, every CARTOON DRAWING OF A MINOR is harmful to all children in the entire world.
Except you're mischaracterizing the argument, which is that some search engines are making it easy for the producers (who harm children) and the consumers (who fund them, or reward them in other ways) to get together.
It's one or the other, you can't have both.
Yeah, that's what I implied. You can't both support freedom and support jailing journalists for blowing the whistle.
Re: (Score:2)
which is that some search engines are making it easy for the producers (who harm children) and the consumers (who fund them, or reward them in other ways) to get together.
Is that the game you want to play? How about: ... ...
* Some color laser printers are making it easy for the producers (who harm children) and the consumers (who fund them)
* Some coal producers are making it easy for power producers (who release CO2) and the consumers (who fund them)
* Some knife manufacturers are making it easy for murderers to kill people
* Guns, etc.
The journalist is not blowing a whistle. They are pointing out the obvious, which is that search engines can be used to search for porn, chil
Re: (Score:1)
IT IS NOT THE SEARCH ENGINES RESPONSIBILITY!
It's everyone's responsibility.
It was not Xerox's responsibility when then invented the printer, nor Samuel Colt's responsibility when he invented the gun.
Samuel Colt no more invented the gun (or even the revolver) than Henry Ford invented the car.
Re: (Score:2)
I see the kiddie porn lovers have modpoints...
Re: (Score:2)
Re: (Score:2)
Because the journalist is now in possession of child porn!
Re: (Score:2)
Because the journalist is now in possession of child porn!
That would be a reason to jail the journalist, but if you read the Times article they were very careful to avoid ever seeing or storing any of the data. They created a headless browser that downloaded the images and piped them through an algorithm to match them against a database of known child porn images without ever displaying or storing the data.
Re: (Score:2)
Re: (Score:2)
Sounds pretty crap to me, but this is the first I recall hearing of it. There's so much noise in the news that it's hard to keep track of what's going on.
Re: (Score:2)
Kiddie porn is a business. People charge to video themselves shagging the kids. Anyone who knowingly profits from this business is going to have a bad time in terms of public relations and will have law enforcement issues. Microsoft has the tools to stop its services being used for kiddie porn but is chooising not to use them because its the thin end of a very thick anti-privacy wedge.
The Times report will make that position much harder to sustain.
Re: (Score:2)
If this discussion was going to stop at kiddie porn that would be fine. But it has already gone into political fact checking, filtering out references to Hong Kong, Tienanmen Square, Nazi symbols, right to be forgotten...
Anyone who knowingly profits from this business is going to have a bad time in terms of public relations
No, this is big businesses being singled-out for being big. I don't see an article telling Akamai not to cache it, or Google Chrome to filter it, or Network Solutions to refuse the DNS lookup, or CenturyLink to filter the packets. There's plenty of companies that made money off that sear
Re: (Score:2)
Re: (Score:2)
This isn't about AI making decisions. It's using known images to identify their spread. Same or similar counts here. And it would be flagging as inappropriate for review. Can we do what you're talking about? Sure but this isn't that.
Care to try again and make a lucid point?
Re: (Score:3)
The key here is what we can do to minimize the abuse of children. I think it is certain that children have been sexually assaulted for a very long time. The catholic priest did not just start molesting boys when Google started searching the internet, or when the first BBS started hosting pictures. The boycott
Re: (Score:2)
It is not the responsibility of Amazon, Microsoft, Google, or Facebook to make automated systems that patrol all human activity for bad behavior.
Actually it is very much the responsibility of companies to obey the law, whether that law be not selling products that kill people, not mishandling personal data, or not sharing child pornography.
Anti-encryption propaganda (Score:5, Insightful)
They are now using children to go against encryption. It was first "the terrorists", now "child abusers".
Same catch phrases repeated again, and again, until public relents, and another freedom is eroded:
- Won't anyone think of the children?
- If you have nothing to hide...
- If only we could unlock the phone of that known terrorist/arsonist/gang member..
I would take my freedom over securing myself for a %0.00001 chance of random harm
So ... (Score:2)
Bing and Co didn't check all the pictures and videos on all the internets before allowing it into their 'phonebook'?
Call me shocked.
Re: (Score:2)
'All the pictures on all the Internets' assumes that the sites they are crawling can't serve up custom content. Here's the generic porn for the boomers, here's some MLP for the zoomers and Barney the Dinosaur for the robots.
Re: (Score:2)
They are making thumbnails, which means they are downloading the image, and can use PhotoDNA to decide if it should be blocked. The tool exists and they have the data. Do you understand the issue now?
Govt wants access to everything (Score:4, Interesting)
Already in windows the boundary between local and cloud files has been blurred. I discovered that without my asking everything on my desktop had been put in the cloud for "my convenience". Now they want companies to deep scan data on the cloud.
I think we will just have to wait for people to learn how bad it is for the government to know everything about everyone. Then maybe future civilizations won't make the same mistake
I actually don't have anything to hid, but other people have things that I would like them to be able to keep hidden.
Re: (Score:1)
The answer is to not get any sort of cloud account for storage.
Local storage is shockingly inexpensive these days.
Re: (Score:2)
Yes, but if you use windows and outlook (which are actually quite user friendly) it gets set up for free, and sometimes folders get put in the cloud without your knowing it
Re: (Score:2)
No they do not get put "in the cloud" without you knowing it.
Re: (Score:2)
I was surprised to discover that my desktop was in the cloud. Next person I asked who used W10 didn't know that either
So *I* was surprised, but maybe most people know that?
Re: (Score:2)
"Already in windows the boundary between local and cloud files has been blurred."
No it has not.
Re: (Score:2)
maybe future civilizations won't make the same mistake
History shows that we don't learn from history.
Not their problem (Score:2)
They are not in the law enforcement business and they have no obligation nor business scanning peoples files just because.
Imagine if you are a renter. Should your landlord be permitted to come into the house you are renting and search for illegal shit, just because they own the property and hey, you MIGHT have some illegal shit there, and even though they do not specifically suspect them of anything, it is a good idea to check anyway? No fucking away!
It is no different here. This idea that companies should
Re: (Score:2)
They are deciding whether it goes in the search results or not. The search is already happening, it's a search engine. That's what it does.
Re: (Score:2)
Imagine if you are a renter. Should your landlord be permitted to come into the house you are renting and search for illegal shit, just because they own the property and hey, you MIGHT have some illegal shit there, and even though they do not specifically suspect them of anything, it is a good idea to check anyway? No fucking away!
A better question is if you are a renter, should law enforcement be permitted to conduct a warrantless search?
https://www.forbes.com/sites/i... [forbes.com]
spam (Score:1)
Still waiting for MS to deliver on their promise of eliminating Spam.
ai toolkit for child abuse detection (Score:1)