Face Search Engine PimEyes Blocks Searches of Children's Faces (nytimes.com) 25
PimEyes, a search engine that relies on facial recognition to help people scan billions of images to find photos of themselves on the internet, announced that it has banned searches of minors as part of the company's "no harm policy." The New York Times reports: PimEyes, a subscription-based service that uses facial recognition technology to find online photos of a person, has a database of nearly three billion faces and enables about 118,000 searches per day, according to [PimEyes CEO Giorgi Gobronidze]. The service is advertised as a way for people to search for their own face to find any unknown photos on the internet, but there are no technical measures in place to ensure that users are searching only for themselves. Parents have used PimEyes to find photos of their children on the internet that they had not known about. But the service could also be used nefariously by a stranger. It had previously banned more than 200 accounts for inappropriate searches of children's faces, Mr. Gobronidze said.
"Images of children might be used by the individuals with twisted moral compass and values, such as pedophiles, child predators," Mr. Gobronidze said. PimEyes will still allow searches of minors' faces by human rights organizations that work on children's rights issues, he added. Mr. Gobronidze said that blocking searches of children's faces had been on "the road map" since he acquired the site in 2021, but the protection was fully deployed only this month after the publication of a New York Times article on A.I.-based threats to children. Still, the block isn't airtight. PimEyes is using age detection A.I. to identify photos of minors. Mr. Gobronidze said that it worked well for children under the age of 14 but that it had "accuracy issues" with teenagers.
It also may be unable to identify children as such if they're not photographed from a certain angle. To test the blocking system, The Times uploaded a photo of Mary-Kate and Ashley Olsen from their days as child stars to PimEyes. It blocked the search for the twin who was looking straight at the camera, but the search went through for the other, who is photographed in profile. The search turned up dozens of other photos of the twin as a child, with links to where they appeared online. Mr. Gobronidze said PimEyes was still perfecting its detection system.
"Images of children might be used by the individuals with twisted moral compass and values, such as pedophiles, child predators," Mr. Gobronidze said. PimEyes will still allow searches of minors' faces by human rights organizations that work on children's rights issues, he added. Mr. Gobronidze said that blocking searches of children's faces had been on "the road map" since he acquired the site in 2021, but the protection was fully deployed only this month after the publication of a New York Times article on A.I.-based threats to children. Still, the block isn't airtight. PimEyes is using age detection A.I. to identify photos of minors. Mr. Gobronidze said that it worked well for children under the age of 14 but that it had "accuracy issues" with teenagers.
It also may be unable to identify children as such if they're not photographed from a certain angle. To test the blocking system, The Times uploaded a photo of Mary-Kate and Ashley Olsen from their days as child stars to PimEyes. It blocked the search for the twin who was looking straight at the camera, but the search went through for the other, who is photographed in profile. The search turned up dozens of other photos of the twin as a child, with links to where they appeared online. Mr. Gobronidze said PimEyes was still perfecting its detection system.
Search is blocked, what about indexing? (Score:1)
Re: (Score:2)
What about women? Are they not also vulnerable and deserving of protection?
Ban searches for females! the internet of full of stalkers and rapists.
Actually, not just the internet. We should cover their faces whenever in public.
"no harm policy" should mean no harm to anyone. Better shut down the whole friggin internet.
Hmmm (Score:4, Insightful)
Nuke it from orbit? (Re:Hmmm) (Score:2)
"no harm" would be to delete their database and burn the building down
Should we nuke the site from orbit just to be sure?
If I understand correctly PimEyes bought a database of photos that lacked any kind of metadata that indicted if the subject of the photo was an adult or child, now with concerns of protecting minors from abuse and exploitation they are attempting to train an algorithm to discern the age of the subjects in a photo based on appearance. That's an impossible task given that there's no immediate and drastic change in appearance upon reaching 18 years of age.
The
Re: (Score:2)
I've initiated a GDPR complaint against them. They did not obtain opt-in permission to process my personal data in this manner, and have no reasonable "legitimate interest" argument.
How do they know the photo is of a child? (Score:2)
I watched a YouTube video lately where the topic of discussion was a of a supposed "loophole" in the trade of child pron. They were quite upset about some website that produced videos of adults with a youthful appearance doing things that children should not be doing. I didn't know what to think because at no time was any real and actual minor put at risk of abuse or exploitation. As repulsive such videos may be to the general public there were no children shown at any time. For a variety of reasons I ma
Re:How do they know the photo is of a child? (Score:4, Insightful)
A legal gray area? No. They are doing something absolutely legal and not at all immoral. The moral panic has to stop. If we want to actually protect children the moral panic must stop. AI generated CP is already ruining the market for purchased CP. It's destroying the credibility of abuse merchants and trust between would-be buyer and seller. The financial value of actual abuse is falling. The only victim of young looking adults and the like is power-hungry politicians whose excuse is disappearing.
Re: How do they know the photo is of a child? (Score:3)
The illegal act required for production is referred to as Child Sexual Exploitation by law enforcement because it involves real, living children being sexually assaulted, with Child Sexual Abuse Ma
Re: (Score:2)
AI generated CP is already ruining the market for purchased CP.
Citation? I mean this is the kind of thing I want to be true, but do you have anything other than hope and dreams to back this up?
Re: (Score:2)
It's not the paedophiles I'm worried about, it's the fact that most of the women I see in AI generated images have the face of a 12 year old and the body of Jessica Rabbit. Do people actually find that attractive? Is a 12 year old with huge boobs and a Barbie doll figure an appropriate fantasy that definitely won't create problematic connections in the minds of the guys jacking off to it?
It's not great for women and girls either. Airbrushing and photoshopping magazines was bad enough, but at least those mod
Re: (Score:1)
True, it's not a grey area: Young looking adults doing porn is ILLEGAL. This is despite the fact there are no children involved.
It has already been settled in court with at least one person going to prison for porn with only adult participants. The young looking actress even went to bat for the guy in court and it didn't work.
Re: (Score:2)
That's true in Australia, but I haven't heard that it was illegal elsewhere.
I have to assume that this was because the lawmakers didn't want to accidentally watch two adults having sex.
Re: (Score:2)
Seriously (Score:2)
What's their criteria? (Score:2)
Did they try it with a grownup Peter Dinklage?
Only children? (Score:3)
Re: (Score:2)
Lots of money using it on adults. Not so much kids. They need to look like a good corporate citizen and demonstrate efforts to self regulate before theyre a household name.
What exactly are they protecting? (Score:2)
Re: (Score:2)
One potentially valid concern is that people could use the tool to target children they've seen in their community. Take pictures of desired victims, run them all through the search engine, and see which are easy targets based on the information returned.
Of course, that same argument applies to adults, we just don't care about them.
Re: (Score:3)
When businesses do things there is often a confluence of reasoning.
They can head off potential bad press with good press and if someone ends up murdered they can use this as proof that they’re very concerned about how their tool is used and offer this up as evidence.
They know full well they’re building a tool that will definitely be used for nefarious purposes but there is no profitable use case for children. It could be used for finding missing children but the money in that has gotta be prett
The key question that needs answering (Score:2)
How do you appear as a child for that algo?
Re: (Score:2)
Theyre probably training it on pics of kids. This result = bad. I’m sure that looking young will be enough to get excluded.
Less moral outrage, please (Score:2)
I can understand a stalker using this to find a child but how exactly, will a pedophile use an image that doesn't contain nudity?
It was only a year or two ago, some newspaper revealed that YouTube videos of children were being recommended by pedophiles. Nobody demanded children be deleted from YouTube.
I realize the author was testing the efficacy of the service but does blocking a photo taken 20 years ago, save anybody?
The screeching hordes of pearl clutchers (Score:2)
Pimeyes pretending. (Score:2)
Trying to frame their product in the most innocent light possible when they know full well that they’re training it mostly on people using it for stalking and doxing so they can sell it off to law enforcement, intelligence, and oppressive government regimes once it gets good.