Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Privacy AI

Face Search Engine PimEyes Blocks Searches of Children's Faces (nytimes.com) 25

PimEyes, a search engine that relies on facial recognition to help people scan billions of images to find photos of themselves on the internet, announced that it has banned searches of minors as part of the company's "no harm policy." The New York Times reports: PimEyes, a subscription-based service that uses facial recognition technology to find online photos of a person, has a database of nearly three billion faces and enables about 118,000 searches per day, according to [PimEyes CEO Giorgi Gobronidze]. The service is advertised as a way for people to search for their own face to find any unknown photos on the internet, but there are no technical measures in place to ensure that users are searching only for themselves. Parents have used PimEyes to find photos of their children on the internet that they had not known about. But the service could also be used nefariously by a stranger. It had previously banned more than 200 accounts for inappropriate searches of children's faces, Mr. Gobronidze said.

"Images of children might be used by the individuals with twisted moral compass and values, such as pedophiles, child predators," Mr. Gobronidze said. PimEyes will still allow searches of minors' faces by human rights organizations that work on children's rights issues, he added. Mr. Gobronidze said that blocking searches of children's faces had been on "the road map" since he acquired the site in 2021, but the protection was fully deployed only this month after the publication of a New York Times article on A.I.-based threats to children. Still, the block isn't airtight. PimEyes is using age detection A.I. to identify photos of minors. Mr. Gobronidze said that it worked well for children under the age of 14 but that it had "accuracy issues" with teenagers.

It also may be unable to identify children as such if they're not photographed from a certain angle. To test the blocking system, The Times uploaded a photo of Mary-Kate and Ashley Olsen from their days as child stars to PimEyes. It blocked the search for the twin who was looking straight at the camera, but the search went through for the other, who is photographed in profile. The search turned up dozens of other photos of the twin as a child, with links to where they appeared online. Mr. Gobronidze said PimEyes was still perfecting its detection system.

This discussion has been archived. No new comments can be posted.

Face Search Engine PimEyes Blocks Searches of Children's Faces

Comments Filter:
  • So they can still gather biometric data about kids, just not allow anyone to search said data? Seems like these companies can do much more.
    • by quenda ( 644621 )

      What about women? Are they not also vulnerable and deserving of protection?
      Ban searches for females! the internet of full of stalkers and rapists.
      Actually, not just the internet. We should cover their faces whenever in public.
      "no harm policy" should mean no harm to anyone. Better shut down the whole friggin internet.

  • Hmmm (Score:4, Insightful)

    by paul_engr ( 6280294 ) on Wednesday October 25, 2023 @12:43AM (#63951571)
    "no harm" would be to delete their database and burn the building down
    • "no harm" would be to delete their database and burn the building down

      Should we nuke the site from orbit just to be sure?

      If I understand correctly PimEyes bought a database of photos that lacked any kind of metadata that indicted if the subject of the photo was an adult or child, now with concerns of protecting minors from abuse and exploitation they are attempting to train an algorithm to discern the age of the subjects in a photo based on appearance. That's an impossible task given that there's no immediate and drastic change in appearance upon reaching 18 years of age.

      The

    • by AmiMoJo ( 196126 )

      I've initiated a GDPR complaint against them. They did not obtain opt-in permission to process my personal data in this manner, and have no reasonable "legitimate interest" argument.

  • I watched a YouTube video lately where the topic of discussion was a of a supposed "loophole" in the trade of child pron. They were quite upset about some website that produced videos of adults with a youthful appearance doing things that children should not be doing. I didn't know what to think because at no time was any real and actual minor put at risk of abuse or exploitation. As repulsive such videos may be to the general public there were no children shown at any time. For a variety of reasons I ma

    • by Iamthecheese ( 1264298 ) on Wednesday October 25, 2023 @01:09AM (#63951599)
      >Adults exploiting their youthful appearance for profit from products and services that are in a legal grey area

      A legal gray area? No. They are doing something absolutely legal and not at all immoral. The moral panic has to stop. If we want to actually protect children the moral panic must stop. AI generated CP is already ruining the market for purchased CP. It's destroying the credibility of abuse merchants and trust between would-be buyer and seller. The financial value of actual abuse is falling. The only victim of young looking adults and the like is power-hungry politicians whose excuse is disappearing.
      • If I had mod points, I would mod this reply up. It hits the nail on the head. Just like the war on drugs, prohibition creates unethical black markets. If there exists a way to serve a market legitimately without causing any real-world harm and we prohibit it, we are actually causing real-world harm by doing so.

        The illegal act required for production is referred to as Child Sexual Exploitation by law enforcement because it involves real, living children being sexually assaulted, with Child Sexual Abuse Ma
      • AI generated CP is already ruining the market for purchased CP.

        Citation? I mean this is the kind of thing I want to be true, but do you have anything other than hope and dreams to back this up?

      • by AmiMoJo ( 196126 )

        It's not the paedophiles I'm worried about, it's the fact that most of the women I see in AI generated images have the face of a 12 year old and the body of Jessica Rabbit. Do people actually find that attractive? Is a 12 year old with huge boobs and a Barbie doll figure an appropriate fantasy that definitely won't create problematic connections in the minds of the guys jacking off to it?

        It's not great for women and girls either. Airbrushing and photoshopping magazines was bad enough, but at least those mod

      • by Anonymous Coward

        True, it's not a grey area: Young looking adults doing porn is ILLEGAL. This is despite the fact there are no children involved.

        It has already been settled in court with at least one person going to prison for porn with only adult participants. The young looking actress even went to bat for the guy in court and it didn't work.

        • That's true in Australia, but I haven't heard that it was illegal elsewhere.

          I have to assume that this was because the lawmakers didn't want to accidentally watch two adults having sex.

    • If efforts against it were focused on rescuing the children from abuse and prosecuting the creators, rather than searching through people's hard drives for files you're not supposed to have (which always seems to end up including a broader range of things for some reason,) then there would be no problem. Actually tackling the problem, rather than just doing something easier to pretend to tackle it, very rarely runs into problems determining whether something is a child or an adult. Or a figment of someone's
  • How is such a thing even legal, for children or for adults.
  • Did they try it with a grownup Peter Dinklage?

  • by LainTouko ( 926420 ) on Wednesday October 25, 2023 @04:12AM (#63951833)
    The idea that providing such a service is harmful to children without being harmful to adults is utterly incoherent.
    • Lots of money using it on adults. Not so much kids. They need to look like a good corporate citizen and demonstrate efforts to self regulate before theyre a household name.

  • Given that anyone can go to the mall and look at as many real-life children as they like in person, why are they concerned about someone looking at a picture of random children located a thousand miles away? Or is this just an induced moral panic to get their business in the news?
    • One potentially valid concern is that people could use the tool to target children they've seen in their community. Take pictures of desired victims, run them all through the search engine, and see which are easy targets based on the information returned.

      Of course, that same argument applies to adults, we just don't care about them.

      • When businesses do things there is often a confluence of reasoning.
        They can head off potential bad press with good press and if someone ends up murdered they can use this as proof that they’re very concerned about how their tool is used and offer this up as evidence.

        They know full well they’re building a tool that will definitely be used for nefarious purposes but there is no profitable use case for children. It could be used for finding missing children but the money in that has gotta be prett

  • How do you appear as a child for that algo?

    • Theyre probably training it on pics of kids. This result = bad. I’m sure that looking young will be enough to get excluded.

  • ... such as pedophiles, child predators

    I can understand a stalker using this to find a child but how exactly, will a pedophile use an image that doesn't contain nudity?

    It was only a year or two ago, some newspaper revealed that YouTube videos of children were being recommended by pedophiles. Nobody demanded children be deleted from YouTube.

    ... their days as child stars ...

    I realize the author was testing the efficacy of the service but does blocking a photo taken 20 years ago, save anybody?

  • I wish companies riding on the coat-tails of hysterical lot's moral panic would stop. It only encourages them to screech more. This isn't about protecting the children. This is about trying to make their sleezy, illegally scraped database of faces, built from the ground up as stalkerware, look legitimate.
  • Trying to frame their product in the most innocent light possible when they know full well that they’re training it mostly on people using it for stalking and doxing so they can sell it off to law enforcement, intelligence, and oppressive government regimes once it gets good.

Love makes the world go 'round, with a little help from intrinsic angular momentum.

Working...