Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Privacy Software

Legal Chatbot Firm DoNotPay Adds Anti-Facial Recognition Filters To Its Suite of Handy Tools (theverge.com) 16

Legal services startup DoNotPay is best known for its army of "robot lawyers" -- automated bots that tackle tedious online tasks like canceling TV subscriptions and requesting refunds from airlines. Now, the company has unveiled a new tool it says will help shield users' photos from reverse image searches and facial recognition AI. The Verge reports: It's called Photo Ninja and it's one of dozens of DoNotPay widgets that subscribers can access for $36 a year. Photo Ninja operates like any image filter. Upload a picture you want to shield, and the software adds a layer of pixel-level perturbations that are barely noticeable to humans, but dramatically alter the image in the eyes of roving machines. The end result, DoNotPay CEO Joshua Browder tells The Verge, is that any image shielded with Photo Ninja yields zero results when run through search tools like Google image search or TinEye.

The tool also fools popular facial recognition software from Microsoft and Amazon with a 99 percent success rate. This, combined with the anti-reverse-image search function, makes Photo Ninja handy in a range of scenarios. You might be uploading a selfie to social media, for example, or a dating app. Running the image through Photo Ninja first will prevent people from connecting this image to other information about you on the web. Browder is careful to stress, though, that Photo Ninja isn't guaranteed to beat every facial recognition tool out there.

This discussion has been archived. No new comments can be posted.

Legal Chatbot Firm DoNotPay Adds Anti-Facial Recognition Filters To Its Suite of Handy Tools

Comments Filter:
  • duct tape over the camera lens?

  • seems like it would be helpful.

    Because I'm cheap I like to use things like exiftool to change the geotags to places like Antarctica or Somalia.

    Or just remove all the exif data completely.

    You could probably do image manipulations like flips and add some copyright watermarking so if someone "steals" your image, you may be able to sue them for using that image. Who knows...

    • None of your suggestions are remotely effective at avoiding facial recognition.

    • I don't get why people would pay for this feature. It's open source and there exist many ways to mess with facial recognition. Haven't found any details on how they manage this but I bet they use a similar technology related to making deep fakes. Instead of replacing faces or audio they obfuscate it somehow. I'm guessing they have targeted the most common method of facial recognition algorithms used. Some can be fooled by messing with the colors in a specific way that the human eyes can't detect easily. Whi

  • All I Could See (Score:4, Informative)

    by Barny ( 103770 ) on Tuesday April 27, 2021 @06:42PM (#61321876) Journal

    All I could see that it did was flipping it horizontally. I took a screencap of both the original and the "modified" version.

    Original: found by google image search
    Modified: not found by google image search
    Modified (flipped horizontally): found by google image search

    So the shtick of "pixel editing" means exactly jack and shit. They're just flipping the image horizontally.

    • by crow ( 16139 )

      And this means if you do it to all your online photos, then any search on one will find all the others. That's not useful.

      • Re:All I Could See (Score:4, Informative)

        by Barny ( 103770 ) on Tuesday April 27, 2021 @07:00PM (#61321930) Journal

        And it will prompt image reverse searchers to do a search and a second search with the image flipped.

        Mine doesn't do that (yet), though I do perform a second search with any transparent areas painted with a white background, and a third with it painted black.

        I wonder if their "pixel editing" is just setting all "almost white" pixels to transparent, which is why it doesn't show up with a screenshot tool.

    • by AmiMoJo ( 196126 )

      I was able to reproduce your results but I think the test that The Verge did was flawed. The problem is that the original image is one that's appeared online hundreds of times already, so TinEye/Google doesn't need to do any facial recognition. All it needs to do is is match that particular image to one of the hundreds of copies it has seen online and extract some text from the webpages they appear on.

      We need an image that isn't already online to test with.

  • by pz ( 113803 ) on Tuesday April 27, 2021 @08:29PM (#61322146) Journal

    Pity they chose that name, because there is excellent digital negative development software called Photo Ninja that already exists as a commercial product with a registered trademark: https://www.picturecode.com/ [picturecode.com] The package is a competitor to Adobe Lightroom, but with superior noise reduction. Actually, the noise reduction and color balancing tools in it are pretty amazing.

    But, in any case, I'd expect more from a legal firm than to use another company's already trademarked name.

  • by alexo ( 9335 ) on Tuesday April 27, 2021 @09:28PM (#61322312) Journal

    Using the "search by image" add-on, I clipped the actual image in the "modified" example and sent it to Yandex.
    The results: https://is.gd/5SJ8oy [is.gd]

  • This might make it a lot harder to spot scammers using pictures from porn sites on dating sites....
  • by tlhIngan ( 30335 ) <slashdot&worf,net> on Wednesday April 28, 2021 @05:11AM (#61323098)

    Because you know scammers are going to misuse and abuse this feature. They already steal images for profile photos, and one common trick is to image-search those photos to find out who the real people are. Or if it's a scam, since if not a stolen profile photo, a stock image. Well, not anymore, I guess.

    It was fun while it lasted, I suppose.

  • Any frequently-used masking algorithm that fools bots but not people won't fool bots for very long.

    If the bot is "AI" then it will be re-trained on new data and "learn."

    If it's a conventional recognizer, it will be replaced with a newer one that sees through it.

    Now, here's where I do see "masking algorithms" being useful:

    If I have a picture that I want to upload for a single use, without any information that identifies me attached, I could put it through several different masks including one that might leav

  • Finally there's a new tool promising to make your pictures undetectable to facial recognition. Why are there less companies fighting to protect users privacy than those evil ones violating your rights? https://customwritingz.net/ [customwritingz.net]

"Being against torture ought to be sort of a multipartisan thing." -- Karl Lehenbauer, as amended by Jeff Daiell, a Libertarian

Working...