Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
Privacy AI

ChatGPT Models Are Surprisingly Good At Geoguessing (techcrunch.com) 14

An anonymous reader quotes a report from TechCrunch: There's a somewhat concerning new trend going viral: People are using ChatGPT to figure out the location shown in pictures. This week, OpenAI released its newest AI models, o3 and o4-mini, both of which can uniquely "reason" through uploaded images. In practice, the models can crop, rotate, and zoom in on photos -- even blurry and distorted ones -- to thoroughly analyze them. These image-analyzing capabilities, paired with the models' ability to search the web, make for a potent location-finding tool. Users on X quickly discovered that o3, in particular, is quite good at deducing cities, landmarks, and even restaurants and bars from subtle visual clues.

In many cases, the models don't appear to be drawing on "memories" of past ChatGPT conversations, or EXIF data, which is the metadata attached to photos that reveal details such as where the photo was taken. X is filled with examples of users giving ChatGPT restaurant menus, neighborhood snaps, facades, and self-portraits, and instructing o3 to imagine it's playing "GeoGuessr," an online game that challenges players to guess locations from Google Street View images. It's an obvious potential privacy issue. There's nothing preventing a bad actor from screenshotting, say, a person's Instagram Story and using ChatGPT to try to doxx them.

ChatGPT Models Are Surprisingly Good At Geoguessing

Comments Filter:
  • by RitchCraft ( 6454710 ) on Thursday April 17, 2025 @04:05PM (#65313337)

    Now I need to carry a green screen around with me.

  • "There's nothing preventing a bad actor from screenshotting, say, a person's Instagram Story and using ChatGPT to try to doxx them"

    This has already been a thing before ChatGPT or other AI assistant tools. The only difference is now there is a bit more automation behind it. Luckily, AI is still really terrible with the hallucinations, so we're somewhat safe for now.

    • Its always been a thing that you can reason about photos in often surprising ways AI or not.

      One thing that always makes me cringe is Ukranian soldiers uploading photos of themselves or of military hardware to social media. While I'm sure most of them are aware to try and strip out EXIF data, theres always other clues.

      A big one is the position of the sun and shadows. That can be used to infer latitude. Its an old television trick when setting up lighting for a shot, that the angle of light gives subtle locat

  • by Roger W Moore ( 538166 ) on Thursday April 17, 2025 @04:31PM (#65313405) Journal
    This is hardly surprising: Google maps has had a feature for at least a couple of years where if you scan your phone camera around you it can pinpoint exactly where you are and what direction you are facing.
  • Recently I fed one of these things a cell phone picture. Way in the distance, unknown to me at the time, was the logo for the shopping center I was at. The thing located the city and state, and the shopping center, and somehow figured out basically where I was in the shopping center. No, I didn't snap a photo of any store signs. Apparently in the distance was a trash can that had the logo for the shopping center, and there was a small map with a "you are here" indication on it. It magnified the photo and, I

  • by Local ID10T ( 790134 ) <ID10T.L.USER@gmail.com> on Thursday April 17, 2025 @05:00PM (#65313457) Homepage

    If you want your life to be private STOP SHARING IT with the world.

    If a person could look at a picture and say "hey! I recognize that place..." then a computer can do it faster and more reliably. It is not an invasion of privacy to recognize things. You shared it.

    What you share is.. no longer private. If you want privacy, keep it to yourself.

    You can live without the validation of other people liking your pics and posts, or you can accept that you have no privacy when you tell the world all your intimate details.

    • Sadly to stay private you also have to stop anyone near you from taking pictures as well.

      I got rid of my Facebook profile a long time ago. However extended family still sends each other photos on it and I'm in some of them. They also have my name and phone number in their contacts lists (maybe even address, birthday, and email too). All they have to say is "PleaseThink looks fat in this pic" and Facebook should be able to associate their ripped contact details of me with what I look like.

      Then my face sho

    • by vux984 ( 928602 )

      "If a person could look at a picture and say "hey! I recognize that place..." then a computer can do it faster and more reliably."

      It's a common mistake to think that is the same thing. It's not.

      Doing something automatically, massive at scale and storing that information changes the nature of that thing.

      If I saw you at the beach, that's fine right? If I took a picture of my kids as you walked by in the background, no big deal right. But I followed you home constantly taking pictures, and storing in them in a

  • It's just automated, advanced, reverse image searching and OCR.

  • All the stupid "enhance image" moments in police TV dramas may actually become real.

    • All the stupid "enhance image" moments in police TV dramas may actually become real.

      And by police TV dramas you mean Red Dwarf [youtube.com].

  • I think we may be confusing privacy with anonymity. If you are in a shopping center or other public space, you have no expectation of "privacy".

    The internet is a public space. If you post a picture or anything else on the web, you have no expectation of privacy.

    But people have come to expect they can remain anonymous on the internet. And that is what AI threatens. It can take a variety of clues and attribute them to a specific location and ultimately to a specific person.

    There used to be a concept of "se

  • That's not too surprising, since those images probably were in the training set.

    Soon a specialized model will be able to name who is appearing in a random picture. Not just celebrities, even randos. Shit is going to hit the fan for everyone who ever had a nude picture leaked (or uploaded it themselves, maybe with a partially covered face).

Maybe Computer Science should be in the College of Theology. -- R. S. Barton

Working...