This week, OpenAI released its latest AI models, o3 and o4-mini, which can reason using user-uploaded photos. In practice, they crop, rotate, and zoom in on photos, even blurry and distorted ones, and conduct thorough analysis. The advanced analysis capabilities make ChatGPT a powerful location-finding tool.

Image source: ilgmyzin / unsplash.com

The OpenAI o3 reasoning model, as established by users of the social network X, has proven itself well in the task of identifying cities, landmarks, and even food establishments shown in photographs based on barely noticeable visual cues. In many cases, the AI ​​does without analyzing past correspondence with the user and without geolocation in the metadata. ChatGPT shows restaurant menus, pictures of the surrounding area, building facades, and their selfies, and supposedly offers to play GeoGuessr, when you need to guess locations based on panoramas in Google Maps.

Image source: x.com/izyuuumi

There are obvious privacy concerns: a hypothetical attacker could take a screenshot of a user’s Instagram✴ Stories and reveal information about them for the purpose of subsequent bullying and other illegal actions. In fact, this ChatGPT feature worked even before the release of o3 and o4-mini, TechCrunch noted: its journalists tried the same trick with a version of the chatbot based on the GPT-4o model – it gave mostly the same results and responded faster. The o3 model, being more powerful, answered correctly more often, but sometimes failed.

Image source: x.com/swax

OpenAI didn’t mention this use case in its security reports for the o3 and o4-mini models. But it did provide a detailed comment to TechCrunch: “OpenAI o3 and o4-mini bring visual reasoning to ChatGPT, making it more useful in areas like accessibility, research, or location-based emergency response. We’ve worked to train our models to reject requests for private or sensitive information, added safeguards to prevent the model from identifying individuals in images, and we actively monitor and take action against abuse of our privacy policies.”

Leave a Reply

Your email address will not be published. Required fields are marked *