The Visual Intelligence feature introduced by Apple became one of the most interesting innovations at the presentation of the iPhone 16. It allows the owner of the device to scan the world around him through the phone’s camera – artificial intelligence will help determine the breed of the dog, copy information about an event from a poster, find out the menu of a cafe or find something useful nearby.
This handy feature could also pave the way for more interesting products in the future—something Apple would need for AR glasses, for example. At the presentation, the company gave an example where a restaurant visitor learns more about it by searching using the iPhone camera. The same could be done by using smart glasses rather than a phone and asking them a question.
Meta✴ has already proven that AI-assisted glasses can be a useful tool for object identification. Apple is quite capable of doing something similar by integrating with the data that the iPhone has, making Visual Intelligence an even more convenient feature. The company’s product range already includes a headset with cameras—Vision Pro—but most of its owners only use it at home, and they know a lot about their home. According to unofficial information, Apple is indeed developing augmented reality glasses, but, according to insiders, the device is still planned for release in 2027, and the employees themselves do not yet believe in this date.
Whenever a device appears, the software for it is already being developed – Visual Intelligence may be the first step towards such a gadget. Apple already has examples of this work: the company spent years developing augmented reality technologies in the iPhone before releasing the Vision Pro – a bulky headset that can also turn into compact glasses. Such devices may well become the new battlefield for tech giants. The same Qualcomm is now, by its own admission, working on mixed reality glasses together with Samsung and Google. But for now, Visual Intelligence should work fully on the iPhone.