Apple’s new “Visual Intelligence” feature was one of the most impressive features shown off at its iPhone 16 event on Monday. The tool lets users scan the world around them with their iPhone’s camera to identify a dog’s breed, copy event details from a poster, or examine just about anything around them.
It’s a useful feature that fits right in with the iPhone’s new camera button, but it could also be the foundation for something bigger in the future — exactly the kind of functionality Apple needs for future tech like AR glasses.
It’s not hard to imagine how visual intelligence could be useful on a device that understands everything it sees. Consider the idea of learning more about a restaurant, as Apple has demonstrated with visual intelligence on the iPhone. Instead of taking your phone out of your pocket to look up information about a new place, with the glasses you can just look at the restaurant, ask a question, and the glasses will tell you more.
Meta has already proven that computer glasses work
Meta has already proven that computer glasses with an AI assistant can be a great and useful tool for identifying things, and it’s not too hard to imagine Apple doing something similar with a very high level of fit and finish on its theoretical glasses. Apple will almost certainly be able to connect the glasses to all the apps on your iPhone and your personal context, making visual intelligence even more useful.
Sure, Apple already has a camera-equipped headset, the Vision Pro, but most people don’t walk around with a headset outside the home, and you probably already know what’s in the house. It’s been reported before that Apple wants to develop true AR glasses, and that seems like the ultimate destination for this kind of technology.
The problem is, Apple-made AR glasses may still be a long way off. BloombergReported by Mark Gurman of June The company noted that while the glasses it is developing are “rumored” to have a release date of 2027, “no one I’ve spoken to inside Apple believes the glasses will be ready within a few years.”
But whenever such glasses arrive, they’ll need software, and you can see Apple laying the groundwork here. Visual Intelligence could be Apple’s first step toward a killer app for computer glasses. Starting now could give Apple years to refine the feature before it makes it into the glasses.
It’s not unprecedented for Apple to take this approach. The company has been iterating on its AR tech for the iPhone for years, even before it launched Vision Pro. While Vision Pro is certainly closer to a VR headset than an AR device, it’s clearly the first step toward what could become AR glasses. As Apple improves its hardware, it could also work on software features like Visual Intelligence on the iPhone, and when the time is right, it could pack all of its best ideas into a product like Glass..