Earlier this week, during the WWDC keynote, Apple showed off its new iOS 26. For the first time since iOS 7 in 2013, Apple is revamping the operating system’s look and feel, introducing a very Windows Aero-esque design language called “Liquid Glass” (RIP Windows Vista), and since this was the flashy new thing at the keynote, it’s been the week’s hot topic.
However, we also saw teasers of other new features that aren’t getting the same level of attention. Within the segment on iOS, for example, Billy Sorrentino showed off a new capability of Apple’s AI-powered Visual Intelligence, which is called, pretty simply, Image Search. The way it works is that you take a screenshot of anything you see on your iPhone’s screen. Once you have the screenshot, you can hit the Image Search button in the lower right. Using AI, Visual Intelligence will scan the screenshot and search for things it sees or create calendar events for dates and times revealed in the image.