Google on Tuesday announced Google Lens, a service capable of recognizing real world landmarks and objects, is coming to the iPhone via an update to Google Photos. The announcement comes as an addendum to the launch of the feature on all Android devices that run Google’s photo service.
Originally announced during I/O last year, Lens uses Google’s deep learning technology and neural networks to make sense of the world. If you point Lens at a book, for example, it’ll identify that book and provide users with options to purchase that title.
Rolling out today, Android users can try Google Lens to do things like create a contact from a business card or get more info about a famous landmark. To start, make sure you have the latest version of the Google Photos app for Android: https://t.co/KCChxQG6Qm
Coming soon to iOS pic.twitter.com/FmX1ipvN62
— Google Photos (@googlephotos) March 5, 2018
Other Lens features include the ability to automatically connect to a Wi-Fi network by scanning the network name and password, as well as advanced OCR accuracy, allowing users to easily digitize business card information.
The Pixel and Pixel 2 still offer the best Lens integration as it’s available in Google Assistant. But similar integration will reportedly roll out to devices from manufacturers like Samsung, Huawei, LG, and more.
Google didn’t say when Lens would be available for iOS, but with a wider rollout happening for Android, it’ll probably arrive very soon.