The idea behind Google Lens was very forward-thinking: looking at a picture and extracting information through it. Its implementation, however, was not. Google is trying to change that with major updates to the feature that it promises will change the experience.

Right at the get-go, instead of being stuck solely on Google Photos, Google Lens is migrating to the cameras of smartphones everywhere, instantly making the feature much more useful. Having to go to Google Photos to use it usually just served to make people forget the feature ever existed.

The devices that Google Lens will natively work with include LG, Sony, Motorola, OnePlus, Mii, Asus, Nokia and Google.

Google announced three major updates for Google Lens: smart text selection, style match, and real-time results. Smart text selection lets you copy text from an image. The example used was a menu, and it’ll be converted into text that can be copied in plain text. Highlighting individual words will also yield search results with useful information about the topic

Style match looks up matching styles of the item you are scanning delivering similar results. This is specifically useful for furniture and clothing. Real-time results scans the environment around you and using machine learning works up results for each item the camera comes across. The impressive thing about this is that the results aren’t static. Scanning a poster for an artist will activate a music video of theirs.

Google did not announce when the updates to Google Lens will begin to roll out.