Google on Wednesday unveiled Google Lens, a powerful new technology capable of recognizing objects in the real world. Want to know what kind of flower that’s growing in your garden? Just point Lens at it’ll tell you the type of flower it is, which Google CEO Sundar Pichai joked will be helpful for people with allergies. And he’s right.

Google Lens “won’t just see what you see, but will also understand what you see to help you take action.” So, if you want to know more about a restaurant, simply point lens at the store front and you’ll get information about it, such as reviews, hours, and more.

The new feature is built into Google Assistant, making it available to millions of devices (so long as you have Android 7.0 Nougat).

The technology used is similar to what you’d find in Google Translate. Google showed off one very convenient way to use the Lens feature: Point your camera at a router and it will save the information for you, getting you up and running in no time.

What’s cool is that when you use Lens, Google’s Assistant will remember and allow users to ask follow-up questions. So, if you take a picture of a menu using Lens and it says “dumplings,” you can ask Assistant to show you pictures of dumplings.

In addition to living inside of Assistant, Google Lens is also coming to Google Photos, making it easier to organize and understand what you’ve taken pictures of. According to Google, its technology is so good at recognizing what’s in your images that it’s better than humans.

Google also said Lens will be capable of dealing with business cards, handling receipts, and more. All of these features will be available for the iPhone, too, because Assistant is now available for iOS. So, yeah, Lens for everybody.

The Google Lens feature in Assistant should be available in the coming months.