Google Lens is an AI-powered technology that uses your smartphone camera and deep machine learning to not only detect an object in front of the camera lens, but understand this and offer actions such as scanning, translation, shopping, and more.
Google Lens was one of Google's biggest announcements way back in 2017, but it was a Google Pixel exclusive feature when that phone launched. Since then, Google Lens has come to the majority of Android devices - if you don't have it, then the app is available to download on Google Play.
What is Google Lens?
Google Lens enables you to point your phone at something, such as a specific flower, and then ask Google Assistant what the object you're pointing at is. You'll not only be told the answer, but you'll get suggestions based on the object, like nearby florists, in the case of a flower.
Other examples of what Google Lens can do include being able to take a picture of the SSID sticker on the back of a Wi-Fi router, after which your phone will automatically connect to the Wi-Fi network without you needing to do anything else. Yep, no more crawling under the cupboard in order to read out the password whilst typing it in your phone. Now, with Google Lens, you can literally point and shoot.
Google Lens will recognise restaurants, clubs, cafes, and bars, too, presenting you with a pop-up window showing reviews, address details and opening times. It's the ability to recognise everyday objects that's impressive. It will recognise a hand and suggest the thumbs up emoji, which is a bit of fun, but point it at a drink, and it will try and figure out what it is.
We tested this functionality with a glass of white wine. It didn't suggest white wine to us, but it did suggest a whole range of other alcoholic drinks, letting you then tap through to see what they are, how to make them, and so on. That shows that, while Lens is fast and clever, it's not always accurate.
What can Google Lens do?
Aside from the scenarios described above, Google Lens offers the following features:
- Translate: You can point your phone at text and, with Google Translate plugging in, live translate text in front of your very eyes. Magic.
- Smart Text Selection: You can point your phone's camera at text, then highlight that text within Google Lens, and copy it to use on your phone. So, for instance, imagine pointing your phone at a Wi-Fi password and being able to copy/paste it into a Wi-Fi login screen.
- Smart Text search: When you highlight text in Google Lens, you can also search that text with Google Assistant. This is handy if you need to look up a definition of word, for instance.
- Shopping: If you see a dress you like while shopping, Google Lens can identify that piece and similar articles of clothing. This works forhousehold decor and more, too, via relevant reviews and shopping options.
- Search around you: If you point your camera around you, Google Lens will detect and identify your surroundings. For us that meant IDing the kinds of plants, breeds of pet cats, and highlighting reviews of DVDs from our entertainment stand.
How does Google Lens work?
Google Lens app
Google has a standalone app on Android for Google Lens if you want to get straight into the features. You can access Google Lens through a whole range of other methods, as detailed below.
The experience is similar whichever approach you take; tapping the Lens icon in Google Assistant takes you through to the same view you get directly in the Lens app.
Within Google Assistant you'll see a Google Lens icon in the bottom right-hand corner. You can tap it and point your smartphone camera at, for instance, show times outside a cinema or a gig venue's information board.
You'll will then be presented with a number of suggestions in the viewfinder, such as hear some songs from the artist picked up from the information board, get tickets for the event through TicketMaster, or add the event to your calendar. Using Lens to get information without having to write it down is handy; you'll be able to call numbers, for example, without having to remember them or manually type them.
Within Google Photos, Google Lens can identify buildings or landmarks, for instance, presenting users with directions and opening hours for them. It will also be able to present information on a famous work of art. Maybe it will solve the debate of whether the Mona Lisa is smiling or not.
When browsing your pictures in Google Photos, you'll see the Google Lens icon in the bottom of the window. Tapping on the icon will see the scanning dots appearing on you picture and then Google will serve up suggestions.
In some Android phones, including the Pixel 4 and others, Google Lens has been been directly added to the device's own camera app. It might be in the 'More' section, but will differ depending on manufacturer and user interface.
Which devices offer Google Lens?
If you're an Android device user, you can access the app. However, there are some exceptions, such as the banned-from-Google-Services phones such as the Huawei Mate 30 Pro - so it's worth checking on Google Play to see if you can get it.
What's next for Google Lens?
Given that Google Lens represents the cutting edge of camera technology and Google's implementation of it, there are constantly small updates and rumoured features for the app on the horizon.
For example, it was recently revealed that Google Lens may soon get some educational features, including a camera mode that could help students with math problems if they're struggling with homework. This apparently may be joined by the ability to use the Lens app to translate written words without having to be online to do so. We'll be keeping an eye out to see if these features go live.
This article was originally published in 2017 and has been updated to reflect changing information