Google Lens is an AI-powered technology that uses your smartphone camera and deep machine learning to not only detect an object but understand what it detects and offer actions based on what it sees.
Here is everything you need to know about Google Lens.
What can Google Lens do?
It will enable users to do things such as point their phone at something, such as a specific flower, ask Google Assistant what the object you're pointing at is and you'll not only be told the answer but you'll get suggestions based on the object, like nearby florists in the case of a flower.
Other examples of what Google Lens can do include users being able to take a picture of the SSID sticker on the back of a router, after which your phone will automatically connect to the Wi-Fi network without you needing to do anything else. Yep, no more crawling under the cupboard trying to read out the password whilst typing it in your phone, only to realise you forgot to put CAPS on.
Google Lens will also recognise restaurants, clubs, cafes and bars too, presenting you with a pop-up window showing reviews, address details and opening times and we've seen this in action and it seems to work really well.
It's the ability to recognise everyday objects that's impressive. Google Lens will recognise a hand and suggest the thumbs up emoji, which is a bit of fun, but point it at a drink and it will try and figure out what it is.
We tried this with a glass of white wine. It didn't suggest white wine, but it did suggest a whole range of other alcoholic drinks, letting you then tap through to see what they are, how to make them and so on. It's fast and very clever - even if it failed to see that it was just wine. What we like is that it recognises the type of drink and suggests things that are similar.
What apps will Google Lens work with?
At launch, Google Lens will be implemented into Google Assistant and Google Photos. Other Google apps will eventually follow, although we're yet to see what those might be.
Within Google Assistant, users will be able to tap the Google Lens icon in the bottom right-hand corner, point their smartphone camera at show times outside a cinema or a gig venue's information board, for example.
You'll will then be presented with a number of suggestions in the viewfinder, such as hear some songs from the artist picked up from the information board, get tickets for the event through TicketMaster or add the event to your calendar. Using lens to get information without having to write it down is really handy - you'll be able to call numbers for example, without having to remember them and manually type them in.
Within Google Photos, Google Lens will be able to identify buildings or landmarks for example, presenting users with directions and opening hours for them. It will also be able to present information on a famous work of art. Maybe it will solve the debate of whether Mona Lisa is smiling or not.
We've seen this put to work with a book cover. There's actually something similar within the Amazon Shopping app that will scan a cover and suggest buying options. Here Google is using similar techniques to provide you with more information about something you took a picture of. We've also seen Bixby Vision doing similar things, so it's not hugely unique, but could be very handy.
When will Google Lens arrive?
Google Lens as a part of Google Assistant should be rolling out to Android users. It was shown off at the launch of the Pixel 2 and Pixel 2 XL - available from late October - but there's no word when Google Lens will be coming to other Android users. The current position is that Pixel users will get an early preview of it, beyond that, Google isn't giving any sort of timeline.