The opening keynote of Google I/O 2022 gave us a number of big announcements, but it was the closing tease that got a lot of people excited - a prototype of AR live translation glasses.

The idea behind the glasses is that you can wear them and take advantage of Google's translation. We've all experienced Google Translate, but Google is on a mission to make it live, instant - and able to drive conversations between people.

The demo showcased the prototype glasses that can use AR to add subtitles in your field of vision, so you can read as someone speaks to you - so you instantly know what they are saying, regardless of the language they are speaking.

"Kind of like, subtitles for the world," says Max Spear, product manager, speaking on the introductory video.

The example was a mother and child, the mother speaking Mandarin, the child English - and immediately, the language barrier is gone, they can seamlessly communicate.

This sits on the work that Google has been putting into live translation and transcription, allow written text to come from spoken language, rather than just translating written language.

The opportunities offered by such a product don't just benefit those speaking different languages, but they can also be of assistance to those who are deaf or hard of hearing, again, using technology to remove a huge barrier.

In the past people have scoffed at AR glasses. The reception has been rather poor, with concerns about privacy, questions over the need to a constant stream of information - but with a use case like live translation, we can all see the immediate benefits that such a device would bring.

As Google says, this is just a prototype, but we're hoping that this is one set of AR glasses that makes it into the real world.

Hit play on the video above to see them in action.