One thing that people learned from the pandemic is that there is no substitute for being together in a room with someone.

While video conferencing apps such as Zoom, Google Meet, and Microsoft Teams try to help you virtually connect with those you love and work with, the experience still feels rather limited. So, in an attempt to push the boundaries of remote collaboration, Google has come up with Project Starline. And it's planning to start testing this project with partners such as T-Mobile later this year.

What is Google Project Starline and how does it render in hyper-real 3D? photo 4

What is Google Project Starline?

  • Early video calling system that renders in 3D, no glasses required

Google CEO Sundar Pichai took the stage at Google I/O 2021 to announce that his company kicked off a project "several years ago" that builds on different areas of computer science and relies on custom-built hardware and highly specialised equipment. Called Project Starline, it's an early natural-feeling telepresence system that makes it seem as if the other person on a video call is sitting in front of you. Each participant sits in a booth outfitted with cameras and infrared projectors that create a realistic depiction, while spatial audio makes it seem as if the voice is coming from right in front of you and out of the person's mouth.

Using head tracking and a 65-inch, 8K glasses-free display, the system allows you to video call someone and experience them in hyper-realistic 3D. 

How does Google Project Starline work?

There are three components to Project Starline:

  • Cameras and depth sensors: Specialised equipment that captures a person from multiple perspectives
  • Computer science advances: Custom software, including novel compression and streaming algorithms
  • Light field display: Custom hardware that renders a realistic representation of someone in 3D

Project Starline uses high-resing cameras and custom depth sensors to capture a user's shape and appearance from multiple perspectives, and then all that is fused by software to create an extremely detailed, real-time 3d model. Google said it's applying research in computer vision, machine learning, spatial audio, and real-time compression. The effect is the feeling of a person sitting across from you. 

The resulting data is also huge - many gigabits per second.

So, to send this 3D imagery over existing networks, Google developed novel compression and streaming algorithms that reduce the data by a factor of more than 100. Google also developed a light field display that shows you the realistic representation of someone sitting right in front of you in three dimensions. 

As you move your head and body, Google's system can adjust the images you see in the light field display to match your perspective. It creates a sense of volume and depth without the need for additional glasses or headsets. "You can talk naturally gesture and make eye contact," CEO Sundar Pichai described at I/O 2021. "It's as close as we can get to the feeling of sitting across from someone."

What is Google Project Starline and how does it render in hyper-real 3D? photo 3

When will Project Starline be available?

  • It launched in a few of Google's offices
  • It's been demoed with over 100 companies
  • Salesforce, T-Mobile, WeWork, and others will test it in Q4 2022

Google said it's already spent thousands of hours testing Project Starline in its own offices. At first, there were no plans to commercially release the product for consumers. Google only said there had been excitement from enterprise partners, and it was planning to expand access to partners in healthcare and media. As of October 2022, Ars Technica reports Google plans to start installing Project Starline prototypes in some of its corporate partners' offices for tests later this year. Partners include Salesforce, T-Mobile, and WeWork. It also said Google has been conducting demos with over 100 companies spread across healthcare, media, and retailers.

Want to know more?

Check out Google's blog post about Starline for more details.