Google announced Project Soli in 2015 during a session at its I/O Developer Conference. Since then, Google's ATAP (Advanced Technology and Projects) division has been developing the technology, which can be used in wearables, phones, computers, cars and IoT devices.

This is everything you need to know about Google's Soli chip, including what it is, how it works and what it does in Google's devices.

What is Google's Soli chip?

Google's Soli is a purpose-built chip to track your motion on a microscopic scale. It uses miniature radar for real-time motion tracking of the human hand; it's able to track sub-millimetre motion at high speeds with great accuracy.

what is google project soli and will it forever change the way we use wearables image 1
Google ATAP

The Soli chip measures just 8 x 10mm and it incorporates the sensor and antenna array into a single device, meaning it can be used in even the smallest wearables. It has no moving parts, consumes very little energy, isn't affected by light conditions and works through most materials making it a pretty exciting bit of technology.

In tandem with the chip, Google ATAP is developing a language for interacting with devices using gestures. Devices equipped with a Soli chip can then use a universal set of gestures. Google calls these Virtual Tool Gestures and they involve things like pressing an invisible button between your thumb and index finger or turning a dial by rubbing your thumb and index finger together.

The idea is that these gestures feel physical and responsive thanks to the feedback from fingers touching each other, even though the gesture itself is virtual. 

How does Google's Soli chip work?

Ready for some science? Let's hope so.

The Google Soli chip uses radar, so it works by emitting electromagnetic waves with objects within the beam reflecting information back to the antenna, according to Google ATAP. Information gathered from the reflected signal - things like time delay or frequency changes - give the device information about the interaction. 

Soli senses "subtle changes in the received signal over time. By processing these ... Soli can distinguish complex finger movements and hand shapes within its field."

what is google project soli and will it forever change the way we use wearables image 3
Google ATAP

Gesture controls are recognised thanks to a number of different ways the information can be interpreted, including raw radar data, machine learning, probable gestures and pre-defined interactions.

How will Soli change the way we use devices?

Soli has the potential to change the way we use all devices. Wearables are probably the most obvious and natural place to apply the technology because those types of devices usually have such small displays (and there is an obvious need for richer, more functional input options on them).

what is google project soli and will it forever change the way we use wearables image 4
Google ATAP

The Apple Watch, for instance, has the physical Digital Crown that provides users with additional ways to navigate the WatchOS interface. A smartwatch that incorporates the Soli chip wouldn't need a digital crown though, because you'd be able to wave your fingers to get things done, such as mimic turning down a volume dialler to decrease the volume or mimic pressing a button to turn something on or off.

However, the Google Pixel 4 smartphone was the first device to incorporate the Soli chip. It allowed for gesture controls, such as users simply waving their hands in order to skip songs, snooze alarms and silence phone calls. These capabilities are said to expand over time, with Google saying that this is just the beginning of what it will offer. Google dropped the technology in the Pixel 5 which might spell the end of Soli in smartphones.

What does Motion Sense offer on the Pixel 4?

Motion Sense is designed to help you use your phone without having to touch it. While Google has had Google Assistant for a number of years offering voice interaction, Motion Sense powered by the Soli chip will give you a range of interactions that you can control with your hand.

Google basically categorises these as presence, reach and gestures. Motion Sense can detect when your hand is approaching the phone to wake it up, for example. This is used to trigger the face unlock sensors, meaning you'll be able to unlock it faster than systems that need you to pick up the phone to get it to work, for example.

The given example is often swiping through music, but there's cancelling timers and alarms too, or silencing the ringer. These are small things right now, but could be useful if you're driving, for example or when your phone is on a night stand. Again, there's more in the pipeline for Soli - but that hardware has lead to a larger forehead in the Pixel devices. The question is whether the functionality will outweigh the desire for more refined device design.

Want to see Soli in action?

The video below not only shows how the Soli chip works when applied to a variety of devices and different scenarios, but it also goes into greater detail about why Google first developed the technology.

Is Soli ready for developers?

Google ATAP is looking for developers to evolve, test and build Soli applications. Currently only the Pixel 4 is taking advantage of the Soli chip.  

Developers can sign up to the mailing list for the latest updates regarding the technology.