Just because Google's keynote is done and over doesn't mean the Google I/O dev conference is also finished with serving up new and interesting information about what the Mountain View-based company envisions for the future.

Take Project Soli, for instance.

It debuted at one session from the conference, and everyone is talking about it. Google ATAP - a skunkworks division at the company that is building things like 3D-sensing, spatially-aware tablets - led the session and introduced the project as a new technology that could change wearables for forever.

Project Soli is a sensor that can easily be used in even the smallest wearables. It is capable of accurately detecting your hand movements in real-time, meaning it's a lot like Leap Motion and other gesture-tracking controllers. But instead of using cameras, Project Soli uses radar technology that fits within a tiny chip.

Google ATAP on YouTube: https://www.youtube.com/watch?v=0QNiZfSsPc0Screen Shot 2015-05-29 at 12.16.03 PM

Google ATAP has basically realised that our hands are the best way to interact with devices. We have such fine control with our fingers; just think about how fast and seamlessly yours can transition from, let's say, typing on a keyboard to untangling a bunch of wires. Project Soli wants to apply that capability to gesture control.

Project Soli's founder, Ivan Poupyrev, demoed the sensor on Friday. Because Project Soli can recognise fine gestures, rather than the large ones needed by most other motion-based controller systems, it will be able to replace all current interfaces and allow you to gesture-control wearables without ever touching a display.

Well, Project Soli has the potential to change the way we use all devices - not just wearables. Wearables are probably the most obvious and natural place to apply the technology first, because those types of devices usually have such small displays (and there's is an obvious need for richer, more functional input options on them).

Google ATAP on YouTube: https://www.youtube.com/watch?v=0QNiZfSsPc0Screen Shot 2015-05-29 at 12.17.17 PM

The Apple Watch, for instance, has the physical Digital Crown that provides users with additional ways to navigate Apple's Watch OS. But Project Soli makes that approach seem archaic. A smartwatch with Project Soli wouldn't need a digital crown, because you could just wave your fingers in the air to get things done.

You could mimick turning down a volume dialer to turn down the volume. You could mimick pressing a button to turn something on or off. You could mimick turning a page to flip through an eBook. The possibilities are endless.

The video below not only shows how Project Soli works when applied to a variety of devices and different scenarios, but it also goes into greater detail about why Google first dreamed up the technology.

Google ATAP plans to release an API for Project Soli to developers at some point, so that they'll be able to build applications and hardware that can take advantage of the technology. Keep in mind Project Soli only just unveiled.

Google ATAP on YouTube: https://www.youtube.com/watch?v=0QNiZfSsPc0Screen Shot 2015-05-29 at 12.16.39 PM

Everything is new still, but the idea is getting everyone at Google I/O excited.

Pocket-lint has a Google ATAP hub with all the latest news from the division.