Gesture recognition currently works by viewing hand movements using a camera. Scientists at the University of Washington want to change that so your phone can "see" gestures, even in your pocket.

The AllSee system uses wireless signals already in the air, like TV signals, as both a power source and a means of seeing a user's gestures from a pocket. Imagine swiping in the air to clear a call that's vibrating in your pocket. Well, maybe not swiping, something that makes you look less crazy.

An ultra-low-power receiver detects changes in amplitude caused by the user's hand movements. Different gestures affect the wave amplitude differently and can be interpreted as recognition.

The AllSee prototype, attached to a phone, was able to detect hand gestures at more than two feet away accurately, for more than 90 per cent of the time. That includes gestures like pushing and pulling to zoom in and out or raising and lowering a hand to increase and decrease volume. And all this worked with a response 1,000 times faster than the blink of an eye.

Another great AllSee advancement is in power efficiency. The system can always be on without draining the battery. The team has created a wake-up gesture to make sure unintentional movements don't have you calling your loved ones from the loo.

Future applications include at-home sensors that mean everything in the home could be controlled by gestures no matter where the user is. Beyond that robots could be called over using gestures - but we're looking a bit of a way into the future there.

We're very excited about this technology but don't expect to see it in our kit for at least a few years.