The original iPhone caused a shift in smartphone design which still resonates today. Technology has moved on so much, however, that a lot of what the iPhone set out to do, is now outdated.

Why, for example, do we even need to touch screens at all? It only makes for greasy finger marks. Xbox Kinect proved you didn’t need a controller for a games consoles and now, from the look of it, eyeSight is about to prove that you don’t need a finger for your phone at all.

EyeSight has been developed by an Israeli company and is essentially a piece of technology that allows you to code motion control into any form of camera. While here we might be looking mainly at mobile, it is possible to use eyeSight in anything from a webcam and smart TV to a set-top box.

Crucially, EyeSight doesn’t require anything in the hardware to be altered. That means if someone were to design it, you could download an app right now and start controlling your phone using gesture alone as recognised by your smartphone's front-facing camera. Nothing else would need to be changed.

The tech is also very light on the processor, so it doesn’t have a detrimental effect on the hardware itself. You could, in theory - as Korean manufacturer Pantech has done - simply write it into Android itself. The Vega LTE smartphone, for example, works with gestures simply by your hovering your hand over it and then swiping in different directions.

Pocket-lint was shown this on both the iPhone and the Pantech handsets. It seems every bit as accurate and smooth as the Kinect. It also manages to recognise even slight gestures and those in low light, and features no set-up at all - not all things that the Microsoft’s Xbox add-on can boast. In essence, eyeSight can recognise very simple and small hand gestures and then turn them into actions.

EyeSight can recognise and interpret different movements of your hand and can distinguish even down to an individual finger level. Hold up a finger, for example, and you could start using a cursor; make a fist, and then motion control will stop. You can even make gestures such as pinching in thin air to make apps go full screen, as well as use 3D space to make push gestures. There is no support for more than one finger at a time as yet, but it's apparently in development, along with intelligent hand tracking. 

Any action can be coded to occur when you make any given movement. So, the Minority Report dream takes another step closer and on any device you like too.

Right now, though, eyeSight exists only in the Pantech LTE handset. It's a Korea-only device, unfortunately, but the reaction has been as good as the ad campaign has been amusing.

To prove that it's not a case of obscure handsets only, Pocket-lint was also shown a demo of eyeSight on an Apple iPhone. It used applications that sadly we can’t name but the promise of a final build is close and it should be submitted to Apple soon. This could mean motion control on the iPhone in a matter of a few weeks, unless the Cupertino company decides to throw one of its famous spanners in the works, of course.

As for eyeSight on the computer, so far it's a development build which worked either with a webcam or the computer’s own built-in camera. The method and use was the same as we saw on the smartphone but on a bigger scale and, while that might seem obvious, it’s the idea of using it with Windows 8 and its Metro or Modern UI that we found very exciting.

Perhaps best of all is the idea of eyeSight’s latest design which integrates the technology into your TV by combining itself with an Android platform on a set-top-box. The nTobeBoxSet, as it's known, is set to launch at the beginning of next year and - put together by Korean manufacturer Innodigital - it could definitely bring something different to the living room. It runs on Android and so incorporates a web browser and Skype, making it rather like a gesture-controlled version of Google TV.

But while all the theory and demonstrations were fascinating, what we couldn’t - and still can't - help wondering, was whether this is actually useful or just fun? Is this touchscreens or is this Siri?

Kinect has brought a lot to gaming but many would argue that you can’t beat a traditional controller-based experience. So, is eyeSight just another gimmick?

In smartphones, it's more than likely just a parlour trick. There might be a use case for the mucky hand moment described in the Pantech advert above but it’s not a huge one. Gesture is possibly better placed in the tablet space with many slates used in the kitchen as recipe aids but, still, it doesn’t look as if it would change your life there either.

As your screen gets bigger though, gestures seem to come into their own. Minority Report has already shown us how the keyboard and mouse could very easily be a thing of the past and there are also plenty of possibilities back at home on the sofa in front of the television. You can pick, pause, search and fast forward whatever you’re watching with far greater ease than with a remote, and that’s without even considering the hassle of when it’s lost or runs out of batteries.

Then again, why bother with waving your arms when voice control is even easier? It might not be something that appeals in public or in an office environment but Kinect has already shown that voice can work in the privacy of your own home.

So, does any of this justify the need for something like eyeSight to exist? Well, just about, but perhaps the very fact that it’s new and different will be enough. If the Kinect can become the fastest-selling gadget of all time, then surely there’s room for a little more fun?

- iPhone tips and tricks with iOS 6

- Windows Phone 8: The phones to buy