Microsoft has envisioned a new form of 3D display that lets you feel objects on the screen by allowing you to push the screen itself to create a sense of resistance.

At a demo of the very early prototype for Pocket-lint at the company's headquarters in Seattle, we were told the concept had been created to allow users to feel different properties like wood or concrete on screen through the use of haptic feedback.

Haptic feedback is mostly commonly used in phones to create a sense of physical contact with something where there doesn't need to be - like when your phone buzzes when you press a key on the virtual keyboard - and the same technology is used here, but to more interesting effect, like pushing a block or feeling the contours of a ball.

"This project combines the two disciplines into a 3D display experience that adds the sense of touch. The standard display monitor has been retrofitted with resistance transducers that sense the amount of pressure a person is placing on the screen," explains Steve Clayton, chief storyteller at Microsoft.

"The transducers work in conjunction with a one-dimensional robot that looks a bit like those articulated television mounts that extend and pivot the television away from the wall. The robot responds to the sensory data generated by the pressure of your fingertip and moves forward or backward accordingly."

We got to play with two very different demos showing us what could be possible for the rather unusual-looking work station where it really was about pushing back and enjoying.

The first was a simple game whereby we had to push away concrete, wooden and sponge boxes before feeling the curvature of a coffee cup and a ball.

The boxes really show what's possible, although it is worth noting that the early prototype is still very rudimentary. For it to be of real use in the workplace the haptic feedback needs to be a lot more precise, but then that's not the point here - this is a proof of concept, rather than something that is ever expected to ship in its current form.

The second demo showed how doctors could use the technology to quickly move through a number of layers just by pushing forward with their finger.

In this case we were able to move through a stack of scans of a human brain. At any point we were able to place a second finger on the screen to stop the screen from moving further and then make notes for others to see at a later date. 

It is very early days for the technology, and there are a lot of moving parts and things going on, but it's an interesting idea nonetheless and shows that Microsoft is looking at a number of different ways to interact with the computers of tomorrow.

Imagine however, that the screen was projected instead of physical, and that the haptic feedback was being delivered via a glove you wear, and you can start to see the makings of that famous scene in Minority Report very clearly.