Control Google Glass with your mind

Deciding that the finger swipes are a limiting factor to how people use Google Glass, one London-based design agency come up with an alternative; mind control.

Called mindRDR, the app, combined with a Neurosky EEG biosensor allows you to take a picture and post it to Twitter simply by thinking about it.

The source code, which the design agency are sharing via GitHub for all Google Glass users to experiment with, is a proof of concept to show what is possible and what a mind controlled future might look like.

"We wanted to start with something simple to prove that it would work," explained Chloe Kirton, the Creative Director at This Place, the agency behind the idea. "You could easily get this working with more expensive sensors that track multiple brain inputs, but we wanted to start with the idea that could you get Glass to respond to concentration or relaxation and what could you do with that."

Down the line Kirton confirmed that there is plenty of scope to develop the software even further to be able to use all the 18 senses that can currently be mapped from the brain as controls. That's like having a whopping 18 menu options, or control buttons. It could, technically, mean the entire Glass experience will be controlled by thought alone.

Today however and the experience is a little cumbersome but intriguing, as we found out having a go ourselves.

To start with you have to place the neuro-scanner on your head. Kirton explains that it is old off the shelve tech and chosen for this experiment so it could be bought cheaply by others keen to have a play (it costs around £70), but that in the future you could easily build something into the glasses themselves.

Once the brain scanner is on, yes you look like even more of a glasshole, you then use Google Glass as you normally do, there is a dedicated app by the agency called mindRDR and once you load this, you are into the thought zone so to speak.

The interface is simple, what is the interface for your brain anyway, and that results in a line on the screen going up when you are concentrating and going down when you are relaxing.

Concentrate enough and you get to snap a picture before being asked to concentrate once more to confirm you want it sent to Twitter.

The result is an ability to take photos and share them on the likes of Twitter or Facebook, all without saying a word or touching a button.

In practice it does take some getting use to. Our first snap was quick and easy to achieve, but a second go taking a picture of ourselves while looking in a mirror proved difficult. It is something that clearly comes with practice as Kirton demoing it to us had no trouble at all focusing on the task at hand.

This is all very early prototype stuff of course and not something that we expect Google to add to the Glass interface any time soon or for This Place to commercialise and offer for sale, however as we proved by sending a picture in a tweet doing nothing other than thinking about it, it is possible.

Where things start to get awe-inspiring is imagining linking this into the connected smart home of the future allowing you to control everything just by thinking about it. Still, it's a step in a very exciting direction.

READ: Want to control your phone with your mind? There's an app for that under Darpa development



>