Register for an account

X

Enter your name and email address below.

Your email address is used to log in and will not be shared or sold. Read our privacy policy.

X

Website access code

Enter your access code into the form field below.

If you are a Zinio, Nook, Kindle, Apple, or Google Play subscriber, you can enter your website access code to gain subscriber access. Your website access code is located in the upper right corner of the Table of Contents page of your digital edition.

Mind

Control A Robot With Your Brain?

Neuroskeptic iconNeuroskepticBy NeuroskepticSeptember 15, 2012 8:06 PM

Newsletter

Sign up for our email newsletter for the latest science news

A paper just out makes the dramatic claim that you can control a robot using thought alone, Avatar style, thanks to a 'mind reading' MRI scanner. But does it really work?

Dutch neuroscientists Patrik Andersson and colleagues bought a robot - an off-the-shelf toy called the 'Spykee' - which is equipped with Wifi and a video camera. The controlling human lay in the scanner and real-time fMRI was used to record brain activity. The video feed from the robot was showed on a screen in the scanner, completing the human-robot loop.

placeholder

Participants controlled the robot with their brain. Specifically, they had to focus their attention on one of three arrows - forward, left, and right - shown on the screen.

During an initial training phase they focussed on each arrow in turn, to provide examples of the resulting brain activity: these were then fed into a machine learning algorithm that learned to recognize the pattern of BOLD activation for each command. Then in the second phase, they could control the robot just by thinking about the correct arrow - the scanner 'decoded' their brain activity and sent the appropriate commands to the bot over Wifi.

placeholder

placeholder

None of the elements of this process are new - real time fMRI has been around for a few years, so has machine learning to decode brain activation - but it's the first time they've been put together in this way.

And it's pretty awesome. The participants were able to guide their 'avatar' around a room to visit a number of target locations. They weren't perfectly accurate, and it took 10 or 15 minutes to navigate a few meters of ground... but it worked.

However... were they really using their minds, or just their eyes?

This is my main concern about this paper: participants were told to keep their eyes focussed on the middle of the screen and just mentally focus on the arrows to give commands. If they did indeed keep their eyes entirely stationary, then the patterns of brain activation would indeed represent pure 'thoughts'.

But if they were moving their eyes slightly (even unconsciously), the interpretation would be rather different. Moving their eyes would change the pattern of light hitting their retina, and this would be expected to change brain activation in the visual system of the brain.

So, maybe the fancy fMRI decoding system wasn't reading their mind, it was just acting as an elaborate means of tracking eye movements - which would be much less interesting. If you want to control a robot with your eyes, there are cheaper ways.

Andersson et al acknowledge this issue, and they claim, for various reasons, that this probably wasn't what happened here - but they didn't measure eye movements directly, so it does remain a worry. Eye tracking devices suitable for fMRI are widely available but this study used an ultra-powerful 7 Tesla scanner which, the authors say, made it impossible. So there's more work to be done here.

rb2_large_white.png

Andersson P, Pluim JP, Viergever MA, and Ramsey NF (2012). Navigation of a Telepresence Robot via Covert Visuospatial Attention and Real-Time fMRI. Brain topography PMID: 22965825

    3 Free Articles Left

    Want it all? Get unlimited access when you subscribe.

    Subscribe

    Already a subscriber? Register or Log In

    Want unlimited access?

    Subscribe today and save 70%

    Subscribe

    Already a subscriber? Register or Log In