I received an e-mail alert about the video on this page:
and my interest was immediately sparked. The idea of using brain waves to type or at least to communicate with the shipboard computer was a staple of science fiction in my youth, and as more and more of the technology from that genre becomes blasé (personnel communicator = mobile phone, Babel fish = Google translate…)it was interesting to see that this one is starting to fall.
It turns out not to be the case.
In this experimental set up the simulated quadriplegic unable to communicate stares at a screen on which a pattern of flashing letters is formed. When the letter that they were staring at flashes, this is picked up by a series of electrodes monitoring their brainwave activity and this is deemed to reading the letters out of the person’s brain.
It’s not. It is just detecting which flashing light a person is looking at. In order for this to work, the person must be able to move their eyeballs to fixate on a particular letter, and must be able to see.
However we already have technology that can watch your eyeballs to see what you are looking at, and its been around for a while.
Cannon used this in cameras to see what part of the image should be used for autofocus since the early 1990’s.
More details are here.
and if you want to play with a very similar application down load camera mouse here:
This doesn’t do eyeball tracking, but will allow you to control a computer even if your arms and legs don’t work in a consistent way.