Pages

Tuesday, 5 February 2013

The AI Lab: Brain-Computer Interfaces - The Future of Collaborative Mind-Control Systems Shaping Up

Alfred Omachar



One of the most challenging advances in human-machine interfaces is the use of a brain-computer interface (BCI) to communicate a user's intention to a computer by passing the classical hand input interfaces such as keyboard, mouse and touch-pad. 

However, recent research in BCI has shown impressive capability for controlling mobile robots, virtual avatars and even humanoid robots. For example, one study demonstrated the ability to control a humanoid robot with a BCI, where users (humans) were able to select an object in the robot's environment – seen through the robot's cameras – and put it in a desired area in the environment -  seen through an overhead camera. Similarly, BCIs have also managed to help people with disabilities to control, for example, a wheelchair, robotic prosthesis or computer cursor.

So how do BCIs work (in a nutshell)?

A BCI system records the brain's electrical activity using electroencephalography (EEG) signals. The signals can be taken invasively or non-invasively either from inside the brain or from the scalp. Non-invasive BCI takes signals that are present at micro-volt levels on the scalp and then amplifies them using an EEG. These signals are then digitised so that they can be used by the computer. Machine learning algorithms are then used to construct software that learn to recognise the patterns generated by a user as he/she thinks of a certain concept, for example, “up”  or “down”. 

A promising Future for Collaborative BCIs

Now researchers are discovering that they even get better results in some tasks by combining the signals from multiple BCI users. For instance, a team at the University of Essex managed to develop a simulator in which pairs of BCI users had to steer a craft towards the centre of a planet by thinking about one of eight directions that they could fly in. Brain signals representing the users' chosen direction were merged in real time and the spacecraft followed that path.

According to the results of this study, it turns out that two-brain navigation performed better compared to single brain navigation. Simulation flights were 67% accurate when controlled by a single user but were 90% on target when controlled by two users. In addition, random noise in the combined EEG signals were significantly reduced and the dual brain navigation could also compensate for a lapse in attention by any one of the two users. In fact, NASA's Jet Propulsion lab in Pasadena, California, has been observing this study while itself investigating the potential of BCIs controlling, for example, planetary rovers, among other space applications. However, for now the idea of planetary rover remote control still remains speculative as most pioneers in the field of BCI are in their research stage.