G E E K   P A G E    Issue 2.11 - November 1996

Jacking In

By Chris Chinnock

EEG signals may be the next user interface



Simple electroencephalogram (EEG) signals have been used for years by medical personnel to determine the general health of the human brain. But developers are now expanding the detection and processing of these brain signals to produce commercial devices that control real-world objects. Users can play music, move computer screen cursors, interact with games, turn on appliances and even guide wheelchairs - all by controlling their brainwave patterns.

Traditional EEG signals consist of complex waveforms that are divided into specific frequency bands. For example, the delta band extends from near 0 to about 4Hz. Theta runs from 4 to 8Hz, alpha from 8 to 13Hz and beta from 13 to about 32Hz. These bands are associated with different aspects of human behaviour.

Most efforts in brain actuation focus on the alpha and beta bands. Strong alpha wave patterns indicate deep meditative or relaxed mental states, while the beta range corresponds to higher-energy thought processes.

To sense brain-wave signals, researchers put electrodes on the subject's head - placement optimises detection of certain brain-wave signals. For example, if a subject is watching a steadily flickering light, putting electrodes at the rear of the head, near the occipital lobe, gives a strong signal corresponding to what the user is observing.

In measuring a subject's brain-waves, it's best to place the electrodes on a shaved area of the scalp, directly over the place where the signals to be measured will be strongest. The forehead is a convenient place for electrode placement, but this technique suffers from the need to separate specific EEG signals from a complex mix of brain-wave and facial muscle activity. Raising your eyebrow can swamp the EEG signal unless the characteristic signature of that movement is filtered out.

Learning how to extract the desired signals from the jumble of noise and competing patterns is one of the main challenges for developers. In pioneering work at the Wright-Patterson Air Force Base in Ohio, researchers developed a technique that uses a slowly flashing light to help extract the desired signal. They discovered that a light flashing at, say, 13.25 times a second produces a 13.25Hz, synchronous EEG brain response. The light thus evokes a stable reference point which helps detect changes in the EEG amplitude.

In their system, Wright-Patterson researchers place two electrodes on either side of the back of the head and attach ground and reference electrodes. A differential EEG signal, produced by subtracting a person's left hemisphere signal from that of the right hemisphere, is sent to a lock-in amplifier. This device aligns the peak of the reference light source signal with the peak of the differential EEG brain-wave signal. This magnifies the signal, making it easier to detect over the background noise. The strength of this signal is then displayed to the user, who uses the feedback to help control the amplitude - and, in turn, an action of some kind.

But how does one learn to control a 13.25Hz brain-wave signal? McMillan explains, "New users find that if you try too hard or force the task, it won't work. In the early stages of training, they often use various forms of imagery related to what they want to achieve. Once they get trained, however, they simply concentrate on controlling the device."

Other developers are trying to use more of the EEG spectrum to detect brain-wave patterns to use as control signals. For example, New York-based IBVA Technologies relies on Fast Fourier Transform (FFT) techniques to analyse brain-wave signals. FFTs are very common in many types of electronic analysis and image processing. In the IBVA system, EEG signals are transmitted via radio signal from the head to a computer. There, continuous, real-time FFT processing analyses the EEG response and computes the amplitude for individual frequencies. This allows users to see the amplitude of their complex EEG waveforms over the entire 0 to 60Hz band.

At Cyberlink Mind Systems, Andrew Junker breaks down the EEG spectrum into ten of what he calls "brainfingers". Each finger is essentially a narrowband electrical filter superimposed onto the EEG spectrum. But Junker extends his detection capability all the way up to 3,000Hz.

Junker places three brainfingers in the theta band, three in alpha and four in beta. Eleventh and twelfth channels detect eye and facial muscle movements above and below the EEG bands. These are principally used as on/off switches to trigger events like a hard return or a mouseclick. All of these brain-body signals are decoded by a computer to produce an amplitude-versus-frequency display. Selected brainfinger signatures can then be connected to various devices to play music and games or even guide a sailing boat.

As far as practical uses go, the key to harnessing brain actuation techniques is to assemble a suite of EEG signatures that users can control simultaneously. Most people can learn to manipulate several biosignal signatures one at a time. The tricky part is learning to control multiple EEG patterns at once.

For example, if one signature moves a cursor left and right and a second signature moves it up and down, it is far more useful to move diagonally by exercising both EEG signatures simultaneously. Such coordination is fairly natural for most people, but it is a learned skill - according to developers of brain-actuated technology.

But what seems likely is that these brain-body signals will eventually become integrated into future human-machine interfaces, reshaping the way we interact with the world. Such interfaces may become especially important for the disabled. Within a few years, specially tailored interfaces could allow disabled people to interact through their computers in ways never thought possible.

Chris Chinnock writes about emerging technologies.