Breanna Olson, a dancer diagnosed with ALS, used brain activity recorded by an EEG headset to control a mixed‑reality avatar live on stage at the OBA Theatre in Amsterdam in December.
The headset was developed by Japanese firm Dentsu Lab in collaboration with NTT as part of a project called Waves of Will. Organisers described the December event as "the first of its kind."
The system used electroencephalography (EEG), which measures electrical activity at the scalp, to detect motor signals tied to Olson imagining specific dance movements. A brain–computer interface translated those signals into computer instructions so the avatar could perform the selected moves in real time.
In an interview with the BBC, Olson called the experience "exhilarating" and "magical" and said it helped restore a sense of expression she had lost to the disease. "This is a new way of expression," she said.
Dentsu Lab chief creative officer Naoki Tanaka told the BBC the project aims to make brainwave interfaces more accessible. Mariko Nakamura of NTT said the team sees potential to adapt the same control signals for devices such as wheelchairs or remote controls.
The performance is part of a wider effort to test whether non-invasive sensing and real‑time decoding of imagined movement can help people with progressive motor conditions participate in creative and social activities. Organisers and Olson emphasised this was a demonstration of capability rather than a clinical trial.
Olson, who lives in Tacoma, Washington, said she hopes the work will give others with ALS more ways to take part in activities they value.
Photo credit: ichef.bbci.co.uk
Tags: EEG, brain-computer interface, ALS, mixed-reality avatar
Topics: Brain–computer interfaces, EEG & neuro-sensing headsets, Wearable neurotech