The encephalophone, an experimental hands-free musical instrument that uses brain waves to generate sounds, may help in the treatment of neurological disorders.
Reclining on a comfy chair, neurologist Thomas Deuel jams with a jazz band without lifting a finger. Wearing a blue cap covered with electrodes tracking his brain signals, he uses his mind to create music through a synthesizer.
Deuel’s mind-bending invention, the encephalophone, belongs in both realms of science and the arts. It’s a musical instrument as well as a biofeedback device designed to help people with neurological disorders, including patients who experienced a stroke or damage to the spinal cord.
As a gifted multi-instrument player of the trumpet, guitar and piano, Deuel studied jazz at the New England Conservatory. The encephalophone combines a love of music with his understanding of brain physiology to help patients with motor disabilities heal through the power of music.
“At first I wanted to invent a new musical instrument,” said Deuel, a neuroscientist at the University of Washington, where he is the director of the Art + Brain Lab.
“However, in the course of project development, I started thinking: How can I use what I am doing for treatment purposes?”
Imaginary Air Guitar
The encephalophone uses the electrode cap to transform alpha brain waves into musical notes. Connected to an electroencephalogram (EEG), the encephalophone uses complex brain signal processing and digital music algorithms to record two kinds of signals to make music.
One signal type, posterior dominant rhythm (PDR), is produced by the brain’s visual cortex when users open and close their eyes to control the music. Mu rhythms, the second type of signal from the brain’s motor cortex, happen when users imagine making movements to control the music.
Deuel claims that playing the encephalophone takes very little thought. Once connected to the apparatus, all users have to do is imagine, for example, they are lifting the left foot and the synth will produce a higher sound. They may produce a lower sound by imagining they’re lifting the right eyebrow.
The EEG then captures the signals from the brain’s motor cortex, and the encephalophone translates these thoughts into synthesized music.
“It is not especially complicated,” said Deuel. “The big thing is you are not moving, but then it is an unconventional instrument.”
For patients with neurological conditions such as multiple sclerosis (MS), stroke or amyotrophic lateral sclerosis (ALS), motor impairments can make it difficult or even impossible to play music.
“The situation is even more difficult for patients who are active musicians, as losing their musical ability can mean losing a treasured part of their identity,” said Deuel.
The encephalophone may serve as an ideal tool in rehabilitation programs. Deuel believes that by helping patients learn how to create music using of different parts of the brain, the encephalophone could potentially be a beneficial therapeutic biofeedback device.
Making Mind Music
Deuel is not the first to come up with the idea of converting brain signals to sound. During the creative heyday of the mid 1960s, the composer and experimental musician Lucier controlled percussion instruments using the brain’s visual cortex. Fast forward to 2007, when other researchers used brainwave systems for underwater brainwave concerts.
The encephalophone is currently at the testing stage, but Deuel says its future already looks promising.
“At the moment, we cannot show you Beethoven’s scores and hope you will be able to play it each time,” he said. “However, you do have real control over the music and there is no accident in that.”
He envisions a future where EEG devices are cheaper and smaller, making it easy to purchase the encephalophone in any electronics store. Then music from people’s minds could be transferred in real time to smartphones or YouTube channels, allowing anyone to be a composer.
It may be a while though, before thought-driven instruments hit the market. In the meantime, other artists have explored creating music using the body only — no instrument required.
Using KAGURA’s augmented reality (AR) digital music software linked to an Intel RealSense camera, the program recognizes human movements and gestures so that the user can perform music without even knowing how to play an instrument.
By tracking movement in 3D space, the camera transfers information about the location of the user’s hands to a processor that translates it into the language of sound.
From the mind to the body to the heart, technology is bringing the healing sounds of music to people everywhere.
Editor’s note: This article was first published on the Polish iQ by Intel site.