A brief overview of Brain Computer Interfaces in relation to music. Human brainwaves were first measured by Hans Berger in 1924. Initially the results were ignored until Adrian and Matthews backed up Berger’s results in their 1934 paper ‘The Berger Rhythm: Potential Changes from the Occipital Lobes in Man‘ published in the Brain Journal. The first piece of music which used EEG was composed by Alvin Lucier (working with Edmond Dewan) in 1965 with his composition ‘Music for Solo Performer’.
Over the years a number of composers have used brainwaves to control music. Richard Teitelbaum, a member of Italian electronic music group Musica Elettronica Viva, used biological signals (EEG and EKG) to control electronic synths.
In the early 1970’s David Rosenboom founded the Laboratory of Experimental Aesthetics at York University in Toronto where they explored the relation between aesthetic experience and produced musical realisations. Many artists and musicians visited and worked there during the time including John Cage, David Behrman, La Monte Young and Marian Zazeela. Rosenboom produced his album Brainwave Music and published results of his experiements in ‘Biofeedback and the Arts’ in 1976. He wrote a a second book called ‘Extended Musical Interface with the Human Nervous System’.
Jacques Vidal, a computer science research at UCLA began developing the first direct brain-computer interface using a batch-porcessing IBM computer. He published a paper in 1973 called ‘Toward Direct Brain-Computer Communication [pdf]‘ A paper from 1998 called ‘Cyberspace Bionics‘ provides an overview of technology and it’s impact on human life.
In the 1970’s Pierre Droste, Andrew Culver and Charles de Mestral made up a Montreal Group called Sonde performing a number of improvisational brainwave concerts.
Between 1990-1992, Benjamin Knapp and Hugh Lusted developed the BioMuse, an 8 channel ‘biocontroller’ that analyses muscle movement (EMG), eye movement (EOG), heart (EKG) or brainwave signals (EEG) using non-invasive transdermal electrodes.
Atau Tanaka has used the BioMuse for compositions and performances. Together with Zbigniew Karkowski and Edwin van der Heide, they established Sensorband, a sensor instrument ensemble. In 1996, Scientific American published an article written by Knapp and Lusted about the BioMuse called ‘Controlling Computers with Nueral Signals [pdf]‘.
The IBVA system provides ‘interactive control from brainwave activity’ allowing the user to trigger audio, images, software and other hardware devices.
The IBVA inhales brainwaves but exhales a brain-computer interface. Your brainwaves can control everything from sounds that go ping to almost any electronically addressable device.
CalArts student Adam Overton used SuperCollider, a custom EEG/EKG device and a respiration harness and sensor for his project Sitting.Breathing.Beating.[Not]Thinking. The equipment analyses breaths, heartbeats and brainwaves whilst Adam meditates. The software plays its own datafiles, which results in the noisy/chaotic textures, the signals & movements are then used to manipulate the sound.
The EyeTap Personal Imaging Lab is a ‘computer vision and intelligent image processing research lab focused on the area of personal imaging, mediated reality and weable computers’, set up in 1998 by Steve Mann. In 2003 they started projects combining music and brainwaves. James Fung‘s project Regenerative Brain Wave Music Project explores ‘new physiological interfaces for musical instruments’. At DECONcert1 48 people were wired up with EEG sensors which were then used to control the sound. By playing back the music created in realtime a biofeedback loop was created with the audience reacting to the sound they were creating. At REGEN3 jazz musicians played music which is ‘driven and altered by the brainwaves of the audience’. An interesting idea to unite the performer and the audience.
Andrew Brouse used Max/MSP and OpenSoundControl to create InterHarmonium. The project aims to generate music using human brainwaves and then transmit the music over to another location via the internet. You can find a more detailed explanation in the project thesis report ‘The InterHarmonium : An Investigation into Networked Musical Applications and Brainwaves’.
Brouse working with Eduardo Miranda who runs Neuromusic department at University of Plymouth, developed the BCMI-PIANO. Matlab and Simulink is used to perform power spectrum and Hjorth analyses of EEG signals in realtime to control music.
In order to have greater control over this system, we are developing methods to train subjects to achieve specific EEG patterns in order to play the BCMI-Piano system. We have initial evidence that this can be made possible using a technique commonly known as biofeedback.
I didn’t intend to write so much about this subject! Developments in this field are sure to continue advancing rapidly as researchers, companies and hobbyists seek to explore new ways of interaction, whether it’s for music, gaming or general use. Perhaps the key issues to tackle as Miranda and Brouse point out is the ‘task of intereting the meaning of the EEG’ and secondly how to create equipment that is more comfortable and portable. It will be interesting to see how the projects develop.