Brain Computer Music Interfaces

A brief overview of Brain Computer Interfaces in relation to music. Human brainwaves were first measured by Hans Berger in 1924. Initially the results were ignored until Adrian and Matthews backed up Berger’s results in their 1934 paper ‘The Berger Rhythm: Potential Changes from the Occipital Lobes in Man‘ published in the Brain Journal. The first piece of music which used EEG was composed by Alvin Lucier (working with Edmond Dewan) in 1965 with his composition ‘Music for Solo Performer’.

Over the years a number of composers have used brainwaves to control music. Richard Teitelbaum, a member of Italian electronic music group Musica Elettronica Viva, used biological signals (EEG and EKG) to control electronic synths.

In the early 1970’s David Rosenboom founded the Laboratory of Experimental Aesthetics at York University in Toronto where they explored the relation between aesthetic experience and produced musical realisations. Many artists and musicians visited and worked there during the time including John Cage, David Behrman, La Monte Young and Marian Zazeela. Rosenboom produced his album Brainwave Music and published results of his experiements in ‘Biofeedback and the Arts’ in 1976. He wrote a a second book called ‘Extended Musical Interface with the Human Nervous System’.

Roger Lafosse and Pierre Henry used a live performance system called a Corticalart on a number of recordings.

Corticalart

Jacques Vidal, a computer science research at UCLA began developing the first direct brain-computer interface using a batch-porcessing IBM computer. He published a paper in 1973 called ‘Toward Direct Brain-Computer Communication [pdf]‘ A paper from 1998 called ‘Cyberspace Bionics‘ provides an overview of technology and it’s impact on human life.

In the 1970’s Pierre Droste, Andrew Culver and Charles de Mestral made up a Montreal Group called Sonde performing a number of improvisational brainwave concerts.

Between 1990-1992, Benjamin Knapp and Hugh Lusted developed the BioMuse, an 8 channel ‘biocontroller’ that analyses muscle movement (EMG), eye movement (EOG), heart (EKG) or brainwave signals (EEG) using non-invasive transdermal electrodes.

Atau Tanaka has used the BioMuse for compositions and performances. Together with Zbigniew Karkowski and Edwin van der Heide, they established Sensorband, a sensor instrument ensemble. In 1996, Scientific American published an article written by Knapp and Lusted about the BioMuse called ‘Controlling Computers with Nueral Signals [pdf]‘.

The IBVA system provides ‘interactive control from brainwave activity’ allowing the user to trigger audio, images, software and other hardware devices.

The IBVA inhales brainwaves but exhales a brain-computer interface. Your brainwaves can control everything from sounds that go ping to almost any electronically addressable device.

Head of IBVA Luciana Haill uses the system to control the pitch and velocity like ‘playing a Theremin with your brain’. Other notable musicians that use the IBVA system are Paras Kaul and Towa Tei.

CalArts student Adam Overton used SuperCollider, a custom EEG/EKG device and a respiration harness and sensor for his project Sitting.Breathing.Beating.[Not]Thinking. The equipment analyses breaths, heartbeats and brainwaves whilst Adam meditates. The software plays its own datafiles, which results in the noisy/chaotic textures, the signals & movements are then used to manipulate the sound.

The EyeTap Personal Imaging Lab is a ‘computer vision and intelligent image processing research lab focused on the area of personal imaging, mediated reality and weable computers’, set up in 1998 by Steve Mann. In 2003 they started projects combining music and brainwaves. James Fung‘s project Regenerative Brain Wave Music Project explores ‘new physiological interfaces for musical instruments’. At DECONcert1 48 people were wired up with EEG sensors which were then used to control the sound. By playing back the music created in realtime a biofeedback loop was created with the audience reacting to the sound they were creating. At REGEN3 jazz musicians played music which is ‘driven and altered by the brainwaves of the audience’. An interesting idea to unite the performer and the audience.

Andrew Brouse used Max/MSP and OpenSoundControl to create InterHarmonium. The project aims to generate music using human brainwaves and then transmit the music over to another location via the internet. You can find a more detailed explanation in the project thesis report ‘The InterHarmonium : An Investigation into Networked Musical Applications and Brainwaves’.

Brouse working with Eduardo Miranda who runs Neuromusic department at University of Plymouth, developed the BCMI-PIANO. Matlab and Simulink is used to perform power spectrum and Hjorth analyses of EEG signals in realtime to control music.

In order to have greater control over this system, we are developing methods to train subjects to achieve specific EEG patterns in order to play the BCMI-Piano system. We have initial evidence that this can be made possible using a technique commonly known as biofeedback.

The BCMI-Piano is mentioned in a paper by Brouse and Miranda titled ‘Toward Direct Brain-Computer Musical Interfaces‘ from NIME.

A number of homebrew BCI devices are currently in progress, one such example is by Mick Grierson. It is still in the early stages, but there is a short video demo on youtube.

I didn’t intend to write so much about this subject! Developments in this field are sure to continue advancing rapidly as researchers, companies and hobbyists seek to explore new ways of interaction, whether it’s for music, gaming or general use. Perhaps the key issues to tackle as Miranda and Brouse point out is the ‘task of intereting the meaning of the EEG’ and secondly how to create equipment that is more comfortable and portable. It will be interesting to see how the projects develop.

Augmenting the Mouse with Pressure Sensitive Input

With a uni-pressure and dual-pressure augmented mouse, users have additional functional control. Two or more pressure sensors can be used in tandem. As always I’d like to see this principle applied in a music environment, current/future applications or perhaps even hardware instruments.

livePic

A drawing system allowing users to draw and then interact with their own drawing. livePic uses a brush and pallet, users can change the colour of the brush by touching the colour of their choice in the pallet. Multiple users can draw and interact together.

Collection of Music Interface Projects

The following post presents a collection of music interfaces that were originally posted in the theoreticalplayground forums.

The RGB Player comprises of a rotating surface upon which a user can place coloured objects. A scanner detects these objects and generates sound depending on the distance from the centre of the rotating surface.

The AudioCube project uses four cubes each referring to a different element of music: drums, bass, lead and strings. The sides of the cubes determines the sound of the element and their relative position on the surface controls the spatial localisation.

The MusicTable provides the musician with cards that can be arranged on the table surface. The specific arrangement of the cards determines the sounds produced.

The ScanJam consists of two scanners and a computer. Each scanner represents one bar of music and objects placed on the scanners result in sound depending on the objects colour, shape and vertical placement.

The synthstick was inspired by the theremin and stylophone. A strip of VHS videotape with some conductive plastic film above it act as a ribbon control and the pitch can be adjusted by making contact at different points along the strip.

The blockjam interface consists of 25 tangible blocks that can be arranged to create musical sequences. It enables multiple users to play and collaborate. Each block can be controlled using gestural input, click-able input and also have a visual display.

The sonic banana was developed by Eric Singer. It is a rubber tube with four bend sensors inside. As the tube is twisted and shaped by the performer the data is sent to MaxMSP to create both arpeggiator and harmonica based musical sequences.

The oroboro is a collaborative musical interface. It is operated by two musicians each using two paddle mechanisms: one is a hand orientation sensor and the other is a ‘haptic mirror’ informing the user what the other musician is doing.

The ISS cube is a tangible music interface which tracks the position of objects on a tabletop surface. Different types of sounds can be played by combining and positioning the objects in different ways.

The Continuum offers continuous control in three dimensions for each finger placed on the surface.

The lemur is a multitouch musical interface developed by Jazz Mutant.

Paul Hertz developed Orai/Kalos as an interface to control audio and visual events.

The Circular Optical Object Locator is a collaborative music interface. The position of objects on a rotating platter determine the music that is produced.

The Jamodrum is a collaborative music interface inspired by drum circles. The musicians are able to create music and also effect visual changes on the tabletop. The tabletop display is often used to present interactive games to encourage collaboration and interaction between the musicians.

Cubed is a project by Douglas Stanley which is a music interface in the form of a Rubik’s Cube. Each side of the cube uses a different instrument to play notes which are determined by the colours on the face. An online shockwave implementation can be found here.

The Manual Input Sessions interface generates sound in response to hand gestures.

Instant City combines a number of fields to create an interactive computer game, musical interface and light sculpture. One or more players are able to create buildings on a table surface using blocks which in addition determines the generated sound output.

Soundgarten – a tangible interface

The Soundgarten is a tangible interface to create sound environments aimed at childreen beween the ages of 3 and 7. It aims to explore three goals:

  • to produce tools for early musical education and training of acoustic perception
  • to encourage collaborative action
  • to develop new approaches in the field of Human Computer Interaction (HCI)

The different sounds are represented by mushroom-shaped items and it is possible to manipulate the sound by plugging in flower and leaf-formed items into the mushrooms.

Audiopad

Developed by James Patten and Ben Recht, the audiopad tracks the position of objects on a tabletop and converts the movement into sound. The display provides visual feedback for the performer and the audience. The system is based on an earlier device made by James Patten called SenseTable.