RGB

The intention behind RGB is to encourage audience participation in the musical composition phase when performing live music. The system was developed by Tomas Dvorak, Alessandro Capozzo and Matous Godik. Flashlights are used to interlace red, green and blue colours which can be combined to produce more colours. The position and colour of the lights helps to determine the resulting sound output.

Virtual Air Guitar

The Virtual Air Guitar is a gesture based input device developed at the Helsinki University of Technology. A mapping system is used to translate the user’s gestures, collected by the webcam, into the sound output.

The hands free method of interaction allows for the guitar to be played just like an air guitar! The webcam identifies all bright orange objects which is why the user must wear some orange gloves to control the guitar. The system also includes two foot pedals, one for starting the music and one for changing performance modes.

The synthesis method to create the sound of the guitar is based on Karplus-Strong technique modelling a Fender Stratocaster guitar combined with physical model of a tube amplifier.

Intelligent MIDI Sequencing with Hamster Control

Levy Lorenzo developed a MIDI device that uses the movement of hamsters to control sound parameters of a musical sequence. The measurements were then fed into a Markov chain process in order to provide an additional layer of abstraction. Each sound was controlled by two hamsters, one responsible for the rhythmic qualities and the other the note sequence.

Sonasphere

Sonasphere is an audiovisual software application by Nao Tokui which provides the user with a 3D canvas onto which they can add objects. The objects can be sound samples, effects or mixers, and it is possible to connect these objects together to modify the sound. Parameters of the objects can be assigned to 3D coordinates for the 3D canvas, so a filter object can have the cutoff frequency assigned to the x axis, the resonance to the y. The distance between a connected sound object and filter object control the volume. In addition gravitational forces act between each of the objects to create a more dynamic and chaotic environment.

Beatbox

The Beatbox is a physical programmable drum machine and consists of 5 tappers. Each tapper allows the user to play a rhythm, and then it can re-play the rhythm until it is stopped, or a new rhythm is played. The tappers can then be placed on different materials and objects to create a custom drum kit.

Sonic Scanner

The Sonic Scanner enables the user to create sounds by scanning different pictures. The instrument offers four different modes: waveform, spectrum, rhythm and sampler. The waveform mode uses the brightness levels of the picture to produce corresponding audio frequencies. Spectrum uses FFT to translate the optical spectrum into an audio spectrum. Rhythm functions similar to the waveform mode but at a slower rate in order to produce more rhythmical textures. The sampler mode enables the user to record a sound and then manipulate it using the Sonic Scanner.

Barcode Scanners

A barcode is a graphical representation of data found on the majority of purchasable products. Using a barcode scanner the data can be decoded and used to identify the product. In the 1990s Epoch developed the Barcode Battler device that was able to read barcodes and create a character based on it. It was then possible to battle characters against one another.

Other companies have also released their own variants including the Skannerz.

This post is taken from the theoretical playground forum in 2004, where I asked whether people had used barcode scanners to make music. At the time a number of musicians were identified including barcodemusic who describe their process of making music. A more recent search found a post on makezine.com called beats from a barcode featuring a project by barcode beats.

« Previous Page