Swept RF Tagging

The Swept RF Tagging device developed by Kai-yuh Hsiao, ‘detects proximity of magnetically resonant tagged objects in field’. The system has been optimised to function as a musical instrument. The following video shows John Zorn using the system in April 2004.

Squeezables

The Squeezables are a set of balls that can be squeezed, stretched and moved in order to produce music. The project was developed by Seum-Lim Gan and Gil Weinberg.

The balls are positioned on a table surface and each ball contains a sensor block with five force sensing resistors. The sesnor block is also connected to a variable resistor slider located underneath the table surface which measures the amount of movement when pulling the balls away from the surface.

loopqoob

Murat Konar developed loopqoob , an interative performance system which uses sensor-equipped cubes to produce sound. A unique music loop is mapped to each side, so the orientation of the cubes determines the music played.

A similar project is Neel Joshi‘s music_blocks which consists of four wooden blocks each containing a 2 axis photointerrupter tilt sensors and a speaker. Three notes and silence are mapped to each side of the blocks, so positioning them in certain ways creates different sound output. The system is controlled via max/msp.

Pebble Box

Sile O’Modhrain has produced an interesting instrument called the Pebble Box, that uses collision theory and physical systems principles. Users interact with objects and sound is produced according to the movement between them.

Composition on the Table

Composition on the Table, is a project by Toshio Iwai from 1999 which consists of four tables which present the user with different tangible interfaces including switches, dials, turntables and sliding boards. The interaction with these produces sound and also updates computer visualisations which are projected onto the table surfaces. One of the objectives of the project is to create a Mixed Reality environment allowing a multi-user and collaborative system to create sounds and visuals.

Collection of Music Interface Projects

The following post presents a collection of music interfaces that were originally posted in the theoreticalplayground forums.

The RGB Player comprises of a rotating surface upon which a user can place coloured objects. A scanner detects these objects and generates sound depending on the distance from the centre of the rotating surface.

The AudioCube project uses four cubes each referring to a different element of music: drums, bass, lead and strings. The sides of the cubes determines the sound of the element and their relative position on the surface controls the spatial localisation.

The MusicTable provides the musician with cards that can be arranged on the table surface. The specific arrangement of the cards determines the sounds produced.

The ScanJam consists of two scanners and a computer. Each scanner represents one bar of music and objects placed on the scanners result in sound depending on the objects colour, shape and vertical placement.

The synthstick was inspired by the theremin and stylophone. A strip of VHS videotape with some conductive plastic film above it act as a ribbon control and the pitch can be adjusted by making contact at different points along the strip.

The blockjam interface consists of 25 tangible blocks that can be arranged to create musical sequences. It enables multiple users to play and collaborate. Each block can be controlled using gestural input, click-able input and also have a visual display.

The sonic banana was developed by Eric Singer. It is a rubber tube with four bend sensors inside. As the tube is twisted and shaped by the performer the data is sent to MaxMSP to create both arpeggiator and harmonica based musical sequences.

The oroboro is a collaborative musical interface. It is operated by two musicians each using two paddle mechanisms: one is a hand orientation sensor and the other is a ‘haptic mirror’ informing the user what the other musician is doing.

The ISS cube is a tangible music interface which tracks the position of objects on a tabletop surface. Different types of sounds can be played by combining and positioning the objects in different ways.

The Continuum offers continuous control in three dimensions for each finger placed on the surface.

The lemur is a multitouch musical interface developed by Jazz Mutant.

Paul Hertz developed Orai/Kalos as an interface to control audio and visual events.

The Circular Optical Object Locator is a collaborative music interface. The position of objects on a rotating platter determine the music that is produced.

The Jamodrum is a collaborative music interface inspired by drum circles. The musicians are able to create music and also effect visual changes on the tabletop. The tabletop display is often used to present interactive games to encourage collaboration and interaction between the musicians.

Cubed is a project by Douglas Stanley which is a music interface in the form of a Rubik’s Cube. Each side of the cube uses a different instrument to play notes which are determined by the colours on the face. An online shockwave implementation can be found here.

The Manual Input Sessions interface generates sound in response to hand gestures.

Instant City combines a number of fields to create an interactive computer game, musical interface and light sculpture. One or more players are able to create buildings on a table surface using blocks which in addition determines the generated sound output.

Tai-Chi

Developed by researchers from Switzerland, Italy, Germany, France and the UK, Tai-Chi (Tangible Acoustic Interfaces for Computer-Human Interaction) is a system that can transform any real object into a touch-sensitive computer interface. A computer is used to read values via piezoelectric sensors which are attached to the object. A recent paper from Design 2006 discusses the system in further detail:

Principally, there are two kinds of stimulation of physical objects: passive and active modes. In the passive mode any change in the acoustic properties of an object, due to its vibration as a consequence of interaction (knocking, tapping etc.), is detected and then used to estimate the location of the interaction. With regard to the active mode, the absorption of acoustic energy at the contact point of an object surface must be ascertained.

Currently there are three passive methods under investigation for tangible acoustic interfaces: time delay of arrival (TDOA), time reversal and acoustic holography.

The paper discusses the implementation of Acoustic Holography using the Rayleigh-Sommerfeld algorithm. The following is an audiovisual installation called Sound Rose using the Tai-Chi system.

Soundgarten – a tangible interface

The Soundgarten is a tangible interface to create sound environments aimed at childreen beween the ages of 3 and 7. It aims to explore three goals:

  • to produce tools for early musical education and training of acoustic perception
  • to encourage collaborative action
  • to develop new approaches in the field of Human Computer Interaction (HCI)

The different sounds are represented by mushroom-shaped items and it is possible to manipulate the sound by plugging in flower and leaf-formed items into the mushrooms.

audioshaker

The Audio Shaker was developed by Tom Jenkins. The device has a cylindrical form with a removeable lid. When the lid is removed sounds can be captured, with the lid on the sounds can be played back and manipulated depending on how the user shakes the device, removing the lid and tipping the device like a jug pours out the sounds.

musicBottles

musicBottles is a musical interface developed in 1999, and consists of three corked bottles. When a bottle is uncorked it results in the sound of the violin, cello, or piano from Edouard Lalo’s Piano Trio in C Minor, Op. 7, being played.