Jens Wunderling developed loopArena, a touchscreen generative music interface. The user has the ability to control up to nine MIDI instruments.
An interesting project by Peter Berkelman and Ralph Hollis from 1998. The project is no longer active, you can check out the current projects from Microdynamic Systems Laboratory. The Psychophysics of Haptic Interaction studies the way in which users interact with real and virtual haptic worlds.
Here is a simple overview of the Magnetic Levitation Haptic Interface and a more detailed video.
The Computational Design Lab at Carnegie Mellon University has an interesting archive of current and previous projects. One that particularly caught my eye was Bach Blocks. A physical tangible instrument to encourage children to create orginal music. Rhythms and harmonys are created by placing the blocks in different patterns. The colour of the blocks is used to identify the pitch and the size of the block determines the beat length.
A drawing system allowing users to draw and then interact with their own drawing. livePic uses a brush and pallet, users can change the colour of the brush by touching the colour of their choice in the pallet. Multiple users can draw and interact together.
Developed by researchers from Switzerland, Italy, Germany, France and the UK, Tai-Chi (Tangible Acoustic Interfaces for Computer-Human Interaction) is a system that can transform any real object into a touch-sensitive computer interface. A computer is used to read values via piezoelectric sensors which are attached to the object. A recent paper from Design 2006 discusses the system in further detail:
Principally, there are two kinds of stimulation of physical objects: passive and active modes. In the passive mode any change in the acoustic properties of an object, due to its vibration as a consequence of interaction (knocking, tapping etc.), is detected and then used to estimate the location of the interaction. With regard to the active mode, the absorption of acoustic energy at the contact point of an object surface must be ascertained.
Currently there are three passive methods under investigation for tangible acoustic interfaces: time delay of arrival (TDOA), time reversal and acoustic holography.
The Soundgarten is a tangible interface to create sound environments aimed at childreen beween the ages of 3 and 7. It aims to explore three goals:
- to produce tools for early musical education and training of acoustic perception
- to encourage collaborative action
- to develop new approaches in the field of Human Computer Interaction (HCI)
The different sounds are represented by mushroom-shaped items and it is possible to manipulate the sound by plugging in flower and leaf-formed items into the mushrooms.
musicBottles is a musical interface developed in 1999, and consists of three corked bottles. When a bottle is uncorked it results in the sound of the violin, cello, or piano from Edouard Lalo’s Piano Trio in C Minor, Op. 7, being played.
The I/O Brush is a digital paintbrush that is able to detect the colour, texture and movements of objects. The brush is equipped with a small CCD video camera, force sensors and a ring of white LEDs. When the user pushes the brush up against an object the force sensors trigger the LEDs providing the video camera with enough light to make a successful recording for the period the brush is pressed against the object. The hope is that the people using it will develop their own unique ‘digital inks’ which they can use to draw on the canvas. The canvas itself is a backlight touchscreen display which is able to read from the brush and display the recorded video.
The midiGun is a midi controller in the shape of a gun designed to control Ableton LIVE and Traktor DJ. It provides the user with “16 different controllers which can be cascaded in 16 switchable sets”, allowing for a total of 256 independent sound and effect controls.