The Wekinator

The Wekinator is a software application developed by Rebecca Fiebrink which uses machine learning principles in order to help develop interactive systems. The GUI provides the user with a way to ‘create machine learning models from scratch, without any programming’.

A number of example applications are suggested including development of musical instruments, video games and other systems for gesture analysis and feedback. The Wekinator supports OSC so any device that can output OSC can be used as a controller and anything that can receive OSC can be controlled. A list of hardware and software that support OSC can be found on the opensoundcontrol website. As an example, a user may want to use a Wiimote accelerometer to control the volume of a sound. So the first thing they would need to do is perform the specific gesture a number of times and associate them to the volume parameter. The wekinator creates a mapping relationship between these two values and starts to develop new mapping relationships between inputs and output parameters.

The following diagram, sourced from the ChucK/Wekinator integration instructions, provides an overview of the system.

wekinator system

Human Instrument

Latest work by Daito Manabe and Masaki Teruoka combines myoelectric sensors with electric muscle stimulations to create a human played and human responsive musical instrument.


Sound Builders

The Sound Builders series on highlights musicians who develop their own musical instruments. The first episode followed Peaking Lights as they prepared for their live tour. Subsequent episodes have included interviews with Diego Stocco, Eric Singer, Steve Mann, Liz Phillips, Reed Ghazla, Felix Thorn, Ranjit Bhatnagar and Ken Butler.

Emonator & The MATRIX

The Emonator was developed jointly by Dan Overholt and Paul Nemirovsky. The project split in two, Paul Nemirovsky continues to work on the project under the name Emonantor whilst Dan Overholt is now developing The Matrix. Why the split? The MATRIX focuses more on the interface design and development of musical synthesis where as the Emonantor is used as a gestural controller for the Emonic Environment.

It offers a 3-dimensional interface using a square set of pushable rods that measure the movement of the hand.


The Squeezables are a set of balls that can be squeezed, stretched and moved in order to produce music. The project was developed by Seum-Lim Gan and Gil Weinberg.

The balls are positioned on a table surface and each ball contains a sensor block with five force sensing resistors. The sesnor block is also connected to a variable resistor slider located underneath the table surface which measures the amount of movement when pulling the balls away from the surface.

Pebble Box

Sile O’Modhrain has produced an interesting instrument called the Pebble Box, that uses collision theory and physical systems principles. Users interact with objects and sound is produced according to the movement between them.


Maniax Memori has designed chessynthesis where the audio output is controlled as the chess game progresses. The final version will use reacTIvision alongside processing. This video shows a visual representation of what to expect, the final release will use a real chess board.

Collection of Music Interface Projects

The following post presents a collection of music interfaces that were originally posted in the theoreticalplayground forums.

The RGB Player comprises of a rotating surface upon which a user can place coloured objects. A scanner detects these objects and generates sound depending on the distance from the centre of the rotating surface.

The AudioCube project uses four cubes each referring to a different element of music: drums, bass, lead and strings. The sides of the cubes determines the sound of the element and their relative position on the surface controls the spatial localisation.

The MusicTable provides the musician with cards that can be arranged on the table surface. The specific arrangement of the cards determines the sounds produced.

The ScanJam consists of two scanners and a computer. Each scanner represents one bar of music and objects placed on the scanners result in sound depending on the objects colour, shape and vertical placement.

The synthstick was inspired by the theremin and stylophone. A strip of VHS videotape with some conductive plastic film above it act as a ribbon control and the pitch can be adjusted by making contact at different points along the strip.

The blockjam interface consists of 25 tangible blocks that can be arranged to create musical sequences. It enables multiple users to play and collaborate. Each block can be controlled using gestural input, click-able input and also have a visual display.

The sonic banana was developed by Eric Singer. It is a rubber tube with four bend sensors inside. As the tube is twisted and shaped by the performer the data is sent to MaxMSP to create both arpeggiator and harmonica based musical sequences.

The oroboro is a collaborative musical interface. It is operated by two musicians each using two paddle mechanisms: one is a hand orientation sensor and the other is a ‘haptic mirror’ informing the user what the other musician is doing.

The ISS cube is a tangible music interface which tracks the position of objects on a tabletop surface. Different types of sounds can be played by combining and positioning the objects in different ways.

The Continuum offers continuous control in three dimensions for each finger placed on the surface.

The lemur is a multitouch musical interface developed by Jazz Mutant.

Paul Hertz developed Orai/Kalos as an interface to control audio and visual events.

The Circular Optical Object Locator is a collaborative music interface. The position of objects on a rotating platter determine the music that is produced.

The Jamodrum is a collaborative music interface inspired by drum circles. The musicians are able to create music and also effect visual changes on the tabletop. The tabletop display is often used to present interactive games to encourage collaboration and interaction between the musicians.

Cubed is a project by Douglas Stanley which is a music interface in the form of a Rubik’s Cube. Each side of the cube uses a different instrument to play notes which are determined by the colours on the face. An online shockwave implementation can be found here.

The Manual Input Sessions interface generates sound in response to hand gestures.

Instant City combines a number of fields to create an interactive computer game, musical interface and light sculpture. One or more players are able to create buildings on a table surface using blocks which in addition determines the generated sound output.

Datasound – making music using digital storage

Datasound is a music interface that allows a user to mix and scratch digital data signals from pictures, floppy discs or drawings. The datasound system offers a number of potential sound sources including a 5.25″ floppy disk turntable, a hard drive, neon light and a flatbed scanner.

perScrutinizer – interactive sound installation

The perScrutinizer is an interactive sound installation providing multiple ways to produce and listen to sound. The system comprises of a cage environment in which remote-controlled moving objects reside. These objects can be controlled using a touchpad input device to make them move, collide and scrap which result in a variety of sounds. The user is provided with a stethoscope, which they can position anywhere on the cage, to listen to the sounds.

Next Page »