The Wekinator

The Wekinator is a software application developed by Rebecca Fiebrink which uses machine learning principles in order to help develop interactive systems. The GUI provides the user with a way to ‘create machine learning models from scratch, without any programming’.

A number of example applications are suggested including development of musical instruments, video games and other systems for gesture analysis and feedback. The Wekinator supports OSC so any device that can output OSC can be used as a controller and anything that can receive OSC can be controlled. A list of hardware and software that support OSC can be found on the opensoundcontrol website. As an example, a user may want to use a Wiimote accelerometer to control the volume of a sound. So the first thing they would need to do is perform the specific gesture a number of times and associate them to the volume parameter. The wekinator creates a mapping relationship between these two values and starts to develop new mapping relationships between inputs and output parameters.

The following diagram, sourced from the ChucK/Wekinator integration instructions, provides an overview of the system.

wekinator system

Emonator & The MATRIX

The Emonator was developed jointly by Dan Overholt and Paul Nemirovsky. The project split in two, Paul Nemirovsky continues to work on the project under the name Emonantor whilst Dan Overholt is now developing The Matrix. Why the split? The MATRIX focuses more on the interface design and development of musical synthesis where as the Emonantor is used as a gestural controller for the Emonic Environment.

It offers a 3-dimensional interface using a square set of pushable rods that measure the movement of the hand.

FlexiGesture

The FlexiGesture dates from 2003 and is a device created to explore the relationship between input gesture and output sound.

For the majority of traditional acoustic instruments, the relationship or mapping between the input gesture and output is predominantly fixed. The advent of digital technology has allowed for the separation between the input and output, and with it the potential of defining new mapping systems.

Collection of Music Interface Projects

The following post presents a collection of music interfaces that were originally posted in the theoreticalplayground forums.

The RGB Player comprises of a rotating surface upon which a user can place coloured objects. A scanner detects these objects and generates sound depending on the distance from the centre of the rotating surface.

The AudioCube project uses four cubes each referring to a different element of music: drums, bass, lead and strings. The sides of the cubes determines the sound of the element and their relative position on the surface controls the spatial localisation.

The MusicTable provides the musician with cards that can be arranged on the table surface. The specific arrangement of the cards determines the sounds produced.

The ScanJam consists of two scanners and a computer. Each scanner represents one bar of music and objects placed on the scanners result in sound depending on the objects colour, shape and vertical placement.

The synthstick was inspired by the theremin and stylophone. A strip of VHS videotape with some conductive plastic film above it act as a ribbon control and the pitch can be adjusted by making contact at different points along the strip.

The blockjam interface consists of 25 tangible blocks that can be arranged to create musical sequences. It enables multiple users to play and collaborate. Each block can be controlled using gestural input, click-able input and also have a visual display.

The sonic banana was developed by Eric Singer. It is a rubber tube with four bend sensors inside. As the tube is twisted and shaped by the performer the data is sent to MaxMSP to create both arpeggiator and harmonica based musical sequences.

The oroboro is a collaborative musical interface. It is operated by two musicians each using two paddle mechanisms: one is a hand orientation sensor and the other is a ‘haptic mirror’ informing the user what the other musician is doing.

The ISS cube is a tangible music interface which tracks the position of objects on a tabletop surface. Different types of sounds can be played by combining and positioning the objects in different ways.

The Continuum offers continuous control in three dimensions for each finger placed on the surface.

The lemur is a multitouch musical interface developed by Jazz Mutant.

Paul Hertz developed Orai/Kalos as an interface to control audio and visual events.

The Circular Optical Object Locator is a collaborative music interface. The position of objects on a rotating platter determine the music that is produced.

The Jamodrum is a collaborative music interface inspired by drum circles. The musicians are able to create music and also effect visual changes on the tabletop. The tabletop display is often used to present interactive games to encourage collaboration and interaction between the musicians.

Cubed is a project by Douglas Stanley which is a music interface in the form of a Rubik’s Cube. Each side of the cube uses a different instrument to play notes which are determined by the colours on the face. An online shockwave implementation can be found here.

The Manual Input Sessions interface generates sound in response to hand gestures.

Instant City combines a number of fields to create an interactive computer game, musical interface and light sculpture. One or more players are able to create buildings on a table surface using blocks which in addition determines the generated sound output.

Wiimote Control

A couple of examples of what we can expect from the Wiimote. Firstly a drum machine which uses two software programs from bobsomers. The user can play drums by flicking the Wiimote & pressing a button. The latest version of the GlovePIE program has support for multiple Wiimotes as well!

This second video shows a Wiimote controlling parameters on a Nord Lead by mapping the controller messages.We’re looking forward to future developments and projects!

perScrutinizer – interactive sound installation

The perScrutinizer is an interactive sound installation providing multiple ways to produce and listen to sound. The system comprises of a cage environment in which remote-controlled moving objects reside. These objects can be controlled using a touchpad input device to make them move, collide and scrap which result in a variety of sounds. The user is provided with a stethoscope, which they can position anywhere on the cage, to listen to the sounds.

Virtual Air Guitar

The Virtual Air Guitar is a gesture based input device developed at the Helsinki University of Technology. A mapping system is used to translate the user’s gestures, collected by the webcam, into the sound output.

The hands free method of interaction allows for the guitar to be played just like an air guitar! The webcam identifies all bright orange objects which is why the user must wear some orange gloves to control the guitar. The system also includes two foot pedals, one for starting the music and one for changing performance modes.

The synthesis method to create the sound of the guitar is based on Karplus-Strong technique modelling a Fender Stratocaster guitar combined with physical model of a tube amplifier.