Electric Stimulus To Face

Daito Manabe uses small electrical pulses to stimulate his facial muscles.

via todayandtomorrow.net

Circuit Bent SONY XV-T33F

Clip from a videobending session using a Sony XV-T33F. Andrew Coleman has blogged how he did it on his animals on wheels website.

CreDio

CreDio is a ‘novel musical instrument that combines digital and mechanical functions to control sound synthesis algorithms.’

ConDio

Taking inspiration from James Patten’s audiopad the ConDio project was started late 2005. A camera tracks coloured objects on a table, each representing a unique sound, effect or function. Sound varies depending on the distance between the objects.

A Shrewd Sketch Interpretation and Simulation Tool

ASSIST allows a user to sketch a mechanical system and run the simulation.

i_AM

Created by Ramon Schreuder i_AM is an interactive audiovisual media installation. The project was a result of research and collaboration with industrial designers, djs, music producers, vjs, animators and software developers. Users interact with objects to create audio and visuals in real-time. The system uses the opensource reacTIVision software developed for the reacTable project.

Audio D-Touch Drum Machine

A tangible drum machine developed by Enrico Costanza at MIT’s Media Laboratory. The Audio d-touch uses wooden blocks to sequence the music and a computer is used to map the location of them via a webcam. The position of each block is used to control a digital audio synthesis process.

Smart Laser Scanner for Human-Computer Interface

Researchers at University of Tokyo’s Ishikawa-Namiki Laboratory are developing a Laser Active Tracking system for human-computer interaction. The system uses a laser diode (visible or invisible light), steering mirrors and a non-imaging photodetector – used to track the laser real-time in a three dimensional environment without image processing.

Multi-Touch Interaction Research

In Februray 2006, Jeff Han presented a new multi-touch interface screen at the TEDTalks – see the video here. Here’s a new video with Jeff Han and Phil Davidson demonstarting an updated version of the interface – [source].

Tele-Kinesthetic Interaction

Tokyo Institute of Technology developed a tele-kinesthetic interaction environment, using MyKinSynthesizer and SPIDARmotion, allowing a user to remotely interact with a physical object by moving and straining their hands. MyKinSynthesizer approximates the hand motions by synthesizing EMG signals and SPIDARmotion is used to display the 3D motion of a hanging ball within a cubic frame. It was featured as an emerging technology at SIGGRAPH 2006.

Next Page »