Clip from a videobending session using a Sony XV-T33F. Andrew Coleman has blogged how he did it on his animals on wheels website.
CreDio is a ‘novel musical instrument that combines digital and mechanical functions to control sound synthesis algorithms.’
ASSIST allows a user to sketch a mechanical system and run the simulation.
Created by Ramon Schreuder i_AM is an interactive audiovisual media installation. The project was a result of research and collaboration with industrial designers, djs, music producers, vjs, animators and software developers. Users interact with objects to create audio and visuals in real-time. The system uses the opensource reacTIVision software developed for the reacTable project.
Researchers at University of Tokyo’s Ishikawa-Namiki Laboratory are developing a Laser Active Tracking system for human-computer interaction. The system uses a laser diode (visible or invisible light), steering mirrors and a non-imaging photodetector – used to track the laser real-time in a three dimensional environment without image processing.
Tokyo Institute of Technology developed a tele-kinesthetic interaction environment, using MyKinSynthesizer and SPIDARmotion, allowing a user to remotely interact with a physical object by moving and straining their hands. MyKinSynthesizer approximates the hand motions by synthesizing EMG signals and SPIDARmotion is used to display the 3D motion of a hanging ball within a cubic frame. It was featured as an emerging technology at SIGGRAPH 2006.