A short demonstration of the MLGI (Multi- Laser Gestural Interface) designed by Meason Wiley at California Institute of the Arts.
Strap it on a dolphin, and you’d have a lethal weapon.
The Multi- Laser Gestural Interface is an open source and modular “free-gesture” controller that uses beams of laser light along with photo resistors to create a physical, fluid musical instrument.
With the MLGI, Wiley is attempting to bring a physical interactivity to electronic music performance. By removing the performer from behind the laptop, the audience becomes aware of the performer’s interaction with the controller, which creates an instant visual connection between the sound and the performer.
The MLGI was created using Dan Overholt’s multi I/O CREATEUSB or CUI interface, which, along with the programming language ChucK, can be made to send MIDI or OSC via USB port.
For more information about this open-source + modular controller, visit www.cyclespersecond.net.
holy shit
Kind of nice, though i suspect it might be a bit useless in a smokey club, based on the guess that the resistors are picking up on the light reflected off the palms of his hands. Shame, because the beams of light going up would conversely look awesome in a smoke filled room 😛
wow. this is very cool work!!
The controller works just fine in a smokey room. Smoke doesn’t really reflect the laser light in any significant way. The lasers output at 635 nm, and the sensors have bandpass filters that will only accept red light at that frequency. There is always a little ambient light distortion, but all of that is scaled out in the code.