Interaction design student Andreas Refsgaard created Eye Conductor, an application that is essentially a MIDI interface that allows people with physical disabilities to play notes and build beats, through eye movements and facial gestures.
Using a $99 eye tracker and a regular webcam, Eye Conductor detects the gaze and selected facial movements of users, enabling them to play any instrument, build beats, sequence melodies or trigger musical effects.
The system is open, designed for inclusion and can be customized to fit the physical abilities of whoever is using it.
How It Works:
Eye Conductor translates eye gaze into musical notes or beats in a drum sequencer. Raising your eyebrows can be used to transpose all played notes up one full octave while opening your mouth can add a delay, reverb or filter effect to the instrument being played.
Thresholds for facial gestures can be adjusted and saved to fit the unique abilities of different users.
Eye Conductor is programmed in Processing and early prototypes uses FaceOSC by Kyle McDonald for face tracking.
Refsgaard explains his ideas behind this project:
For a lot of people with physical disabilities, the lack of fine motor skills exclude them from producing music on traditional instruments.
With Eye Conductor I wanted to push the boundaries of the interaction design by exploring how eye and face tracking technologies could be used for creative purposes. I believe that the ability to express oneself artistically should be available to all, regardless of physical disabilities or challenges. Therefore, I wanted to create a solution that operated in the same domain as traditional instruments. Something that gives people a lot of freedom, but also requires them to practice, just like a regular instrument.
User Research
The project relied heavily on user research, and I visited several schools and housing communities for people with physical disabilities, as well as families with children in wheelchairs in their private homes.
The people I meet were extremely diverse in terms of physical abilities, but music seemed to be a unifying interest for them all.
At Jonstrupvang, a home for people with different disabilities, “Music Thursdays” was the activity that gathers the most people every week, despite the fact that almost half the people doesn’t produce any sound, because of physical inability to do so.
About half of the people I tested early stage prototypes with were unable to speak, but as soon as I showed them my interactive prototypes we immediately connected.
More information is available at Refsgaard’s site.
This is simply amazing. I could certainly see using phoenims in place of loops and setting the threshold of the facial expressions quite high, I would imagine a kind of speech processor. given enough time, different-abled people could learn to talk through a device like this.
Quite superb.
So unbelievably boss mode.
I hope the eye-tracking becomes a part of a broader set of accessible gesture-to-midi package so that those with differing abilities can access more or fewer control points.
i have one hand the hardest problem i have is when you have to press two buttons to access a parameter ,sometimes these buttons depending on synth are nowhere near each other i end up using my nose or if i can find it a pencil in my teeth ,,one button per function synths are awesome
http://djshadow.com/wp-content/uploads/2014/05/4744741-large.jpg
Folks interested in this stuff should check this out too: http://deeplistening.org/site/content/aumipadhome