Do beautiful eyes make beautiful music?
That’s what Patrick of cappel-nord.de set to find out when he created eyeSequencer.
He explains how this works:
I wrote a little framework/application in Processing to help me analyse the image. “Scanners”? circulate in the eye and analyse RGB and brightness values. These values are used to create OSC Messages, which then are send to SuperCollider. There I could generate sound, but for this example i decided to send MIDI control messages and notes to Ableton Live.
This is a work in progress and I see much room for improvements. I only spend about one hour for the SuperCollider/Ableton Live part, so this could sound a lot nicer 😉
via califaudio