Developer Ryan McGee has released a new sound design app for iOS, VOSIS, that synthesizes sound based on the greyscale image pixel data from photos or live video input.
Description:
OSIS is an interactive image sonification interface that creates complex wavetables by raster scanning greyscale image pixel data.
Using a multi-touch screen to play image regions of unique frequency content rather than a linear scale of frequencies, it becomes a unique performance tool for experimental and visual music. A number of image filters controlled by multi-touch gestures add variation to the sound palette. On a mobile device, parameters controlled by the accelerometer add another layer expressivity to the resulting audio-visual montages.
Here’s a demo of VOSIS in action:
VOSIS is available for the iPad for $1.99 in the App Store. VOSIS is Creative Commons licensed.
If you’ve used VOSIS, leave a comment and let us know what you think of it!
Man this thing can not quit sounding creepy. Love it! At first, I dismissed it as a Virtual ANS type app. But, it’s paradigm is actually quite different.
Love this – but I wish it had MIDI control so you could play it from a keyboard, with recorded vector movements to evolve the sound!
Could you create tonal sweeps by feeding it animated video on a loop?
Love this app….thank you.
I like the ability to actually play the sound. Pretty cool. I wonder what it’s range is? All the examples sounded pretty much the same.
It crashes my iPad 2 from time to time. Seems to happen when I am fingering it vigorously!
DOH!
cool! it’d be neat to project the display on to a tv and then then use that as a video source