New Apple Vision Pro App, MIDI Widgets, Lets You Add Virtual Controls To Your Hardware Synths

Developer Geert Bevin (Moog Music, Roger Linn Design LinnStrument, MPE Standard) shared this video demo of MIDI Widgets, a new application for Apple Vision Pro that lets you add virtual controls to your hardware MIDI instruments.

MIDI Widgets is the first spatial MIDI designer. It lets you create virtual faders, buttons and knobs that you can position, anywhere in your physical spaces. The MIDI control widgets will stay anchored in space where you put them.

In the video, Bevin demonstrates adding custom virtual MIDI controls for the Subsequent 37. While the Subsequent 37 is already a knobby synth, it’s easy to imagine using MIDI Widgets to expanding the capabilities of synths like the Yamaha DX7 and the Sequential SixTrak, which have great-sounding synth engines, limited by clunky patch editing.

Features:

  • Designed specifically for Apple Vision Pro
  • Create as many faders, buttons and knobs as you need
  • Freely position and orient individual controllers
  • Spatial snapping makes it easy to build out perfectly aligned larger control surfaces
  • Controllers automatically remain anchored to their location, even across restarts
  • Controllers can be grouped in scenes and are easily recalled for different purposes
  • Freely re-position and re-orient entire scenes
  • Scenes can send MIDI messages to multiple configurable MIDI ports
  • Support for Bluetooth LE MIDI devices
  • Built-in virtual MIDI to send MIDI to other Vision Pro apps on the same device
  • Per controller MIDI channel selection
  • Per controller MIDI Control Change, Channel Pressure, Program Change or Pitch Bend messages
  • Per controller 7-bit and 14-bit Control Change resolution
  • Controllers can optionally reset to a neutral position upon release
  • Faders and knobs can be unidirectional or bidirectional
  • Every part of a control can be assigned a different color for recognizability
  • Colors can easily be copied and pasted, individually or as complete color schemes
  • Controllers can have optional labels that can be placed on any of the four edges
  • A floating tool window follows your left or right hand and allows editing anywhere
  • Multiple controllers can be used simultaneously, allowing for dual hand interaction

Pricing and Availability:

MIDI Widgets For Vision Pro is available now for $9.99 USD.

19 thoughts on “New Apple Vision Pro App, MIDI Widgets, Lets You Add Virtual Controls To Your Hardware Synths

    1. I might be a synth geek – but this was one of the first demonstrations of Apple Pro that wowed me. As the prices for Vision Pro drop down, I could see this sort of thing becoming a great way to do sound design.

    2. You only need to show params for one operator

      They are identical after that. A few global and routing params and some way to switch which operator you are currently controlling

      Not too difficult

      See how easy Dexed makes a DX7 look

  1. It is like many other beauties. Looks nice but how to use it practically? In an live environment? When you move your fingers, hands?

    For musicians which love their hardware and haptic controls it will not work.

    1. It’s not for everyone; this doesn’t mean it isn’t for no one.

      Still, will be a niche product that will do some amazing thingsfor a certain subset of people. Good for them.

    2. Might be fun to control synths that are behind you while you’re in the DAW, or to play parameters with gestures. Price of entry too damn high for the novelty though.

    3. Think of all the synths (Kurzweil kx000, Elektron Analog Four, Waldorf Blowfeld, etc) that are great synths that have tons of power but don’t have knob-per-function interfaces.

      They all benefit from dedicated patch editors, which you use when you’re focused on creating new sounds, not when you’re focused on performing.

      I do not see why anyone would want to use something like this as a performance tool, unless you are doing a performance in VR or something like that.

    1. You can create your own knob/fader/button per function midi controller, per instrument, per room.

      Yes this can be done with hardware, hardware takes up space, hardware has it’s own limitations and often cannot scale. You can create arbitrary controller schemas that appeal to your senses (faders vs knobs or both).

      You could trade in all of your hardware midi controllers for the Apple Vision Pro. The sight to object selection fidelity looks really good based on the demo so I could see this working well for this use case.

      The vision pro is expensive but someday either the vision pro or the next thing will be cheaper or this will be ported to cheaper AR products.

  2. IMO, its still a solution looking for a problem, more than not. It will eventually take hold for its novelty as much as its practicality. Its an adjunct approach to laboring over a piano or guitar. After years of chatter about performance gestures becoming more of a thing, this is the opening salvo for the Star-Trek interface idea.

    The goggle unit price will have to drop a lot, but as usual, a lot more people will thrash around AT it than play it in a distinctive manner. I’m skeptical, because I want it sound as good as it looks, for all of the fanfare. My sci-fi gland says it won’t start taking off big time until around the 4th generation. For now, I’ll remain a MIDI Fart with a Thunderbolt chaser until the dust settles more.

  3. Brilliant. Now it just needs a control surface to interface with it…

    To be serious, manufacturers have been trying out gestural control ideas for decades (eg Roland D-Beam) but they don’t stick because there’s no tactile feedback. People want the resistance of a pitch wheel or the dimensional constraints of a rotary or linear control.

    The Theremin is a worthy exception, but let’s face it, those instruments are kind of art objects for solo performers and hark back to the nostalgia of the electrical age. I have a hard time seeing people lining up to witness a performance by someone in VR, but perhaps someone will invent a ‘rapt audience’ app so you can thrill a crowd in the comfort of your living room.

    1. I’m glad you mentioned the tactile aspect of playing and mixing. I am with you on how unsatisfying it would be to just twiddle the imaginary knobs in your cyber-world. I’m sure they will develop some way to experience haptic feedback, but it isn’t the same.

  4. My first thought: “Wow! that’s an expensive MIDI controller!” Really? (no pun intended) for a hardware synth, nothing beats a physical Real set of knobs for controlling a filter! I can see how this would be much more relevant for a Software instrument – Although Gosh! that’s a lot of work! How long would it take to program all these controls for one of the Arturia instruments for example? For a live performance – How comfortable will audiences become with some Goggled dude (or dudette) pinching his/her/their fingers in the air?

Leave a Reply

Your email address will not be published. Required fields are marked *