This is a demonstration video showing real-time 3D acoustic rendering, using the Phoenix gaming engine.
Developer David Rosen discusses this in depth in his paper Real-time 3D Acoustics Rendering:
Abstract
It is common for real-time systems to render 3D sounds by simply panning the sound from left to right and adjusting the volume based on distance. This is a good start, but ignores many effects that contribute to our perception of sounds and the acoustic environment. First, we must take into account the time it takes for the sound to travel from the source to each receiver (e.g. the left and right ear). Second, ears are not omnidirectional microphones; sounds can be occluded by the head and by the ear itself, and different frequency bands can be affected in different ways by this occlusion. Finally, sound sources can be occluded by objects in the environment, and the sound can reflect off of surfaces, or reverberate within spaces. In this paper we describe an efficient way to simulate all of these phenomena in real time.
If you’re interested in this stuff, give the paper a read. There are explosions involved.
Rosen’s a game developer, but this technology seems like it would be an interesting area of development for audio/visual synthesis.
Can you imagine entering into a 3D world of ambient sound-objects, where your perspective, and the location of the objects, affects what you hear?
via Califaudio