On all platforms, Resonance Audio not only offers control over where audio is coming from, but how it propagates. You can narrow its spread, for example, or make it so that sounds are louder when you're facing an object. It'll automatically create near-field effects as you approach a source, which can be critical in VR apps where positional audio is everything. The technology works with multiple engines (including Unreal Engine and FMOD), but there's a particular advantage if you're building with Unity: you can precalculate reverberation effects for a given environment (say, an echo-filled hall) so that they won't chew up too much processing power.

It's up to developers to make use of Resonance Audio, and there's no guarantee they will. Some may decide that existing options are better for their needs, or even create their own. However, Google isn't shy about its plans here: between this and the Poly object library, it wants developers to have as painless an experience as possible when creating 360-degree and VR content. The easier it is to create an immersive experience, the more likely it is that Google will see uptake for in-house technologies like 360-degree YouTube videos and Daydream VR.