



The Bose AR platform with the Bose AR Frames and the QuietComfort 35 headphones was one of the most exciting announcements that happened at SXSW. These Bose AR Frames wearables feature spatialized audio and serve as a bluetooth headset that has an accelerometer, gyrometer, and magnetometer that when paired with the GPS on your phone can detect where you are and where you’re looking. These are enough of the key ingredients to start to create an augmented layer of spatialized audio that iOS, Android and Unity app developers can start to target. There are a number of different head and body gestures that can also start to be detected including push-ups, squats, “Sup?” nod gesture, shake, double tap, look up, look down, spin around, roll head around — including lots of other potential gestures that could be trained through machine learning.

I had chance to talk with Michael Ludden of Bose AR developer relations to talk about the evolution of their AR platform, what types of apps were being launched at SXSW this year, and how Bose is going to be pushing an audio-first layer of augmentation that pulls people out of their screens so that we can be heads up, hands free, and more IN the world around us.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

https://dts.podtrac.com/redirect.mp3/d1icj85yqthyoq.cloudfront.net/wp-content/uploads/2019/04/Voices-of-VR-753-Michael-Ludden.mp3

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality