Apple has patented a new sound positioning system that may become a key part of its augmented reality Apple Glasses. And in fact, it may be crucial to turn its AR into a must have device.

As Patently Apple reports, the patent recently filed with the U.S. Patent and Trademark Office first describes the troubles of multiparty audio conference calls, which ”typically include audio signals from individual callers in different locations, such that it may be difficult to recognize who is speaking by voice alone.”

Apple’s patent describes a patent in which audio signals from every person are processed using 3D positioning meta-information attached to the audio signal, so people wearing a pair of earbuds would actually hear the sound coming from that point in space and, if they turn their heads to that person, they will hear them louder and clearer — just like in the real world.

Audio signals from the individual callers are processed by simulating a virtual audio environment (e.g., virtual room) whose acoustic properties resemble those of the real environment in which the listener is situated. Metadata included in the audio signals may be used to automatically cluster callers and reposition groups of participants in the virtual audio environment. Using a head-tracked sound reproduction system, the audio signals are dynamically filtered with binaural room impulse responses to account for and preserve the rendered location of the callers in the virtual audio environment. To increase speech intelligibility, the listener can rotate to the position of the active speaker, similar to a natural conversation. Apple patent on AR audio

3D audio positioning is hardly new, but Apple seems to be focused on making this work in augmented reality environments by simulating the positioning of conference participants to where they are in the actual room.

AR’s killer app?

We know that Tim Cook is a big fan of augmented reality, which he considers social and inclusive while disparaging virtual reality headsets as isolating tech. And what is more social than having work colleagues or loved ones materialize in a room with you and speak to you naturally, rather than using FaceTime or any other videoconferencing software?

To put it in more relatable terms: imagine the scene from the Jedi Council in the Star Wars‘ prequels, where some of the Jedi were there — but others weren’t, instead appearing as holograms as if they were in the actual room. Now, Apple doesn’t have any holographic tech yet, but the Apple Glasses could make holograms appear right in front of you.

Imagine meeting with your colleagues, friends or relatives in the same way —either one on one or with a group of people — and the audio will actually behave as if you were in a real room with real people. Something like this will break all the barriers of videoconferencing software and make the AR Glasses the ultimate human communication device.

Of course, the same technology could be applied to every other AR app you can think about, including games. It seems like the logical last piece of Apple’s AR puzzle.

A long wait

We will have to wait quite a bit to see something like this in action, though.

The Apple Glasses have been reportedly pushed back to 2023 after being rumored to launch sometime in 2020. There will apparently be two models: a mixed AR/VR headset to be released in 2022 and then the AR glasses a year later. Reports from inside the Cupertino company said the 2023 device was going to be lighter, like an actual pair of glasses. Reliable rumors say that the glasses will use the iPhone as its graphics and sound processing unit.

This AR sound patent adds to the long trail of evidence, with multiple AR-related purchases, from Akonia Holographics — a Colorado-based startup dedicated to AR displays — to the one that started it all: Metaio, a German company that developed an Augmented Reality SDK that seems to be the basis for ARKit, the Apple Augmented Reality developer API that debuted in iOS 11 in 2017.