Over the weekend Patently Apple discovered a HomePod related patent describing the inner workings of the speaker array in-depth. While mildly interesting, what was buried deep within the patent filing was an eye opener. In fact it was surprisingly and refreshingly cool.

While Apple's patent FIG. 22 shown below illustrates the HomePod being the center of home automation that via Siri could control appliances, lighting, TV, a security system and more, one device oddly stood out: A Mixed Reality Headset.

Why is that cool? Well, because with a mixed reality headset (assuming it's a future Apple device) users will be able to see illuminated interfaces to control HomePod and perhaps any of the devices associated with their Home Automation Hub of FIG. 22 below that can't be seen otherwise. No one else in the room will be able to see these interfaces without this Augmented and Virtual Reality headset.

Apple's patent FIG. 22 illustrated below shows a diagram indicating different types of connected electronics that can communicate and/or interact with an array speaker / HomePod.





Apple adds that patent FIG. 22 shows a diagram indicating different types of connected electronics that can communicate and/or interact with array speaker. In some embodiments, the array speaker can act as a central hub to facilitate home automation. A Virtual Reality Headset is described working uniquely with HomePod.



The one device noted in patent FIG. 22 above is a Mixed Reality headset. How Apple describes this in their filing is crazy cool as follows:

"In some embodiments, the array speaker can be configured to interact with wearable display #2218. The wearable display can take the form of augmented reality or virtual reality goggles that present digital content to a user. When the wearable display is an augmented reality display it can overlay various control interfaces around 'array speaker' [going forward, 'array speaker' will be referred to as 'HomePod' as shown in FIG. 22].

For example, virtual content could overlay convex user interface atop HomePod to make the user interface larger. In some embodiments, the enlarged user interface could include an expanded display and enlarged control manipulation regions that allow a user to control HomePod with more efficiently and/or with a greater degree of options.

For example, user interface could be configured to display a virtual graphics equalizer allowing a user to increase or reduce treble and/or bass output associated with the audio being generated by HomePod.

In some embodiments, a user could be presented with an overlay that visualized the various regions of the room covered by each of a number of speaker drivers contained within HomePod. The user could then be able to adjust audio output specific to a particular region associated with one or more speaker drivers. For example, the user could identify only the depicted regions containing individuals listening to the audio output from HomePod.

Furthermore, the user could reduce the audio output for a first user positioned in a first region of HomePod associated with a first audio driver and increase the audio output for a second user positioned in a second region of HomePod associated with a second audio driver. In this way, listeners can enjoy audio at a desired volume and the virtual interface allows the user to quickly identify the regions within which various listeners are located.

In some embodiments, HomePod can include various indicia that help circuitry and sensors associated with the wearable headset display to orient the virtual content relative to HomePod.



For example, since HomePod is cylindrical it could be difficult to determine a radial position of each of the speaker drivers within HomePod. Small indicia such as decorative symbols could be embedded within acoustic fabric covering HomePod. In this way, the various listening zones could be more accurately associated with HomePod.

In some embodiments, HomePod can include optical sensors [cameras] configured to identify the position of various listeners in a room and then change the audio output to improve the audio experience for the identified listeners.

In some embodiments, the wearable headset display device can be configured to receive optical commands from HomePod. For example, a display associated with a user interface can be configured to output particular patterns of light.

Optical sensors of the wearable headset display device can identify the patterns of light and in response vary the display in some manner. For example, the type, size and orientation of virtual controls displayed by the wearable headset display can be varied in accordance with the output of the display associated with the user interface.

Apple's patent application was originally filed back in Q4 2017 and published in Q4 2018. Considering that this is a patent application, the timing of such a product to market is unknown at this time.

Last week Patently Apple posted a report titled "Apple has shown great interest with In-Air Gesturing via Multiple Patent Applications and their Pursuit of Leap Motion."





Beyond a mere mixed reality headset, it appears by Apple's patents and actions that they're working on new ways to enhance the experience. Whether it's using advanced in-air gesturing to control a virtual thermostat seen in this headset or view and interact with an illuminated interface on the HomePod to control audio anywhere in the home, we see that Apple is once again looking at HomePod and a mixed reality headset holistically rather than rushing to copy products from their competitors.