When a person aims the laser pointer at a virtual object and selects the audio location control, the VR system plays a short impulse response tone at location of the controller. Then the sound is played a few more times as it quickly progresses to the location of the virtual object. Because all audio is processed using the Google VR Spatial Audio plugin, each tone provides enough information to understand distance and relative location of the object in the virtual space.

To test our prototype, we challenged participants to find and pick up a toy laser gun within the virtual room, navigate to the window, and finally shoot at a duck moving outside the window. We ran six non-visually-impaired people through the prototype, and all of them were able to complete the challenge successfully. After the task was completed, four of them went through the experience again, this time able to see the room without vision impairment. Because they had navigated the room by sound, we found that they were already familiar with their surroundings.

