An Apple patent application published by the U.S. Patent and Trademark Office on Thursday describes a system that may one day replace the infrared proximity sensors deployed in current iPhones with sonar-like technology.





Apple's invention for "Passive proximity detection" negates the need for the current IR sensor, replacing it with a system that can detect and process sound waves to determine how far away an object is from a portable device.

Much like passive echolocation or a loose interpretation of passive sonar, the filing describes a system that takes two sound wave samples, a "before" and an "after," and compares the two to determine if an external object's proximity to the device changed. "Sampling" occurs when a transducer, such as a microphone, picks up ambient sound and sends a corresponding signal to the device's processor for analysis.

The invention relies on basic acoustic principles as applied to modern electronics. For example, a microphone's signal equalization curve from an audio source changes when the device moves towards or away from an object, which "variably reflect[s] elements of the sound wave."

This effect may be noticed when sound is reflected by soft material as opposed to a hard surface. Generally, sound reflected off the soft surface will seem muted when compared to the same sound reflected off a hard surface located at the same distance and angle from an audio transducer and a sound source.

In one of the invention's embodiments, two microphones are situated at different planes on a device, and detect the subtle changes in broad-audio-spectrum caused by interference when a sound wave interacts with an object.

To relate this to a common phenomenon, when a sea shell is held up to one's ear a resonant cavity is formed that amplifies ambient sounds. This hi-Q filtering results in the ocean like sounds one hears.

In another example, response signals produced by two microphones located at either end of a device can be compared to determine if an object is nearer to one or the other. For example, when a user's face is close to the top of a device, as is usual when talking on the phone, the microphone located near the ear will produce a different reactance ratio than the microphone located at the device's base.



Microphones located at two ends of an iPhone.

Basically, the signals from two transducers, or microphones, detect slight changes in ambient sound and sends corresponding signals to a processor which then compares the two to determine whether an object is in close proximity to either of the mics.

Monitoring of the microphones can be live or set to take samples at predetermined intervals, such as after a user begins to speak. Placement of the microphones can also be tweaked, and in some cases can be located next to each other.

Finally, a more active detection method is proposed, where an internal speaker generates noise, taking the place of ambient sound waves.



Illustration of peak frequency compared to ambient noise signals produced by mics.

As portable electronic devices become increasingly smaller, the need to develop space-saving components, or to combine parts to serve a number of uses, becomes more pressing. Such is the case with Apple's latest iPhone 5, a device that packs 4G LTE, Wi-Fi and Bluetooth communications, a battery that can last for days, a 4-inch Retina display, two cameras, and a litany of other features into a chassis only 7.6 mm deep.

Space is already at a premium with the iPhone, as evidenced by the new Lightning connector, which Apple's Worldwide Marketing chief Phil Schiller said was needed to create such a thin device. Moving forward, the company is rumored to incorporate near field communications (NFC) for e-wallet payments, which will take up even more precious room.