Seth Horowitz is clearly into bats. So much so that it’s one of the first things you notice when talking with him. A bioacoustician and former Brown University neuroscientist, he is utterly captivated by the way they navigate the night sky. “They’re in the dark moving at up to 30 miles an hour, and they’re doing all these crazy acrobatics,” he says, his voice swooping up and down like a bat surfacing from a plunge. “Meanwhile, they’re using echolocation the way we use our eyes.”

For most of human history, we have been in the dark when it comes to echolocation. It wasn’t until the late 1700s, when Italian priest and physiologist Lazzaro Spallanzani blindfolded bats and observed them flying, that we caught the first glimpse of it. Spallanzani’s experiments took a perverse turn when, in an effort to eliminate various senses, he lopped off his subjects’ ears and tongues. Though his methods were questionable by modern standards, his conclusions were not. Unfortunately, Spallanzani’s colleagues thought it strange and illogical that an animal could “see” with anything other than its eyes, and his theory lay dormant for another 140 years.

Horowitz believes he might be able to change that. At NeuroPop, he’s in the beginning stages of development on an assistive device for the blind that won’t quite give them sight, but if it works, it could come pretty close.

Now, as CEO and chief neuroscientist for NeuroPop—a company that explores applications of sound technology—Horowitz’s projects probe the role of auditory information in our perception of the world. And there are few people for whom sound is more critical than those who are blind. Most people who are blind use sound to guide them through the world, and a few have fine-tuned their ears in such a way that they truly echolocate, like bats. But not everyone can.

At Brown, Horowitz was known for his peculiar experiments. One of them involved hooking up a microphone to a billiard ball so he could hear what a game of pool sounded like. As part of another, he strapped lasers on the backs of bats—a stunt that earned him the nickname “Dr. Evil” among students and faculty.

For Horowitz, chirping isn’t a weird tick or nervous habit—it happens to be part of his job. He wants to know if humans can use technology to emulate nature’s use of ultrasonic sound. Bats—his primary inspiration—send out high-frequency “chirps” and analyze the time delay of the returning echoes so they can fly swiftly through the night. “Everyone can listen for very fine-grain echoes,” he says. “But you really have to train yourself to do it.”

It’s easy to get sidetracked speaking with Horowitz—his enthusiasm is contagious. You find yourself asking him about a million things, things you had never intended to ask him, simply because he makes it all sound so interesting . And occasionally unexpected. Like when he chirped midway through one of our conversations.

In the decades since Griffin’s early work, we have discovered that bats’ approaches to echolocation are incredibly diverse. Some species, like the big brown bat, send out short chirps that span a wide range of frequencies, known as frequency modulation or FM. Others use constant frequency, or CF, where they emit a single frequency over a longer period of time, anywhere between 5 to 30 milliseconds. There are also species, like horseshoe and mushroom bats, that use a CF-FM hybrid call.

During that time, scientists varyingly hypothesized that bats navigated by detecting changes in airflow on their wings or that they possessed some mysterious “new sense.” Finally, in the 1930s, Harvard grad student Donald Griffin formally discovered echolocation in a series of experiments involving an electroacoustical meter and a soundproof chamber. There, he learned that bats emit controlled pulses of sound that reverberate above the range of human hearing, painting the room in sound. The echoes that returned to their ears fleshed out the soundscape with remarkable precision.

There are also different ways to process the returning echoes. All species’ cochleae—the spiral-shaped section of the inner ear devoted to hearing—are organized in such a way that different parts of the sensory tissue process different frequencies. But among bats that use CF or CF-FM hybrid calls, the portions of their cochlea that sense their particular call’s ultrasonic pitches are especially thick with nerve endings. (In humans, our cochleae sense frequencies in a logarithmic progression throughout their length.) These bats, says Elizabeth Olson, a biophysicist at Columbia University, “essentially tune their radio to their own frequency.”

In bats that use FM calls, the ability to echolocate is all in their brain. They broadcast a rapid sonic “sweep” composed of pulses ranging from 20 kHz to 100 kHz, each of which are tightly etched into their neural “muscle memory,” allowing them to identify extremely subtle pitch changes in the spectrum of returning echoes. What’s more, they’re able to discriminate between impulses as short as one microsecond apart. That’s hundreds of times finer than human can discern and far faster than a fully developed nervous system should respond.

The speed and fidelity at which bats process certain sounds stumped scientists for a long time. It wasn’t until 2007 that Horowitz and a team of researchers at Brown’s Bat Lab found the answer lurking in bats’ nervous systems.

They discovered that big brown bats, which use FM to echolocate, retained gap junctions in certain neurons that were devoted to processing sound. In the nervous system, gap junctions are often precursors to chemical synapses, both of which help transfer information throughout the body. Gap junctions are critical to fetal development, synchronizing muscle cells during labor and playing a key role in processes like cell migration and neural circuit formation. “If you want to lay down the initial structure for something, gap junctions are ideal,” Horowitz says. But in most mammals—including humans—expression of neuronal gap junctions decreases sharply after birth, giving way to chemical synapses and increased synaptic activity.

Somewhere in the big brown bats’ evolutionary history, though, one or more mutations let certain cochlear cells retain their gap junctions into adulthood. Gap junctions can send signals faster than synapses, making for speedy connections that can process sound much more quickly. Some of those gap junctions also function as signal filters, allowing a bat’s brain to distinguish different portions of the returned echo.

Here’s how it works. Say a big brown bat is on the prowl for food. The higher and lower pitches of its chirps detect different objects. As it searches for small prey—like a nearby June bug—it listens for subtle changes in the higher pitches of the echoes. This helps it recognize an insect by its “acoustic texture,” a detailed bit of information that’s similar to what a human might see under a microscope. Then, like the flip of a lens in an optometrist’s office, it starts hearing lower frequencies, which help it to avoid bigger objects, like a tree or another bat. Thanks to the sound filtering provided by the speedy gap junctions, big brown bats can discriminate between frequencies and time delays in just a few millionths of a second. In one fell sweep, they have all the frequencies they need to swoop through a three-dimensional soundscape.

False Starts

To Horowitz, the way bats process sound is the key to unlocking an entirely new class of assistive devices, one that could allow blind people to navigate with unprecedented freedom. His vision is ambitious. For the most part, similar previous attempts have failed spectacularly. But then again, most of them were just “clear path indicators” that warned the user of objects in the way, buzzing more intensely when an obstruction neared. They also tended to use fixed sound frequencies, meaning the information returned from those pulses was fairly simple. They had names like “Sonic Pathfinder,” “Sonic Torch,” and “Sonicguide.” Though radical for their time, they were hardly used outside of research labs.

Critics said some models presented users with too much unnecessary detail. Others complained that the battery packs were too big. The devices also couldn’t tell you whether an object was large or small or what material it was made out of. In sum, Horowitz says, they were just plain difficult to use. Newer devices are currently being marketed to the blind community, but today, most people still rely on a long cane or guide dog to get around.

Horowitz hopes his device will be viewed as a serious competitor to those time-tested techniques, but he faces a series of challenges, not least of which are technological. “I don’t know of anyone who uses a device that would enhance echolocation,” says Lore Thaler, a lecturer in psychology at Durham University in the United Kingdom and an expert on human echolocation.

Moving in the Modern World

While long canes and guide dogs are common in the blind community, they certainly aren’t the only available tools. Even among those who use a cane, for example, different people prefer different techniques. That’s partially due to personal preference, but also because there’s a wide range of abilities among people who are blind. Some can sense light but not see objects, others can echolocate with ease, while for still others the concept of space is so challenging that they navigate by running their fingers along the wall. For that last group, says Paul Doerr, a mobility instructor, “the difference between my being right in front you and somewhere else is sort of colossal.”

Nowhere is that diversity more apparent than the school at which Doerr teaches, Perkins School for the Blind, located in Watertown, Massachusetts. Founded in 1829, it’s the oldest institute of its kind in the United States. (Helen Keller is among its most famous students.) Because Perkins’ residents are young and their motor skills still malleable, adult instructors can identify subtle variations from person to person and train them to maneuver accordingly.

A surprising number will end up using echolocation. Of all blind people in the world, “maybe 20 percent of blind people would say yes, I echolocate,” Thaler says. “That typically means they use mouth clicks. That said, there might be a high proportion of people who also use sound reflections kind of implicitly,” she adds, by using ambient sound instead of clicks.

“I have some kids who are sort of like cats,” Doerr says. “They get around beautifully. They’re graceful…they never trip over anything.” Many of these students invent their own echolocation strategies—snapping their fingers, clicking their tongues, whacking their canes.

But it doesn’t come naturally to everyone, even those trained from a young age. Thaler and her colleagues have preliminary evidence that suggests there might be a correlation between sighted people’s ability to echolocate using clicks and their ability to create visual imagery in their minds. Those sighted people, Thaler says, “don’t describe any particular visual experience, but they’re actively trying to create a spatial layout of the world.” While blind people who echolocate aren’t necessarily trying to recreate a visual experience either, they are trying to create a similar map of their surroundings.

Even for those who excel at it, the world we live in isn’t ideal for echolocation, at least for humans. Higher frequencies in the clicks many people use degrade quickly in the atmosphere—random, sparsely distributed air molecules don’t always carry the sound waves back to the user. Bats’ specialized auditory and nervous systems have evolved to overcome this problem, but for humans, echolocation doesn’t come naturally. Though it might be easier with a little help.

A More Sound Device

For Horowitz, the discrepancies between humans and bats provide something of a roadmap. The current blueprint for his sensor is a silicon-and-code version of the biological signal filter in FM bats. It would also emulate an FM bat’s chirp, sending out a wide range of frequencies that would return information about a person’s surroundings with different levels of detail. For example, one portion of the sound would sense large, stationary objects while another could pick up smaller, fast-moving obstacles. “We just hope this device will be a more natural experience than some of the big chirping devices that have come around in the past,” he says.

Ideally, he would be able to offer this kind of assistive device to people all over the world who are blind or visually impaired. “They’d have a Google Glass-type aesthetic—slick and non-intrusive—that could become a way of life for the people who use them,” Horowitz says.

Cramming the technology into a small, unobtrusive package will be the easy part. Tweaking the algorithms and presenting the information to users in a way that’s, well, usable will be tricky. The other hurdle, though, is how to present the information in a way that paints a “picture” for the user…except it’s not a picture.

“We have to consider human processes when using a biologically-inspired sensor,” Horowitz says. While bats can decompose echoes into the fine details of a June bug, for example, humans’ understanding of sound isn’t as innately intricate. So Horowitz says his device will distill a person’s surroundings down to their essence. It’s similar to the way we learn to differentiate, through echoes and vibrations, the sounds of a booming cathedral and a tiny closet. As we grow, we begin to recognize these patterns. To adults, Horowitz says, “they’re immediately discernable. But it’s also a learned process.”

Learning to use Horowitz’s biomimetic assistive device would be like learning to use any other tool. Early on, it’ll feel awkward and cumbersome. “Someone who wants to learn how to use this device would have to be exposed to a simple room of a simple shape first,” he says. As comfort with the device grows, people can explore more complex areas. Users would graduate to more ever complex spaces until they gain enough proficiency to go about their daily lives.

“You’re not going to get the fine level of discrimination that a bat would get—it’s a prosthetic system,” he admits. Still, he’s optimistic about the device’s prospects and what it could do for the blind community. “If you put this new thing on, and if you’re able to navigate with a couple weeks training, it would revolutionize the level of autonomy that blind people have.”

Envisioning the Future

“I’d be the first to try it,” says Robert, a 15-year-old student at Perkins, when I ask him to imagine a device like Horowitz’s. He and his two peers, Shae, who is also 15, and Kenny, who is 19, met with me one early fall day in a Gothic building at the center of Perkins’ campus.

People who are visually impaired face many challenges in a world that privileges sight. Even a task as simple as crossing the street can present a number of obstacles. While unaided echolocation can tear down some walls, it certainly doesn’t eliminate them all. And it doesn’t work for everyone.

A successful biomimetic device for the blind, then, would have to accommodate a wide variety of preferences. Take Robert—he can see some amount of light up to 15 or 20 feet away, giving him some additional spatial information that’s not available to every blind person. Shae, on the other hand, has some vision in one eye, and he says his quiet personality means he’s usually listening intently, allowing him to quickly pick out sounds. Their English teacher, Jeff Migliozzi, who has been using a cane for 50 years, is hesitant to use anything else. Even in this small room, you can clearly see that Horowitz’s device doesn’t just have to meet a high standard, it has to meet perhaps a dozen or more of them.