[This is part two of a two-part column. Read part one, “Here Come The Animals”, here.]



“We are born trapped in our own selfish skins, and we open our eyes to the rings of existence around us.” – Adam Gopnik , “Dog Story,” The New Yorker Harold: “You sure have a way with people.” Maude: “Well, they’re my species!” – Harold and Maude

The other day I went for a hike in High Park, a large tract of forest in Toronto’s West End. Nature relaxes me. I watch the squirrels and the birds and place my hand on the trunks of the largest trees, feeling for something I can’t readily describe. At one point on a quiet trail I realized I wasn’t alone. Coming towards me was another hiker, weaving drunkenly, catching his feet in the roots. He was texting on his smart phone. So, actually, I was still alone. He was only notionally in the forest. His mind was elsewhere, as my mind is so often elsewhere, flitting through information space, that endlessly diverting playground.

It’s easy to criticize technology in this way—as the great distractor, dissociating us from our bodies, and from nature. If there is a Devil, surely the iPhone is his spawn, deracinating us, slowly turning the human species into nothing more than brains in vats, and transforming the green Earth into a giant dumpster for our byproducts. Except … except God and the Devil are always neck and neck, aren’t they, bobbing and surging like Olympic rowers in a race that never ends. And I bet God has an iPhone too – actually, He probably has an early prototype of Google Glass, Google’s much-anticipated “augmented reality” spectacles. But even as these new God-gadgets threaten new ways for humans to checkout, they may also promise new ways for humans to check in.

For this column, a thought-experiment. Consider the Smart Phone – or let me consider mine. It’s pretty old at this point – a third generation iPhone that my friend Roy gave me when he upgraded a couple years ago. A lot of the new apps don’t even work on it. But “SoundHound” does. When I’m in some public space and I hear a song I like, I hit the app and it listens too, processingprocessingprocessing, until eventually it blinks out the artist and track title. Amazing!

My iPhone is like a little sensing organism. It hears (microphone), sees (camera), feels (touch screen), balances (gyroscope), and is self-aware enough to know both its speed (accelerometer) and position (GPS). In these different ways my iPhone senses the external world and, via different apps, generates an appropriate response. This is not a bad description of a crude mammalian sensory-motor system.

Here is my big idea. Every week some new bit of research on the chemical sensors of the octopus, or elephant sub-sonic hearing or bird navigation, makes its way into the scientific journals. It’s mostly other experts who see it. But what if we began to apply that research in the way we use our technology? What if we used it to build Animal-Empathy Apps for every organism?

Think of it as the ultimate design challenge, a new kind of artist –scientist collaboration, and a new kind of product assembly line. At one end is a vast growing Open Source Wiki of consolidated information about different animal sensory systems and ecology and comparative neurobiology and behavioral data and all the rest. At the other end is the ongoing creative evolution of better and better interfaces. What begins as a gimmick gets increasingly more detailed and immersive and intelligent. Now, when our technology mediates our experience, it does so through the simulated prisms of an alien nervous system.

So you start, say, with a Whale Watching app. On the main screen are four little images:

“What is it like to be a …

1. killer whale?

2. humpback?

3. bottle-nosed dolphin?

4. right whale?”

Choose one and the screen goes dark. The view /camera function opens, altered now so that everything looks a little bluer and more fish-eyed (the whale’s eye has more rod cells than cones, thus it sees more of the blue end of the colour spectrum, plus all marine mammal eyes have stronger, more spherical lenses). A separate algorithm adds texture to moving shapes in an imaginative recreation of echolocation, as the species-specific song or clicks or whistles play underneath. Simple – underwhelming, perhaps – but still evocative.

Version two goes a step further and uses the toothed whale’s primary sense: echolocation. Fisherman already use cheap sonar devices to find fish. Portable ultrasound. Works above water and below. Now, when you click on the killer whale you hear real functioning echolocation clicks and watch a visual reproduction of how a killer whale might experience … er, Prospect Park. On roller-blades. Minus 99% of what’s actually true and important about what it’s like to be a killer whale (you have to begin somewhere).

Whales are just for starters. We could have apps for dogs and cats, calibrated to their particular visual and auditory systems – specialized colour spectrum, amplified sounds, higher frequencies. Imagine the potential market, given the pet-owner demographic: “Look dear, this is how Dexter sees the world.” Like a dog sniffing a pole, our iPhones could have chemical sensors that detect olfactory signatures and translates them into an undulating tapestry of indexed and colored airflows.

Insert an infrared sensor and you could enter the world of the mosquito – how do they find you? (actually, by smell – but blood-sucking bed bugs are attracted to your body heat, as are pit vipers and vampire bats.) Or an app for horses and cows. Animal science professor Temple Grandin, famously autistic, “thinks in images” and says livestock do the same. She’s been able to design more humane cattle corrals because she sees what will spook the animals: shadows, reflected light, sudden changes in contrast, small objects, blowing clothing – specific details that most of us would overlook but many farm animals will balk at. An animal-empathy app could find practical application with its ability to scan a particular environment for farmers and highlight potential problems. Here we go, into the cow.

Although some of this functionality is out of reach at the moment, the technology is getting cheaper and faster and more portable (and responsive) by the day. As it does so, expect some radical sensory extension: tools to visualize the chemical world of plant information exchange, the electrical fields of amphibians, the fluxing and shifting magnetic field of the planet itself – the list goes on and on. Put three artists and three biologists and three ecologists in a room and you’d have fifty cool Animal-Empathy App ideas in under an hour. (My friend Marni suggests The Kafka App: “the alarm that wakes you up in the mind of a cockroach.”)

Apps are only the beginning. Anachronistic aquariums and zoos – bad ways to deal with the legitimate human desire to seek out nonhuman nature – could be slowly replaced by the next generation of interpretive centers, all kitted out with the latest animal sensory system technology. (“How was the zoo today?” “Great. I was an anaconda for a while, then an orangutan named Namu, and then I was a sloth for a really looooong time.”)

Keeping pace with our crimes, the human capacity for empathy continues to expand. Over the past two hundred years, in part via the printed word (that first generation of perspective-expanding technology), our circle of empathy has moved far beyond the small ruling world of white male property owners. Lately we’ve seen a new cultural focus, in works like The Curious Incident of the Dog in the Night, on stories that dramatize the interior landscape of mental illness, for instance – the world of autistics, schizophrenics and depressives. We go to movies like Melancholia, or Spider. And non-human nature will be next. There is nowhere else to go.

This is where the arrow of consciousness studies is pointing: out beyond us, to the next concentric ring, in what we might hope is an ever-expanding circle of empathy and understanding. Novelist Barbara Gowdy has already taken her readers there in her novel The White Bone, a story about the persecution of a herd of elephants by poachers, told entirely from the elephants’ point of view. There are many other examples.

My own view is that the more we are able to see ourselves as part of the community of nature, the more sympathetic and respectful and conscious we may become. As I argued in my last column, something of an animal’s experience can be known to us – certainly more than is known at present. We used to believe that infants didn’t feel pain too; we know better now.

Now let’s go back to my walk in High Park, this time with a different outcome. Along comes the other hiker, still stumbling on roots. But now it’s because he’s looking up, into the canopy. He’s wearing a pair of Android SmartShades™ and he’s launched the top-selling Bird World App. This bit of software doesn’t just tell you what species are singing. Programed into its database is an encyclopedia of bird sounds, from the begging calls of hatchlings to companionship appeals, warnings of threats, aggression challenges and more. The hiker now has access to the rich and multilayered story of the forest, as the app tracks the movement of potential predators, identifies nest areas and flight arcs and opens up a world few of us know is even there.

Looking up at the birds – enchanted, distracted – the hiker doesn’t notice my hand reach for his $800 SmartShades. I pluck them easily from his face and disappear into the trees before he realizes what’s happening.

Lesson number one in the expanded-empathy future: never trust a human.