For some, the word “technology” might evoke cold imagery of steely robots and complex computer algorithms. But a talk on “empathetic technology” at this year’s Wired Health conference did a lot to change this perception. Share on Pinterest Our smart devices may soon know how we are feeling even before we do. With approximately 39 million people in the United States currently owning a smart speaker, technology that caters to our needs is more and more ubiquitous, taking up ever more of our personal space. But smart devices can do so much more than merely playing our favorite song or searching the internet when we ask them to. Smart speakers may soon be able to diagnose us or tell how we are feeling. At Wired Health — an annual conference that brings to the fore the latest developments in health tech — neuroscientist and technologist Poppy Crum, Ph.D., gave a talk aptly titled “Technology that knows what you’re feeling.” Treading a fine line between ominous and hopeful, the title made a powerful point: soon, consumer technology may know our mental and physical states before we do. But how, exactly, can technology achieve this? How can we harness its potential to help us elucidate mental and physical conditions, and what role does empathy play in all of this? These are some of the questions that Crum answered at Wired Health — an event which this year took place at the Francis Crick Institute in London, United Kingdom.

What is empathetic technology? Crum, who is the chief scientist at Dolby Laboratories in San Francisco, CA, and an adjunct professor at Stanford University in the Center for Computer Research in Music and Acoustics, defines empathetic technology as “technology that is using our internal state to decide how it will respond and make decisions.” So how can technology read our internal states? Crum’s talk at Wired Health featured some interesting examples of neurophysiological “giveaways” that the right type of technology can now pick up easily — a phenomenon the scientist referred to as “the end of the poker face.” For instance, as Crum showed in her talk, when we’re feeling overwhelmed by a cognitive load — or, in simpler terms, when we’re struggling to understand something — our pupils dilate. The pupillometry research from the last few decades has shown that we can track multiple cognitive processes, such as memory, attention, or mental load, by examining the behavior and measuring the diameter of our pupils. In fact, this is an experiment we can all “try at home.” In 1973, renowned psychologist Daniel Kahneman wrote: “Face a mirror, look at your eyes and invent a mathematical problem, such as 81 times 17. Try to solve the problem and watch your pupil at the same time, a rather difficult exercise in divided attention. After a few attempts, almost everyone is able to observe the pupillary dilation that accompanies mental effort.” Further experiments have shown how skin conductance, also known as galvanic skin response, can be a tool to predict a person’s emotional response when watching a movie or a football match. How much sweat a person’s skin secretes, as well as the changes in the electrical resistance of the skin, can predict “stress, excitement, engagement, frustration, and anger.” Furthermore, humans exhale chemicals, such as carbon dioxide and isoprene, when they feel lonely or scared. In fact, in the TED talk below, Crum had tracked the carbon dioxide that members of the audience exhaled when they watched suspenseful scenes from a thriller movie. Waller also uses a pair of glasses to simulate vision problems, and other researchers have used immersive technology, such as virtual reality simulators, to recreate the experience of living with “age-related macular degeneration, glaucoma, protanopia, and diabetic retinopathy.”