​It looks like Apple hopes your emotions are written all over your face.

As first reported last week by the Wall Street Journal, the technology giant has acquired Emotient, a startup that uses artifical intelligence to specialize in "emotion detection."

CBC Radio technology columnist Dan Misener explains the coming wave of emotionally aware computers.

What do we know about Emotient?

It's a company that specializes in what's called "automated facial expression recognition." Basically, that involves teaching a computer to look at human faces and recognize emotions.

The company says its technology can recognize a number of different facial expressions — joy, sadness, surprise, anger, fear, disgust, and contempt.

And they say they can identify those emotions in real-time, and in busy environments, like a retail space.

Details of the company's acquisition by Apple — like how much Apple paid — haven't been revealed, and Apple hasn't said anything about what it plans to do with the technology.

But we do know this is part of a growing field, called "affective computing," and there's a huge amount of money and attention going into it right now.

How does a computer recognize emotions in someone's face?

Michel Valstar is a computer science professor at the University of Nottingham, and he specializes in getting computers to recognize facial expressions.

He said these kinds of systems look at facial features and the geometry of a person's face.

"What kind of wrinkles are there, what kind of shape does the corner of your mouth have, how much white is there in the eyes," are facial features Valstar said a computer might analyze.

"Or you can go the other way, and you can look at the geometric shape of the face. Because when you smile, obviously the corners of your mouth go out and up. They will indicate to a system that you are smiling," he said.

By looking at the textures and geometry of someone's face, and combining those with sophisticated pattern-matching, a computer can measure what are called the six "basic expressions" — anger, disgust, fear, happiness, sadness and surprise.

What's driving the push to develop computers that can sense emotions?

The earliest adopters of this technology have been advertisers and market researchers.

Let's say you have a video of customers using your product, or walking past a retail display, or watching an advertisement. You can use software like Emotient to get a sense of how people feel about your product.

Emotient's emotion detection technology could be used to give retailers a sense of how customers passing by feel about a product. (Brian Burnett/CBC) However, that brings up some obvious privacy and ethical questions.

It's one thing for a computer to analyze your face if you know it's happening, and you've agreed to be part of a market research group.

But it's another thing for people simply walking down the street or browsing through a store to have their faces analyzed without their knowledge or consent.

Who else might be interested in using emotionally-aware computers?

Valstar focuses his research on medical conditions that alter expressive behavior, like Alzheimer's disease, depression, or chronic pain. And he said computers that can recognize facial reactions could be beneficial in his work.

"If you could measure how often somebody smiles and how intense people smile, but also whether they look at you or avert their gaze. Those are very strong objective signals that you could use in diagnosis and monitoring of people with varying medical conditions," he said.

He also said facial expression technology could offer clinicians more objective measurements than they can achieve right now, because facial expressions can difficult to monitor, and interpreting them can be subjective.

How will Apple use the face-reading technology it's acquired?

It's unclear at this point, because Apple hasn't said what they plan to do with Emotient's technology. But many within the facial expression recognition community have been speculating.

One possible use Apple may have for Emotient's technology is improving the way the digital assistant Siri relates to users. (Suzanne Plunkett/Reuters ) One obvious possibility is that Apple will use this technology to improve Siri, its personal assistant software. If Siri knows that you're happy, or sad, or tired, or bored, it could tailor its responses to you.

Another possibility is using the technology to tailor recommendations. Apple has a large content business — it sells e-books, TV shows, movies, and music. Emotion detection could help Apple recommend something that suits your mood.

There are also rumours Apple is working on its own car. It's not hard to imagine this sort of technology monitoring drivers, especially for safety purposes. For example, the technology could ensure a driver's eyes are on the road and they're paying attention. Or it could look for signs of drowsiness.

When asked about the possibility of Emotient's technology being used inside an Apple car, Michel Valstar said, "I think this is getting to an area that I'm not allowed to talk about."

But wherever it ends up, this is probably not the last we've heard about emotionally-aware computing — just the beginning.