It’s flu season, and many of us find ourselves glancing nervously at anyone coughing or sniffling in our vicinity. But how, besides shielding ourselves from public sneezers, do we avoid coming into contact with infections?

It turns out, our brains are quite finely tuned to detecting illness in others. New research suggests that subtle facial cues alert us to infections mere hours after they take hold. This research could one day help train AI systems to detect illness as well.

A study published in the journal Proceedings of the Royal Society B took 16 healthy volunteers and injected them, at different times, with both a placebo and a type of E. coli bacteria that causes flu-like symptoms. The volunteers, who did not know which injection they had just received, were photographed two hours after each shot. These photographs were then shown to 62 participants who were asked to judge whether the person in the picture was healthy or sick. These participants had to make the judgement after only viewing the photo for five seconds.

The participants were only able to detect a sick person 52 percent of the time, hardly better than chance. But they were able to detect a healthy person 70 percent of the time. Facial features related to judgements of sickness included redder eyes, duller skin, a more swollen face, a droopier mouth and eyelids and paler skin and lips. The sick photos were also rated as looking more tired.

“We expected that people would be better than chance at detecting sick people, but far from 100 percent since they were only allowed to see a photo for a few seconds,” says John Axelsson, a professor at the University of Stockholm and a co-author of the study. “We expect people to be a lot better when they can interact for real with someone and then also use other cues such as biological motion, smell etc.”

The research was limited by the study’s small size and the fact that all the volunteers were Caucasian and all were healthy, Axelsson says. Further research is needed to look at different ethnic groups, different ages and at people with chronic disorders. More research could also potentially identify more features important to our judgments of sickness and health beyond the ones identified in the study. Additional research could also show whether we treat people who appear ill differently.

Despite these limitations, Axelsson hopes the better understanding of non-verbal signs of sickness can help doctors improve diagnoses. The sickness signs identified by the study will also “very likely” one day be used in training AIs to detect illness, though this is not part of Axelsson’s research.

Other recent research has shown just how much subtle facial features and movements can reveal about our health and mental states, says Mark Frank, a communications professor at the University at Buffalo, State University of New York, who studies facial expressions. The presence or absence of certain tiny facial movements can indicate disorders like Bell’s Palsy or brain tumors. Microexpressions – fleeting looks often too quick to register on our consciousness – can reveal schizophrenia or whether a person with depression is recovering or not.

“Subtle movements in eyelids can reveal fatigue and can even predict when a driver is more likely to crash his or her vehicle,” Frank says.

Understanding what our faces say about our health will be important in training AIs, Frank says. AIs could help humans do real-time analysis and decision-making, which could be especially important when people are “overwhelmed by too much information.”

One could imagine an illness-detecting AI being used in airports, for example, scanning thousands of faces per second. Airports in some parts of the world already use temperature scanners to weed out potentially sick individuals; an AI could improve upon such technology to identify people who are sick without fevers. Such technologies would likely bring up privacy concerns, as well as debates over whether they're effective as containment strategies.

Developers are already working on a variety of neural networks – systems that learn on their own by analyzing huge amounts of data – to detect signs of illness earlier or better than humans can. Recent examples include an algorithm to read chest X-rays and diagnose pneumonia, an AI for spotting very early lung cancers on CT scans, and a Google technology for seeking early signs of eye diseases that can cause blindness. But for a neural network to learn, it needs to be told what to look for. Which means humans need to teach it. Which means humans need to know. Studies like Axelsson’s, which show what facial changes are associated with sickness, could give humans the tools to do the teaching.

In the meantime, now you know to stay away from people with subtly drooping eyelids (though maybe they’re just tired). Better yet, just get a flu shot.