The seemingly lonely people swore more, and talked more about their relationship problems and their needs and feelings. They were more likely to express anxiety or anger, and to refer to drugs and alcohol. They complained of difficulty sleeping and often posted at night. The non-lonely control group, perhaps fittingly, began a lot more conversations by mentioning another person’s username. They also posted more about sports games, teams, and things being “awesome.”

This study was far from a perfect window into Twitter users’ souls. Certainly, people can talk about their needs and feelings without being lonely. But natural-language processing is nevertheless making it easier for scientists to understand what different emotions look like online. In recent years, researchers have used social-media data to predict which users are depressed and which are especially happy. As the tools for analysis become more sophisticated, a wide array of emotions and mental-health conditions can now be predicted using the words that people are already typing into their phones and computers every day.

Read: Me, myself, and authenticity

In some cases, researchers can unearth fine-grained differences within amorphous emotions. Take, for instance, empathy. There’s long been an idea in psychology that there are two types of empathy: “Beneficial” empathy, or compassion, involves sympathizing with someone and trying to help that person. Meanwhile, “depleting” empathy entails feeling someone’s actual pain—and suffering yourself in the process. For a paper that is still undergoing peer review, another group of researchers at the University of Pennsylvania analyzed social-media language to determine how these two types of empathy are expressed. They found that people who demonstrate compassion tend to say things like “blessed,” “wonderful,” “prayers,” or “family.” Those who express depleting empathy use words like “me,” “feel,” “myself,” and “anymore.”

This might seem like a minor distinction, but according to Lyle Ungar, one of the study’s authors, finding the difference between the two can help people in jobs that involve caring for others, such as doctors, understand when their empathy might be counter-productive. Depleting empathy can lead to burnout. “I can really care about you and not suffer with you,” Ungar says. “I can worry that there’s poverty in Africa and donate money to charity without feeling what it’s like to have malaria.”

Beyond common emotions, language-analysis technology might also shed light on more serious conditions. It might one day be used to predict psychosis in patients with bipolar disorder or schizophrenia. Episodes of psychosis, or losing touch with reality, can be shortened or even stopped if caught early enough, but many patients are too far gone by the time loved ones realize what’s happening. And it’s difficult for people going through psychosis to realize they’re in the midst of it.