By Matthew Warren

The age of social media has opened up exciting opportunities for researchers to investigate people’s emotional states on a massive scale. For example, one study found that tweets contain more positive emotional words in the morning, which was interpreted as showing that most people are in a better mood at that time of day.

The premise of this line of research is that our word choices reflect our psychological states – that if someone uses more positive or negative emotional words, this is a good indication that they are actually experiencing those emotions. But now a new study has thrown a spanner in the works, finding that – for spoken language at least – this assumption might not hold up. In their preprint posted recently on PsyArxiv, Jessie Sun and colleagues found that emotion-related words do not in fact provide a good indication of a person’s mood, although there may be other sets of words that do.

Sun’s team asked 185 American university students to wear a recording device for a week, which recorded a 30 second snippet of sound every 9.5 minutes. Four times per day, the participants also completed a survey via text message measuring the positive and negative emotions they had experienced over the previous hour.

The team ended up with a whopping 150,000 recordings, which research assistants transcribed over the course of two years, weeding out clips which contained no speech or just a few words. They then scored each recording according to how many positive and negative emotion words it contained (like “sweet” or “hurt”), by running the text through an analysis programme called the Linguistic Inquiry and Word Count , which contains dictionaries of words associated with different topics. Finally, the team averaged the scores for all clips from the same three-hour period surrounding each questionnaire, ending up with 1,579 language-based emotion measurements that they could directly compare to participants’ self-reported mood.

The researchers found that – contrary to the assumptions of some past studies – the number of positive and negative emotional words was not associated with participants’ actual mood. “Our findings suggest that researchers should not assume that fluctuations in … [the use of emotional words] … can be used as a proxy for subjective emotion experience, at least for spoken language”, the authors write.

But the recordings did contain some emotional information: the research assistants’ assessment of the speaker’s emotions, based on listening to the recordings, was associated with participants’ rating of their own mood. The authors suggest that the human raters were picking up on non-verbal cues relating to emotion – things like intonation and volume – that the programme itself was missing.

In an exploratory analysis, the authors also examined whether any other sets of words unrelated to emotion could predict participants’ mood. They found that greater use of words related to socialising, like “you” or “we”, was associated with experiencing more positive emotion, while use of maths words, like “minus” and “number”, was related to less positive emotion. However, these associations were weak, so may not be useful measures of emotion, say the authors.

There are other possible explanations for the null results in the study, which the authors acknowledge. It could be that a person’s use of emotion-related words taps into an aspect of emotion that also isn’t captured by self-report questionnaires – perhaps one that participants themselves aren’t consciously aware of. Alternatively, the dictionaries themselves may not always reflect how people use words: for example, the word “pretty” appears in the positive dictionary, but could be used in a negative context (e.g. “it was pretty terrible”). But even taking into account these limitations, the new study demonstrates the importance of checking the validity of tools in psychology research, to make sure that they are actually measuring what we think they are measuring.

The Language of Well-Being: Tracking Fluctuations in Emotion Experience through Everyday Speech

Matthew Warren (@MattbWarren) is Staff Writer at BPS Research Digest