26 Oct 2016

By Emma Goodman

A new Reuters Institute report, Brand and Trust in a Fragmented News Environment, has found that many news users prefer an algorithm to choose their news, rather than an editor. Although most of those interviewed for the report had not previously given much consideration to how their news was curated, when they were asked, algorithms were preferred, particularly among the younger and more technologically engaged. This was despite the fact that most had concerns regarding the accuracy of news content on social media, and many trust in the experience of establish news brands.

Aggregators such as Google News or Apple News use algorithms to surface stories in response to search terms or past consumption. Social networks such as Facebook, which is far more widely used than any aggregator, also use algorithms to decide which stories feature most prominently in a user’s feed, from among the brands that a user follows and posts from friends.

The report is based on focus group discussions in the UK, US, Germany and Spain, with both younger (20-34 years old) and older (35-54) news users on how they come across news in a distributed environment where aggregators and social media are more and more important intermediaries. It provides a qualitative supplement to the quantitative research conducted for the 2016 Digital News Report.

The popularity of algorithmically selected news is in line with the wider 2016 Digital News Report survey, where 36% of respondents were happy to have news selected for them automatically based on what they have read before, compared to 30% who said that having stories selected by editors and journalists was a good way to get news.

Why are algorithms more popular?

Those who favored algorithms did so both because of their perceived independence from editorial and political agendas, and the potential for personalization based on prior reading habits. Algorithms were seen as less influenced by political agendas, and as they present a variety of sources, users felt that they had a chance to use their own judgment to interpret a story. The research found that trust in news was associated with engaging with a variety of sources, and as the DNR survey found, many people are starting to see themselves as editors of their news consumption, mixing and matching a range of different sources and proactively managing their various feeds.

Those who preferred editors, mainly older and less tech savvy, appreciated the expertise of humans, and the manageable way in which it is presented. Older participants, particularly those in Germany, were also wary about their personal data being used to make content selections. A participant from one of the German focus groups said, “someday... your data is sold and the purchaser may get a picture of you and each single one of us...what you’re interested in... who you are.”

But there was a perceived risk that content might be made less prominent as a result of political or commercial considerations at news organisations. One UK participant explained that they would be “more inclined to trust an algorithm that takes from a full range rather than one editor. I think that one editor could be exposed quite quickly.”

What are the implications of this preference for algorithms?

Even though many news users are increasingly comfortable with algorithmic filtering of news, they are not blind to the possible pitfalls. The focus groups identified a range of common concerns across all four countries.

The filter bubble

Interviewees thought that algorithms helped introduce them to a broader range of content and brands based on their interests, but there is also the risk of getting caught in a ‘filter bubble’ of their own interests as algorithms select news for users based on past consumption and preferences. As a UK focus group participant said, “Is it a little bit claustrophobic in there? It’s just like you’re getting what you want. Maybe it’s nice to get things that you wouldn’t necessarily ask for?”

This was a concern to the DNR survey respondents also but it does not necessarily seem to change behaviour. “We may not trust algorithms very much when we stop to think about how they work, but the services they enable are amazing and we would not want to be without them,” wrote Rasmus Kleis Nielsen regarding the DNR findings.

The ‘democracy’ of algorithms

Some study participants noted a preference for editors because they could be held accountable – as one noted, “You can write to the editor if you’ve got a complaint or you’ve got a comment,” but many perceived algorithms as democratic. Many media scholars and others have written about how problematic it is that algorithms are perceived as neutral, but never actually can be: they are always created according to specific principles. The proprietary algorithms upon which tech companies build their businesses will also never be transparent, and as they are not public utilities they will never be accountable. As Emily Bell wrote in the Guardian: “Facebook does not see itself as responsible for the information diet of the world, even though this is exactly what it is becoming.”

Beyond the filter bubble, there is the risk of algorithmic discrimination, whether intentional or not. Algorithms can reproduce and reinforce human prejudices, as several studies have shown, either in the way that they are constructed or in the way that they learn from use. As US academic Zeynep Tufecki has pointed out, the Facebook algorithm, for example, values posts that can be shared and liked, often content that is “designed to generate either a sense of oversize delight or righteous outrage and go viral,” which means that some important stories are unlikely to appear in user’s news feeds and become popular.

What does this mean for news providers?

News organisations are already frustrated by their lack of control over how news is consumed on Facebook. Having encouraged news providers for some time to put more content on the social network through initiatives like Instant Articles, Facebook then announced that it was further prioritizing stories from friends and family. Facebook’s ‘trending’ news section, for example, has also been a source of controversy, with allegations of bias when it was discovered that it was curated by editors, followed by problems with offensive and inaccurate content when it dismissed its editors.

This preference for algorithmic curation suggests that a key element that established news outlets can offer – an informed, prioritized selection of editorial content – is becoming less important for some readers, who are increasingly comfortable with algorithmic filtering (even if the content itself may still come in large part from news media). But the focus group discussions also clearly document that many still value established, trusted brands, and enjoy watching TV news and reading print papers, where the experience is always carefully curated by editors and journalists. News organisations therefore have an opportunity to more effectively differentiate themselves from what is offered by intermediaries to offer a distinct and potentially superior way of getting news.

What can news organisations do to build on their audiences’ trust in their ability to select the news? Another is to find ways to distribute their expert curation more widely off-site, for example through personalized, targeted newsletters or mobile apps. Or perhaps a middle ground is to shift curation focus from the home page to the article page, better catering to audiences arriving on news sites via social media. The research shows that trust in news is primarily associated with content and its perceived accuracy, impartiality and tonality, so above all, it remains important to do good journalism.