For a service used by three in every 10 people on the planet, Facebook increasingly seems to lack friends. Governments and campaigners accuse it of paying too little tax, with chancellor Philip Hammond’s last Budget speech targeting the company through a planned tax on digital services. Facebook has accepted criticism that it...

By submitting my Email address I confirm that I have read and accepted the Terms of Use and Declaration of Consent.

Enjoy this article as well as all of our content, including E-Guides, news, tips and more.

was used to spread hate speech in Myanmar, contributing to 700,000 Rohingya refugees fleeing the country.

Speakers at two sessions at the recent IF Oxford Science and Ideas Festival, chaired by Computer Weekly’s business applications editor Brian McKenna, focused on Facebook – although the problems they described and solutions they proposed could be applied more broadly.

“Facebook is designed for a better species than ours,” said Siva Vaidhyanathan, professor of modern media studies at the University of Virginia. “We are a vengeful species, a lustful species, a shallow species. We are a species filled with people who are looking out for the short term, for instant gratification.” Most human institutions help to counteract those weaknesses, he added – but Facebook exploits them.

Vaidhyanathan, author of the recently published Antisocial media: how Facebook disconnects us and undermines democracy, pointed out how Facebook’s scale means its flaws are magnified enormously. On 30 October, the company said it had 2.27 billion monthly active users, up 10% from a year ago. “I can’t think of anything else in the world, except oxygen and water, that affects that many people regularly,” he said.

Those users are exposed to algorithmic amplification, where content that generates comments, clicks, shares or likes is brought to more people’s attention through their newsfeeds, said Vaidhyanathan. “What is not favoured on Facebook? Things that are carefully argued, soberly assessed, deeply researched, modestly presented – the kind of material that we depend on to think collectively about out problems.”

He added: “What does fly on Facebook? Things that generate strong emotions. So baby pictures and puppy pictures, but also hate speech, calls to genocide, conspiracy theories.”

Painful paradox of Facebook In Antisocial media, Vaidhyanathan notes that Facebook has a tolerant culture and a missionary zeal to connect and empower people, but argues that the internet giant has become complicit in the rise of extremists and terrorists around the world. “The painful paradox of Facebook is that the company’s sincere devotion to making the world better has invited nefarious parties to hijack it to spread hatred and confusion,” he writes. In Oxford, Vaidhyanathan said Facebook damages public debate in other ways. It undermines journalism that used to be paid for by advertising through its own highly successful service. This has caused further damage by allowing anonymous, targeted political advertising. The company recently opened an archive of political adverts and now requires buyers to identify themselves, but does not appear to carry out robust checks on the latter. Before the recent US mid-term elections, Vice News applied to advertise on behalf of all 100 senators and was approved every time. An application supposedly on behalf of Islamic State was also approved – although one for Facebook founder Mark Zuckerberg was refused. “Facebook is designed for a better species than ours. We are a vengeful species, a lustful species, a shallow species” Siva Vaidhyanathan, University of Virginia Vidya Narayanan, a researcher for Oxford University’s Oxford Internet Institute, said Facebook and other social media provide a platform to people who have lacked one, particularly in countries such as India. “It is an empowering thing, as maybe their voices have been suppressed in the past,” she said, speaking after the event. Because of this, Narayanan warned against heavy-handed controls. “It is important not to over-regulate, because I strongly feel these platforms offer people a means of expressing their thoughts and opinions and a way of being connected with the outside world,” she said, such as for older people and others who are physically isolated. If there is to be regulation, it should recognise those benefits and look to maintain them, she added. But there are problems and newer social media services – such as encrypted messaging system WhatsApp, which is owned by Facebook – may coarsen discussions even further. The Oxford Internet Institute tracked political discussions in Brazil on WhatsApp in advance of October’s presidential election, in which far-right candidate Jair Bolsonaro beat left-winger Fernando Haddad. Researchers joined groups, announced themselves and only stayed if no one objected. They recorded the images, videos and links to news sources shared within the group, rather than any personal data. WhatsApp groups are private and Facebook does not monitor what is shared. “There is a lot of freedom to express yourself in any way that you deem fit, and we’ve seen some pretty dire content in some of these groups,” said Narayanan, adding that the researchers developed new categories for classification – including “hate, gore and porn”. The ongoing research has seen material created exclusively for WhatsApp groups, including memes and jokes about politicians and ideas, sometimes appearing in a number of groups. “It is hard to track where these messages originate,” she said. The institute also looked at data from Twitter accounts in Brazil, and found polarisation between supporters of Bolsonaro and Haddad. “There was almost no middle group,” said Narayanan. Facebook was not included in the research, as it largely refuses to co-operate with academics.

Tools and humans The profound influences that tools have on their users were discussed by Nigel Shadbolt, principal of Jesus College, Oxford and co-founder of the Open Data Institute, in another IF Oxford event linked to the one on social media. Recent discoveries have shown that hominids had been making tools for 200,000 generations before Homo sapiens developed, he said. “These tools allowed the species to master its environment; they also changed everything from the fine motor control in hands and fingers to the cortex. It is also thought that it drove other functions, such as sociability and language development.” Research also suggests that the development of more intricate tools activates more parts of the human brain, said Shadbolt. “We often think that we made our technology. But our technology made us, and is continuing to make us.” So what might provide answers? In the same Augmented humanity event at which Shadbolt spoke, Helena Webb, a senior researcher at Oxford University’s department of computer science, outlined two research projects. Digital Wildfire, a collaboration between Oxford, de Montfort, Warwick and Cardiff universities that ended in 2016, looked at how social media’s tendencies to spread harmful material could be lessened. “Sometimes the way in which people respond can be just as inflammatory as the original post” Helena Webb, Oxford University It found that legal measures and controls by social media platforms were limited by time-lags between publication and removal, giving it time to have an impact, and their focus on individual pieces of content or users, rather than a wider “digital wildfire”. The research found greater benefits from self-governance. Webb said social media users may be wary of replying to someone expressing inflammatory views. “A lot of the time, the assumption is that the person is doing it for attention,” she said. “If you reply to and give them attention, that’s giving them what they want.” However, ignoring material also means it goes unchallenged, she said. The research analysed data from Twitter on challenges to sexist, homophobic and racist comments. It found that when one person responded, an online discussion developed. “The hateful content is spread as the conversation continues,” said Webb, but something else happens when a group gets involved. “The conversation shuts down more quickly if you have multiple people coming in to disagree,” she said. Although this didn’t show any change of users’ opinions, it did represent users self-regulating such content, she pointed out. Webb said platforms could encourage this process, but it is not without problems. “Sometimes the way in which people respond can be just as inflammatory as the original post,” she said. In some cases, multiple negative responses turn into a “pile-on” – a deluge of condemnation that may see the person who made the first comment professionally damaged.