Facebook CEO Mark Zuckerberg testifies before the United States Senate on Capitol Hill about the Cambridge Analytica scandal in April 2018.

Facebook CEO Mark Zuckerberg testifies before the United States Senate on Capitol Hill about the Cambridge Analytica scandal in April 2018.

FACEBOOK INTENTIONALLY AND knowingly violated both data privacy and anti-competition laws, according to a new report from the UK government.

The social media giant and its executives were referred to as “digital gangsters” in the Digital, Culture, Media and Sport (DCMS) select committee’s report.

The document was released today after an 18-month investigation into fake news and Facebook’s privacy practices.

“Companies like Facebook should not be allowed to behave like ‘digital gangsters’ in the online world, considering themselves to be ahead of and beyond the law,” the report states.

The investigation was launched in 2017 amid growing concerns about the influence of false information and its ability to spread on social media, potentially impacting elections.

In March 2018, the Observer, the New York Times and Channel 4 revealed that Cambridge Analytica had secretly acquired data harvested from tens of millions of Facebook users’ profiles.

The firm, best known for its work on Donald Trump’s US presidential election campaign, sold the data to political clients and used it design software to predict and influence voters’ choices at the ballot box.

Up to 87 million people were affected by the data breach, the largest in Facebook’s history. Cambridge Analytica has since collapsed into administration.

The report has called for the following:

Compulsory Code of Ethics for tech companies overseen by independent regulator

Regulator given powers to launch legal action against companies breaching code

Government to reform current electoral communications laws and rules on overseas involvement in UK elections

Social media companies obliged to take down known sources of harmful content, including proven sources of disinformation

‘Democracy is at risk’

Speaking about the investigation, Damian Collins, MP and chair of the DCMS committee, said “three big threats to our society” were identified.

“Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalised ‘dark adverts’ from unidentifiable sources, delivered through the major social media platforms we use everyday. Much of this is directed from agencies working in foreign countries, including Russia.

“The big tech companies are failing in the duty of care they owe to their users to act against harmful content, and to respect their data privacy rights.

Companies like Facebook exercise massive market power which enables them to make money by bullying the smaller technology companies and developers who rely on this platform to reach their customers.

“These are issues that the major tech companies are well aware of, yet continually fail to address. The guiding principle of the ‘move fast and break things’ culture often seems to be that it is better to apologise than ask permission.”

Collins said “a radical shift in the balance of power between the platforms and the people” is needed, adding: “The age of inadequate self-regulation must come to an end.”

He said the inquiry also found that electoral regulations in the UK are “hopelessly out of date for the internet age” and need to be reformed.

Zuckerberg still ‘ducking questions’

Collins stated that if Facebook CEO Mark Zuckerberg “doesn’t believe he is accountable to the UK Parliament, he is to the billions of Facebook users across the world”.

Zuckerberg “still has questions to answer yet he’s continued to duck them, refusing to respond to our invitations directly or sending representatives who don’t have the right information”.

“[He] continually fails to show the levels of leadership and personal responsibility that should be expected from someone who sits at the top of one of the world’s biggest companies,” Collins added.

Russian disinformation and Brexit

The committee has repeatedly asked Facebook in written correspondence and oral evidence about Russian activity on Facebook and knowledge of Russian advertisements that ran during the presidential election in America in 2016.

The New York Times reported in November 2018 that Facebook had discovered Russian-linked activity on its site in 2016, attempting to disrupt the US election.

MPs have concluded that two senior executives from Facebook who appeared as witnesses left them with the impression they had “deliberately misled the committee or they were deliberately not briefed by senior executives at Facebook, about the extent of Russian interference in foreign elections”.

The report calls on the UK government to make a statement about how many investigations are currently being carried out into Russian interference in UK politics. It notes that MPs want to find out what impact this may have had on previous British elections and the Brexit referendum in 2016.

Hefty fines

The report repeats a number of recommendations from the committee interim report published last summer. It calls for the government to reconsider a number of recommendations to which it did not respond and to include concrete proposals for action in its forthcoming White Paper on online harms.

The document recommends that clear legal liabilities be established for tech companies to act against harmful or illegal content on their sites, and calls for a compulsory Code of Ethics defining what constitutes harmful content.

An independent regulator should be responsible for monitoring tech companies, backed by statutory powers to launch legal action against companies in breach of the code. Companies failing obligations on harmful or illegal content should face hefty fines, the report states.

Social media companies cannot hide behind the claim of being merely a ‘platform’ and maintain that they have no responsibility themselves in regulating the content of their sites.

It repeats its recommendation for new independent regulation to be funded by a levy on tech companies operating in the UK.

Abusive content

Responding to the report, Facebook said it was “pleased to have made a significant contribution” to the committee’s investigation by answering over 700 questions.

“We are open to meaningful regulation and support the committee’s recommendation for electoral law reform,” said Karim Palant, the company’s UK public policy manager, said in a statement.

We have already made substantial changes so that every political ad on Facebook has to be authorised, state who is paying for it and then is stored in a searchable archive for seven years.

Palant said Facebook supports privacy legislation and has made recent changes in this regard, stating: “While we still have more to do, we are not the same company we were a year ago.”

He added that Facebook has increased its team working on abusive content to 30,000 people and “invested heavily in machine learning, artificial intelligence and computer vision technology to help prevent this type of abuse”.