Senior Minister of State for Law and Health Edwin Tong, taking part in a first-of-its-kind international hearing on fake news and disinformation in London, questioned a senior Facebook executive as to why his company initially refused to take down a hate post in Sri Lanka during a period of religious violence and anti-Muslim riots in March.

While Mr Richard Allan, Facebook's vice-president of policy solutions, agreed it was a "serious and egregious" mistake, he maintained Facebook is still "best placed" to remove content which causes harm.

The exchange took place in the British Parliament yesterday, in a hearing that involved 24 parliamentarians from nine countries, including Singapore.

The others were from Argentina, Belgium, Brazil, Canada, France, Ireland, Latvia and Britain, representing about 447 million people.

Besides Mr Tong, Singapore MPs Sun Xueling and Pritam Singh also attended the hearing. The trio are members of Singapore's Parliamentary Select Committee tasked to find ways to combat online falsehoods.

In his question, Mr Tong referenced a Facebook post in the Sinhala language which read, "Kill all Muslims, don't even let an infant of the dogs escape". A report said a user had pointed out to Facebook it violated its hate speech policy.

But Facebook replied then that it did not go against its "community standards" and told the user to directly block the person posting it, among other options.



In a still taken from UK Parliament video footage, Facebook's vice-president of policy solutions Richard Allan gives evidence at an international hearing on fake news in London on Nov 27, 2018. PHOTO: AFP



Mr Tong asked if Facebook's initial refusal to take down the inflammatory posts - which led to the Sri Lankan government banning access to the platform - showed that the social media site "cannot be trusted to make the right assessment" on what can appear on its platform.

Mr Allan admitted it was a "simple error" on the part of a Facebook staff member, saying: "We make mistakes. Our responsibility is to reduce the number of mistakes."

He added: "We are investing very heavily now in artificial intelligence, where we would precisely create a dictionary of hate-speech terms in every language."

Ms Sun asked if Facebook would work with the authorities to take down false information and close accounts that put out fake news.

Mr Allan said that while it would work with the authorities, "the best person to make a decision about whether that claim is true or false is not Facebook... it's the relevant judicial authority in any country".

Mr Singh asked how Facebook was dealing with the tampering of elections. In the 2016 US presidential election, Russians allegedly meddled in it by buying and posting advertisements to breed discord.

Mr Allan said Facebook will set up a task force of security and legal specialists for every significant election to tackle such interference.

When Mr Singh asked what he meant by "significant election", Mr Allan replied: "Our current resourcing allows us to look at all national elections. So, if there's a national election in Singapore, for example, that would be covered."

After Mr Allan's almost three-hour testimony, the parliamentarians inked a declaration, "International Principles for the Law Governing the Internet", which called, among various things, for tech companies to be accountable to users.