No matter their protestations, the big tech firms will always be asked to do more to tackle the manifestations of society's problems on their platforms. And yesterday's four-hour evidence session in front of British parliamentarians was no different.

In an intense hearing – for which the members of the digital, culture, media and sport committee selflessly flew to Washington DC, USA, on the public purse – reps from Twitter, Facebook and Google were questioned about their platforms' massive reach and influence, and their efforts to tackle the spread of misinformation.

Throughout the session, the frustration the MPs felt about being unable to – or perhaps being perceived as unable to – properly hold these tech giants to account was palpable.

Using almost inflammatory language, the politicians accused the companies of hypocrisy, failures in their duty of trust to users, and of caring more about ad revenue than weeding out fake news.

Many exchanges exemplified the fraught relationship between policymakers and tech firms. MPs struggled to find the right words to get their point across, stymied either by a lack of technical understanding or the notorious opacity of the businesses in front of them.

Kremlin social media trolls aren't actually that influential, study finds READ MORE

"This is the problem, Mr Milner," Ian Lucas said to Facebook's UK policy manager Simon Milner. "You have everything. You have all that information; we have none of it because we can't see it."

Although much of the debate echoed previous discussion, some new nuggets of information came to light, including the extent of Russian bots' interference in the Brexit referendum.

Nick Pickles, the UK policy manager of Twitter, said that – after the further investigation pushed for by the committee – it had identified "a very small number" of accounts linked to the largest Russian actors, the Internet Research Agency.

There were 49 active during the campaign – equivalent to 0.005 per cent of the accounts that tweeted about the referendum – and they sent a total of 942 tweets. That is less than 0.02 per cent of the total tweets about Brexit, he said.

Those tweets received a total of 461 retweets and 637 likes – on average, less than 10 likes and 13 retweets per account.

In comparison, during the US election the same agency controlled some 3,814 bot accounts that generated 175,993 tweets.

Both Pickles and his US counterpart, Carlos Monje, pointed to extra efforts the platform was taking against dodgy accounts. Action is taken against 10 times as many accounts as last year, they said, while some 6.4 million accounts are challenged each week, which Monje said was up 60 per cent on October.

"I don't think tech companies should be deciding during elections what is true and what is not true. And that's what you're asking us to do," he said, mentioning the infamous Brexit promise of £350m a week for the NHS.

Pickles also highlighted Twitter's "penalty box" system, which aims to "change people's behaviour" by requiring those who break community rules to delete the offending tweet and often provide a phone number in order to be let back on. He said 65 per cent of accounts only go through this process once.

However, much of the interrogation focused not on abuse or bots, but on lies.

The committee was insistent (sometimes to the point of absurdity) that it was Twitter's job to seek out and remove "lies" on the platform, with committee chair Damian Collins asking why "telling lies on Twitter isn't a breach of the terms".

Pickles countered: "We don't have rules based on truth," adding that decisions on account takedowns had to be based on context – for instance, whether an account was created solely to abuse someone.

Beyond that, Pickles said, it wasn't for Twitter to revoke access because someone had said something untrue.

"I don't think tech companies should be deciding during elections what is true and what is not true. And that's what you're asking us to do," he said, mentioning the infamous Brexit promise of £350m a week for the NHS.

This wasn't good enough for the MPs, though, who variously said they were "astounded" and "staggered" at what they deemed an abdication of responsibility.

This centred on the platform's global reach and power. "We're not looking at an 18th century broadsheet sheet," said Tory MP Giles Watling, while colleague Rebecca Pow asked: "What is this [spread of misinformation] doing to our children?"

As Facebook pushes yet more fake articles, one news editor tells Mark to get a grip – or Zuck off READ MORE

Pickles conceded that a "big public debate" was yet to be had in full, but maintained that it was not for tech firms to become the arbiters of truth, emphasising the distinction between abusive behaviour and people's viewpoints.

Despite this, the overall tone of the exchange between Twitter and the committee was less fraught than the one had with Facebook's two employees – who also received the longest grilling.

The most heated part of that discussion was over Facebook's investigation into Russian influence in elections, which the committee criticised for not being sufficiently detailed.

Milner said the full report would be due at the end of the month, but repeatedly noted that the UK government had provided the firm with no intelligence about suspected accounts.

"What we haven't had is information that's enabled us to target one particular page or a particular phenomenon," Milner said. Although he added that he wasn't suggesting that "there is nothing", he said that in the US the firm had been given an intelligence report that had helped it identify Russian actors and postings.

But this argument got short shrift from Collins, who later said: "You haven't looked! You haven't looked, have you?"

Milner also came under fire for Facebook's inability to say where money for election campaign ads had come from. Committee member and Labour MP Ian Lucas said that if such funds came from overseas, Facebook could be "facilitating an illegal act".

In response, Milner said he hadn't heard that analysis before, noting that they could see the account that paid for the ads but didn't have ways of preventing someone from overseas paying for election ads. He also argued that this was more a matter for the Electoral Commission.

Facebook is working to boost advertising transparency, though. The committee was told it was developing a system that would allow people to see what pages adverts appear on, and what other adverts there are on that page. This will also cover so-called dark ads – those seen just by the sender and receiver (and of course Facebook).

In comparison, Google and YouTube got off lightly – but they were also the first session, so maybe the MPs hadn't quite warmed up to their task by then.

The representatives from the businesses said that their investigation into Russian influence would also report at the end of the month.

YouTube's global public policy director, Juniper Downs, said there was no evidence of Russian influence in ads placed during the Brexit campaign, but admitted it hadn't looked at account activity, and promised to cooperate with the committee on that.

In discussions on fake news, Downs and Google's vice-president of News, Richard Gingras, emphasised the importance of users' trust in their platforms and efforts to "surface" trusted accounts and demote sensationalist or fake content.

Twitter breaks bad news to 677,775 twits: You were duped by Russia READ MORE

However, the committee was sceptical of Downs' assertions that she didn't know what YouTube's annual advertising revenue was, when she was asked to put the company's "tens of millions of dollars" spend on tackling fake news into context.

A back-of-the-envelope calculation from the committee's clerk put global revenue at about $10bn – making the proportion spent on fighting spam about 0.1 per cent.

"That's a small sticking plaster over a gaping wound," said Collins.

But Downs countered that there was "no constraint on the resources we will put into getting this right" and would invest more if necessary.

She also admitted that the platform's "view next" recommendation algorithm needed work, saying that – although news content only makes up about 2 per cent of viewing – the firm was "not proud" of evidence that showed false information was shown to people through this system.

"We don't need extra motivation to get this right," she said.

Nonetheless, the promises made by Downs and her counterparts appeared to have done little to assuage the concerns of the angry mob of MPs.

Collins told reporters afterwards that the billions of dollars the firms make "massively outweighs the relatively small amounts they invest in harmful and difficult content". ®