Alex Wroblewski/Getty Images

Facebook, Google and Twitter offered Congress members an apology Tuesday but wanted lawmakers to understand it's not all their fault.

The House Judiciary Committee held its second hearing on the content filtering practices of tech giants like Facebook, Twitter and Google, looking to learn how social media blocks content and if there's any political bias involved.

The tech giants testifying walk a thin line when it comes to content moderation. They've all discussed a societal responsibility that involves pushing back against disinformation, hate speech and terrorist content, but they've run into trouble with what they choose to take action against.

At a Facebook event last week in the social network's New York headquarters, the company explained why InfoWars, a notorious conspiracy theory site, was still allowed on the platform despite Facebook's fight against disinformation.

During the hearing, Rep. Ted Deutch, a Democrat from Florida, pressed Facebook on its issues with InfoWars, asking, "How many strikes does a conspiracy theorist who attacks grieving parents and student survivors of mass shootings get? How many strikes are they entitled to before they can no longer post those kinds of horrific attacks?"

Facebook's head of global policy management, Monika Bickert, doubled down on the social network's position, which was to delete posts but keep the source up in the name of free expression.

"If they posted sufficient content that it violated our threshold, that page would come down," Bickert said. "That threshold varies, depending on the severity of different types of violations." Bickert said InfoWars didn't reach that threshold, but she didn't specify what the limits are.

It's not just what Facebook isn't taking down that's bothered lawmakers. Politicians also raised concerns about what posts Facebook has been removing.

Rep. Bob Goodlatte, the chairman of the Judiciary Committee and a Republican from Virginia, pointed to Facebook's algorithm mistakenly blocking the Declaration of Independence on the Fourth of July.

"Think about that for a moment," Goodlatte said in his opening remarks. "If Thomas Jefferson had written the Declaration of Independence on Facebook, that document would never have seen the light of day. No one would be able to see his words because an algorithm automatically flagged it."

Facebook, Twitter and Google declined an invite to the first hearing, in April, but sent executives to testify Tuesday. Along with Bickert, Juniper Downs, head of public policy and government relations for Google-owned YouTube, and Nick Pickles, Twitter's senior strategist on public policy, also attended.

At the April hearing, the committee questioned Trump supporters Lynnette "Diamond" Hardaway and Rochelle "Silk" Richardson, otherwise known as internet personalities Diamond and Silk, who claimed Facebook was silencing conservative voices on the social network. Bickert referenced the duo at the hearing Tuesday, telling members of Congress that Facebook was committed to making sure the social network was open for all voices.

"We badly mishandled our communications with them," Bickert said, "and since then we've worked hard to improve our relationship. We appreciate the perspective that they add to our platform."

Downs said YouTube doesn't target political beliefs but does demonetize videos and block content it considers "dangerous, illegal or illicit."

"We don't always get it right," Downs said, "and sometimes our system makes mistakes."

Twitter's testimony echoed Google's and Facebook's remarks, with the company telling Congress that it doesn't censor political views but that its team has made mistakes. Pickles apologized for Twitter blocking a Senate campaign announcement ad for Rep. Marsha Blackburn, a Republican from Tennessee, last October.

"Every day we have to make tough calls, we do not always get them right," Pickles said. "When we make a mistake, we acknowledge them, and we strive to learn from them."

As the hearing was going on in Washington, Eva Guidarini, a member of Facebook's US politics and government outreach team, offered another explanation on Facebook's content moderation policy at an event in New York.

City & State Protecting NY

She said Facebook took action against posts that caused "significant real world harm," like voter misinformation that told people they could vote by sending text messages. With content like conspiracy theories though, it's more of a gray area, she said.

"When it comes to information being marked false by a fact-checker, we know there's often times disagreements about what fact-checkers say," Guidarini said.

In a statement, a Facebook spokeswoman said the company removes content that violates its community standards, but doesn't have a policy that all content posted on Facebook has to be true.

"But there's a very real tension here: We work hard to find the right balance between encouraging free expression and promoting a safe and authentic community, and we believe that down-ranking inauthentic content strikes that balance," the statement reads.

At the hearing, all three of the tech companies' executives also discussed why they weren't responsible for harmful content posted on their platforms.

Goodlatte countered by saying property owners have an obligation to make sure their grounds are safe, while clubs have a legal obligation to make sure drugs aren't being sold. He wanted to know why Facebook, Twitter and Google were exempt from legal action if their social networks fostered an unsafe environment.

Goodlatte wasn't the only lawmaker with those concerns. Rep. Darrell Issa, a Republican from California, also asked why social networks weren't held accountable legally for what its users publish.

Representatives from Google, Twitter and Facebook pointed to Section 230 of the Communications Decency Act, which protects tech companies from content posted on their platforms.

"We believe," Downs said, "that the openness that's enabled by 230 has brought tremendous benefits to the world."

First published July 17, 8:24 a.m. PT

Update, 12:21 p.m.: Adds details from the hearing and comments from Facebook staff.

Update on July 18, 7:17 a.m. PT: Adds a response from a Facebook spokeswoman.

Cambridge Analytica: Everything you need to know about Facebook's data mining scandal.

iHate: CNET looks at how intolerance is taking over the internet.