On Wednesday morning, the Senate Committee on Commerce, Science, and Transportation approved the Stop Enabling Sex Traffickers Act (SESTA) after weeks of back and forth with major tech companies like Google and Facebook over the language used in the bill. SESTA will now head to the Senate for consideration.

The passage of SESTA would overhaul Section 230 of the Communications Decency Act, a piece of legislation from the 90s that protects web service providers from legal culpability for what users post on their site. The new legislation adds an exception to Section 230 that would hold platforms liable for ads and content that facilitate sex trafficking.

SESTA’s introduction to the Senate in August caused a fracas in the tech world: Digital rights advocates claimed it promotes censorship, but lawmakers see it as a way to hold internet companies accountable for what occurs on their platforms.

“The original meaning of Section 230 has been turned on its head,” David Golumbia, an associate professor of English at Virginia Commonwealth University and the author of The Cultural Logic of Computation, told me in an email. “Google, Facebook, and its digital rights advocates like the Electronic Frontier Foundation (EFF) objected so strongly to this effort because they rely on Section 230 to make these companies unlike any companies that have ever existed before: media companies in every way, but unaccountable as media companies always have been.”

The issue of whether a web platform is legally responsible for user-generated content has risen to prominence in US politics following a string of lawsuits alleging that backpage.com, a site for classified ads, plays a prominent role in facilitating the sex trafficking of minors.

Backpage.com was protected in these cases by freedom-of-speech appeals and by Section 230. In each of these cases, judges ruled that categorically outlawing escort ads violated the First Amendment or placed an “impossible burden” on websites like backpage.com to review their millions of postings.

SESTA aims to add an exception for sex trafficking to Section 230 protections for websites that host user-generated content. This would effectively require online service providers to self-police their platforms for content and advertisements that facilitate sex with minors. In this respect it is less extreme than other sex trafficking prevention legislation, such as a number of state bills floated last year that aimed to classify all internet connected devices as “porn vending machines.”

Since its introduction to the Senate committee in early August, SESTA drew the ire of major Silicon Valley companies like Facebook, Amazon, Twitter and Google, as well as digital rights groups like the EFF.

Until recently, these tech companies fought the bill under the umbrella of the Internet Association, a lobbying organization formed in 2012. They argued that the bill would significantly hurt their business models by requiring these new policing measures, as well as expose them to far more legal actions.

According to Axios, Google lobbied especially hard for changes to the bill, including a provision that would have required the Justice Department to sign off on legal actions brought against sites that violated the law. This proposed change was rejected by the bill’s sponsors on the grounds that it would significantly weaken its effectiveness.

Earlier this month, however, the Internet Association reached a “compromise” with lawmakers after they made what the association’s president Michael Beckerman called “important changes” to the bill. In the days leading up to the Senate commerce committee’s vote on the bill, the Internet Association reversed its position and threw its support behind the legislation.

According to an Internet Association spokesperson, one of the key changes that swayed the association was a clause that determined whether or not a company knowingly facilitated sex trafficking on its site. The change made it so that the development of bots and other automated tools to look for sex traffickers on a web platform wouldn’t be seen as the platform owner knowing that its platform is used for sex trafficking. This would’ve made the development of these automated tools punishable for facilitating sex trafficking on the site, even though they were made for the exact opposite purpose.

Despite the Internet Association’s recent support for SESTA and the Senate commerce committee’s adoption of the bill, digital rights groups are far from satisfied. The Electronic Frontier Foundation said SESTA was “still an awful bill” and signed a letter to the Senate commerce committee along with other digital rights groups like Engine and the Center for Technology and Democracy.

Due to the scale of many web platforms like Google, Facebook or Twitter, compliance with SESTA would almost certainly require automation to filter out content that facilitates sex trafficking. According to Elliot Harmon, a staff activist at the EFF, technology that can do this perfectly, or even efficiently, doesn’t exist, and given the high stakes involved it is unlikely that a company would trust its filters to work perfectly every time.

He pointed to the recent “sentiment analyzer” released by Google, which would flag phrases like 'I'm a homosexual' as toxic, as an example of how far web filtering technology has to go.

“It’s not to say that automated filters are useless,” Harmon told me on the phone. “They are useful as an aid to human moderation, but they are not effective as a replacement for that human moderation element.”

This means that these web companies are more likely to err on the side of caution and over-police their platforms and censor content that has nothing to do with facilitating sex trafficking, he said. Harmon said that one of the most likely causalities of these less-than-perfect automated systems are likely to be sex trafficking victims themselves, whose voices will be silenced if they try to tell their story or seek help.

“Lawmakers believe they can generate advancements in technology by passing laws that require those advancements in technology, like they’ll come about if only the nerds worked harder,” Harmon said.

The EFF claims SESTA will especially harm smaller web companies and nonprofit organizations like Wikipedia or the Internet Archive, which don’t have the resources to effectively police their platforms like Google and other tech giants. They claim this will result in a disproportionate amount of censorship on smaller platforms in an effort to comply with the law.

Got a tip? You can contact this reporter securely on Signal at +19284875164, OTR chat at doberhaus@jabber.ccc.de, or email daniel.oberhaus@vice.com

The EFF and many other digital rights groups argue that the protections afforded to web platforms by Section 230 of the Communications Decency Act are a major part of what has allowed the internet to grow and innovate over the years.

“Section 230 was introduced in the first place because courts were treating platforms that tried to moderate their users differently and held them responsible for the bad content that slips through,” Harmon said. “What you lose when you weaken 230 is the safety and incentive to create a filter that both loses bad content and keeps legitimate content.”

Golumbia and other supporters of SESTA, on the other hand, think the Communications Decency Act has strayed far from its original purpose and is ready for an update.

“We are not talking censorship, we are talking the prosecution of people for abetting the vicious crime of human trafficking,” Golumbia, who advocates against code as a form of speech, told me. “Every other kind of media is held liable for participating in crimes like this, and we don't call it "censorship." [But] the EFF has redefined "speech" so that it now means the total operation of digital tech companies.”