Senator Lindsey Graham (R-SC) is the latest Republican member of Congress intent on demonstrating his anti-“Big Tech” credentials. His colleagues Sen. Joshua Hawley (R-MO) and Sen. Ted Cruz (R-TX) have led the charge against Silicon Valley, focusing on the weak claim that large firms such as Google, Facebook, and Twitter are censoring conservative activists and lawmakers. In a draft bill leaked last week Graham takes a different approach, though it is no less misguided. If passed as written, Graham’s bill, which targets Big Tech’s supposed unwillingness to aid law enforcement, would put the security and privacy of law abiding Americans and residents at risk.

Grahams bill is titled the EARN IT Act. It’s an elaborate acronym that stands for Eliminating Abusive and Rampant Neglect of Interactive Technologies Act. As the not‐​so‐​subtle title suggests, Graham is unhappy with what he perceives to Big Tech’s lack of effort and cooperation when it comes to combating crime, especially child sexual exploitation. During a Senate Judiciary hearing on encryption in December Graham issued a warning to Facebook and Apple: “this time next year, if we haven’t found a way that you can live with, we will impose our will on you.” In the weeks since the hearing Graham has clearly thought about ways he might be able to impose his will on Big Tech. The result is the EARN IT Act, which is co‐​sponsored by Graham’s colleague Sen. Richard Blumenthal (D-CT).

At the heart of the EARN IT Act is an amendment to Section 230 of the Communications Decency Act of 1996. The law protects “interactive computer services” such as Youtube, Facebook, and Twitter from the majority of content posted on their platforms by third party users. The EARN IT Act conditions Section 230’s intermediary liability protections related to child sex abuse material (CSAM) content on tech platforms’ adherence to a code of best practices approved by a commission. The commission would be made up of the Attorney General, the Federal Trade Commission chairman, the Department of Homeland Security secretary, and members appointed by the majority and minority from both houses of Congress. Platforms would be rendered the “publishers”of CSAM material uploaded by users unless they submit a certification of compliance with the code. The bill imposes criminal penalties on false certifications and creates a new civil cause of action to be used against firms that don’t comply.

Section 230 has become a popular target for modification or repeal. Graham’s bill treats it like a carrot. Despite Section 230’s broad protections, under 18 U.S. Code § 2258A platforms are already responsible for any CSAM they discover but fail to report to the FBI. Platforms that are made aware of CSAM face severe criminal penalties for failing to deal with it. Existing police powers have spurred platforms to police CSAM far more effectively than other sorts of illegal content.



Platform reporting requirements render the issue visible, at least in America, drawing critical attention to platform moderation efforts. When firms become more effective in identifying illegal material, the resulting reports are taken as evidence of greater CSAM prevalence, not better discovery mechanisms. Nevertheless, cross platform arrangements to remove previously logged CSAM material, such as PhotoDNA, have reduced its spread.



Elsewhere, the problem is less visible but more prevalent. Yandex, a popular Russian search engine, does not participate in PhotoDNA, but does not report its discovery of child sexual abuse material either. While Google and Microsoft’s implementation of PhotoDNA cut CSAM searches more than in half, Yandex remains a popular vector for pedophiles. The hyper‐​transparency associated with digital record keeping and reporting makes it easy to blame the messenger. But doing so is a mistake.

While AG Bill Barr can enforce reporting and takedown requirements, he cannot prohibit the provision of encrypted messaging services or demand backdoor access to them. Existing snooping authorities, formalized in the Communications Assistance for Law Enforcement Act, exempt “information services” from wiretapping access mandates.

However, by privileging Section 230’s shield on compliance with an indeterminate evolving code, Graham and Barr can finally dissuade platforms from providing Americans with secure encrypted messaging. Encryption is useful to criminals, child predators included. However, the vast majority of encrypted messages flow between lawful conversants trying to maintain a modicum of privacy and security in an often‐​threatening digital world. Holding Section 230 hostage in an attempt to limit public access to encryption threatens speech freedoms and endangers privacy and security.

Those proposing the effective end to encryption have difficult questions to answer about security. If companies felt compelled to eliminate encryption messaging services their devices would become a major target of foreign intelligence agencies and criminals. Such a mandate would put the privacy and security of law‐​abiding citizens and residents at risk.

It’s true that criminals take advantage of encryption. But so do journalists, activists, dissidents, Congressional staffers, intelligence officials, and (as of early January) the 82nd Airborne. All would have their privacy and security compromised by mandated law enforcement access to communications. Organized criminals and oppressive governments wouldn’t hesitate to take advantage of the end of encryption. The results could be deadly.