According to a Bloomberg report, Senator Lindsey Graham has drafted a bill which aims at lifting off “interactive computer services” providers’ immunity to lawsuits for child sexual abuse material (CSAM) posted on their platforms. The “discussion draft” affects all known apps, such as WhatsApp or iCloud, email and cloud services providers. The draft bill is dubbed as Eliminating Abusive and Rampant Neglect of Interactive Technologies (or EARN IT) Act.
According to the draft, a “National Commission on Online Child Exploitation Prevention” will be established, which will be chaired by Attorney General and whose purpose is to “develop recommended best practices for providers of interactive computer services regarding the prevention of online child exploitation conduct.” These recommendations shall include practices for “identifying, categorizing, and reporting material related to child exploitation or child sexual abuse.”
Abiding to these best practices will entail the “interactive computer services” provider with a “certification” that gives him immunity in accordance with Section 230 of the Communications Decency Act. If “an officer of a provider of an interactive computer services” fails to abide by the practices or submits “false statements” of compliance will be held accountable to imprisonment up to two years.
Some may argue, that “at last, these tech giants are held accountable for all this disgusting material circulating in the web.” But it is not as simple as that. Opponents to end-to-end encryption have used from time to time various arguments to justify their stance. Since the argument of “terrorism” failed, now they are targeting the emotions of the public against CSAM traders.
The real target of this draft bill is encryption. Although it is not mentioned, encryption, particularly end-to-end encryption, is likely to be targeted as being contrary to “best practices” for preventing CSAM, because if a provider cannot “see” the contents of files on its service due to encryption, it is harder to detect CSAM files.
At this point, let’s pause for a while and have a look at what existing legislation says about “interactive computers services” providers and illegal content. Riana Pfefferkorn, Associate Director of Surveillance and Cybersecurity at the Stanford Center for Internet and Society, wrote a detailed analysis of the EARN IT Act, where she provides the legislative background.
Section 230 of the Communications Decency Act of 1996 says, in essence, “that online platforms (“providers” of “interactive computer services”) mostly can’t be held liable for the things their users say and do on the platform.” For example: If you defame me on LinkedIn, I can sue you for defamation, but I can’t sue the platform. “Without the immunity provided by Section 230, there might very well be no Twitter, or Facebook, or dating apps, or basically any website with a comments section,” explains Pfefferkorn.
In addition, while the Communications Assistance for Law Enforcement Act of 1994, or CALEA for short, requires telecommunications carriers (e.g., phone companies) to make their networks wiretappable for law enforcement, it does not mandate “information services” to be designed as surveillance friendly.
“In passing these two laws, Congress made a wise policy choice not to strangle the young Internet economic sector in its cradle. Exposing online services to crippling legal liability would inhibit the free exchange of information and ideas; mandating that “information services” be surveillance-friendly to the U.S. government would hurt their commercial viability in foreign markets. Congress chose instead to encourage innovation in the Internet and other new digital technologies. And the Internet bloomed,” says Pfefferkorn.
What is more important is that in the US there is already a Federal Law (Chapter 110 of Title 18 of the U.S. Code) that makes CSAM illegal. Section 2258A of the law imposes duties on online service providers, such as Facebook. The law mandates that providers must report CSAM when they discover it on their services, and then preserve what they’ve reported (because it’s evidence of a crime). Providers “who fail to comply with this obligation face substantial (and apparently criminal) penalties payable to the federal government.” The duties of the online service providers do not include any duty to proactively monitor and filter content on the service to look for CSAM. Section 2258A only requires providers to report CSAM they “obtain actual knowledge of.”
It is because of this Federal Law that tech companies reported over 45 million online photos and videos of children being sexually abused, according to a ground breaking report by the New York Times. Pfefferkorn notes that Section 230 explicitly states that “Nothing in this section shall be construed to impair the enforcement of … [chapter] 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute.” In other words, federal prosecutors can hold providers accountable for CSAM on their services.
The question that raises is obvious: if the legislative provisions force platforms such as Facebook or WhatsApp to report CSAM when they discover it, why is there a need to amend Section 230?
It is far than obvious that the goal of this bill is to ban end-to-end encryption, without banning it. It is an underhanded manipulation that has many problems. The idea of the draft bill is to make providers EARN Section 230 immunity for CSAM claims, ”by complying with a set of guidelines that would be developed by an unelected commission and could be modified unilaterally by the Attorney General, but which are not actually binding law or rules set through any legislative or agency rulemaking process,” says Pfefferkorn.
The bill would allow the “unelected commission” to set best practices making it illegal for online service providers to provide end-to-end encryption, while this totally legal under existing federal law, specifically CALEA.
What is also worrying is that the proposed bill appeals to the society’s sentiments and emotions against CSAM traders instead of providing a factual proof that banning encryption will solve the problem. It is a post-truth bill by science-deniers that cannot understand that a backdoor to encryption means broken encryption. It means systems susceptible to every kind of adversarial attacks and actors.
Further, the proposed bill will not have any effect to the root of the problem—the CSAM traders. The tech companies might be tempted to comply with the “best practices” in fear of losing Section 230 immunity, but that threat will have no effect on the bad actors in the CSAM ecosystem: Dark web sites devoted to CSAM, which already don’t qualify for Section 230 immunity because they serve and host directly the illegal content on their sites. As a result, CSAM traders will leave the “certified, good” platforms for the “bad ones”, where it is and will be very hard for law enforcement to track them down.
On the other hand, the bill cannot do anything to deter the CSAM traders from encrypting offline their material and then sending it through WhatsApp or Messenger, even if these platforms no longer have any end-to-end encryption functionality. “It will just move the place where the encryption happens to a different point in the process. File encryption technology is out there, and it’s been used by CSAM offenders for decades; the EARN IT Act bill can’t change that” concludes Riana Pfefferkorn.
It is apparent that the Graham bill will not do anything to stop CSAM from being in the web.
It is about time we end this meaningless debate. We all agree that CSAM has to end for the sake of the innocent souls of our children. Both ends, law enforcement and tech companies, need to focus their efforts and resources not on proving who is more correct than the other, rather on investing on new, privacy enhancing technologies, such as differential privacy, homomorphic encryption and secure multi-party computation. In a previous article of ours we highlighted the progress academia has made in using homomorphic encryption for crime detection. Microsoft has already released to law enforcement PhotoDNA, which uses hashing algorithms for detecting CSAM material.
Academia, law enforcement and tech companies need to cooperate and collaborate to advance these privacy enhancing technologies for the good of our society.