it’s one that just got louder. Far from over, the bedraggled debate reared afresh with government cries to halt Facebook’s plans to end-to-end encrypt Messenger, its widely used messaging app. This would follow in the footsteps of Facebook-owned WhatsApp, but with a second international data sharing agreement underway and a lingering misunderstanding of what encryption can do, it’s hard to see which way the cards will fall on this one.
Does the US-UK agreement guarantee against backdoors, or will the application imply it? Will Facebook respond to pressure to keep Messenger free of end-to-end encryption, or will privacy win out? And if it does, at what cost? And lastly, are those costs worth bearing in light of the alternative? All questions we’re still grappling with, in this week’s Encryption Digest.
It becomes increasingly hard for any thoughtful person to derive a “right answer” to the encryption debate, if there were one. And, it becomes increasingly hard to determine which tradeoffs we are willing to, well, trade off.
The pressure on Facebook to encrypt all Messenger chats (as it did with WhatsApp) is increasingly strong, with the same weary arguments on both sides. Law enforcement wants a way to police criminal activity, while tech and privacy rights groups don't want bad actors to enter those same backdoors. And, they want privacy. We know all of this.
However, why is Facebook—who just over a year ago was tangled up in Cambridge Analytica—suddenly so concerned about protecting our information? And, regardless of the reason, what will be the implications if they do?
Last year, you may remember an MS-13 investigation in California in which the Department of Justice asked Facebook to allow access to voice calls suspect in the surveillance of the accused gang. It was 2018, and they would not comply. “Facebook’s position is that it would either have to remove encryption from Messenger altogether or 'hack' the individual that the government wants to listen in on. Right now, the company seems unwilling to do either,” reported Chris Welch of The Verge.
It’s 2019, and so far, the media giant has not changed its stance. However, Facebook Messenger does not support end-to-end encryption across all chats and has responded to pressure to monitor illicit behavior in the form of content moderation. Consequently, “16.8 million reports to the U.S. National Center for Missing & Exploited Children (NCMEC)” have been made and “2,500 arrests by U.K. law enforcement.” Additionally, “more than 99% of the content Facebook takes action against—both for child sexual exploitation and terrorism—is identified by [its] safety systems, rather than by reports from users.” With end-to-end encryption, these monitoring capabilities will no longer be possible, even for Facebook.
Unfortunately, for every crime ring that gets caught, swaths of political activists around the world stay offline or are silenced because of prying, censoring governments. North Koreans can’t send a text without state supervision and the Uygars are left with no better form of social solidarity than TikTok. If they had end-to-end encrypted apps, maybe organization, recourse and revolution would be possible.
At the end of the end-to-end rainbow lies a duplicitous pot of gold. Something significant will be lost in either case and with Facebook standing as an unexpected last guard, it might just be up to them to determine what.
As the UK and US band together to present a united front against end-to-end encryption, international opinions continue to foment, decisions push to a close, and legislators still may not fully understand encryption.
At the Summit on Lawful Access last Friday, the Justice Department made an impassioned and practical appeal against fully encrypted “lawless” zones where perpetrators of child pornography could flourish. It’s a good point, and one of the more salient ones against government inaccessible cryptography.
FBI director Christopher Wray expressed annoyance at being accused of “trying to weaken encryption or weaken cybersecurity more broadly. We're doing no such thing.” And, to his credit, maybe the cybersecurity community hasn’t done a great job of explaining practical applications. Instead of a backdoor, he insists the FBI would much rather use a “front door” as they’ve always done, with a warrant and respect for Fourth Amendment rights.
However, a backdoor (or any door) is like a bucket of water that will wet all hands. There is no way in the anatomy of SHA-1, RSA, ECC or any cryptographic algorithm to make it party specific; an understandingly baffling concept for an agency that has built its culture off person-centric clearance levels and selective exclusivity. Unfortunately, there is no retina scan at the encryption backdoor.
“Hackers are a danger, but so are violent criminals, terrorists, drug traffickers, human traffickers, fraudsters and sexual predators” argues Attorney General William Barr. And he’s not wrong. But comparing apples to apples, “hackers” could be parsed into spying dictatorships, crippling mega breaches and hostile governments potentially taking a swipe at say, the coding of US long-range missiles in cryptographically vulnerable hands of defense contractors.
It’s a tough call.
Maybe we’re partly to blame. Maybe as cybersecurity experts we should come down from our vantage point more often to educate about the mechanics of cryptography. Maybe we should be generous with the fact that 30 years ago most politicians were building their careers and cybersecurity was largely a buzzword. In any event, we can take time to step out from behind our monitors to quell some rumors that have been floating around about the nature of encryption and the debate at large. And for once, maybe we’ll put it in plaintext.
Attorney General Barr said that information security "should not come at the expense of making us more vulnerable in the real world."
We couldn’t agree more. However, the dangers of criminals, pedophiles and dark web drug dealers must be pitted against nation state attacks, government censorship and potential compromise of financial institutions were those same backdoors to be exploited by bad actors.
The FBI quoted that in 2017 the number of devices inaccessible by the FBI was close to 8,000. That’s 8,000 people who could have potentially gotten away with criminal activity because of indicting evidence that may be behind a lock screen.
The Washington Post performed an internal investigation and clarified that number at closer to 1,000. This may be beside the point, but many run-of-the mill cell manufacturers may not put as much time into security R&D as say, Apple or Samsung. “Companies recognized guarding against state surveillance is a bottom line issue for them... It is a question of financial interest to these companies to be able to convince their users that their data is secure with them" explains Mark Rumold, senior staff attorney at the Electronic Frontier Foundation (EFF). Because privacy is a premium, the risk to public safety may not be as ubiquitous yet as it seems.
It’s a fight that’s turned to a stalemate, arguments can turn into breeding grounds for misinformation – and innocently so. Most politicians don’t speak cybersecurity or have a 360-degree understanding of the threat landscape. We do. And, in a decision that affects us all, it helps to make that knowledge accessible in the simplest ways possible.
A bilateral agreement was put forth that would require American social media firms to share private messaging data with British law enforcement. Known as the US-UK Bilateral Data Access Agreement, it spans both jurisdictions and is part of a joint effort to speed investigations against "terrorism, transnational organized crime, and child exploitation.”
This was spurred in part by the investigation of Matthew Falder of the UK, whose 137 convicted criminal offenses included blackmail, child sexual abuse and sharing of indecent images. Government efforts were stymied in the international pursuit, and the act seeks to streamline accessibility to incriminating information. “Under its terms, law enforcement, when armed with appropriate court authorization, may go directly to tech companies based in the other country to access electronic data, rather than going through governments, which can take years,” quotes the Department of Justice’s Office of Public Affairs.
This is not an automatic sanction for encryption backdoors. The understandable balking in the media may have colored the issue, but Facebook’s Mark Zuckerberg asserted, “We oppose government attempts to build backdoors because they would undermine the privacy and security of our users everywhere.”
The CLOUD Act, its predecessor, may provide some context to what this treaty aims to do. Passed in 2018, it “allow[s] for companies to provide available information when we receive valid legal requests and do not require companies to build back doors,” according to Facebook.
The agreement is subject to “a six-month Congressional review period mandated by the CLOUD Act, and the related review by UK’s Parliament.” While a full version of the 2018 CLOUD Act was made available for review on the Department of Justice’s website, a copy of the upcoming treaty was not. While all signs seem to point north, it remains to be seen if the act will explicitly exclude backdoors. With the battle for encrypted access still heavily under way, we can only speculate how this might land.
Failure to correctly manage the machine identities that protect end-to-end encryption could lead to data breaches, much like the one at Equifax. See how a TLS certificate undermined even the best encryption.