This week it matters. The open and fertile terrain that gave rise to innovations like Facebook and Amazon, Wikileaks and countless open source projects may have to answer to the Department of Justice now before deploying anything new. Ever. The anti-encryption backdoor debate and the issue of government data access is approaching accountability as a bill drafted earlier this year by Senators Lindsey Graham (R- SC ) and Richard Blumenthal (D- Conn.) could put the jurisdiction of (any and all) internet companies under the auspices of the Attorney General of the United States. This time, legislation will affect Section 230, a landmark law protecting free speech online. In the proposed revision of the EARN IT Act, many of those protections will be withheld unless internet companies comply with “best practices” generally translated to mean encryption backdoors. In the name of child protection, we come to a summit moment where the fate of personal data privacy, free speech and internet innovation hang in the balance as full government oversight and individual privacy rights are at an impasse. If we had to choose just one, the Graham-Blumenthal bill makes us do so. At this point, it will be up to Congress to decide.
In America, the encryption battle may finally be coming to a head. Are we weeks away from adopting similar practices to Australia, a la the lawful-access backdoor laws of 2018? Will we have our own industry insights to report two years later as heavyweights like Google and Facebook fall in line behind the Attorney General and allow for weakening of cryptographic safeguards?
First, let’s find out what’s at stake.
Crimes that didn’t need a backdoor to catch.
This brings up more questions than answers.
Technologists have been speaking out against the bill for the very reason that it was invented—safety. In line with child safety are the possibly unforeseen effects to national safety. In keeping with this concern, the EFF even added to their article a direct email send to your local Congressman containing a pre-written letter expressing support for end-to-encryption. I filled one out.
Congress has some weighty decisions to make, and how this vote falls could determine more than political rivalries and WhatsApp chats. It sets a fundamental precedent for free speech online, and who is responsible for it—and why. And, considering some of the nuances of the bill, postures some questions as to practicality and motive.
A weakening of encryption across all platforms leaves every bit of infrastructure that’s online (or every bit of infrastructure) vulnerable to cyberattack. Weak for one means weak for all. Military, law enforcement and national defense administrators will have to deal with the repercussions in our national security posture, facing cyberthreats in what could be an increasingly compromised position.
As the EFF states, “International diplomats from many countries, including the U.S. State Department, rely heavily on encrypted services to get their work done. The U.S. military also relies on encryption, and Congressman Ro Khanna has spoken up about the importance of encryption to national defense.”
If you, Android app developer, haven’t done so already, Google suggests you encrypt all your apps. Now.
Aside from the keynote reason, here’s why. There’s two different ways app data is stored—in a sandbox on the actual device, where app information can’t be shared with other apps, and externally—where security is often weak or nonexistent. Google suggests we cut down on the risk of storing personally identifiable information (which we all give out so readily with an Accept and Continue button) by storing our externally housed data in sandboxes as well.
In addition, they suggest that app developers encrypt that external sandboxed information, to add an extra layer of protection. That sounds like security best-practice to me.
To aid in this decision, Google suggested its security library Jetpack Security (JetSec) which enables developers to encrypt and decrypt files containing sensitive information gleaned on apps.
Is there anything to gain by Google simultaneously issuing a strong suggestion to encrypt all Android app data and offering a product that can do so? Maybe, but it might be beside the point. The bigger issue might be that we simply can’t afford to leave personal data exposed.
Additionally, Google suggests adding biometrics as an added security feature. Seeing as Android has a 75% market share in the mobile phone industry, Google owns Android, and big game lately has been the target of data breaches—I'm going to have to think about that one. For better or worse, that information is being stored somewhere and we know biometrics databases get hacked.
Here's trying to prevent what we can, while we can; one fully-encrypted sandbox at a time.
We all want faster internet. But at what cost? It may not make a difference.
So, you can download a full 2-hour movie in 3.6 seconds or scroll rapidly through Feedly with zero latency. Cool. 5g is purportedly 100 times faster than its 4G predecessor at top speeds. Very cool. What’s not so cool is that 5G rolled out with sub-par cryptographic protections. And that leaves us with a sad bit of irony, you could get hacked even faster.
This seems to be a pressing issue these days. DevOps guys (or simply “developers” to the up-and-coming) have directives to roll new stuff out, fast. InfoSec is struggling to keep a handle on all the cats they have to herd, with developers often taking the sensible route at the time and relying on the buy-in security options of the software they’re using, or (even worse) just rolling out packages with less than hearty security oversight. That could be the issue. Or, there could be others.
That’s up to you to decide.
The (Lingering) Issue of Unencrypted Pre-Authentication Messages
The Huawei and Government Backdoor Issue
Not meant to be safe?
Ultimately, it looks like mobile phones, their networks and associated apps were not intended to be places of high-walled security and impenetrable privacy. Conscientious researchers are hoping to ameliorate that with industry standard fixes, and “the greatest thing that has happened in cellular,” the 5GReasoner proposal, lays out a potential framework for making 5G cryptographically safe.
According to Schneier however, the problem stems back farther than Jover, the proposal or any after-market fixes can accomplish—the problem is systemic, and it’s the carry-on of the encryption debate between backdoors and privacy, with the motivations that underly it.
In a comment to CSO, Schneier put it bluntly: “Nobody wants 5G security...Governments like spying on 5G. Carriers don’t care very much. They’ll do what the law says.”