I’m thinking that maybe I should start using a brick. An actual, every-day brick. But only if it runs on Android OS. Apparently, the connection between our cell phone towers and our phones isn’t encrypted, with no imminent plans to encrypt in the future. On that topic, Facebook says Jeff Bezos got his personal messages exfiltrated because he was using an iPhone, and Telegram says it was because he was using a Facebook product. Interestingly, they both might have a good point as government phone hackers are now saying Android is putting up a fight in the “don’t decrypt me” category, while iPhones (all of them) have fallen prey to industry hacking devices. So, until some things get sorted out, I may opt for a VPN protected carrier pigeon.
Someone failed Jeff Bezos’ iPhone. Telegram says it’s WhatsApp.
Let’s point some fingers. Telegram founder Pavel Durov points out that if Jeff Bezos had only been using his encrypted platform, Jeff Bezos’ personal information would still be safe.
Facebook says that it was Apple’s OS that let him down, not brainchild WhatsApp.
Either way, if any of us want to avoid a similarly uncomfortable fate, maybe we should investigate what we know.
I’m right. So what?
In a straight-between-the-eyes answer to the question of his bias, Durov replies, “Of course I consider Telegram Secret Chats to be significantly more secure than any competing means of communication – why else would I be developing and using Telegram?”
Things might have been different a year ago. “A year ago we couldn’t get into iPhones, but we could get into all the Androids. Now we can’t get into a lot of the Androids,” revealed Detective Rex Kiser, who conducts digital forensic examinations for the Fort Worth Police Department, in a comment to Vice.
Times have changed, and as the battle to secure the encryption front faces increasing pressure from both sides, it seems Android has held the line—even improved it.
Take results from one frequently used hacking tool, for example. Cellebrite, a contractor popular with government agencies, has a device designed to break the unbreakable. This would allow access into a users’ private files for the purpose of providing implicating evidence in criminal trials.
It can crack all iPhones up to iPhone X. However, when up against Android devices, it has a harder time swiping data (GPS, social media, browsing) from phones like the Samsung Galaxy S9, the Google Pixel 2 and Huawei P20 Pro - from which it literally got nothing.
However, government agencies can still use other means to crack stubborn tech—just not the Cellebrite tool. There are agencies, notably the Remote Operations Unit, which upon consideration by the Inspector General of the Justice Department, could have been consulted by the FBI in wake of the 2015 San Bernardino shootings. In the report, it is suggested that the FBI “could have done more” to crack the phone via available internal means before going to Apple. The process is more “labor intensive and takes more time and resources,” but ultimately, can be done.
In a very real way, this is a case study of what could happen if backdoors really were implemented. In some cases, they might have already been. The vulnerability cuts across phones, operating systems and hardware security right to the code. The same backdoors that gather data on criminals and pedophiles gather data on you, me and Jeff Bezos.
Should we start making things safer? That seems to be a question we’re asking ourselves as we catch our breath in the wake of breakneck technological advancements like the internet and 5G.
The answer these days, seems to be—yes. We might wonder why we didn’t make them “safe” to begin with, but the dot com era was a forgivable utopia of idealists.
However, safety now has to compete with legacy technologies and industries that (literally) aren’t built for it. Take, for example, cell towers. Your phone connects with the nearest cell tower available. No authentication, no credentialing required. Some of the older phones couldn’t handle that anyway. Only, in that ease lies the vulnerability to stingray attacks. These are false base camps broadcasting the same “system information messages” as real cell phone towers and passing off as the real thing. And your mobile device can’t tell the difference.
Cue Yomna Nasser, research engineer, and her talk at the USENIX Enigma security conference on Monday. She believes there is a way to provide encryption for pre-authentication messages (or those pings from the cell phone tower to your phone), and her full plan is laid here.
It works similarly to HTTPS encryption and would require phones to check the cell tower’s digital certificate prior to connecting. Just like online. With the push of the last decade to “encrypt the internet,” this seems like a logical next move.
HTTPS similarities aside, there are just a few barriers:
Researcher Roger Piqueras Jover from Bloomberg LP agrees a fix is overdue - "It’s been many, many years, even decades, and we still have the same problems.” However, while awareness might be rising and there may be concessions for safety in the 5G standard, until “option” is no longer part of adoption, implementation might be a long time coming.
As Jover points out, “It’s complicated—the way cellular networks are designed is based on standards developed by industry players with maybe non-aligning incentives."
While we fight the fight to protect our privacy online, low-tech hackers might be peering into other parts of our personal lives by just sneaking into our phones. The backdoor debate is far from the only front in the encryption fight. It seems until the pain of compromise by Stingray attack finally hits mobile network providers (or until ink is put to paper) those ideas might be just like our unencrypted data – floating uncomfortably out in the ether.
Make sure you know the basics of symmetric encryption. Our Chalk Talk video gives a refresher course.