Apple and the FBI, from San Bernardino to Pensacola
Just before 11:00 am on December 2, 2015, Syed Rizwan Farook and his wife, Tashfeen Malik, opened fire on a department holiday event for the San Bernardino County Department of Public Health, at the Inland Regional Center in San Bernardino, California. The couple, armed with .223-caliber assault rifles and semi-automatic handguns, took the lives of fourteen people, wounding a further 24. Only because of poor construction did an explosive device fail to detonate at the scene, preventing further loss of life. The perpetrators fled and were tracked to a residential neighborhood later that afternoon. Hundreds of shots were fired in a gun battle involving at least 20 police officers, it ended in the death of both Farook and Malik. One officer was injured in the shootout.
Almost exactly four years to the day, on the morning of December 6, 2019, a Saudi Arabian Air Force officer, Mohammed Alshamrani, opened fire in a classroom building at Naval Air Station Pensacola in Florida, killing three US servicemen and injuring a further eight. The three men who were killed put themselves in the path of harm's way to protect others, the youngest, Mohammed Haitham, was only 19-years old.
Both of these attacks share a chilling list of common traits. The underpinning of Jihadist ideology, the tragic proximity to the holiday season, the senseless loss of human life, and locked iPhones.
Aside from the immediate tragedy, the stories of both San Bernardino and Pensacola have risen to the forefront of the public arena for another reason: encryption. On one side, the FBI and two government administrations who earnestly (and perhaps sincerely) believe that public safety and security could be better served if there was a way around your iPhone's most basic security feature, the lock screen. On the other side, Tim Cook, Apple, and a wider family of alarmed companies who know deep down that creating a "backdoor" to the iPhone and iOS could irreversibly undermine the fundamental security of iOS devices to the detriment of its customers and user privacy.
Slide to Unlock
The Slide to Unlock gesture is one of the most iconic tenets of Apple's iPhone. From a swipe and a passcode to a fingerprint to a face, Apple has invented and reinvented ways of keeping your iPhone personal to you. Make no mistake, our iPhones are more secure now than they ever have been. But how has this seemingly simple, fundamental feature of iOS become such a bone of contention between Apple and the United States government? More importantly, how do the similarities, and indeed the differences between these two cases show us that the FBI's case for a backdoor to iOS has never been weaker?
The phones
In both cases, the FBI, in the course of its investigation recovered iPhones allegedly belonging to the respective shooters. In 2015, the FBI recovered an iPhone 5C that reportedly belonged to Syed Farook. The phone was owned and issued by San Bernardino County and issued to Farook as an employee. Just over a week ago, it emerged that in the case of Pensacola shooter Mohammed Alshamrani, the FBI had recovered two iPhones belonging to the gunman, later revealed to be an iPhone 5 and an iPhone 7.
The backing of the court
In both of these cases, the US judicial system has seemingly sided with the FBI, however, the circumstances differ in both cases. Famously, in 2015, the FBI, having recovered Farook's iPhone, spent two months trying to gain access to the phone of its own accord. (There were even reports that someone tried to reset the Apple ID password whilst it was in FBI custody) It wasn't until early February that FBI Director James Comey told a Senate Intelligence Committee:
Master your iPhone in minutes
iMore offers spot-on advice and guidance from our team of experts, with decades of Apple device experience to lean on. Learn more with iMore!
He further lamented that he and other federal officials had long warned about the challenges that powerful encryption posed to national security investigations, a problem that had been exacerbated by Apple's release of iOS 8, an altogether more robust version of iOS as far as security was concerned.
It was the FBI's failure to gain access to Farook's iPhone through its own efforts that led to a request that Apple create a new version of iOS that didn't feature iOS 8's troublesome security features, that could then be side-loaded onto the iPhone in question so that they could access the device. In Apple's own words:
When Apple refused, the FBI went to the courts, where a federal judge ordered Apple to provide "reasonable technical assistance" to the government. The three key points: bypassing or disabling the auto-erase function (the feature that automatically erases an iPhone after a certain amount of incorrect password attempts), enabling the FBI to submit passcodes electronically via the device's port, Bluetooth or Wi-Fi, and the disabling of the delay feature between passcode attempts. Essentially, the FBI would be able to brute force the iPhone's passcode, using an unlimited number of attempts without risk of delay, disabling the phone for good or losing the data. Apple, of course, refused and made public its objections in a customer letter issued by Tim Cook on February 16. A legal back and forth began, and would not cease until late March, when the FBI would gain access to the phone using third-party assistance, but more on that later.
Similarly in the more recent Pensacola shooting, the FBI has the backing of the court, having been granted a warrant to search both phones. As with San Bernardino, Pensacola investigators have once again been thwarted by Apple's iOS security.
The backing of the court remains a consistent feature of both cases. Unlike in the San Bernardino case, however, the FBI has not (yet) sought court backing that would compel Apple to create the backdoor it so desperately wants, as this remains a developing story, there's no reason to rule this course of action out just yet.
Two administrations, one voice
The use of backdoor technology and limiting encryption in iPhones is one of only a few policies that the Trump and Obama administrations seem to agree on. As the San Bernardino court battle raged, President Obama spoke out in support of the FBI's efforts:
In decidedly less civil terms, President Trump took to Twitter in wake of reports that the FBI was struggling to unlock the two iPhones belonging to the Pensacola shooter stating:
https://twitter.com/realDonaldTrump/status/1217228960964038658
The FBI doesn't need Apple
As already noted, the San Bernardino legal case fizzled out in March of 2016. The reason? Justice Department lawyers told the judge that a third-party had come forward with the means to bypass the iOS security in question and that it no longer needed Apple's assistance. Later it was widely reported that Israeli firm Cellebrite (which denies its involvement), helped the FBI gain access to Farook's iPhone at the cost of over a million dollars, a price James Comey said was "well worth" paying despite the fact it would only work on the iPhone 5C (or older), which was more than two years old at the time. Ironically enough, Apple has utilized Cellebrite devices in its Apple Stores to assist customers who wished to transfer their data from non-iPhone devices to their new iPhone during the in-store setup process.
The age of the iPhones in the recent Pensacola investigation has led to widespread criticism of the FBI. The iPhone 5 and 7 are both known to be vulnerable to several existing means of bypassing iOS encryption. Not only the aforementioned Cellebrite but also Grayshift's GrayKey and the Checkm8 bootrom exploit. As recently as this week it was reported that the FBI used Grayshift's GrayKey to extract data from a locked iPhone 11 Pro Max, but the extent of what data was recovered remains unclear.
As the New York Times has noted, the iPhone 7 is the tougher cookie but is still vulnerable to Checkm8, which can remove the 10 password attempts limit and automatically try thousands of passcodes until one works. The only thing that might slow this down is the length of the password:
Their report notes that the FBI may just have found that in the case of Pensacola, the iPhones are protected by very good passwords. If this is the case, then Apple can't help the FBI anyway, they just need time. If the problem is that the phones were damaged (the gunman shot the iPhone 7 and tried to break the iPhone 5), then again, Apple still can't help and we're back at square one. The only reason that the FBI can't use existing techniques to crack these latest iPhones is either password length, or damaged hardware, neither of which an iOS backdoor would solve.
In both cases, it seems evident that the FBI is less concerned about the three iPhones at the heart of these respective stories, and more about encryption in general. Another consistent feature of both stories is how the FBI and the government have tried to use these high profile tragedies to drum up public support for its cause, strong-arming Apple into compliance in the process.
The big picture
The FBI doesn't need Apple to unlock the iPhones used by the Pensacola shooter (or as we've just established Apple can't help), just as it didn't need it back in 2016. Both of these stories are about precedent, an incredibly dangerous one at that. One of the most glaring similarities between both cases (that almost no one seems to have mentioned), is that all the suspects are dead, killed either at the scene of the crime or within hours of the act. When the FBI finally gained access to Farook's iPhone in 2016, it found "nothing of significance." Even in the Pensacola case, the Bureau admitted it was only searching the phone out of "an abundance of caution." There is no active suspect, no fugitive, no apparent reason to believe that further lives might be at risk, or that opening up a couple of iPhones might reveal anything to the contrary. Not only does the FBI not need Apple, it also doesn't need these iPhones either.
What harm could it do?
So why doesn't Apple just comply, crack open these phones and let the FBI be on their merry investigative way?
Rich Mogull is a Security Analyst for Securosis, and CISO for DisruptOps. A security veteran of 20 years, Rich has also worked as Research VP on the security team at Gartner, where he was the lead analyst for encryption. More recently he has focused on cloud security, authoring the Cloud Security Alliance Guidance Document and building its Foundation and Advanced training classes. In short, an expert. We spoke to him about Apple, the FBI and encryption to shed some light on what exactly Apple means when it says that complying with the FBI could undermine all of our security, permanently.
In layman's terms, Apple's iOS encryption is burnt into your iPhone's hardware — the two are inseparable. The only way to decrypt the data on an iPhone is by using the device itself. Without this feature, a hacker could take your iPhone's data and copy it to another piece of hardware, a big home computer or a cloud server. Then, they could brute force your passcode without the limitations of passcode attempts, or the danger of wiping the device (something the FBI court order specifically requested Apple disable).
"Having that hardware key means the encryption key is always big and secure and basically impossible to brute force," said Mogull, "even if the customer has "1234" as their passcode."
The other key aspect of iOS encryption is the limitations on passcode entry that are built into the iPhone's hardware:
Apple doesn't have a copy of any device-specific keys, and obviously, it doesn't know what your passcode is. As Rich notes, even with an iPhone in its possession, Apple "can't provide the key to law enforcement and they can't circumvent their own security controls." It can technically extract the data, but even still, they can't brute force access to the data.
So how could Apple comply?
Well, there are a few options, all of which are unbelievably terrible. The important thing to remember, however, is that for all the above reasons, any of these solutions would affect all the iOS devices, everywhere, all of them. For the purposes of education, here goes...
1. Switch off some/all of the security controls.
Apple could roll back/switch off all of those pesky features that are holding back the FBI, encryption embedded in the hardware, limiting the number of attempts at a passcode, time delays between wrong guesses, the feature that erases your iPhone after a certain number of wrong attempts. We could go back to how things used to be, but if you lost your iPhone, anyone could just brute force your iPhone until it unlocked, something that used to happen regularly before Apple improved its security.
2. The master key (backdoor)
Apple could create a "master key" to circumvent iOS security. As Rich notes: "This key would have to work on all iPhones and anyone who got their hands on it could access everyone's phones around the world." More to the point, history shows us that tools like these are impossible to keep under wraps, even the NSA has let some of its super-secret tools in the past. And even if Apple kept it hidden from its own employees "it would still be in the position of having to violate their own customers' privacy pretty much hundreds or thousands of times a day as law enforcement requests rolled in."
3. Apple could give the key to the government
Maybe the government could do a better job at protecting the key than Apple (doubtful but let's hypothesize) — if the US government had a tool like this, it would have to provide it to other governments as well, otherwise, countries that didn't have this tool would become "cybersecurity havens" in the way that countries without extradition agreements with the US are used by fugitives. A business traveler wouldn't be able to trust that they could travel from one country to the next and that their data would remain safe. Repressive regimes could access the data of anyone whenever they want. On a smaller scale, any kind of corruption could lead to gross invasions of privacy, for example, a law-enforcement officer spying on an ex-partner. The more hands the keys are put in, the more chance there is they could be lost an stolen.
4. What if we split the key between Apple and the government?
Given how extensive the use of such a key would be, this would be no more secure than any of the other options, and could actually open up channels of corruption between criminals, government officials, and perhaps even Apple employees. That might sound far fetched, but the risk however small is real, and the consequences dire.
To conclude, there is no way to solve the problem at hand without significantly reducing the security of current iPhones.
The internet knows enough about you anyway...
Fears about backdoors, the FBI, the cat-and-mouse game between Apple, hackers and security companies like Cellebrite have changed very little in the years between San Bernardino and Pensacola. What has changed, however, is privacy. With every app downloaded, status posted, link clicked, and location checked-in to, you give over another little piece of yourself to the internet.
In the last three years, the tech world has been rocked by scandal after scandal, and countless attempts to halt the advancing assault on our privacy: The Cambridge Analytica fiasco, where it emerged that the Facebook data of at least 87 million people was harvested and sold to enable targeting of election campaigning and advertising, Huawei and the concerns over its 5G hardware, the EU's push for a new standard with the General Data Privacy Regulation (GDPR), California's Consumer Privacy Act and a recent New York Times investigation that revealed 50 billion location pings from the phones of 12 million Americans. Confusingly, we have less privacy than ever before but are seemingly more aware of how bad the situation is.
Hopefully, this awakening will one day turn the tide of privacy in favor of us, the consumer. But we won't get there unless companies like Apple stand up to these demands of the government, demands that will erode our privacy even further and could pave the way for even more intrusion. The lie at the heart of it all? That law enforcement and the government needs to know more information about you than it already does.
Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design. Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple. Follow him on Twitter @stephenwarwick9