Today, we had a spirited internal debate on the FBI request for a backdoor into an iPhone owned by one of the killers in the San Bernardino terrorist incident. Apple contend this case is about the Right to Privacy, not of a terrorist’s but of all of us, while the FBI insists this is a matter of National Security.
Privacy versus Security – we want them both, but where is the fine line to be drawn?
Here we have Josh Yeager, Director of Operations, and Matt Rodenbaugh, Professional Services Team Lead, who hold opposing views:
The Federal Government is Overreaching and Everyone’s Security & Privacy Are Threatened
Early this week, Apple published a letter to their customers that triggered a controversy that swept through the tech industry over the next few days. The FBI requested that Apple create software that could be installed on an iPhone to bypass several security restrictions that currently prevent them from cracking the phone’s passcode.
Apple rejected their request, saying “Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data. Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.”
I have heard many arguments that Apple should cooperate with the FBI because the San Bernardino investigation is important and the requested software does not cause any problems for the public. I agree that the investigation is important but disagree that the FBI’s request is safe for the public.
If an attacker can modify or replace the software on the phone, they can break through any other security measures. This would be like a house with alarms on every door and window, motion sensors in every room, and the alarm control box on the wall outside the front door. Apple prevents this problem by making the phone unable to turn on unless its software is “signed” with a secret key that only a few people at Apple can use. Every other phone maker does the same thing. This makes it impossible to change any of the phone’s software unless you have access to the secret key.
There is essentially only one way for Apple to comply with the FBI’s request. They must make and sign a tool for the FBI to use to crack this phone. They can make it run only on a specific phone so that it can’t be used to attack other phones, but this means that each time the FBI or another party needs to unlock a phone Apple will need to modify it and sign it again.
Since the secret key is one of Apple’s most important assets, it is held tightly and has a rigorous control process. The process requires careful review and approval of every use of the key. It is a deliberately slow and expensive process. If the government routinely requests this cracking software for more phones, it will overwhelm that process. This is not a hypothetical scenario: the Justice department has already demanded that Apple unlock nine more phones.
Every time Apple uses the key, there is a risk that it will be stolen. And if it is ever stolen, every iPhone in the world will become completely vulnerable to attack by anyone who gets a copy of the key. This includes governments, companies, amateur hackers, and even “script kiddies”. This is the core problem with the FBI’s request: they say it would only be used with a valid court order, but there is absolutely no way to guarantee that will remain true. Even if the US government always follows the rules to use this backdoor, its existence will create significant new risks to the security of all Apple users, and the legal precedent will cause the same backdoors to be created on every other platform.
These risks are abstract, but real. I applaud Apple for standing up for their customers’ security.
The FBI is Right to Deal with this Rotten Apple
At the time of this writing, Apple is in the middle of a fight against the FBI. As I currently understand the request, the FBI has acquired an iPhone from the San Bernardino shooting which may contain information that law enforcement can use to identify other guilty parties and obtain justice for the victims and our country. The FBI needs Apple’s assistance to remove the delays caused by typing in an incorrect password and the trigger to wipe all data off the phone if the password is entered incorrectly 10 times. By removing these features the FBI would be able to use computer to crack the phone’s password and gain access to the data on the phone.
Apple is taking the stance that writing code that circumvents their security would be a burden on the company, taking 6-10 developers between 2-4 weeks to produce. In addition, this could be a slippery slope for the FBI and government to go down making it possible for hackers, criminals, and the FBI to use this on the public. If they intend to develop and destroy the operating system upon each request, it would cause even more repetitive burden on the company in the future.
First, let me state that I completely agree with Apple’s concerns. We need to ensure we are protecting the public’s civil liberties and the government should not be able to add damaging burden on a company.
What I don’t agree with is how Apple is taking a stand and saying NO and nothing more. We live in the greatest country in the world which provides us with freedom to do what we want and protection from those who would want to harm us. Why doesn’t Apple care about the crime and victims? Why are they just saying “No” rather than stating their concerns and working with the FBI on a compromise to provide justice but also protect the company and the public. If it’s a burden, ask for financial compensation. If it’s public security concerns, why not offer to do it in an Apple controlled environment, copy the data for the FBI and destroy the OS immediately?
In summary, I am not against Apple’s concerns, I’m against their unwillingness to work with the FBI on an acceptable solution. This should not be an “All or Nothing” but rather a compromise.