Apple last week filed a motion to vacate a federal order requiring the company to create a tool or code to unlock the iPhone of one of the San Bernardino, California, shooters.
The order would set a dangerous precedent and release a powerful means to breach security on potentially millions of phones around the world, Apple argued.
It transcends one phone and would empower government to make private companies compromise the security of all their users whenever it sees fit, the company said.
“This is not a case about one isolated iPhone. Rather, this case is about the Department of Justice and the FBI seeking through the courts a dangerous power that Congress and the American people have withheld: the ability to force companies like Apple to undermine the basic security and privacy interests of hundreds of millions of individuals around the globe,” the motion says.
Apple already has tools that could compromise the security of millions of people, so the implication that this code is any different from similar capabilities the company possesses is baseless, according to Stewart Baker, partner at Steptoe & Johnson.
Furthermore, Apple has security in place to protect itself and its users from data breaches, he told TechNewsWorld.
“The code that they’re so worried will get out is no different than any of the other codes they write, in that if it gets out, then bad things will happen. Apple already protects its code very aggressively because they don’t want that to happen, so there’s no super-burden to protecting this code, Baker noted.
“This is particularly true because in order to install this code on the phone that is the target it is going to be necessary for Apple to sign the code with their super-secret signature,” he added.
“What would happen is that Apple would send this signature to the phone, which will identify itself back to Apple, which means Apple almost certainly has to be right in the middle of any such transaction. It’s not like you can just steal the code and walk off and use it — because you also have to have Apple’s signature, Baker said.
“If Apple’s signature is compromised, it’s the end of security for everyone, and they’re already in a position where they have to protect that aggressively,” he added.
The case is a matter of getting information that’s imperative to an ongoing investigation, according to Paul Charlton, a partner at Steptoe & Johnson.
“What we can say with absolute certainty is that if you think about this as something other than a technology company — if you think about this in terms of Apple being the landlord that holds within its building evidence of terrorist activity — there wouldn’t be any doubt in anyone’s mind that the government should be allowed, with the appropriate court authority, … to go in and take what they need,” he told TechNewsWorld.
FBI director James Comey “has made it very clear that what he’s interested in is not a back door, not a wide open door into this apartment complex, if you will, but entry into a specific apartment … to grab this specific piece of information. That seems narrowly tailored and wholly reasonable to me,” Charlton said.
Creating the code the government is asking for would open a Pandora’s box of unforeseen consequences, according to Christopher Maurer, assistant professor of information technology and management at the University of Tampa.
“We see time and time again that there are really good intentions. There might be a real problem and government is not addressing the underlying issue and instead is creating other issues in the form of loopholes or unintended side effects,” he told TechNewsWorld.
One such side effect would be a precedent allowing other law enforcement agencies to order phones to be unlocked, noted Chris Calabrese, vice president for policy at the Center for Democracy & Technology.
“In terms of the idea that this is no different and that this back door doesn’t create a vulnerability is just not true. What we’re talking about is a precedent that will not just be for the FBI but will almost certainly be for those state and local law enforcement, of which there will be tens of thousands across the country. They’re all going to encounter iPhones. They’re all going to want them to be unlocked,” he told TechNewsWorld.
A back door would be a potentially hazardous tool if it fell into the wrong hands, Calabrese added.
“There’s going to have to be an entire process in place on unlocking iPhones somehow, which is to say subverting their security. That’s a giant process designed to be exploited by bad guys. And you just can’t say somehow that this is a one-off,” he said.
Congressional Action Ahead
Congress eventually will have to answer the larger privacy question, Steptoe & Johnson’s Charlton noted.
“We are constantly weighing our rights to privacy versus our need for security. That’s why we have a Fourth Amendment. That’s why we have to get search warrants before we conduct searches on individual’s homes,” he said.
“Here, that’s exactly what happened. The FBI obtained a valid court order after showing probable cause to believe that there’s evidence of terrorist activity on this phone, and right now that court order is still in place, absent the lawyers from Apple being able to reverse that order they’re going to have to turn that information over,” Charlton added.
However, incentives already are in place to ensure that customer data is secure, the Center for Democracy & Technology’s Calabrese maintained, citing the Sony hack.
“There are a lot of incentives to want to build devices that are private and secure. There are reputational harms, potential liability, the requirement that they do a data breach notice if the information gets out,” he said.
“We’ve all seen, for example, what happened with Sony and the devastating result of not having good security in their systems,” Calabrese said.
The National Institute of Standards and Technology has published standards for good security and cryptology, he noted. “There are guidelines in place that help people know what they need to do. There are best practices out there that have nothing to do with legislation.”
On the other hand, rigid mandates might freeze the development of security technology, Calabrese added.
“You don’t want to say you must do the following six things to secure a phone when in three years those things could be totally out of date but you still have a legal requirement to do them,” he said.
“There’s a push pull when it comes to whether you should mandate security,” Calabrese added. “Our view is that you need baseline security standards, and you need to let people know what best practices are and then create incentives to get people to meet those best practices without mandating anything in particular.”