Blog Post 9-Government Backdoors

The use of encryption is on the forefront of today’s national defense debate. After recent terrorist events, which may have been organized through encrypted channels, many are shocked and outraged that there are ways that terrorists can hide their information such that no one, not even the government, can access it. Before these events, many people had probably never even heard about encryption before. They probably don’t understand that it’s something that they use on a day-to-day basis, whether through the internet, mobile phones, etc. Now to many it has an evil connotation. And we are left asking what should the companies proving these services do about it. Is a company more ethically responsible for their user’s privacy or society’s safety?

The main argument for companies providing a backdoor to governments is that encryption does not allow governments to execute warrants for information. This argument makes sense to me. I would be completely against allowing the government to access our data at any time. However, allowing the government access to our data when a court warrant is served seems justified. Manhattan district attorney defended this point in a recent paper, stating “Last fall, a decision by a single company changed the way those of us in law enforcement work to keep the public safe and bring justice to victims and their families.” If we can issue a court warrant and legally obtain physical evidence in a case, we should certainly be able to do this with digital evidence. However, it appears that this is impossible.

In previous encryption systems, Apple would retain a key that could be used to unlock customer’s data in the case of a search warrant. This seems perfect, a user’s data would be secure except in the case of a warrant. However, we do not live in a perfect world. What Apple found is that creating this backdoor, even if designed only to be used by them, left their system vulnerable. Eventually somebody would find a way in. This is the root of our issue. In a perfect world we could create a system that would be completely secure, except in the case of a search warrant. However, it is not possible. Once we create a vulnerability in the system, eventually somebody will exploit it. So in a perfect world, Apple could balance its ethical responsibility to privacy with its responsibility to security. Sadly, the answer to our question is not that easy, so we have to choose a side.

I have to side with Apple in this argument, that they are more responsible for their user’s privacy then what their user’s do with their system. When the argument is brought up that “isn’t saving lives or protecting our nation worth a little less individual privacy”, I say that we are not losing a little less privacy. We are losing all our privacy. Even if we trust that the government will only use the backdoors legally, we take the chance that a malicious party will exploit it. When the argument is brought up that “if you’ve got nothing to hide, you’ve got nothing to fear” I say that everybody has something to hide. Even a completely ethical person may have credit card information, company information, etc. on their mobile phone. Many people right now would say national security is worth their privacy, but if you tell them their identity might be stolen I think they would change their mind. National security is very important, but maintaining digital privacy is something that is basically necessary for most people to live normal lives. Thus I think protecting our privacy is more valuable than protecting the chance that encryption is used against national security.

 

Leave a comment