In philosophy, one of the classic moral debates has focused on the conflict between liberty and security. While this topic covers many issues, the main problem is determining the extent to which liberty should be sacrificed in order to gain security. There is also the practical question of whether or not the security gain is actually effective.
One of the recent versions of this debate focuses on tech companies being required to include electronic backdoors in certain software and hardware. Put in simple terms, a backdoor of this sort would allow government agencies (such as the police, FBI and NSA) to gain access even to files and hardware protected by encryption. To use an analogy, this would be like requiring that all dwellings be equipped with a special door that could be secretly opened by the government to allow access to the contents of the house.
The main argument in support of mandating such backdoors is a fairly stock one: governments need such access for criminal investigators, gathering military intelligence and (of course) to “fight terrorism.” The concern is that if there is not a backdoor, criminals and terrorists will be able to secure their data and thus prevent state agencies from undertaking surveillance or acquiring evidence.
As is so often the case with such arguments, various awful or nightmare scenarios are often presented in making the case. For example, it might be claimed that the location and shutdown codes for ticking bombs could be on an encrypted iPhone. If the NSA had a key, they could just get that information and save the day. Without the key, New York will be a radioactive crater. As another example, it might be claimed that a clever child pornographer could encrypt all his pornography, making it impossible to make the case against him, thus ensuring he will be free to pursue his misdeeds with impunity.
While this argument is not without merit, there are numerous stock counter arguments. Many of these are grounded in views of individual liberty and privacy—the basic idea being that an individual has the right to have such security against the state. These arguments are appealing to both liberals (who tend to profess to like privacy rights) and conservatives (who tend to claim to be against the intrusions of big government).
Another moral argument is grounded in the fact that the United States government has shown that it cannot be trusted. To use an analogy, imagine that agents of the state were caught sneaking into the dwellings of all citizens and going through their stuff in clear violation of the law, the constitution and basic moral rights. Then someone developed a lock that could only be opened by the person with the proper key. If the state then demanded that the lock company include a master key function to allow the state to get in whenever it wanted, the obvious response would be that the state has already shown that it cannot be trusted with such access. If the state had behaved responsibly and in accord with the laws, then it could have been trusted. But, like a guest who abused her access to a house, the state cannot and should not be trusted with a key After all, we already know what they will do.
This argument also applies to other states that have done similar things. In the case of states that are even worse in their spying on and oppression of their citizens, the moral concerns are even greater. Such backdoors would allow the North Korean, Chinese and Iranian governments to gain access to devices, while encryption would provide their citizens with some degree of protection.
The strongest moral and practical argument is grounded on the technical vulnerabilities of integrated backdoors. One way that a built-in backdoor creates vulnerability is its very existence. To use a somewhat oversimplified analogy, if thieves know that all vaults have a built in backdoor designed to allow access by the government, they will know that a vulnerability exists that can be exploited.
One counter-argument against this is that the backdoor would not be that sort of vulnerability—that is, it would not be like a weaker secret door into a vault. Rather, it would be analogous to the government having its own combination that would work on all the vaults. The vault itself would be as strong as ever; it is just that the agents of the state would be free to enter the vault when they are allowed to legally do so (or when they feel like doing so).
The obvious moral and practical concern here is that the government’s combination to the vaults (to continue with the analogy) could be stolen and used to allow criminals or enemies easy access to all the vaults. The security of such vaults would be only as good as the security the government used to protect this combination (or combinations—perhaps one for each manufacturer). As such, the security of every user depends on the state’s ability to secure its means of access to hardware and software.
The obvious problem is that governments, such as the United States, have shown that they are not very good at providing such security. From a moral standpoint, it would seem to be wrong to expect people to trust the state with such access, given the fact that the state has shown that it cannot be depended on in such matters. To use an analogy, imagine you have a friend who is very sloppy about securing his credit card numbers, keys, PINs and such—in fact, you know that his information is routinely stolen. Then imagine that this friend insists that he needs your credit card numbers, PINs and such and that he will “keep them safe.” Given his own track record, you have no reason to trust this friend nor any obligation to put yourself at risk, regardless of how much he claims that he needs the information.
One obvious counter to this analogy is that this irresponsible friend is not a good analogue to the state. The state has compulsive power that the friend lacks, so the state can use its power to force you to hand over this information.
The counter to this is that the mere fact that the state does have compulsive force does not mean that it is thus responsible—which is the key concern in regards to both the ethics of the matter and the practical aspect of the matter. That is, the burden of proof would seem to rest on those that claim there is a moral obligation to provide a clearly irresponsible party with such access.
It might then be argued that the state could improve its security and responsibility, and thus merit being trusted with such access. While this does have some appeal, there is the obvious fact that if hackers and governments knew that that the keys to the backdoors existed, they would expend considerable effort to acquire them and would, almost certainly, succeed. I can even picture the sort of headlines that would appear: “U.S. Government Hacked: Backdoor Codes Now on Sale on the Dark Web” or “Hackers Linked to China Hack Backdoor Keys; All Updated Apple and Android Devices Vulnerable!” As such, the state would not seem to have a moral right to insist on having such backdoors, given that the keys will inevitably be stolen.
At this point, the stock opening argument could be brought up again: the state needs backdoor access in order to fight crime and terrorism. There are two easy and obvious replies to this sort of argument.
The first is based in an examination of past spying, such as that done under the auspices of the Patriot Act. The evidence seems to show that this spying was completely ineffective in regards to fighting terrorism. These is no reason to think that backdoor access would change this.
The second is a utilitarian argument (which can be cast as a practical or moral argument) in which the likely harm done by having backdoor access must be weighed against the likely advantages of having such access. The consensus among those who are experts in security is that the vulnerability created by backdoors vastly exceeds the alleged gain to protecting people from criminals and terrorists.
Somewhat ironically, what is alleged to be a critical tool for fighting crime (and terrorism) would simply make cybercrime much easier by building vulnerabilities right into software and devices.
In light of the above discussion, it would seem that baked-in backdoors are morally wrong on many grounds (privacy violations, creation of needless vulnerability, etc.) and lack a practical justification. As such, they should not be required by the state.
In light of the above discussion, it would seem that baked-in backdoors are morally wrong on many grounds (privacy violations, creation of needless vulnerability, etc.) and lack a practical justification. As such, they should not be required by the state.
The Koch brothers would agree, Hillary would disagree. Kochs–1, Hillary–0
Hilary would be for it before she was against it.