In philosophy, a classic moral debate is on the conflict between liberty and security. While this covers many issues, the main problem is determining the extent to which liberty should be sacrificed to gain security. There is also the practical question of whether the security gain is effective.
One ongoing debate focuses on tech companies being required to include electronic backdoors in certain software and hardware. A backdoor of this sort would allow government agencies (such as the police, FBI and NSA) to access files and hardware protected by encryption. This is like requiring all dwellings be equipped with a special door that could be secretly opened by the government to allow access.
The main argument in support of mandating backdoors that governments need such access for criminal investigators, gathering military intelligence and (of course) to “fight terrorism.” The concern is that if there is not a backdoor, criminals and terrorists will be able to secure their data and prevent state agencies from undertaking surveillance or acquiring evidence.
As is so often the case with such arguments, various awful or nightmare scenarios are presented in making the case. For example, the location and shutdown codes for ticking bombs might be on an encrypted iPhone. If the NSA had a key, they could save the day. As another example, it might be claimed that a clever child pornographer could encrypt all his pornography, making it impossible to make the case against him, thus ensuring he will be free to pursue his misdeeds with impunity.
While this argument is not without merit, there are counter arguments. Many of these are grounded in views of individual liberty and privacy, the idea being that an individual has the right to have such security against the state. These arguments are appealing to both liberals (who profess to like privacy rights) and conservatives (who profess to be against the intrusions of big government when they are not in charge).
Another moral argument is grounded in the fact that the United States government has, like all governments, shown that it cannot be trusted. Imagine agents of the state were caught sneaking into the dwellings of all citizens and going through their stuff in clear violation of the law, the Constitution and basic moral rights. Then someone developed a lock that could only be opened by the person with the proper key. If the state then demanded that the lock company include a master key function to allow the state to get in whenever it wanted, the obvious response would be that the state has already shown that it cannot be trusted with such access. If the state had behaved responsibly and in accord with the laws, then it could have been trusted. But, like a guest who abused her access to a house, the state cannot and should not be trusted with a key After all, we already know what they will do.
In the case of states that are even worse in their spying on and oppression of their citizens, the moral concerns are even greater. Such backdoors would allow the North Korean, Chinese and Iranian governments to gain access to devices, while encryption could provide their citizens with some degree of protection.
Probably the strongest moral and practical argument is grounded on the technical vulnerabilities of integrated backdoors. One way that a built-in backdoor creates vulnerability is by its mere existence. To use a somewhat oversimplified analogy, if thieves knew that all safes had a built-in backdoor designed to allow access by the government, they would know what to target.
One counter-argument is that the backdoor would not be that sort of vulnerability—that is, it would not be like a weaker secret door into a safe. Rather, it would be like the government having its own combination that would work on all safes. The vault itself would be as strong as ever; it is just that the agents of the state would be free to enter the safe when they are allowed to legally do so (or when they feel like doing so).
The obvious moral and practical concern here is that the government’s combination (continue with the analogy) could be stolen and used to allow criminals or enemies easy access. The security of all safes would be only as good as the security the government used to protect this combination (or combinations—perhaps one for each manufacturer). As such, the security of every user depends on the state’s ability to secure its means of access to hardware and software.
One obvious problem is that governments, such as the United States, have shown that they are not very good at providing such security. From a moral standpoint, it would seem to be wrong to expect people to trust the state with such access, given the fact that the state has shown that it cannot be depended on in such matters. Imagine you have a friend who is very sloppy about securing his credit card numbers, keys, PINs and such—in fact, you know that his information is routinely stolen. Then imagine that this friend insists that he must have your credit card numbers, PINs and such and that he will “keep them safe.” Given his own track record, you have no reason to trust this friend nor any obligation to put yourself at risk, regardless of how much he claims that he needs the information.
One obvious counter to this analogy is that this irresponsible friend is not a good analogue to the state. The state has compulsive power that the friend lacks, so the state can use its power to force you to hand over this information.
The counter to this is that the mere fact that the state has compulsive force does not mean that it is thus responsible—which is the key concern in regards to both the ethics of the matter and the practical aspect of the matter. That is, the burden of proof would seem to rest on those that claim there is a moral obligation to provide a clearly irresponsible party with such access.
It might then be argued that the state could improve its security and responsibility, and thus merit being trusted with such access. While this does have some appeal, there is the obvious fact that if hackers and governments knew that the keys to the backdoors existed, they would take pains to acquire them and would, almost certainly, succeed. I can even picture the sort of headlines that would appear: “U.S. Government Hacked: Backdoor Codes Now on Sale on the Dark Web” or “Hackers Linked to China Hack Backdoor Keys; All Updated Apple and Android Devices Vulnerable!” As such, the state would not seem to have a moral right to insist on having such backdoors, given that the keys will inevitably be stolen.
At this point, the stock opening argument could be brought up again: the state needs backdoor access to fight crime and terrorism. There are two easy and obvious replies to this sort of argument.
The first is based on an examination of past spying, such as that done under the auspices of the Patriot Act. The evidence seems to show that this spying was completely ineffective in regard to fighting terrorism. There is no reason to think that expanded backdoor access would change this.
The second is a utilitarian argument (which can be cast as a practical or moral argument) in which the likely harm done by having backdoor access must be weighed against the likely advantages of having such access. The consensus among those who are experts in security is that the vulnerability created by backdoors vastly exceeds the alleged gain to protecting people from criminals and terrorists.
Somewhat ironically, what is alleged to be a critical tool for fighting crime (and terrorism) would simply make cybercrime much easier by building vulnerabilities right into software and devices.
In light of the above discussion, baked-in backdoors are morally wrong on many grounds (privacy violations, creation of needless vulnerability, etc.) and lack a practical justification. As such, they should not be required by the state.

If we pause to think about the back door, it has, as a practical matter, always been an alternate route of escape, whether that be from intruders; fires ;
the long arm of the law or *marauding animals seeking something to eat. There have always been perils. So, in these matters, having a back door is not a matter of ethics, it is THE matter of survival. You have probably visited this before, but, inasmuch as I have not read other such visitations, I could not know. It is always a pleasure to read your posts, because as a Professor, you write like one, still including some wry humor and a dash of irony…all-in-all, an educational and pleasant experience. As I understand it from a former student, the late John Searle had some of this penchant. Thanks, Professor!
[*Of course, hungry animals are not choosy: whether they first encounter a front door or back door is not a matter of planning. It is one of opportunity. And just so.]
Things are getting creepy, where you live and work, Professor. See: Weinberg’s Daily Nous post, today, concerning totalitarian tactics. You probably are already aware. How has the weather been? Ours has been brutal and there’s more coming which MAY miss us in O-H-I-O. I have friends, in the projected path, for whom I am concerned. Query: how do you assess *consciousness*? I have commentary and dialogue with Eric Schwitzgabel, on his Splintered Mind post. Others, including my brother in Canada, chime in from time to time. People don’t always know what I’m sayin’, but they know what I mean. At least I have not lost any Royal titles or privileges. Am so, ahem, thankful for that. I grew up, in a sense, with the disowned Prince. Watched his impending downfall, in the 1970s. Many warning symptoms then are stark reality now. Remember that. Backdoors are connected to contextual reality, Last remarks, on the backdoor issue.
There is an irony abutting security now, that was not so common or obvious not so many years ago. With the advent and advancements of social media, people have lost (or sacrificed) much of their once-common guardianship of privacy. It appears to me that *belongingness* is of paramount importance, and, many trust the sanctity of information sharing in a braver, newer world. This alarms me, because a record album, recorded by Al Kooper, some years ago, was titled: You Never Really Know Who Your Friends Are. I learned that truth the hard way in the 1960s when some acquaintances conspired and succeeded in stealing my guitar. I never was confident enough to confront those who orchestrated the theft because I had no concrete proof; only uncanny coincidence, and, all said and done, these conspirators were supposed to be friends, not thieves. So, after the time that has passed, I have no contact with any of them and no wish to. It disturbs to me that social media and other influences are causing harm because people believe we are all one big, happy family. Crooks and all manner of nefarious interests take full advantage of such gullibility—something that by now sensible folks should be wary of, yet aren’t. It is sad.