In philosophy, a classic moral debate is on the conflict between liberty and security. While this covers many issues, the main problem is determining the extent to which liberty should be sacrificed to gain security. There is also the practical question of whether the security gain is effective.
One ongoing debate focuses on tech companies being required to include electronic backdoors in certain software and hardware. A backdoor of this sort would allow government agencies (such as the police, FBI and NSA) to access files and hardware protected by encryption. This is like requiring all dwellings be equipped with a special door that could be secretly opened by the government to allow access.
The main argument in support of mandating backdoors that governments need such access for criminal investigators, gathering military intelligence and (of course) to “fight terrorism.” The concern is that if there is not a backdoor, criminals and terrorists will be able to secure their data and prevent state agencies from undertaking surveillance or acquiring evidence.
As is so often the case with such arguments, various awful or nightmare scenarios are presented in making the case. For example, the location and shutdown codes for ticking bombs might be on an encrypted iPhone. If the NSA had a key, they could save the day. As another example, it might be claimed that a clever child pornographer could encrypt all his pornography, making it impossible to make the case against him, thus ensuring he will be free to pursue his misdeeds with impunity.
While this argument is not without merit, there are counter arguments. Many of these are grounded in views of individual liberty and privacy, the idea being that an individual has the right to have such security against the state. These arguments are appealing to both liberals (who profess to like privacy rights) and conservatives (who profess to be against the intrusions of big government when they are not in charge).
Another moral argument is grounded in the fact that the United States government has, like all governments, shown that it cannot be trusted. Imagine agents of the state were caught sneaking into the dwellings of all citizens and going through their stuff in clear violation of the law, the Constitution and basic moral rights. Then someone developed a lock that could only be opened by the person with the proper key. If the state then demanded that the lock company include a master key function to allow the state to get in whenever it wanted, the obvious response would be that the state has already shown that it cannot be trusted with such access. If the state had behaved responsibly and in accord with the laws, then it could have been trusted. But, like a guest who abused her access to a house, the state cannot and should not be trusted with a key After all, we already know what they will do.
In the case of states that are even worse in their spying on and oppression of their citizens, the moral concerns are even greater. Such backdoors would allow the North Korean, Chinese and Iranian governments to gain access to devices, while encryption could provide their citizens with some degree of protection.
Probably the strongest moral and practical argument is grounded on the technical vulnerabilities of integrated backdoors. One way that a built-in backdoor creates vulnerability is by its mere existence. To use a somewhat oversimplified analogy, if thieves knew that all safes had a built-in backdoor designed to allow access by the government, they would know what to target.
One counter-argument is that the backdoor would not be that sort of vulnerability—that is, it would not be like a weaker secret door into a safe. Rather, it would be like the government having its own combination that would work on all safes. The vault itself would be as strong as ever; it is just that the agents of the state would be free to enter the safe when they are allowed to legally do so (or when they feel like doing so).
The obvious moral and practical concern here is that the government’s combination (continue with the analogy) could be stolen and used to allow criminals or enemies easy access. The security of all safes would be only as good as the security the government used to protect this combination (or combinations—perhaps one for each manufacturer). As such, the security of every user depends on the state’s ability to secure its means of access to hardware and software.
One obvious problem is that governments, such as the United States, have shown that they are not very good at providing such security. From a moral standpoint, it would seem to be wrong to expect people to trust the state with such access, given the fact that the state has shown that it cannot be depended on in such matters. Imagine you have a friend who is very sloppy about securing his credit card numbers, keys, PINs and such—in fact, you know that his information is routinely stolen. Then imagine that this friend insists that he must have your credit card numbers, PINs and such and that he will “keep them safe.” Given his own track record, you have no reason to trust this friend nor any obligation to put yourself at risk, regardless of how much he claims that he needs the information.
One obvious counter to this analogy is that this irresponsible friend is not a good analogue to the state. The state has compulsive power that the friend lacks, so the state can use its power to force you to hand over this information.
The counter to this is that the mere fact that the state has compulsive force does not mean that it is thus responsible—which is the key concern in regards to both the ethics of the matter and the practical aspect of the matter. That is, the burden of proof would seem to rest on those that claim there is a moral obligation to provide a clearly irresponsible party with such access.
It might then be argued that the state could improve its security and responsibility, and thus merit being trusted with such access. While this does have some appeal, there is the obvious fact that if hackers and governments knew that the keys to the backdoors existed, they would take pains to acquire them and would, almost certainly, succeed. I can even picture the sort of headlines that would appear: “U.S. Government Hacked: Backdoor Codes Now on Sale on the Dark Web” or “Hackers Linked to China Hack Backdoor Keys; All Updated Apple and Android Devices Vulnerable!” As such, the state would not seem to have a moral right to insist on having such backdoors, given that the keys will inevitably be stolen.
At this point, the stock opening argument could be brought up again: the state needs backdoor access to fight crime and terrorism. There are two easy and obvious replies to this sort of argument.
The first is based on an examination of past spying, such as that done under the auspices of the Patriot Act. The evidence seems to show that this spying was completely ineffective in regard to fighting terrorism. There is no reason to think that expanded backdoor access would change this.
The second is a utilitarian argument (which can be cast as a practical or moral argument) in which the likely harm done by having backdoor access must be weighed against the likely advantages of having such access. The consensus among those who are experts in security is that the vulnerability created by backdoors vastly exceeds the alleged gain to protecting people from criminals and terrorists.
Somewhat ironically, what is alleged to be a critical tool for fighting crime (and terrorism) would simply make cybercrime much easier by building vulnerabilities right into software and devices.
In light of the above discussion, baked-in backdoors are morally wrong on many grounds (privacy violations, creation of needless vulnerability, etc.) and lack a practical justification. As such, they should not be required by the state.

An obvious consequence of technological advance is the automation of certain jobs. In the past, these jobs tended to be mechanical and repetitive: the sort of tasks that could be reduced to basic rules. A good example of this is the replacement of automobile assembly line jobs with robots. Not surprisingly, it has been claimed that certain jobs will always require humans because these jobs simply cannot be automated. Also not surprisingly, the number of jobs that “simply cannot be automated” shrinks with each advance in technology.
In my previous essay I sketched my view that self-defense is consistent with my faith, although the defense of self should prioritize protecting the integrity of the soul over the life of the body. A reasonable criticism of my view is that it seems inherently selfish: even though my primary concern is with acting righteously, this appears to be driven by a desire to protect my soul. Any concern about others, one might argue, derives from my worry that harming them might harm me. A critic could note that although I make a show of reconciling my faith with self-defense, I am merely doing what I have sometimes accused others of doing: painting over my selfishness and fear with a thin layer of theology. That, I must concede, is a fair point and I must further develop my philosophy of violence to address this. While it might seem odd, my philosophy of violence is built on love.
In his book Naked Sun, Isaac Asimov creates the world of Solaria. What distinguishes this world from other human inhabited planets is that it has a strictly regulated population of 20,000 humans and 10,000 robots for each human. What is perhaps the strangest feature of this world is a reversal of what many consider a basic human need: the humans of Solaria are trained to despise in-person contact with other humans, though interaction with human-like robots is acceptable. Each human lives on a huge estate, though some live “with” a spouse. When the Solarians need to communicate, they make use of a holographic telepresence system. Interestingly, they have even developed terminology to distinguish between communicating in person (called “seeing”) and communication via telepresence (“viewing”). For some Solarians the fear of encountering another human in person is so strong that they would rather commit suicide than endure such contact.
Thanks to improvements in medicine humans are living longer and can be kept alive beyond when they would naturally die. On the plus side, longer life is generally good. On the downside, this longer lifespan and medical intervention mean that people will often need extensive care in their old age that can be a burden on caregivers. Not surprisingly, there has been an effort to solve this problem with companion robots.
In June 2015 the United States Supreme Court ruled in favor of the legality of same-sex marriage. Many states had already legalized it and most Americans thought it should be legal. As such, the ruling was consistent both with the constitution and with the democratic ideal of majority rule. There are, of course, those who objected to the ruling and are even now working to undo it.
The United States has libertarian and anarchist threads, which is appropriate for a nation that espouses individual liberty and expresses distrust of the state. While there are many versions of libertarianism ranging across the political spectrum, I will focus on the key idea that the government should impose minimal limits on individual liberty and that there should be little, if any, state regulation of business. These principles were laid out clearly by American anarchist Henry David Thoreau in his claims that the best government governs least (or not at all) and that government only advances business by getting out of its way.
While, as a professor, talking is my business, I am generally reluctant to talk about my faith. One reason for this is a matter of professionalism. As a professor at a state university, it would be both unprofessional and improper to preach rather than teach. While some might take the view that a believer should attempt to always spread their belief, consider if you hired a plumber to fix your sink and she spent the entire time preaching rather than plumbing. If my students seek religion, they can easily find it in the many places of worship in Tallahassee. If I wished to be a religious teacher, I could seek employment at a religious institution. If I wanted to preach rather than teach, I could join the ministry.
There is an old legend that King Midas for a long time hunted the wise Silenus, the companion of Dionysus, in the forests, without catching him. When Silenus finally fell into the king’s hands, the king asked what was the best thing of all for men, the very finest. The daemon remained silent, motionless and inflexible, until, compelled by the king, he finally broke out into shrill laughter and said these words, “Suffering creature, born for a day, child of accident and toil, why are you forcing me to say what would give you the greatest pleasure not to hear? The very best thing for you is totally unreachable: not to have been born, not to exist, to be nothing. The second-best thing for you, however, is this — to die soon.”