Unlike the thinking machines of science fiction, human beings can easily believe inconsistent (even contradictory) claims. I am confident I have many inconsistent beliefs and know that I have many false beliefs. This is because I have turned up such beliefs over the years—one of the benefits (or occupational hazards) of being a professional philosopher. I thus infer I have many left in my mind. I do not know which ones are false—if I knew, I would (I hope) stop believing them. Writing out my ideas, like this, is a help—other people can see my claims and subject them to critical assessment. If someone can honestly show that two of my beliefs are inconsistent (or contradictory) I consider that a gift—they are helping me weed the garden of my mind. But not everyone is grateful for this sort of help—although, to be fair, this can often be done from cruelty rather than honest concern.
While most people do not write extensively about their beliefs, many people present their professed beliefs on social media, such as Facebook. Being a philosopher, I have the annoying trait of noting these claims and then assessing whether they can all be true. That is, I automatically check for logical inconsistency and contradictions. Two claims are inconsistent if they both cannot be true at the same time; but they could both be false. If two claims are contradictory, one must be false and the other true.
As would be suspected, the political beliefs people profess are often inconsistent or even contradictory. I have, and perhaps so have you, seen relatively short Facebook posts putting forth sets of political claims that are inconsistent. As noted in my previous essay, the professed beliefs of Trump supporters about the pandemic often form such sets. It is a bit jarring to see a single post mock people who take the pandemic “hoax” seriously, assert that the “China Virus” is a dangerous bioweapon, and then finish things off with praise for Trump’s great handling of the pandemic and how the Democrats are trying to steal credit for the great vaccine that should be credited to Trump. It gets even stranger when 5G and QAnon gets thrown into the post. Efforts to point out the inconsistencies tend to only lead people to respond by angrily doubling down or making threats. I, of course, invite readers to provide their favorite examples of how “the libs” also hold inconsistent sets of beliefs. But keep in mind that inconsistency is a matter of logic—a set of false claims can be consistent with each other. So how do people profess such sets of clearly inconsistent beliefs? Perhaps the concept of choice blindness can shed some light on the matter.
Back in 2005 Swedish researchers developed the concept of choice blindness after conducting an experiment involving choosing between two photos of faces. Each participant was asked which photo they found most attractive and then the researcher used sleight of hand to make the participant think they had been handed back the photo they picked. But the researcher handed them the photo they had not picked. While one would expect the subject to notice the switch, they generally did not—they accepted the switched photo as the one they had picked and even offered reasons as to why they had picked that photo in the first place (though they had rejected it in the first place). Follow up experiments yielded the same results for the taste of jam, financial choices, and eyewitness reports.
These results could, perhaps, be explained away in terms of weak preferences and other factors. For example, if a person is asked to pick between two photos and at that moment, they slightly prefer one, then it would not be surprising that they would be amendable to easily changing their mind. But one might think that political beliefs would be different—especially in these highly polarized times. But people seem to suffer from choice blindness here as well.
In 2018 an experiment was conducted in which participants were given a survey about political questions. The researchers gave the subjects false feedback and found that their beliefs tended to shift accordingly. This effect lasted up to a week and, interestingly, lasted even longer when the researchers asked the participants to defend “their choices.” For example, a person who originally favored raising taxes would be asked by the researchers about “their” view that taxes should not be raised. This person would then tend to believe that taxes should not be raised. The researchers’ explanation is a reasonable one: if a person thinks a belief is their belief, they will be free of many of the factors that would have caused them to defend their original belief. This certainly makes sense: if someone believes they believe something, then they will tend to believe and defend it. Roughly put, people believe what they believe they believe—even when they previously did not believe it. So how can this help explain the ability to believe inconsistent or even contradictory claims?
Based on the above, a person can initially believe one claim and then be easily switched to believing (and defending) a claim inconsistent with their original professed belief. For example, a person who initially believes that a carbon tax would reduce emissions could have their belief switched by this method to believing (and defending) that carbon taxes would not do that. These two claims are inconsistent, but a person can easily be switched from one to the other without apparently even noticing.
Now consider a person who professes to believe claims that are inconsistent. When they profess one claim, this would be analogous to their professing their original belief in the choice blindness experiment. When they profess an inconsistent claim, this would be analogous to them professing belief in the claim they were switched to believing by the researchers. In the case of holding inconsistent beliefs, a person would be switching themselves when they switched from professing one belief to professing belief in a claim inconsistent with the first belief. As such, a person would believe the first belief and then seamlessly switch to the inconsistent belief without noticing the inconsistency. Given that the experiment shows that people can be switched to opposite beliefs without noticing, it would be even easier for people to hold to inconsistent beliefs without noticing any inconsistency. They believe one belief because they believe it; they believe an inconsistent belief because they believe that as well. That is, people believe what they think they believe and simply ignore or forget any inconsistencies. While this is certainly not the whole story, choice blindness does shed some light on the ability people have to profess inconsistent beliefs.