During the COVID-19 pandemic, some politicians argued that America should be reopened because the dire predictions about COVID turned out to be wrong. On the face of it, this appears to be good reasoning that we could use in the next pandemic: if things are not as bad as predicted, we can start reopening sooner than predicted. To use an analogy, if a fire was predicted to destroy your house, but it only burned your garage, then it would make sense to move back in and rebuild the garage. While this reasoning is appealing, it also can be a trap. Here is how the trap works.
Some politicians and pundits pointed out that the dire predictions about COVID did not come true. For example, the governor of Florida said that since the hospitals were not overwhelmed as predicted, it was a good idea for them to return to profitable elective surgery. He also, like some other Republican governors, wanted to reopen very quickly. This reasoning seemed sensible: the pandemic was not as bad as predicted, so we can quickly reopen. There are also those who sneered at the dire predictions and were upset at what they saw as excessive precautions. This can also seem sensible: the experts predicted a terrible outcome for COVID-19, but they were wrong. We overreacted and should have rolled back the precautions when the predictions did not come true. But would this be a wise strategy for the next pandemic?
While it is reasonable to consider whether the precautions were excessive, there is a tempting fallacy that needs to be avoided. This is the prediction fallacy. It occurs when someone uncritically rejects a prediction and responses to that prediction when the outcome of a prediction turns out to be false. The error in the logic occurs because the person fails to consider what should be obvious: if a prediction is responded to effectively, then the prediction is going to be “wrong.” The form of the fallacy is this:
Premise 1: Prediction P predicted X if we do not do R.
Premise 2: Response R was taken based on prediction P.
Premise 3: X did not happen, so prediction P is wrong.
Conclusion: We should not have taken Response R (or no longer need to take Response R).
To use a concrete example:
Premise 1: Experts predicted that the hospitals in Florida would be overwhelmed if we did not respond with social distancing and other precautions.
Premise 2: People responded to this prediction with social distancing and other precautions.
Premise 3: The hospitals in Florida were not overwhelmed, so the prediction was wrong.
Conclusion: The response was excessive, and we no longer need these precautions.
While it is (obviously) true that a prediction that turns out to be wrong is wrong, the error is uncritically concluding that this proves that the response based on the prediction need not have been taken (or that we no longer need to keep responding in this way). The prediction assumes we do not respond (or do not respond a certain way) and the response is to address the prediction. If the response is effective, then the predicted outcome would not occur and that is the point of responding. To reason that the “failure” of the prediction shows that the response was mistaken or no longer needed would be a mistake in reasoning. You could be right; but you need to do more than to point to the failed prediction.
As a silly, but effective analogy, imagine we are driving towards a cliff. You make the prediction that if we keep going, we will go off the cliff and die. So, I turn the wheel and avoid the cliff. If backseat Billy gets angry and says that there was no reason to turn the wheel or that I should turn it back because we did not die in a fiery explosion, Billy is falling for this fallacy. After all, if we did not turn, then we would have died. And if we turn back too soon, then we die.
The same applied to COVID-19: by responding effectively to dire predictions, we changed the outcome and the predictions turned out to be wrong. But to infer that the responses were excessive or that we should stop now simply because the results were not as dire as predicted would be an error.
This is not to deny what is obviously true: it is possible to overreact to a pandemic. But making decisions based on the prediction fallacy is a bad idea. There is also another version of this fallacy.
A variation of this fallacy involves inferring the prediction was a bad one because it turned out to be wrong:
Premise 1: Prediction P predicted X if we do not do R.
Premise 2: Response R was taken based on prediction P.
Premise 3: X did not happen.
Conclusion: The prediction was wrong about X occurring if we did not do R.
While the prediction turned out to be wrong in the sense that the predicted outcome did not occur, this does not disprove the prediction that X would occur without the response when the response occurred. Going back to the car analogy, the prediction that we would die if we drove off the cliff if we do not turn is not disproven if we turn and then do not die. In fact, that is the result we want.
Getting back to COVID-19, the predictions made about what could occur if we did nothing are not disproven by the fact that they did not come true when we did something. So, to infer that these predictions must have been wrong in predicting what would occur if we did nothing would be an error. We do, of course, rationally assess predictions based on outcomes but this assessment should not ignore considering the effect of the response. Sorting out such counterfactual predictions is hard. In complex cases we can probably never prove what would have happened, but good methods can guide us here, which is why we need to go with science and math rather than hunches and feelings.
This fallacy draws considerable force from psychological factors, especially in the case of COVID-19. The response that was taken to the virus came with a high cost and we wanted things to get back to normal—so, ironically, the success of the response made us feel that we could stop quickly or that we did not need such a response. As always, bad reasoning can lead to bad consequences and in a pandemic, it can hurt and even kill many people.
Stay safe and I will see you in the future.

Years ago, my coverage of medical testing in my critical thinking class was purely theoretical for most of my students. But COVID-19 changed that. One common type of medical test determines whether a person is infected with a disease, such as COVID-19. Another is to determine whether a person has had the infection. While tests are a critical source of information, we need to be aware of the limitations of testing. Since I am not a medical expert, I will not comment on the accuracy of specific methods of testing. Instead, I will look at applying critical thinking to testing.
While it would be irrational to reject medical claims of health care experts in favor of those made by a ruler, this happened in the last pandemic and will happen again. Why people do this is mainly a matter of psychology, but the likely errors in reasoning are a matter of philosophy.
During the next pandemic, accurate information will be critical to your wellbeing and even survival. Some sources will mean well but will unintentionally spread misinformation. Malicious sources will be spreading disinformation. While being an expert in a relevant field is the best way to sort out which sources to trust, most of us are not experts in these areas. But we are not helpless. While we cannot become medical experts overnight, you can learn skills for assessing sources.
While assessing the credibility of sources is always important, the next pandemic will make this a matter of life and death. Those of us who are not epidemiologists or medical professionals must rely on others for information. While some people will provide accurate information, there will also be well-meaning people unintentionally spreading unsupported or even untrue claims. There will also be people knowingly spreading disinformation. Your well-being and even survival will depend on being able to determine which sources are credible and which are best avoided.
Critical thinking can save your life, especially during a pandemic of pathogens, disinformation and misinformation. While we are not in a pandemic as this is being written, it is a question of when the next one will arrive. As our government is likely to be unwilling and unable to help us, we need to prepare to face it on our own. Hence, this series on applying critical thinking to pandemics.
Anyone familiar with sports knows that if team members don’t work together, things will go badly. So good athletes set aside internal conflicts when on the field and come together to win. This does not mean that an athlete should accept anything a teammate might do without complaint. For example, a good athlete would not allow a teammate to cheat or a coach to abuse athletes. As another example, a good athlete would not tolerate a teammate committing domestic violence or engage in dog fighting. While we belong to various competing teams, such as nations, during a pandemic we should all be on the same team since we are playing a deadly game of humans versus pathogens.
As COVID-19 ravaged humanity, xenophobia and racism remained alive and well. For example, an Iranian leader played on fears of America and Israel.