During the COVID-19 pandemic, some politicians argued that America should be reopened because the dire predictions about COVID turned out to be wrong. On the face of it, this appears to be good reasoning that we could use in the next pandemic: if things are not as bad as predicted, we can start reopening sooner than predicted. To use an analogy, if a fire was predicted to destroy your house, but it only burned your garage, then it would make sense to move back in and rebuild the garage. While this reasoning is appealing, it also can be a trap. Here is how the trap works.
Some politicians and pundits pointed out that the dire predictions about COVID did not come true. For example, the governor of Florida said that since the hospitals were not overwhelmed as predicted, it was a good idea for them to return to profitable elective surgery. He also, like some other Republican governors, wanted to reopen very quickly. This reasoning seemed sensible: the pandemic was not as bad as predicted, so we can quickly reopen. There are also those who sneered at the dire predictions and were upset at what they saw as excessive precautions. This can also seem sensible: the experts predicted a terrible outcome for COVID-19, but they were wrong. We overreacted and should have rolled back the precautions when the predictions did not come true. But would this be a wise strategy for the next pandemic?
While it is reasonable to consider whether the precautions were excessive, there is a tempting fallacy that needs to be avoided. This is the prediction fallacy. It occurs when someone uncritically rejects a prediction and responses to that prediction when the outcome of a prediction turns out to be false. The error in the logic occurs because the person fails to consider what should be obvious: if a prediction is responded to effectively, then the prediction is going to be “wrong.” The form of the fallacy is this:
Premise 1: Prediction P predicted X if we do not do R.
Premise 2: Response R was taken based on prediction P.
Premise 3: X did not happen, so prediction P is wrong.
Conclusion: We should not have taken Response R (or no longer need to take Response R).
To use a concrete example:
Premise 1: Experts predicted that the hospitals in Florida would be overwhelmed if we did not respond with social distancing and other precautions.
Premise 2: People responded to this prediction with social distancing and other precautions.
Premise 3: The hospitals in Florida were not overwhelmed, so the prediction was wrong.
Conclusion: The response was excessive, and we no longer need these precautions.
While it is (obviously) true that a prediction that turns out to be wrong is wrong, the error is uncritically concluding that this proves that the response based on the prediction need not have been taken (or that we no longer need to keep responding in this way). The prediction assumes we do not respond (or do not respond a certain way) and the response is to address the prediction. If the response is effective, then the predicted outcome would not occur and that is the point of responding. To reason that the “failure” of the prediction shows that the response was mistaken or no longer needed would be a mistake in reasoning. You could be right; but you need to do more than to point to the failed prediction.
As a silly, but effective analogy, imagine we are driving towards a cliff. You make the prediction that if we keep going, we will go off the cliff and die. So, I turn the wheel and avoid the cliff. If backseat Billy gets angry and says that there was no reason to turn the wheel or that I should turn it back because we did not die in a fiery explosion, Billy is falling for this fallacy. After all, if we did not turn, then we would have died. And if we turn back too soon, then we die.
The same applied to COVID-19: by responding effectively to dire predictions, we changed the outcome and the predictions turned out to be wrong. But to infer that the responses were excessive or that we should stop now simply because the results were not as dire as predicted would be an error.
This is not to deny what is obviously true: it is possible to overreact to a pandemic. But making decisions based on the prediction fallacy is a bad idea. There is also another version of this fallacy.
A variation of this fallacy involves inferring the prediction was a bad one because it turned out to be wrong:
Premise 1: Prediction P predicted X if we do not do R.
Premise 2: Response R was taken based on prediction P.
Premise 3: X did not happen.
Conclusion: The prediction was wrong about X occurring if we did not do R.
While the prediction turned out to be wrong in the sense that the predicted outcome did not occur, this does not disprove the prediction that X would occur without the response when the response occurred. Going back to the car analogy, the prediction that we would die if we drove off the cliff if we do not turn is not disproven if we turn and then do not die. In fact, that is the result we want.
Getting back to COVID-19, the predictions made about what could occur if we did nothing are not disproven by the fact that they did not come true when we did something. So, to infer that these predictions must have been wrong in predicting what would occur if we did nothing would be an error. We do, of course, rationally assess predictions based on outcomes but this assessment should not ignore considering the effect of the response. Sorting out such counterfactual predictions is hard. In complex cases we can probably never prove what would have happened, but good methods can guide us here, which is why we need to go with science and math rather than hunches and feelings.
This fallacy draws considerable force from psychological factors, especially in the case of COVID-19. The response that was taken to the virus came with a high cost and we wanted things to get back to normal—so, ironically, the success of the response made us feel that we could stop quickly or that we did not need such a response. As always, bad reasoning can lead to bad consequences and in a pandemic, it can hurt and even kill many people.
Stay safe and I will see you in the future.