While scientists have only fairly recently gotten around to studying cogitative biases, philosophers have been teaching about them for centuries-typically in the form of various logical errors. However, it is good that the scientific attention to these biases is serving to attract additional attention to them.
Everyone of us is, of course, loaded down with all sorts of cognitive biases. Some scientists even claim that such biases are hard wired into the brain, thus making them part of our actual anatomy and physiology. If so, it would seem to suggest that people might be more or less biased based on the specifics of their hard-wiring. This would help explain some of the variation in people when it comes to being able to reason well.
While we all suffer from cognitive biases (and other biases) we do have the capacity to resist and even overcome such biases and reason in a more objective manner. As this takes effort and training (as well as the will to want to think critically) it is not very common for folks to try to overcome these biases. Hence, bad reasoning tends to dominate.
One standard bias is known as negativity bias. While some people are more prone to focus on the negative than others, apparently we all have an inbuilt tendency to give more weight to negative information relative to positive information. This would help to account for the fact that people tend to consider a single misdeed to outweigh a large number of good deeds.
Of course, people do also have other biases that can lead them to weigh the positive more than the negative. For example, people tend to ignore or downplay negative aspects of people, causes, and things they like and weigh the positive more heavily. This often involves embracing inconsistency by applying different standards relative to what one likes or dislikes (see, for example, how Fox News and MSNBC evaluate various political matters).
Interestingly, this bias seems to occur at neurological level. The brain actually has more neural activity when it is reacting to negative information than when reacting to positive information. Assuming these results apply generally, we are actually hard-wired for negativity.
The defense against this involves being aware of this bias and exhibiting even greater caution in assessing negative information-especially when it involves negative information about something we do not like. For example, folks who dislike the Tea Party will weigh negative information about them more heavily than positive evidence and will tend to make little effort to determine whether the evidence has been properly assessed. The same holds true for folks who dislike the Occupy Wall Street movement and its spin-offs. They will take any negative evidence as being quite significant and ignore or undervalue positive evidence.
This bias does help explain a great deal about how people see political events and assess them.