Let bygones be bygones. There is one irrefutable fact: we cannot change our decisions or our actions of the past. And what do we usually do about it? Change our perception of what happened and remember our own decisions as better than they really were.

This psychological effect known as prejudice or bias of hindsight manifests itself when we look back in time and effectively believe that the events that occurred were more predictable than they really were when a particular decision was made.

What is cognitive bias?

A cognitive bias is a deviation in ordinary cognitive processing that leads the individual to distort and misinterpret the available information .

This type of irrational judgments, as occurs with retrospective bias, arise as an evolutionary necessity from which our brains are capable of making instantaneous judgments without the mediation of a more elaborate and therefore slower system of interpretation. Although they can lead us to make serious errors of interpretation, in certain contexts and situations they help us to make better and more effective decisions.

The concept of cognitive bias was introduced by psychologists and researchers Daniel Kahneman and Tversky in 1972, following their experience in researching patients who were unable to reason intuitively with large numbers. They both argued that most important human decisions are based on a limited number of heuristic principles -mind-sets that we use to simplify reality and solve problems- and not on a formal analysis of facts. This theory was in stark contradiction to the rational decision-making model that prevailed at the time.

Retrospective Bias: What it is and how it influences us

It is common for prejudice or retrospective bias to act whenever an economic or social crisis occurs. For example, after the global financial crisis of 2008, which was triggered by the collapse of the housing bubble and the subprime mortgage fraud in the US, we saw many of the economists who were unable to predict its devastating effects claim in hindsight that the effects were indeed predictable and that they knew what would eventually happen.

This bias also has a lot to do with the ability of human beings to remember certain events. Our memory system does not work like a computer : memories fade over time and we reconstruct part of them by accumulating new experiences. The psychologist Elizabeth Loftus has been investigating for years the so-called “false memories”, postulating the theory that the way in which someone is asked to remember something influences their subsequent description of the memory itself.

These processing errors that bias our memory , as happens with the bias of hindsight, which leads us to modify the memory of our previous beliefs in favour of the final conclusion, determine our vision of ourselves and of what surrounds us. Historians, by biasing the outcome or development of a historical battle, or physicians, by biased recall of the negative effects of a clinical trial, are two examples of professions affected by this bias.

What does the research say about it?

Despite the fact that a bias such as that of hindsight seems, a priori, to be an easily explained and identifiable error, the vast majority of studies carried out conclude that it is very difficult to make judgements about something that has happened by completely abstracting from the result , so that it is also complicated to try to counteract its effect. Numerous studies have confirmed this bias, and in recent years an attempt has been made to determine whether judges succumb to it to a greater or lesser extent than, for example, members of a jury.

In this regard, in 2001 a study was conducted with 167 judges of the U.S. Federal Courts and concluded that the judges were affected by the retrospective bias to the same extent as the rest of the citizens. Another empirical study conducted by researchers W.K. Viscusi and R. Hastie in 2002 also concluded that the same effects derived from retrospective bias influenced the judge’s sentence, but to a lesser extent.

According to the study, despite the fact that the jurors were correct in incorporating moral and social assessments into their verdicts that allowed them to classify a harmful act or behavior as malicious (thus punishing the defendant and preventing similar conduct in the future), there were many errors and prejudices that turned the guilty verdicts into an unpredictable lottery . On the other hand, professional judges made fewer mistakes, a fact that calls into question the suitability of juries, despite being more democratic in form.

How to combat this and other biases

There is no magic formula that will guarantee us to avoid irrational judgments and biases like that of hindsight, but we can take into account certain keys to minimize their effects . The first thing is to start by assuming and accepting an uncomfortable truth: that we are not smarter than anyone else and that everyone, without exception, is susceptible to suffering its effects, regardless of the studies we have or how rational we think we are.

Biases, as the evolutionary mechanisms that they are, are there and are there for a reason : to speed up the decision making and the response to stimuli, problems or situations that, otherwise, we could not face due to the inability of our cognitive system to process all the available information in the shortest time possible.

Once we have assumed our own vulnerability to the effects of the irrational, the next step is to know how to deal with the information we receive from our context and from others. It is important to weigh the data and demand evidence against statements that generate suspicion. Intuition without the support of reason does not lead to success. We must contrast all opinions, our own and those of others, with facts and objective data. And be aware that making decisions based on a self-evaluation of our capabilities can be misleading.

Finally, beware of always wanting to be right. Listening carefully and trying to understand the real meaning of the information provided by our interlocutor may be the best remedy against self-deception. Closing our eyes and ears to the evidence so as not to see our established beliefs endangered is the prelude to one of the greatest evils of our society: fanaticism. And to paraphrase American psychologist Gordon Allport: “People who are aware or ashamed of their prejudices are also those who are on the road to suppressing them.

Other types of biases

There are many cognitive biases that induce us to make mistakes and irrational judgments , but we cannot focus only on the retrospective bias. There are many others that we must take into account. Among the best known are the following:

1. Carry-over bias

It consists of believing or doing something that many people do. That is, the probability of the occurrence of a behaviour would increase according to the number of individuals who sustain it . This bias is partly responsible for how we perpetuate many of the myths and false beliefs (such as thinking we only use 10% of our brain or believing that homeopathy works) so ingrained in our society today.

  • You may be interested in: “Asch’s compliance experiment: when social pressure can”

2. Anchorage bias

It is the tendency to “anchor” and use the first piece of information that comes to us and then make judgments or decisions .

The consequences of this bias are often used very effectively by all kinds of vendors and salespeople. A very obvious example can be found in car dealerships. The salesperson shows us a vehicle and throws in a specific price (for example, 5,000 euros). That first information, in this case a figure, is going to make that during the whole process of purchase we have in mind the figure that the seller has offered us. In this way, he is the one who starts with the advantage of being able to negotiate from his own terms

3. Fundamental error of attribution bias

It is the tendency to attribute an individual’s observable behavior exclusively to internal traits (such as personality or intelligence). In this way, we simplify reality by discarding a priori any possible relationship between situational factors -more changeable and less predictable- and the individual, which may serve to explain his behaviour.

4. Confirmation bias

It is produced by favoring, interpreting and remembering information that confirms our own previous expectations and beliefs, thus nullifying any other type of alternative explanation. We interpret reality selectively (as does the retrospective bias), ignoring facts and situations that do not support our preconceived ideas.

This reasoning error has a very negative influence, for example, in political and organizational fields , where it is usual to have to consider multiple options in order to make an accurate decision.

5. Availability bias

It is the tendency to estimate the probability of an event based on the availability or frequency with which that event appears in our mind through experience. For example, if the media present us with news of burglaries in the summer every day, our tendency will be to think that these events occur constantly and more frequently than they really do, since they will be more present in our memory than other events that are objectively more frequent.

Bibliographic references:

  • Bunge, M. and Ardila, R. (2002). Philosophy of psychology. Mexico: Siglo XXI.
  • Myers, David G. (2005). Psychology. Mexico: Médica Panamericana.