Being wrong is one of the most recurrent fears of people , despite the stoicism with which Greek or Roman philosophy took it ( errare humanum est , which Seneca the Younger said). Or, rather, we fear the anticipated consequences of errors, which for an immense majority tend to be catastrophes imagined in advance that cause a great deal of psychological discomfort, and not a few blocks when it comes to making decisions.

What is really a mistake?

In principle, we mistakenly understand an assessment to be out of line or invalid in its field of application, either when making decisions or when carrying out actions as a result of this decision. We know that it is maladjusted because the prediction of the results we have made is not fulfilled . Of course, we classify it as an error if this mismatch has a negative balance, because if it is the opposite and we obtain an unexpected benefit, it will immediately become a success despite the dissonance.

Numerous studies have been carried out on how we manage errors ; from various fields of study and more or less all point in the direction indicated by Buss and Haselton (2000) in their theory of error management. In short, when we have to make a decision on some matter that involves a certain degree of uncertainty, we can make two types of mistakes.

In type I errors, or false positives, we predict that an event will occur that eventually does not happen, while in type II errors, or false negatives, we bet that an event that later occurs will not happen. The theory holds that in deciding it is not possible to minimize both probabilities; either we reduce one or we reduce the other.

Which one is better? It depends on the perceived cost, and therefore on the context . If I need to design a fire protection system or I am an engineer, I will tend to minimize the type II ones, which would be a real disaster; an alarm should tend to false positives for obvious reasons. But in general, we tend to opt for more prudent options if we expect to make a profit, while in a loss scenario we are more willing to take risks (Johnson, 2013).

How do mistakes happen?

Most decisions are made by what Kahneman and other authors call the system 1 or autopilot of our mental processes .

Anyone who has tried to put dirty dishes in the fridge or has looked all over the house for glasses while wearing them on their head knows that our automations fail. However, the margin of inaccuracy is a tribute worth paying in return for the speed, efficiency and adaptability to the environment that this automatic method offers. The most important decisions will ideally be taken with the intervention of system 2, whose action is voluntary, reflexive and involves much greater effort.

In general, when we think we have made a mistake, it is due to a lack of information when taking a course of action , either because it is inaccessible (it is very difficult to know what the working climate will be like in this new and exciting job we have achieved and which seems like an excellent opportunity) or because of a misinterpretation of the available one, and here we would enter the field of cognitive biases when deciding. It is not uncommon to ignore data that do not fit with our predefined ideas, or to underestimate them. Or overestimate rather flimsy evidence.

In fact, apart from the negative consequences that the mistake may have, we are very concerned about the emotional cost of the terrible moment when we realize that we have made a mistake . Managing the frustration of seeing your desires, needs or aspirations unfulfilled is a process that is educated from childhood and that not everyone knows how to manage properly.

Rage against someone external or against ourselves, sadness for the loss of what we anticipated and the helplessness in which we sometimes find ourselves, is a difficult pill to swallow.

Fear of making a mistake: what to do to manage it?

In general, in order to achieve a better exposure to error without too serious psychological consequences , some keys must be taken into account.

1. Accept that error is ubiquitous and daily

We make thousands of decisions a day, most of them decided by system 1, which saves us a lot of tedious work. So we’ll be wrong dozens or maybe hundreds of times. The more I am used to the possibility of error, the less I will suffer when it occurs .

2. Learning to value real costs

The cost of error is not always high, nor is it a tragedy. In fact, of the dozens of mistakes made every day, we are not aware of most of them because they do not have consequences. Even there are errors that prevent us from other more important ones , such as “positive illusions” that overestimate our capacity or ability to face some situations and that can lead us to solve them on many occasions (McKay & Dennet, 2009).

3. Assessing our biases

Many of the biased decisions we make are adaptive, paradoxically; for example, looking at both sides of the road, even if no cars pass is a behavioural bias and its cost is minimal. The famous negativity bias is evolutionary because it favours survival , although it is not always correct. Biases minimize the cost of errors.

The point is that, if we perceive a bad result to be repeated, there may be a bias of our own that does not serve us – “don’t trust everyone”, “men just want sex”, and so on. A thoughtful assessment of how we decide is important.

4. Adequate emotional management

We will be angry, we will be furious and we may hyperventilate if we miss a deadline, choose a career that we don’t like or enter into a relationship with a toxic person. But beware of “making this unpleasant feeling last” longer than recommended. Negative emotions serve to indicate where there is a problem, no more and no less . Then our task is to identify it well and put solutions to it.

5. Integrate new information.

It is about looking for adaptability in our mental schemes, incorporating new behaviors and adjusting our patterns once we have located what was interfering with our predictions. We humans often modify our ways of doing things, even though we don’t do it consciously in many cases.

We don’t always look for the maximum benefit, but the best fit . For this, we need to examine the error carefully. To avoid the influence of our own bias, we can always seek help, whether professional or “amateur”; the insight of another trusted person can be very helpful.

Bibliographic references:

  • D. Johnson, D. Blumstein, J. Fowler, M. Haselton (2013) The evolution of error: error management, cognitive constraints, and adaptive decision-making biases. Trends in Ecology & Evolution August 2013, Vol. 28, No. 8.
  • M. Haselton y D. Buss (2000) Error Management Theory: Una nueva perspectiva sobre los prejuicios en la lectura de la mente de los sexos cruzados. Journal of Personality and Social Psychology 2000. Vol. 78, No. 1,81-91.
  • M. Psyrdellis y N. Justel (2017) construyen psicológicos vinculados a la respuesta de frustración en humanos. Anuario de Investigaciones, vol. XXIV, 2017, pp. 301-310 Universidad de Buenos Aires.
  • N. Keith y M. Frese (2005). Self-Regulation in Error Management Training: Control de las emociones y metacognición como mediadores de los efectos de la actuación. Journal of Applied Psychology 2005, Vol. 90, No. 4, 677 – 691.