Illusions of Cognitive Illusions

Cognitive Illusions, common patterns of »irrational thinking« should be overcome: There are books that tell you about these errors and how to avoid them and you can also get an infographic with a list of these, warning you in bright colors. However, it seems that some research on biases may be biased itself.

One example for a cognitive bias is the »Overconfidence Bias«: In studies (like e.g.2), people were are asked to give answers to questions (»Which city is larger, Copenhagen or Karachi?«) and to have the participants rate their confidence in their answer (»70% sure«).

It seems that people are too confident in their answers: They are biased.

However we should not be overly confident in this bias. An alternative interpretation is offered by Juslin, Winman and Olson (2000) 7. They argue, that the overconfidence effect largely depends statistical problems that the studies are prone too and the nonrepresentative selection of questions. According to them: There is no overconfidence bias.

Another candidate for a bias that might be actually a decent strategy is the »confirmation bias«: Seemingly, this is the tendency to affirm assumptions rather than to falsify them. The classic study10 is giving participants the triple (2,4,6) and ask which rule applies to them. Participants assume that it must be »ascending multiples of 2« (like (20,22,24) and (8,10,12) ). They test cases that adhere to this rule. But the rule is »any ascending numbers« (e.g. (3,42,178)). Since most tested the »ascending multiples of 2« rule, they did not find out that the actual rule.

Participants focused on testing cases for which one assumes to be able to confirm. In this particular setting the strategy that was employed was not helpful.

But interestingly, such a strategy is the only way to disconfirm a theory in some cases: It is the only way to find false positives, cases, in which your preliminary theory says that something will be the case but actually it is not. If your hypothesis is »If the street is wet, it rains«, you can’t find out this is not always right if you only »test« dry streets.

In many real live situations it makes much sense to use such an approach: If you choose a new person to employ, you will invite those people who you assume to be good candidates.

The »confirmation bias« is not confirming one’s pet theory. It is a bias of working with what the theory assumes to be true: A »positive test strategy«. And in many situations it performs very good and helps to falsify wrong assumptions 8.

Other cognitive biases are also disputed. In »Rationality for Mortals« 3, Psychologist Gigerenzer lists twelve »cognitive illusions« which can be – plausibly and based on research – interpreted as working well for many real live situations 4. Not only that, they may often indeed be better than their suggested alternative »rational« approaches. Many of the »rational« approaches assume that more knowledge and more data and more computation will always perform better – which is not the case due to data overfitting 5 and lack of considering the environment 6 in which the decisions are taken.

This is »Ecological Rationality«, according to Gingerenzer: The behavior may not always be the optimal (mathematically) in an artificial context, but will perform rather well (also mathematically), if typical »real life« situations are faced. They are methods or »hacks« which often are not only applicable quickly but may even perform better than the elaborate analysis you can't do anyway in an everyday situation.

This does not mean that everything that seems like a bias is actually a great thing to pursue. But not applying the intuitive method and instead to resorting to a seemingly theoretically superior solution can also lead to the errors one actually wanted to avoid.


Notes

  • Update 2018-08-25: An interesting view on the cultural implications of the concept of cognitive illusions is Shaw, Tamsin. “Invisible Manipulators of Your Mind”, The New York Review of Books, 2017

  • Update 2020-07-06: The idea of a cognitive bias also suggests that there is a single unbiased view, to which our thinking should be corrected. Feminists authors criticize the abstract and mathematical idea of rationality and ‘good’ thinking since it ignores the context, particularly social situation and embodiment of people. See e.g. 9 or 1.



  1. Adam, Alison. 1998. Artificial Knowing: Gender and the Thinking Machine. London ; New York: Routledge. 

  2. Fischhoff, Baruch, Paul Slovic, and Sarah Lichtenstein. "Knowing with certainty: The appropriateness of extreme confidence." Journal of Experimental Psychology: Human perception and performance 3.4 (1977): 552. 

  3. Gigerenzer, Gerd. Rationality for mortals: How people cope with uncertainty. Oxford University Press, 2008. 

  4. ibid, table 1.2 

  5. ibid, p.53 

  6. ibid p. 54 

  7. Juslin, Peter, Anders Winman, and Henrik Olsson. "Naive empiricism and dogmatism in confidence research: A critical examination of the hard–easy effect." Psychological review 107.2 (2000): 384. 

  8. Klayman, Joshua, and Young-Won Ha. "Confirmation, disconfirmation, and information in hypothesis testing." Psychological review 94.2 (1987): 211. 

  9. Lloyd, Genevieve. 1993. The Man of Reason: ”male” and “Female” in Western Philosophy. 2nd ed. Minneapolis: University of Minnesota Press. 

  10. Wason, Peter C. "On the failure to eliminate hypotheses in a conceptual task." Quarterly journal of experimental psychology 12.3 (1960): 129-140.