Cognitive bias

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment.[1] Individuals create their own "subjective social reality" from their perception of the input. An individual's construction of social reality, not the objective input, may dictate their behaviour in the social world.[2] Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.[3][4][5]

Some cognitive biases are presumably adaptive. Cognitive biases may lead to more effective actions in a given context.[6] Furthermore, allowing cognitive biases enable faster decisions which can be desirable when timeliness is more valuable than accuracy, as illustrated in heuristics.[7] Other cognitive biases are a "by-product" of human processing limitations,[8] resulting from a lack of appropriate mental mechanisms (bounded rationality), or simply from a limited capacity for information processing.[9][10]

A continually evolving list of cognitive biases has been identified over the last six decades of research on human judgment and decision-making in cognitive science, social psychology, and behavioral economics. Kahneman and Tversky (1996) argue that cognitive biases have efficient practical implications for areas including clinical judgment, entrepreneurship, finance, and management.[11][12]

Overview

Bias arises from various processes that are sometimes difficult to distinguish. These include

  • information-processing shortcuts (heuristics)[13]
  • noisy information processing (distortions in the process of storage in and retrieval from memory)[14]
  • the brain's limited information processing capacity[15]
  • emotional and moral motivations[16]
  • social influence[17]

The notion of cognitive biases was introduced by Amos Tversky and Daniel Kahneman in 1972[18] and grew out of their experience of people's innumeracy, or inability to reason intuitively with the greater orders of magnitude. Tversky, Kahneman and colleagues demonstrated several replicable ways in which human judgments and decisions differ from rational choice theory. Tversky and Kahneman explained human differences in judgement and decision making in terms of heuristics. Heuristics involve mental shortcuts which provide swift estimates about the possibility of uncertain occurrences.[19] Heuristics are simple for the brain to compute but sometimes introduce "severe and systematic errors."[7]

For example, the representativeness heuristic is defined as the tendency to "judge the frequency or likelihood" of an occurrence by the extent of which the event "resembles the typical case".[19] The "Linda Problem" illustrates the representativeness heuristic (Tversky & Kahneman, 1983[20]). Participants were given a description of "Linda" that suggests Linda might well be a feminist (e.g., she is said to be concerned about discrimination and social justice issues). They were then asked whether they thought Linda was more likely to be a "(a) bank teller" or a "(b) bank teller and active in the feminist movement". A majority chose answer (b). This error (mathematically, answer (b) cannot be more likely than answer (a)) is an example of the "conjunction fallacy"; Tversky and Kahneman argued that respondents chose (b) because it seemed more "representative" or typical of persons who might fit the description of Linda. The representativeness heuristic may lead to errors such as activating stereotypes and inaccurate judgments of others (Haselton et al., 2005, p. 726).

Alternatively, critics of Kahneman and Tversky such as Gerd Gigerenzer argue that heuristics should not lead us to conceive of human thinking as riddled with irrational cognitive biases, but rather to conceive rationality as an adaptive tool that is not identical to the rules of formal logic or the probability calculus.[21] Nevertheless, experiments such as the "Linda problem" grew into the heuristics and biases research program which spread beyond academic psychology into other disciplines including medicine and political science.

Types

Biases can be distinguished on a number of dimensions. For example,

  • there are biases specific to groups (such as the risky shift) as well as biases at the individual level.
  • Some biases affect decision-making, where the desirability of options has to be considered (e.g., sunk costs fallacy).
  • Others such as illusory correlation affect judgment of how likely something is, or of whether one thing is the cause of another.
  • A distinctive class of biases affect memory,[22] such as consistency bias (remembering one's past attitudes and behavior as more similar to one's present attitudes).

Some biases reflect a subject's motivation,[23] for example, the desire for a positive self-image leading to egocentric bias and the avoidance of unpleasant cognitive dissonance.[24] Other biases are due to the particular way the brain perceives, forms memories and makes judgments. This distinction is sometimes described as "hot cognition" versus "cold cognition", as motivated reasoning can involve a state of arousal.

Among the "cold" biases,

  • some are due to ignoring relevant information (e.g., neglect of probability).
  • some involve a decision or judgement being affected by irrelevant information (for example the framing effect where the same problem receives different responses depending on how it is described; or the distinction bias where choices presented together have different outcomes than those presented separately).
  • others give excessive weight to an unimportant but salient feature of the problem (e.g., anchoring).

The fact that some biases reflect motivation, and in particular the motivation to have positive attitudes to oneself[24] accounts for the fact that many biases are self-serving or self-directed (e.g., illusion of asymmetric insight, self-serving bias). There are also biases in how subjects evaluate in-groups or out-groups; evaluating in-groups as more diverse and "better" in many respects, even when those groups are arbitrarily-defined (ingroup bias, outgroup homogeneity bias).

Some cognitive biases belong to the subgroup of attentional biases which refer to the paying of increased attention to certain stimuli. It has been shown, for example, that people addicted to alcohol and other drugs pay more attention to drug-related stimuli. Common psychological tests to measure those biases are the Stroop task[25][26] and the dot probe task.

Individuals' susceptibility to some types of cognitive biases can be measured by the Cognitive Reflection Test (CRT) developed by Frederick (2005).[27][28]

List

The following is a list of the more commonly studied cognitive biases:

Name Description
Fundamental attribution error (FAE) Also known as the correspondence bias [29] is the tendency for people to over-emphasize personality-based explanations for behaviours observed in others. At the same time, individuals under-emphasize the role and power of situational influences on the same behaviour. Jones and Harris' (1967)[30] classic study illustrates the FAE. Despite being made aware that the target's speech direction (pro-Castro/anti-Castro) was assigned to the writer, participants ignored the situational pressures and attributed pro-Castro attitudes to the writer when the speech represented such attitudes.
Priming bias The tendency to be influenced by what someone else has said to create preconceived idea.
Confirmation bias The tendency to search for or interpret information in a way that confirms one's preconceptions. In addition, individuals may discredit information that does not support their views.[31] The confirmation bias is related to the concept of cognitive dissonance. Whereby, individuals may reduce inconsistency by searching for information which re-confirms their views (Jermias, 2001, p. 146).[32]
Affinity bias The tendency to be biased toward people like ourselves
Self-serving bias The tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests.
Belief bias When one's evaluation of the logical strength of an argument is biased by their belief in the truth or falsity of the conclusion.
Framing Using a too-narrow approach and description of the situation or issue.
Hindsight bias Sometimes called the "I-knew-it-all-along" effect, is the inclination to see past events as being predictable.

A 2012 Psychological Bulletin article suggests that at least 8 seemingly unrelated biases can be produced by the same information-theoretic generative mechanism.[14] It is shown that noisy deviations in the memory-based information processes that convert objective evidence (observations) into subjective estimates (decisions) can produce regressive conservatism, the belief revision (Bayesian conservatism), illusory correlations, illusory superiority (better-than-average effect) and worse-than-average effect, subadditivity effect, exaggerated expectation, overconfidence, and the hard–easy effect.

Practical significance

Many social institutions rely on individuals to make rational judgments.

The securities regulation regime largely assumes that all investors act as perfectly rational persons. In truth, actual investors face cognitive limitations from biases, heuristics, and framing effects.

A fair jury trial, for example, requires that the jury ignore irrelevant features of the case, weigh the relevant features appropriately, consider different possibilities open-mindedly and resist fallacies such as appeal to emotion. The various biases demonstrated in these psychological experiments suggest that people will frequently fail to do all these things.[33] However, they fail to do so in systematic, directional ways that are predictable.[5]

Cognitive biases are also related to the persistence of superstition, to large social issues such as prejudice, and they also work as a hindrance in the acceptance of scientific non-intuitive knowledge by the public.[34]

However, in some academic disciplines, the study of bias is very popular. For instance, bias is a wide spread phenomenon and well studied, because most decisions that concern the minds and hearts of entrepreneurs are computationally intractable[12]

Reducing

Because they cause systematic errors, cognitive biases cannot be compensated for using a wisdom of the crowd technique of averaging answers from several people.[35] Debiasing is the reduction of biases in judgment and decision making through incentives, nudges, and training. Cognitive bias mitigation and cognitive bias modification are forms of debiasing specifically applicable to cognitive biases and their effects. Reference class forecasting is a method for systematically debiasing estimates and decisions, based on what Daniel Kahneman has dubbed the outside view.

Similar to Gigerenzer (1996),[36] Haselton et al. (2005) state the content and direction of cognitive biases are not "arbitrary" (p. 730).[8] Moreover, cognitive biases can be controlled. One debiasing technique aims to decrease biases by encouraging individuals to use controlled processing compared to automatic processing.[29] In relation to reducing the FAE, monetary incentives[37] and informing participants they will be held accountable for their attributions[38] have been linked to the increase of accurate attributions. Training has also shown to reduce cognitive bias. Morewedge and colleagues (2015) found that research participants exposed to one-shot training interventions, such as educational videos and debiasing games that taught mitigating strategies, exhibited significant reductions in their commission of six cognitive biases immediately and up to 3 months later.[39]

Cognitive bias modification refers to the process of modifying cognitive biases in healthy people and also refers to a growing area of psychological (non-pharmaceutical) therapies for anxiety, depression and addiction called cognitive bias modification therapy (CBMT). CBMT is sub-group of therapies within a growing area of psychological therapies based on modifying cognitive processes with or without accompanying medication and talk therapy, sometimes referred to as applied cognitive processing therapies (ACPT). Although cognitive bias modification can refer to modifying cognitive processes in healthy individuals, CBMT is a growing area of evidence-based psychological therapy, in which cognitive processes are modified to relieve suffering[40][41] from serious depression,[42] anxiety,[43] and addiction.[44] CBMT techniques are technology assisted therapies that are delivered via a computer with or without clinician support. CBM combines evidence and theory from the cognitive model of anxiety,[45] cognitive neuroscience,[46] and attentional models.[47]

Common theoretical causes of some cognitive biases

A 2012 Psychological Bulletin article suggested that at least eight seemingly unrelated biases can be produced by the same information-theoretic generative mechanism that assumes noisy information processing during storage and retrieval of information in human memory.[14]

Individual differences in decision making biases

People do appear to have stable individual differences in their susceptibility to decision biases such as overconfidence, temporal discounting, and bias blind spot.[51] That said, these stable levels of bias within individuals are possible to change. Participants in experiments who watched training videos and played debiasing games showed medium to large reductions both immediately and up to three months later in the extent to which they exhibited susceptibility to six cognitive biases: anchoring, bias blind spot, confirmation bias, fundamental attribution error, projection bias, and representativeness.[52]

Criticisms

There are criticisms against theories of cognitive biases based on the fact that both sides in a debate often claim each other's thoughts to be in human nature and the result of cognitive bias, while claiming their own viewpoint as being the correct way to "overcome" cognitive bias. This is not due simply to debate misconduct but is a more fundamental problem that stems from psychology's making up of multiple opposed cognitive bias theories that can be non-falsifiably used to explain away any viewpoint.[53][54]

See also

References

  1. ^ Haselton, M. G.; Nettle, D. & Andrews, P. W. (2005). The evolution of cognitive bias (PDF). In D. M. Buss (Ed.), The Handbook of Evolutionary Psychology: Hoboken, NJ, US: John Wiley & Sons Inc. pp. 724–746.
  2. ^ Bless, H.; Fiedler, K. & Strack, F. (2004). Social cognition: How individuals construct social reality. Hove and New York: Psychology Press.
  3. ^ Kahneman, D.; Tversky, A. (1972). "Subjective probability: A judgment of representativeness" (PDF). Cognitive Psychology. 3 (3): 430–454. doi:10.1016/0010-0285(72)90016-3.
  4. ^ Baron, J. (2007). Thinking and Deciding (4th ed.). New York, NY: Cambridge University Press.
  5. ^ a b Ariely, Dan (2008). Predictably Irrational: The Hidden Forces That Shape Our Decisions. New York, NY: HarperCollins. ISBN 978-0-06-135323-9.
  6. ^ For instance: Gigerenzer, G.; Goldstein, D. G. (1996). "Reasoning the fast and frugal way: Models of bounded rationality" (PDF). Psychological Review. 103 (4): 650–669. CiteSeerX 10.1.1.174.4404. doi:10.1037/0033-295X.103.4.650. PMID 8888650.
  7. ^ a b Tversky, A. & Kahneman, D. (1974). "Judgement under uncertainty: Heuristics and biases". Science. 185 (4157): 1124–1131. Bibcode:1974Sci...185.1124T. doi:10.1126/science.185.4157.1124. PMID 17835457.
  8. ^ a b Haselton, M. G.; Nettle, D. & Andrews, P. W. (2005). The evolution of cognitive bias. In D. M. Buss (Ed.), The Handbook of Evolutionary Psychology: Hoboken, NJ, US: John Wiley & Sons Inc. pp. 724–746.
  9. ^ Bless, H.; Fiedler, K. & Strack, F. (2004). Social cognition: How individuals construct social reality. Hove and New York: Psychology Press.
  10. ^ Morewedge, Carey K.; Kahneman, Daniel (2010-01-10). "Associative processes in intuitive judgment". Trends in Cognitive Sciences. 14 (10): 435–440. doi:10.1016/j.tics.2010.07.004. ISSN 1364-6613. PMC 5378157. PMID 20696611.
  11. ^ Kahneman, D. & Tversky, A. (1996). "On the reality of cognitive illusions" (PDF). Psychological Review. 103 (3): 582–591. CiteSeerX 10.1.1.174.5117. doi:10.1037/0033-295X.103.3.582. PMID 8759048.
  12. ^ a b S.X. Zhang; J. Cueto (2015). "The Study of Bias in Entrepreneurship". Entrepreneurship Theory and Practice. 41 (3): 419–454. doi:10.1111/etap.12212.
  13. ^ Kahneman, D., Slovic, P., & Tversky, A. (1982). Judgment under uncertainty: Heuristics and biases (1st ed.). Cambridge University Press.
  14. ^ a b c Martin Hilbert (2012). "Toward a synthesis of cognitive biases: How noisy information processing can bias human decision making" (PDF). Psychological Bulletin. 138 (2): 211–237. CiteSeerX 10.1.1.432.8763. doi:10.1037/a0025940. PMID 22122235. Lay summary.
  15. ^ Simon, H. A. (1955). "A behavioral model of rational choice". The Quarterly Journal of Economics. 69 (1): 99–118. doi:10.2307/1884852. JSTOR 1884852.
  16. ^ Pfister, H.-R.; Böhm, G. (2008). "The multiplicity of emotions: A framework of emotional functions in decision making". Judgment and Decision Making. 3: 5–17.
  17. ^ Wang, X. T.; Simons, F.; Brédart, S. (2001). "Social cues and verbal framing in risky choice". Journal of Behavioral Decision Making. 14 (1): 1–15. doi:10.1002/1099-0771(200101)14:1<1::AID-BDM361>3.0.CO;2-N.
  18. ^ Kahneman, Daniel; Shane Frederick (2002). "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". In Thomas Gilovich; Dale Griffin; Daniel Kahneman. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press. pp. 51–52. ISBN 978-0-521-79679-8.
  19. ^ a b Baumeister, R. F.; Bushman, B. J. (2010). Social psychology and human nature: International Edition. Belmont, USA: Wadsworth. p. 141.
  20. ^ Tversky, A. & Kahneman, D. (1983). "Extensional versus intuitive reasoning: The conjunction fallacy in probability judgement" (PDF). Psychological Review. 90 (4): 293–315. doi:10.1037/0033-295X.90.4.293.
  21. ^ Gigerenzer, G. (2006). "Bounded and Rational". In Stainton, R. J. Contemporary Debates in Cognitive Science. Blackwell. p. 129. ISBN 978-1-4051-1304-5.
  22. ^ Schacter, D.L. (1999). "The Seven Sins of Memory: Insights From Psychology and Cognitive Neuroscience". American Psychologist. 54 (3): 182–203. doi:10.1037/0003-066X.54.3.182. PMID 10199218.
  23. ^ Kunda, Z. (1990). "The Case for Motivated Reasoning" (PDF). Psychological Bulletin. 108 (3): 480–498. doi:10.1037/0033-2909.108.3.480. PMID 2270237.
  24. ^ a b Hoorens, V. (1993). "Self-enhancement and Superiority Biases in Social Comparison". In Stroebe, W.; Hewstone, Miles. European Review of Social Psychology 4. Wiley.
  25. ^ Jensen AR, Rohwer WD (1966). "The Stroop color-word test: a review". Acta Psychologica. 25 (1): 36–93. doi:10.1016/0001-6918(66)90004-7. PMID 5328883.
  26. ^ MacLeod CM (March 1991). "Half a century of research on the Stroop effect: an integrative review". Psychological Bulletin. 109 (2): 163–203. CiteSeerX 10.1.1.475.2563. doi:10.1037/0033-2909.109.2.163. PMID 2034749.
  27. ^ Frederick, Shane (2005). "Cognitive Reflection and Decision Making". Journal of Economic Perspectives. 19 (4): 25–42. doi:10.1257/089533005775196732. ISSN 0895-3309.
  28. ^ Oechssler, Jörg; Roider, Andreas; Schmitz, Patrick W. (2009). "Cognitive abilities and behavioral biases" (PDF). Journal of Economic Behavior & Organization. 72 (1): 147–152. doi:10.1016/j.jebo.2009.04.018. ISSN 0167-2681.
  29. ^ a b Baumeister, R. F.; Bushman, B. J. (2010). Social psychology and human nature: International Edition. Belmont, USA: Wadsworth.
  30. ^ Jones, E. E. & Harris, V. A (1967). "The attribution of attitudes". Journal of Experimental Social Psychology. 3: 1–24. doi:10.1016/0022-1031(67)90034-0.
  31. ^ Mahoney, M. J. (1977). "Publication prejudices: An experimental study of confirmatory bias in the peer review system". Cognitive Therapy and Research. 1 (2): 161–175. doi:10.1007/bf01173636.
  32. ^ Jermias, J. (2001). "Cognitive dissonance and resistance to change: The influence of commitment confirmation and feedback on judgement usefulness of accounting systems". Accounting, Organizations and Society. 26 (2): 141–160. doi:10.1016/s0361-3682(00)00008-8.
  33. ^ Sutherland, Stuart (2007) Irrationality: The Enemy Within Second Edition (First Edition 1994) Pinter & Martin. ISBN 978-1-905177-07-3
  34. ^ Günter Radden; H. Cuyckens (2003). Motivation in language: studies in honor of Günter Radden. John Benjamins. p. 275. ISBN 978-1-58811-426-6.
  35. ^ Marcus Buckingham; Ashley Goodall. "The Feedback Fallacy" (March-April 2019). Harvard Business Review.
  36. ^ Gigerenzer, G. (1996). "On narrow norms and vague heuristics: A reply to Kahneman and Tversky (1996)". Psychological Review. 103 (3): 592–596. CiteSeerX 10.1.1.314.996. doi:10.1037/0033-295x.103.3.592.
  37. ^ Vonk, R. (1999). "Effects of outcome dependency on correspondence bias". Personality and Social Psychology Bulletin. 25 (3): 382–389. doi:10.1177/0146167299025003009.
  38. ^ Tetlock, P. E. (1985). "Accountability: A social check on the fundamental attribution error". Social Psychology Quarterly. 48 (3): 227–236. doi:10.2307/3033683. JSTOR 3033683.
  39. ^ Morewedge, Carey K.; Yoon, Haewon; Scopelliti, Irene; Symborski, Carl W.; Korris, James H.; Kassam, Karim S. (2015-08-13). "Debiasing Decisions Improved Decision Making With a Single Training Intervention". Policy Insights from the Behavioral and Brain Sciences. 2: 129–140. doi:10.1177/2372732215600886. ISSN 2372-7322.
  40. ^ MacLeod, C.; Mathews, A.; Tata, P. (1986). "Attentional Bias in Emotional Disorders". Journal of Abnormal Psychology. 95 (1): 15–20. doi:10.1037/0021-843x.95.1.15. PMID 3700842.
  41. ^ Bar-Haim, Y.; Lamy, D.; Pergamin, L.; Bakermans-Kranenburg, M. J. (2007). "Threat-related attentional bias in anxious and nonanxious individuals: a meta-analytic study". Psychol Bull. 133 (1): 1–24. CiteSeerX 10.1.1.324.4312. doi:10.1037/0033-2909.133.1.1. PMID 17201568.
  42. ^ Holmes, E. A.; Lang, T. J.; Shah, D. M. (2009). "Developing interpretation bias modification as a "cognitive vaccine" for depressed mood: imagining positive events makes you feel better than thinking about them verbally". J Abnorm Psychol. 118 (1): 76–88. doi:10.1037/a0012590. PMID 19222316.
  43. ^ Hakamata, Y.; Lissek, S.; Bar-Haim, Y.; Britton, J. C.; Fox, N. A.; Leibenluft, E.; Pine, D. S. (2010). "Attention bias modification treatment: a meta-analysis toward the establishment of novel treatment for anxiety". Biol Psychiatry. 68 (11): 982–990. doi:10.1016/j.biopsych.2010.07.021. PMC 3296778. PMID 20887977.
  44. ^ Eberl, C.; Wiers, R. W.; Pawelczack, S.; Rinck, M.; Becker, E. S.; Lindenmeyer, J. (2013). "Approach bias modification in alcohol dependence: Do clinical effects replicate and for whom does it work best?". Developmental Cognitive Neuroscience. 4: 38–51. doi:10.1016/j.dcn.2012.11.002. PMID 23218805.
  45. ^ Clark, D. A., & Beck, A. T. (2009). Cognitive Therapy of Anxiety Disorders: Science and Practice. London: Guildford.
  46. ^ Browning, M.; Holmes, E. A.; Murphy, S. E.; Goodwin, G. M.; Harmer, C. J. (2010). "Lateral prefrontal cortex mediates the cognitive modification of attentional bias". Biol Psychiatry. 67 (10): 919–925. doi:10.1016/j.biopsych.2009.10.031. PMC 2866253. PMID 20034617.
  47. ^ Eysenck, M. W.; Derakshan, N.; Santos, R.; Calvo, M. G. (2007). "Anxiety and cognitive performance: Attentional control theory". Emotion. 7 (2): 336–353. CiteSeerX 10.1.1.453.3592. doi:10.1037/1528-3542.7.2.336. PMID 17516812.
  48. ^ Kahneman, Daniel; Shane Frederick (2002). "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". In Thomas Gilovich; Dale Griffin; Daniel Kahneman. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press. pp. 49–81. ISBN 978-0-521-79679-8. OCLC 47364085.
  49. ^ a b Tversky, Amos; Daniel Kahneman (September 27, 1974). "Judgment under Uncertainty: Heuristics and Biases". Science. 185 (4157): 1124–1131. Bibcode:1974Sci...185.1124T. doi:10.1126/science.185.4157.1124. PMID 17835457.
  50. ^ Slovic, Paul; Melissa Finucane; Ellen Peters; Donald G. MacGregor (2002). "The Affect Heuristic". In Thomas Gilovich; Dale Griffin; Daniel Kahneman. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge University Press. pp. 397–420. ISBN 978-0-521-79679-8.
  51. ^ Scopelliti, Irene; Morewedge, Carey K.; McCormick, Erin; Min, H. Lauren; Lebrecht, Sophie; Kassam, Karim S. (2015-04-24). "Bias Blind Spot: Structure, Measurement, and Consequences". Management Science. 61 (10): 2468–2486. doi:10.1287/mnsc.2014.2096.
  52. ^ Morewedge, Carey K.; Yoon, Haewon; Scopelliti, Irene; Symborski, Carl W.; Korris, James H.; Kassam, Karim S. (2015-10-01). "Debiasing Decisions Improved Decision Making With a Single Training Intervention". Policy Insights from the Behavioral and Brain Sciences. 2 (1): 129–140. doi:10.1177/2372732215600886. ISSN 2372-7322.
  53. ^ Popper, Karl, Conjectures and Refutations: The Growth of Scientific Knowledge
  54. ^ "Surely You're Joking, Mr. Feynman!": Adventures of a Curious Character, 1985, Richard Feynman

Further reading

  • Eiser, J.R. and Joop van der Pligt (1988) Attitudes and Decisions London: Routledge. ISBN 978-0-415-01112-9
  • Fine, Cordelia (2006) A Mind of its Own: How your brain distorts and deceives Cambridge, UK: Icon Books. ISBN 1-84046-678-2
  • Gilovich, Thomas (1993). How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life. New York: The Free Press. ISBN 0-02-911706-2
  • Haselton, M.G., Nettle, D. & Andrews, P.W. (2005). The evolution of cognitive bias. In D.M. Buss (Ed.), Handbook of Evolutionary Psychology, (pp. 724–746). Hoboken: Wiley. Full text
  • Heuer, Richards J. Jr. (1999). "Psychology of Intelligence Analysis. Central Intelligence Agency".
  • Young, S. (2007) Micromessaging - Why Great Leadership Is Beyond Words New York: McGraw-Hill. ISBN 978-0-07-146757-5
  • Kahneman D., Slovic P., and Tversky, A. (Eds.) (1982) Judgment Under Uncertainty: Heuristics and Biases. New York: Cambridge University Press ISBN 978-0-521-28414-1
  • Kahneman, Daniel (2011) Thinking, Fast and Slow. New York: Farrar, Straus and Giroux ISBN 978-0-374-27563-1
  • Kida, Thomas (2006) Don't Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking New York: Prometheus. ISBN 978-1-59102-408-8
  • Nisbett, R., and Ross, L. (1980) Human Inference: Strategies and shortcomings of human judgement. Englewood Cliffs, NJ: Prentice-Hall ISBN 978-0-13-445130-5
  • Piatelli-Palmarini, Massimo (1994) Inevitable Illusions: How Mistakes of Reason Rule Our Minds New York: John Wiley & Sons. ISBN 0-471-15962-X
  • Stanovich, Keith (2009). What Intelligence Tests Miss: The Psychology of Rational Thought. New Haven (CT): Yale University Press. ISBN 978-0-300-12385-2. Lay summary (PDF) (21 November 2010).
  • Sutherland, Stuart (2007) Irrationality: The Enemy Within Second Edition (First Edition 1994) Pinter & Martin. ISBN 978-1-905177-07-3
  • Tavris, Carol and Elliot Aronson (2007) Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions and Hurtful Acts Orlando, Florida: Harcourt Books. ISBN 978-0-15-101098-1
  • Funder, David C.; Joachim I. Krueger (June 2004). "Towards a balanced social psychology: Causes, consequences, and cures for the problem-seeking approach to social behavior and cognition" (PDF). Behavioral and Brain Sciences. 27 (3): 313–376. doi:10.1017/s0140525x04000081. PMID 15736870. Archived from the original (PDF) on 2014-02-22. Retrieved 3 May 2011.

External links

Adaptive bias

Adaptive bias is the idea that the human brain has evolved to reason adaptively, rather than truthfully or even rationally, and that cognitive bias may have evolved as a mechanism to reduce the overall cost of cognitive errors as opposed to merely reducing the number of cognitive errors, when faced with making a decision under conditions of uncertainty.

Bias blind spot

The bias blind spot is the cognitive bias of recognizing the impact of biases on the judgment of others, while failing to see the impact of biases on one's own judgment. The term was created by Emily Pronin, a social psychologist from Princeton University's Department of Psychology, with colleagues Daniel Lin and Lee Ross. The bias blind spot is named after the visual blind spot. Most people appear to exhibit the bias blind spot. In a sample of more than 600 residents of the United States, more than 85% believed they were less biased than the average American. Only one participant believed that he or she was more biased than the average American. People do vary with regard to the extent to which they exhibit the bias blind spot. It appears to be a stable individual difference that is measurable (for a scale, see Scopelliti et al. 2015).The bias blind spot appears to be a true blind spot in that it is unrelated to actual decision making ability. Performance on indices of decision making competence are not related to individual differences in bias blind spot. In other words, everyone seems to think they are less biased than other people, regardless of their actual decision making ability.

Cognitive bias in animals

Cognitive bias in animals is a pattern of deviation in judgment, whereby inferences about other animals and situations may be affected by irrelevant information or emotional states. It is sometimes said that animals create their own "subjective social reality" from their perception of the input. In humans, for example, an optimistic or pessimistic bias might affect one's answer to the question "Is the glass half empty or half full?"

To explore cognitive bias, one might train an animal to expect that a positive event follows one stimulus and that a negative event follows another stimulus. For example, on many trials, if the animal presses lever A after a 20 Hz tone it gets a highly desired food, but a press on lever B after a 10 Hz tone yields bland food. The animal is then offered both levers after an intermediate test stimulus, e.g. a 15 Hz tone. The hypothesis is that the animal's "mood" will bias the choice of levers after the test stimulus; if positive, it will tend to choose lever A, if negative it will tend to choose lever B. The hypothesis is tested by manipulating factors that might affect mood – for example, the type of housing the animal is kept in.Cognitive biases have been shown in a wide range of species including rats, dogs, rhesus macaques, sheep, chicks, starlings and honeybees.

Cognitive bias mitigation

Cognitive bias mitigation is the prevention and reduction of the negative effects of cognitive biases – unconscious, automatic influences on human judgment and decision making that reliably produce reasoning errors.

Coherent, comprehensive theories of cognitive bias mitigation are lacking. This article describes debiasing tools, methods, proposals and other initiatives, in academic and professional disciplines concerned with the efficacy of human reasoning, associated with the concept of cognitive bias mitigation; most address mitigation tacitly rather than explicitly.

A long-standing debate regarding human decision making bears on the development of a theory and practice of bias mitigation. This debate contrasts the rational economic agent standard for decision making versus one grounded in human social needs and motivations. The debate also contrasts the methods used to analyze and predict human decision making, i.e. formal analysis emphasizing intellectual capacities versus heuristics emphasizing emotional states. This article identifies elements relevant to this debate.

Cognitive bias modification

Cognitive bias modification (CBM) refers to the process of modifying cognitive biases in healthy people and also refers to a growing area of psychological (non-pharmaceutical) therapies for anxiety, depression and addiction called cognitive bias modification therapy (CBMT). CBMT is sub-group of therapies within a growing area of psychological therapies based on modifying cognitive processes with or without accompanying medication and talk therapy, sometimes referred to as applied cognitive processing therapies (ACPT). Other ACPTs include attention training, interpretation modification, approach/avoid training, imagery modification training, eye movement desensitization and reprocessing therapy for PTSD.

According to Yiend et al. 2013 in an article in the journal Cognitive Therapy Research, "CBM treatments are a more convenient and flexible than other modes of treatment because they do not require meetings with a therapist. They offer the potential for delivery using modern technologies (e.g. internet or mobile phone) and require minimal supervision. They could therefore become highly cost effective and widely accessible. CBM methods are also less demanding and more acceptable to patients than traditional therapies. This is because personal thoughts and beliefs are not directly interrogated, and there is no need for social interaction or stigmatizing visits to outpatient clinics. Similarly, patient insight is not required because CBM seeks to target the underlying maintaining cognitive bias directly; therefore, patient engagement is likely to be easier. In sum, CBM methods offer a high gain, low cost treatment option because they can circumvent many of the practical and psychological requirements that disadvantage competing psychological interventionsCBMT techniques are technology assisted therapies that are delivered via a computer with or without clinician support. CBM combines evidence and theory from the cognitive model of anxiety, cognitive neuroscience and attentional models.CBM can be seen as one version of attentional retraining. It has been described as a 'cognitive vaccine'.

Congruence bias

Congruence bias is a type of cognitive bias similar to confirmation bias. Congruence bias occurs due to people's overreliance on directly testing a given hypothesis as well as neglecting indirect testing.

Dog intelligence

Dog intelligence or dog cognition is the process in dogs of acquiring, storing in memory, retrieving, combining, comparing, and using in new situations information and conceptual skills.Studies have shown that dogs display many behaviors associated with intelligence. They have advanced memory skills, and are able to read and react appropriately to human body language such as gesturing and pointing, and to understand human voice commands. Dogs demonstrate a theory of mind by engaging in deception.

Dunning–Kruger effect

In the field of psychology, the Dunning–Kruger effect is a cognitive bias in which people of low ability have illusory superiority and mistakenly assess their cognitive ability as greater than it is. The cognitive bias of illusory superiority comes from the inability of low-ability people to recognize their lack of ability. Without the self-awareness of metacognition, low-ability people cannot objectively evaluate their competence or incompetence.As described by social psychologists David Dunning and Justin Kruger, the cognitive bias of illusory superiority results from an internal illusion in people of low ability and from an external misperception in people of high ability; that is, "the miscalibration of the incompetent stems from an error about the self, whereas the miscalibration of the highly competent stems from an error about others."

End-of-the-day betting effect

The end-of-the-day betting effect is a cognitive bias reflected in the tendency for bettors to take gambles with higher risk and higher reward at the end of their betting session to try to make up for losses. William McGlothlin (1956) and Mukhtar Ali (1977) first discovered this effect after observing the shift in betting patterns at horserace tracks. Mcglothlin and Ali noticed that people are significantly more likely to prefer longshots to conservative bets on the last race of the day. They found that the movement towards longshots, and away from favorites, is so pronounced that some studies show that conservatively betting on the favorite to show (to finish first, second, or third) in the last race is a profitable bet despite the track’s take.

False consensus effect

In psychology, the false-consensus effect or false-consensus bias is an attributional type of cognitive bias whereby people tend to overestimate the extent to which their opinions, beliefs, preferences, values, and habits are normal and typical of those of others (i.e., that others also think the same way that they do). This cognitive bias tends to lead to the perception of a consensus that does not exist, a "false consensus".

This false consensus is significant because it increases or decreases self-esteem, the (overconfidence effect) or a belief that everyone knows one's own knowledge. It can be derived from a desire to conform and be liked by others in a social environment. This bias is especially prevalent in group settings where one thinks the collective opinion of their own group matches that of the larger population. Since the members of a group reach a consensus and rarely encounter those who dispute it, they tend to believe that everybody thinks the same way. The false-consensus effect is not restricted to cases where people believe that their values are shared by the majority, but it still manifests as an overestimate of the extent of their belief.

Additionally, when confronted with evidence that a consensus does not exist, people often assume that those who do not agree with them are defective in some way. There is no single cause for this cognitive bias; the availability heuristic, self-serving bias, and naïve realism have been suggested as at least partial underlying factors. Maintenance of this cognitive bias may be related to the tendency to make decisions with relatively little information. When faced with uncertainty and a limited sample from which to make decisions, people often "project" themselves onto the situation. When this personal knowledge is used as input to make generalizations, it often results in the false sense of being part of the majority.The false-consensus effect can be contrasted with pluralistic ignorance, an error in which people privately disapprove but publicly support what seems to be the majority view (see below).

Horn effect

The horn effect, closely related to the halo effect, is a form of cognitive bias that causes one's perception of another to be unduly influenced by a single negative trait. An example of the horn effect may be that an observer is more likely to assume a physically unattractive person is morally inferior to an attractive person, despite the lack of relationship between morality and physical appearance.

Information bias (psychology)

Information bias is a cognitive bias to seek information when it does not affect action. People can often make better predictions or choices with less information: more information is not always better. An example of information bias is believing that the more information that can be acquired to make a decision, the better, even if that extra information is irrelevant for the decision.

List of cognitive biases

Cognitive biases are systematic patterns of deviation from norm or rationality in judgment, and are often studied in psychology and behavioral economics.Although the reality of these biases is confirmed by replicable research, there are often controversies about how to classify these biases or how to explain them. Some are effects of information-processing rules (i.e., mental shortcuts), called heuristics, that the brain uses to produce decisions or judgments. Biases have a variety of forms and appear as cognitive ("cold") bias, such as mental noise, or motivational ("hot") bias, such as when beliefs are distorted by wishful thinking. Both effects can be present at the same time.There are also controversies over some of these biases as to whether they count as useless or irrational, or whether they result in useful attitudes or behavior. For example, when getting to know others, people tend to ask leading questions which seem biased towards confirming their assumptions about the person. However, this kind of confirmation bias has also been argued to be an example of social skill: a way to establish a connection with the other person.Although this research overwhelmingly involves human subjects, some findings that demonstrate bias have been found in non-human animals as well. For example, hyperbolic discounting has been observed in rats, pigeons, and monkeys.

Name calling

Name calling is a form of verbal abuse in which insulting or demeaning labels are directed at an individual or group. This phenomenon is studied by a variety of academic disciplines such as anthropology, child psychology, and politics. It is also studied by rhetoricians, and a variety of other disciplines that study propaganda techniques and their causes and effects. The technique is most frequently employed within political discourse and school systems, in an attempt to negatively impact their opponent.

Observer effect

Observer effect may refer to:

Hawthorne effect, a form of reactivity in which subjects modify an aspect of their behavior, in response to their knowing that they are being studied

Heisenbug of computer programming, where a software bug seems to disappear or alter its behavior when one attempts to study it

Observer effect (information technology), the impact of observing a process while it is running

Observer effect (physics), the impact of observing a physical system

Probe effect, the effect on a physical system of adding measurement devices, such as the probes of electronic test equipment

Observer-expectancy effect, a form of reactivity in which a researcher's cognitive bias causes them to unconsciously influence the participants of an experimentIt may also refer to:

"Observer Effect" (Star Trek: Enterprise), an episode of Star Trek: Enterprise, named after this effect

Omission bias

The omission bias is an alleged type of cognitive bias. It is the tendency to judge harmful actions as worse, or less moral than equally harmful omissions (inactions) because actions are more obvious than inactions. It is contentious as to whether this represents a systematic error in thinking, or is supported by a substantive moral theory. For a consequentialist, judging harmful actions as worse than inaction would indeed be inconsistent, but deontological ethics may, and normally does, draw a moral distinction between doing and allowing. The bias is usually showcased through the trolley problem.

Precision bias

Precision bias is a form of cognitive bias in which an evaluator of information commits a logical fallacy as the result of confusing accuracy and precision. More particularly, in assessing the merits of an argument, a measurement, or a report, an observer or assessor falls prey to precision bias when he or she believes that greater precision implies greater accuracy (i.e., that simply because a statement is precise, it is also true); the observer or assessor are said to provide false precision.

Precision bias, whether called by that phrase or another, is addressed in fields such as economics, in which there is a significant danger that a seemingly impressive quantity of statistics may be collected even though these statistics may be of little value for demonstrating any particular truth.

It is also called the numeracy bias, or the range estimate aversion.

The clustering illusion and the Texas sharpshooter fallacy may both be treated as relatives of precision bias. In these former fallacies, precision is mistakenly considered evidence of causation, when in fact the clustered information may actually be the result of randomness.

Rosy retrospection

Rosy retrospection refers to the psychological phenomenon of people sometimes judging the past disproportionately more positively than they judge the present. The Romans occasionally referred to this phenomenon with the Latin phrase "memoria praeteritorum bonorum", which translates into English roughly as "the past is always well remembered". Rosy retrospection is very closely related to the concept of nostalgia. The difference between the terms is that rosy retrospection is a cognitive bias, whereas the broader phenomenon of nostalgia is not necessarily based on a biased perspective.

Although rosy retrospection is a cognitive bias, and distorts a person's view of reality to some extent, some people theorize that it may in part serve a useful purpose in increasing self-esteem and a person's overall sense of well-being. For example, Terence Mitchell and Leigh Thompson mention this possibility in a chapter entitled "A Theory of Temporal Adjustments of the Evaluation of Events" in a book of collected research reports from various authors entitled "Advances in Managerial Cognition and Organizational Information Processing".Simplifications and exaggerations of memories (such as occurs in rosy retrospection) may also make it easier for people's brains to store long-term memories, as removing details may reduce the burden of those memories on the brain and make the brain require fewer neural connections to form and engrain memories. Mnemonics, psychological chunking, and subconscious distortions of memories may in part serve a similar purpose: memory compression by way of simplification. Data compression in computers works on similar principles: compression algorithms tend to either (1) remove unnecessary details from data or (2) reframe the details in a simpler way from which the data can subsequently be reconstructed as needed, or (3) both. Much the same can be said of human memories and the human brain's own process of memorization.

In English, the idiom "rose-colored glasses" or "rose-tinted glasses" is also sometimes used to refer to the phenomenon of rosy retrospection. Usually this idiom occurs as some variation of the phrase "seeing things through rose-tinted glasses" or some other roughly similar phrasing.

Rosy retrospection is also related to the concept of declinism.

Subjective validation

Subjective validation, sometimes called personal validation effect, is a cognitive bias by which a person will consider a statement or another piece of information to be correct if it has any personal meaning or significance to them. In other words, a person whose opinion is affected by subjective validation will perceive two unrelated events (i.e., a coincidence) to be related because their personal belief demands that they be related. Closely related to the Forer effect, subjective validation is an important element in cold reading. It is considered to be the main reason behind most reports of paranormal phenomena. According to Bob Carroll, psychologist Ray Hyman is considered to be the foremost expert on subjective validation and cold reading.The term subjective validation first appeared in the 1980 book The Psychology of the Psychic by David F. Marks and Richard Kammann.

Cognitive biases
Statistical biases
Other biases
Bias reduction

This page is based on a Wikipedia article written by authors (here).
Text is available under the CC BY-SA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.