A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective social reality" from their perception of the input. An individual's construction of social reality, not the objective input, may dictate their behaviour in the social world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.
Some cognitive biases are presumably adaptive. Cognitive biases may lead to more effective actions in a given context. Furthermore, allowing cognitive biases enable faster decisions which can be desirable when timeliness is more valuable than accuracy, as illustrated in heuristics. Other cognitive biases are a "by-product" of human processing limitations, resulting from a lack of appropriate mental mechanisms (bounded rationality), or simply from a limited capacity for information processing.
A continually evolving list of cognitive biases has been identified over the last six decades of research on human judgment and decision-making in cognitive science, social psychology, and behavioral economics. Kahneman and Tversky (1996) argue that cognitive biases have efficient practical implications for areas including clinical judgment, entrepreneurship, finance, and management.
Bias arises from various processes that are sometimes difficult to distinguish. These include
The notion of cognitive biases was introduced by Amos Tversky and Daniel Kahneman in 1972 and grew out of their experience of people's innumeracy, or inability to reason intuitively with the greater orders of magnitude. Tversky, Kahneman and colleagues demonstrated several replicable ways in which human judgments and decisions differ from rational choice theory. Tversky and Kahneman explained human differences in judgement and decision making in terms of heuristics. Heuristics involve mental shortcuts which provide swift estimates about the possibility of uncertain occurrences. Heuristics are simple for the brain to compute but sometimes introduce "severe and systematic errors."
For example, the representativeness heuristic is defined as the tendency to "judge the frequency or likelihood" of an occurrence by the extent of which the event "resembles the typical case". The "Linda Problem" illustrates the representativeness heuristic (Tversky & Kahneman, 1983). Participants were given a description of "Linda" that suggests Linda might well be a feminist (e.g., she is said to be concerned about discrimination and social justice issues). They were then asked whether they thought Linda was more likely to be a "(a) bank teller" or a "(b) bank teller and active in the feminist movement". A majority chose answer (b). This error (mathematically, answer (b) cannot be more likely than answer (a)) is an example of the "conjunction fallacy"; Tversky and Kahneman argued that respondents chose (b) because it seemed more "representative" or typical of persons who might fit the description of Linda. The representativeness heuristic may lead to errors such as activating stereotypes and inaccurate judgments of others (Haselton et al., 2005, p. 726).
Alternatively, critics of Kahneman and Tversky such as Gerd Gigerenzer argue that heuristics should not lead us to conceive of human thinking as riddled with irrational cognitive biases, but rather to conceive rationality as an adaptive tool that is not identical to the rules of formal logic or the probability calculus. Nevertheless, experiments such as the "Linda problem" grew into the heuristics and biases research program which spread beyond academic psychology into other disciplines including medicine and political science.
Biases can be distinguished on a number of dimensions. For example,
Some biases reflect a subject's motivation, for example, the desire for a positive self-image leading to egocentric bias and the avoidance of unpleasant cognitive dissonance. Other biases are due to the particular way the brain perceives, forms memories and makes judgments. This distinction is sometimes described as "hot cognition" versus "cold cognition", as motivated reasoning can involve a state of arousal.
Among the "cold" biases,
The fact that some biases reflect motivation, and in particular the motivation to have positive attitudes to oneself accounts for the fact that many biases are self-serving or self-directed (e.g., illusion of asymmetric insight, self-serving bias). There are also biases in how subjects evaluate in-groups or out-groups; evaluating in-groups as more diverse and "better" in many respects, even when those groups are arbitrarily-defined (ingroup bias, outgroup homogeneity bias).
Some cognitive biases belong to the subgroup of attentional biases which refer to the paying of increased attention to certain stimuli. It has been shown, for example, that people addicted to alcohol and other drugs pay more attention to drug-related stimuli. Common psychological tests to measure those biases are the Stroop task and the dot probe task.
The following is a list of the more commonly studied cognitive biases:
|Fundamental attribution error (FAE)||Also known as the correspondence bias  is the tendency for people to over-emphasize personality-based explanations for behaviours observed in others. At the same time, individuals under-emphasize the role and power of situational influences on the same behaviour. Jones and Harris' (1967) classic study illustrates the FAE. Despite being made aware that the target's speech direction (pro-Castro/anti-Castro) was assigned to the writer, participants ignored the situational pressures and attributed pro-Castro attitudes to the writer when the speech represented such attitudes.|
|Priming bias||The tendency to be influenced by what someone else has said to create preconceived idea.|
|Confirmation bias||The tendency to search for or interpret information in a way that confirms one's preconceptions. In addition, individuals may discredit information that does not support their views. The confirmation bias is related to the concept of cognitive dissonance. Whereby, individuals may reduce inconsistency by searching for information which re-confirms their views (Jermias, 2001, p. 146).|
|Affinity bias||The tendency to be biased toward people like ourselves|
|Self-serving bias||The tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests.|
|Belief bias||When one's evaluation of the logical strength of an argument is biased by their belief in the truth or falsity of the conclusion.|
|Framing||Using a too-narrow approach and description of the situation or issue.|
|Hindsight bias||Sometimes called the "I-knew-it-all-along" effect, is the inclination to see past events as being predictable.|
A 2012 Psychological Bulletin article suggests that at least 8 seemingly unrelated biases can be produced by the same information-theoretic generative mechanism. It is shown that noisy deviations in the memory-based information processes that convert objective evidence (observations) into subjective estimates (decisions) can produce regressive conservatism, the belief revision (Bayesian conservatism), illusory correlations, illusory superiority (better-than-average effect) and worse-than-average effect, subadditivity effect, exaggerated expectation, overconfidence, and the hard–easy effect.
Many social institutions rely on individuals to make rational judgments.
The securities regulation regime largely assumes that all investors act as perfectly rational persons. In truth, actual investors face cognitive limitations from biases, heuristics, and framing effects.
A fair jury trial, for example, requires that the jury ignore irrelevant features of the case, weigh the relevant features appropriately, consider different possibilities open-mindedly and resist fallacies such as appeal to emotion. The various biases demonstrated in these psychological experiments suggest that people will frequently fail to do all these things. However, they fail to do so in systematic, directional ways that are predictable.
Cognitive biases are also related to the persistence of superstition, to large social issues such as prejudice, and they also work as a hindrance in the acceptance of scientific non-intuitive knowledge by the public.
However, in some academic disciplines, the study of bias is very popular. For instance, bias is a wide spread phenomenon and well studied, because most decisions that concern the minds and hearts of entrepreneurs are computationally intractable
Because they cause systematic errors, cognitive biases cannot be compensated for using a wisdom of the crowd technique of averaging answers from several people. Debiasing is the reduction of biases in judgment and decision making through incentives, nudges, and training. Cognitive bias mitigation and cognitive bias modification are forms of debiasing specifically applicable to cognitive biases and their effects. Reference class forecasting is a method for systematically debiasing estimates and decisions, based on what Daniel Kahneman has dubbed the outside view.
Similar to Gigerenzer (1996), Haselton et al. (2005) state the content and direction of cognitive biases are not "arbitrary" (p. 730). Moreover, cognitive biases can be controlled. One debiasing technique aims to decrease biases by encouraging individuals to use controlled processing compared to automatic processing. In relation to reducing the FAE, monetary incentives and informing participants they will be held accountable for their attributions have been linked to the increase of accurate attributions. Training has also shown to reduce cognitive bias. Morewedge and colleagues (2015) found that research participants exposed to one-shot training interventions, such as educational videos and debiasing games that taught mitigating strategies, exhibited significant reductions in their commission of six cognitive biases immediately and up to 3 months later.
Cognitive bias modification refers to the process of modifying cognitive biases in healthy people and also refers to a growing area of psychological (non-pharmaceutical) therapies for anxiety, depression and addiction called cognitive bias modification therapy (CBMT). CBMT is sub-group of therapies within a growing area of psychological therapies based on modifying cognitive processes with or without accompanying medication and talk therapy, sometimes referred to as applied cognitive processing therapies (ACPT). Although cognitive bias modification can refer to modifying cognitive processes in healthy individuals, CBMT is a growing area of evidence-based psychological therapy, in which cognitive processes are modified to relieve suffering from serious depression, anxiety, and addiction. CBMT techniques are technology assisted therapies that are delivered via a computer with or without clinician support. CBM combines evidence and theory from the cognitive model of anxiety, cognitive neuroscience, and attentional models.
A 2012 Psychological Bulletin article suggested that at least eight seemingly unrelated biases can be produced by the same information-theoretic generative mechanism that assumes noisy information processing during storage and retrieval of information in human memory.
People do appear to have stable individual differences in their susceptibility to decision biases such as overconfidence, temporal discounting, and bias blind spot. That said, these stable levels of bias within individuals are possible to change. Participants in experiments who watched training videos and played debiasing games showed medium to large reductions both immediately and up to three months later in the extent to which they exhibited susceptibility to six cognitive biases: anchoring, bias blind spot, confirmation bias, fundamental attribution error, projection bias, and representativeness.
There are criticisms against theories of cognitive biases based on the fact that both sides in a debate often claim each other's thoughts to be in human nature and the result of cognitive bias, while claiming their own viewpoint as being the correct way to "overcome" cognitive bias. This is not due simply to debate misconduct but is a more fundamental problem that stems from psychology's making up of multiple opposed cognitive bias theories that can be non-falsifiably used to explain away any viewpoint.
Adaptive bias is the idea that the human brain has evolved to reason adaptively, rather than truthfully or even rationally, and that cognitive bias may have evolved as a mechanism to reduce the overall cost of cognitive errors as opposed to merely reducing the number of cognitive errors, when faced with making a decision under conditions of uncertainty.Bias blind spot
The bias blind spot is the cognitive bias of recognizing the impact of biases on the judgment of others, while failing to see the impact of biases on one's own judgment. The term was created by Emily Pronin, a social psychologist from Princeton University's Department of Psychology, with colleagues Daniel Lin and Lee Ross. The bias blind spot is named after the visual blind spot. Most people appear to exhibit the bias blind spot. In a sample of more than 600 residents of the United States, more than 85% believed they were less biased than the average American. Only one participant believed that he or she was more biased than the average American. People do vary with regard to the extent to which they exhibit the bias blind spot. It appears to be a stable individual difference that is measurable (for a scale, see Scopelliti et al. 2015).The bias blind spot appears to be a true blind spot in that it is unrelated to actual decision making ability. Performance on indices of decision making competence are not related to individual differences in bias blind spot. In other words, everyone seems to think they are less biased than other people, regardless of their actual decision making ability.Cognitive bias in animals
Cognitive bias in animals is a pattern of deviation in judgment, whereby inferences about other animals and situations may be affected by irrelevant information or emotional states. It is sometimes said that animals create their own "subjective social reality" from their perception of the input. In humans, for example, an optimistic or pessimistic bias might affect one's answer to the question "Is the glass half empty or half full?"
To explore cognitive bias, one might train an animal to expect that a positive event follows one stimulus and that a negative event follows another stimulus. For example, on many trials, if the animal presses lever A after a 20 Hz tone it gets a highly desired food, but a press on lever B after a 10 Hz tone yields bland food. The animal is then offered both levers after an intermediate test stimulus, e.g. a 15 Hz tone. The hypothesis is that the animal's "mood" will bias the choice of levers after the test stimulus; if positive, it will tend to choose lever A, if negative it will tend to choose lever B. The hypothesis is tested by manipulating factors that might affect mood – for example, the type of housing the animal is kept in.Cognitive biases have been shown in a wide range of species including rats, dogs, rhesus macaques, sheep, chicks, starlings and honeybees.Cognitive bias mitigation
Cognitive bias mitigation is the prevention and reduction of the negative effects of cognitive biases – unconscious, automatic influences on human judgment and decision making that reliably produce reasoning errors.
Coherent, comprehensive theories of cognitive bias mitigation are lacking. This article describes debiasing tools, methods, proposals and other initiatives, in academic and professional disciplines concerned with the efficacy of human reasoning, associated with the concept of cognitive bias mitigation; most address mitigation tacitly rather than explicitly.
A long-standing debate regarding human decision making bears on the development of a theory and practice of bias mitigation. This debate contrasts the rational economic agent standard for decision making versus one grounded in human social needs and motivations. The debate also contrasts the methods used to analyze and predict human decision making, i.e. formal analysis emphasizing intellectual capacities versus heuristics emphasizing emotional states. This article identifies elements relevant to this debate.Cognitive bias modification
Cognitive bias modification (CBM) refers to the process of modifying cognitive biases in healthy people and also refers to a growing area of psychological (non-pharmaceutical) therapies for anxiety, depression and addiction called cognitive bias modification therapy (CBMT). CBMT is sub-group of therapies within a growing area of psychological therapies based on modifying cognitive processes with or without accompanying medication and talk therapy, sometimes referred to as applied cognitive processing therapies (ACPT). Other ACPTs include attention training, interpretation modification, approach/avoid training, imagery modification training, eye movement desensitization and reprocessing therapy for PTSD.
According to Yiend et al. 2013 in an article in the journal Cognitive Therapy Research, "CBM treatments are a more convenient and flexible than other modes of treatment because they do not require meetings with a therapist. They offer the potential for delivery using modern technologies (e.g. internet or mobile phone) and require minimal supervision. They could therefore become highly cost effective and widely accessible. CBM methods are also less demanding and more acceptable to patients than traditional therapies. This is because personal thoughts and beliefs are not directly interrogated, and there is no need for social interaction or stigmatizing visits to outpatient clinics. Similarly, patient insight is not required because CBM seeks to target the underlying maintaining cognitive bias directly; therefore, patient engagement is likely to be easier. In sum, CBM methods offer a high gain, low cost treatment option because they can circumvent many of the practical and psychological requirements that disadvantage competing psychological interventionsCBMT techniques are technology assisted therapies that are delivered via a computer with or without clinician support. CBM combines evidence and theory from the cognitive model of anxiety, cognitive neuroscience and attentional models.CBM can be seen as one version of attentional retraining. It has been described as a 'cognitive vaccine'.Congruence bias
Congruence bias is a type of cognitive bias similar to confirmation bias. Congruence bias occurs due to people's overreliance on directly testing a given hypothesis as well as neglecting indirect testing.Dog intelligence
Dog intelligence or dog cognition is the process in dogs of acquiring, storing in memory, retrieving, combining, comparing, and using in new situations information and conceptual skills.Studies have shown that dogs display many behaviors associated with intelligence. They have advanced memory skills, and are able to read and react appropriately to human body language such as gesturing and pointing, and to understand human voice commands. Dogs demonstrate a theory of mind by engaging in deception.Dunning–Kruger effect
In the field of psychology, the Dunning–Kruger effect is a cognitive bias in which people of low ability have illusory superiority and mistakenly assess their cognitive ability as greater than it is. The cognitive bias of illusory superiority comes from the inability of low-ability people to recognize their lack of ability. Without the self-awareness of metacognition, low-ability people cannot objectively evaluate their competence or incompetence.As described by social psychologists David Dunning and Justin Kruger, the cognitive bias of illusory superiority results from an internal illusion in people of low ability and from an external misperception in people of high ability; that is, "the miscalibration of the incompetent stems from an error about the self, whereas the miscalibration of the highly competent stems from an error about others."End-of-the-day betting effect
The end-of-the-day betting effect is a cognitive bias reflected in the tendency for bettors to take gambles with higher risk and higher reward at the end of their betting session to try to make up for losses. William McGlothlin (1956) and Mukhtar Ali (1977) first discovered this effect after observing the shift in betting patterns at horserace tracks. Mcglothlin and Ali noticed that people are significantly more likely to prefer longshots to conservative bets on the last race of the day. They found that the movement towards longshots, and away from favorites, is so pronounced that some studies show that conservatively betting on the favorite to show (to finish first, second, or third) in the last race is a profitable bet despite the track’s take.False consensus effect
In psychology, the false-consensus effect or false-consensus bias is an attributional type of cognitive bias whereby people tend to overestimate the extent to which their opinions, beliefs, preferences, values, and habits are normal and typical of those of others (i.e., that others also think the same way that they do). This cognitive bias tends to lead to the perception of a consensus that does not exist, a "false consensus".
This false consensus is significant because it increases or decreases self-esteem, the (overconfidence effect) or a belief that everyone knows one's own knowledge. It can be derived from a desire to conform and be liked by others in a social environment. This bias is especially prevalent in group settings where one thinks the collective opinion of their own group matches that of the larger population. Since the members of a group reach a consensus and rarely encounter those who dispute it, they tend to believe that everybody thinks the same way. The false-consensus effect is not restricted to cases where people believe that their values are shared by the majority, but it still manifests as an overestimate of the extent of their belief.
Additionally, when confronted with evidence that a consensus does not exist, people often assume that those who do not agree with them are defective in some way. There is no single cause for this cognitive bias; the availability heuristic, self-serving bias, and naïve realism have been suggested as at least partial underlying factors. Maintenance of this cognitive bias may be related to the tendency to make decisions with relatively little information. When faced with uncertainty and a limited sample from which to make decisions, people often "project" themselves onto the situation. When this personal knowledge is used as input to make generalizations, it often results in the false sense of being part of the majority.The false-consensus effect can be contrasted with pluralistic ignorance, an error in which people privately disapprove but publicly support what seems to be the majority view (see below).Horn effect
The horn effect, closely related to the halo effect, is a form of cognitive bias that causes one's perception of another to be unduly influenced by a single negative trait. An example of the horn effect may be that an observer is more likely to assume a physically unattractive person is morally inferior to an attractive person, despite the lack of relationship between morality and physical appearance.Information bias (psychology)
Information bias is a cognitive bias to seek information when it does not affect action. People can often make better predictions or choices with less information: more information is not always better. An example of information bias is believing that the more information that can be acquired to make a decision, the better, even if that extra information is irrelevant for the decision.List of cognitive biases
Cognitive biases are systematic patterns of deviation from norm or rationality in judgment, and are often studied in psychology and behavioral economics.Although the reality of these biases is confirmed by replicable research, there are often controversies about how to classify these biases or how to explain them. Some are effects of information-processing rules (i.e., mental shortcuts), called heuristics, that the brain uses to produce decisions or judgments. Biases have a variety of forms and appear as cognitive ("cold") bias, such as mental noise, or motivational ("hot") bias, such as when beliefs are distorted by wishful thinking. Both effects can be present at the same time.There are also controversies over some of these biases as to whether they count as useless or irrational, or whether they result in useful attitudes or behavior. For example, when getting to know others, people tend to ask leading questions which seem biased towards confirming their assumptions about the person. However, this kind of confirmation bias has also been argued to be an example of social skill: a way to establish a connection with the other person.Although this research overwhelmingly involves human subjects, some findings that demonstrate bias have been found in non-human animals as well. For example, hyperbolic discounting has been observed in rats, pigeons, and monkeys.Name calling
Name calling is a form of verbal abuse in which insulting or demeaning labels are directed at an individual or group. This phenomenon is studied by a variety of academic disciplines such as anthropology, child psychology, and politics. It is also studied by rhetoricians, and a variety of other disciplines that study propaganda techniques and their causes and effects. The technique is most frequently employed within political discourse and school systems, in an attempt to negatively impact their opponent.Observer effect
Observer effect may refer to:
Hawthorne effect, a form of reactivity in which subjects modify an aspect of their behavior, in response to their knowing that they are being studied
Heisenbug of computer programming, where a software bug seems to disappear or alter its behavior when one attempts to study it
Observer effect (information technology), the impact of observing a process while it is running
Observer effect (physics), the impact of observing a physical system
Probe effect, the effect on a physical system of adding measurement devices, such as the probes of electronic test equipment
Observer-expectancy effect, a form of reactivity in which a researcher's cognitive bias causes them to unconsciously influence the participants of an experimentIt may also refer to:
"Observer Effect" (Star Trek: Enterprise), an episode of Star Trek: Enterprise, named after this effectOmission bias
The omission bias is an alleged type of cognitive bias. It is the tendency to judge harmful actions as worse, or less moral than equally harmful omissions (inactions) because actions are more obvious than inactions. It is contentious as to whether this represents a systematic error in thinking, or is supported by a substantive moral theory. For a consequentialist, judging harmful actions as worse than inaction would indeed be inconsistent, but deontological ethics may, and normally does, draw a moral distinction between doing and allowing. The bias is usually showcased through the trolley problem.Precision bias
Precision bias is a form of cognitive bias in which an evaluator of information commits a logical fallacy as the result of confusing accuracy and precision. More particularly, in assessing the merits of an argument, a measurement, or a report, an observer or assessor falls prey to precision bias when he or she believes that greater precision implies greater accuracy (i.e., that simply because a statement is precise, it is also true); the observer or assessor are said to provide false precision.
Precision bias, whether called by that phrase or another, is addressed in fields such as economics, in which there is a significant danger that a seemingly impressive quantity of statistics may be collected even though these statistics may be of little value for demonstrating any particular truth.
It is also called the numeracy bias, or the range estimate aversion.
The clustering illusion and the Texas sharpshooter fallacy may both be treated as relatives of precision bias. In these former fallacies, precision is mistakenly considered evidence of causation, when in fact the clustered information may actually be the result of randomness.Rosy retrospection
Rosy retrospection refers to the psychological phenomenon of people sometimes judging the past disproportionately more positively than they judge the present. The Romans occasionally referred to this phenomenon with the Latin phrase "memoria praeteritorum bonorum", which translates into English roughly as "the past is always well remembered". Rosy retrospection is very closely related to the concept of nostalgia. The difference between the terms is that rosy retrospection is a cognitive bias, whereas the broader phenomenon of nostalgia is not necessarily based on a biased perspective.
Although rosy retrospection is a cognitive bias, and distorts a person's view of reality to some extent, some people theorize that it may in part serve a useful purpose in increasing self-esteem and a person's overall sense of well-being. For example, Terence Mitchell and Leigh Thompson mention this possibility in a chapter entitled "A Theory of Temporal Adjustments of the Evaluation of Events" in a book of collected research reports from various authors entitled "Advances in Managerial Cognition and Organizational Information Processing".Simplifications and exaggerations of memories (such as occurs in rosy retrospection) may also make it easier for people's brains to store long-term memories, as removing details may reduce the burden of those memories on the brain and make the brain require fewer neural connections to form and engrain memories. Mnemonics, psychological chunking, and subconscious distortions of memories may in part serve a similar purpose: memory compression by way of simplification. Data compression in computers works on similar principles: compression algorithms tend to either (1) remove unnecessary details from data or (2) reframe the details in a simpler way from which the data can subsequently be reconstructed as needed, or (3) both. Much the same can be said of human memories and the human brain's own process of memorization.
In English, the idiom "rose-colored glasses" or "rose-tinted glasses" is also sometimes used to refer to the phenomenon of rosy retrospection. Usually this idiom occurs as some variation of the phrase "seeing things through rose-tinted glasses" or some other roughly similar phrasing.
Rosy retrospection is also related to the concept of declinism.Subjective validation
Subjective validation, sometimes called personal validation effect, is a cognitive bias by which a person will consider a statement or another piece of information to be correct if it has any personal meaning or significance to them. In other words, a person whose opinion is affected by subjective validation will perceive two unrelated events (i.e., a coincidence) to be related because their personal belief demands that they be related. Closely related to the Forer effect, subjective validation is an important element in cold reading. It is considered to be the main reason behind most reports of paranormal phenomena. According to Bob Carroll, psychologist Ray Hyman is considered to be the foremost expert on subjective validation and cold reading.The term subjective validation first appeared in the 1980 book The Psychology of the Psychic by David F. Marks and Richard Kammann.