It’s widely know that the human brain often takes shortcuts when analyzing situations, especially quick ones or ones that don’t require conscious attention. These shortcuts are called biases, and unfortunately there are hundreds of them, clustered around a few categories.
That’s why a very interesting research surfacing a different deeper mode of analyzing biases is particularly interesting:
We argue that different biases could be traced back to the same underlying fundamental beliefs and outline why at least some of these fundamental beliefs are likely held widely among humans
A. Oeberst, R. Imhoff (2023) Toward Parsimony in Bias Research
They extract a set of 6 fundamental beliefs that can help categorize a number of different biases. From the paper:
- My experience is a reasonable reference.
- Spotlight effect — Overestimating the extent to which (an aspect of) oneself is noticed by others
- Illusion of transparency — Overestimating the extent to which one’s own inner states are noticed by others
- Illusory transparency of intention — Overestimating the extent to which an intention behind an ambiguous utterance (that is clear to oneself) is clear to others
- False consensus — Overestimation of the extent to which one’s opinions, beliefs, etc., are shared
- Social projection — Tendency to judge others as similar to oneself
- I make correct assessments of the world.
- Bias blind spot — Being convinced that mainly others succumb to biased information processing
- Hostile media bias — Partisans perceiving media reports as biased toward the other side
- I am good.
- Better-than-average effect — Overestimating one’s performance in relation to the performance of others
- Self-serving bias — Attributing one’s failures externally but one’s successes internally
- My group is a reasonable reference.
- Ethnocentric bias — Giving precedence to one’s own group (not preference)
- In-group projection — Perceiving one’s group (vs. other groups) as more typical of a shared superordinate identity
- My group (members) is (are) good.
- In-group bias/partisan bias — Seeing one’s own group in a more favorable light than other groups (e.g., morally superior, less responsible for harm)
- Ultimate attribution error — External (vs. internal) attribution for negative (vs. positive) behaviors of in-group members; reverse pattern for out-group members
- Linguistic intergroup bias — Using more abstract (vs. concrete) words when describing positive (vs. negative) behavior of in-group members and the reverse pattern for out-group members
- Intergroup sensitivity effect — Criticisms evaluated less defensively when made by an in-group (vs. out-group) member
- People’s attributes (not context) shape outcomes.
- Fundamental attribution error/correspondence bias — Preference for dispositional (vs. situational) attribution with regard to others
- Outcome bias — Evaluation of the quality of a decision as a function of the outcome (valence)
While the paper goes much more in-depth, this summary list of six fundamental beliefs I feel it’s much more useful for personal self-analysis and improvement, as well as a way to discuss with people that are exhibiting certain biases. Generally speaking counter-arguments for a bias that go against a fundamental belief aren’t likely to succeed — or if they do, they require a lot of effort on both the arguing and the receiving side. This means that good strategies of engagement might require a deeper understanding of the belief to be successful.
Some of these are obvious: imagine for example a counter-argument that makes a generalization about the group a person feels they belong to, arguing that the whole group is mistaken. That is unlikely to work as that’s their group of reference, the network of people they call friends, and so on: disrupting a belief by going directly against the group as a whole won’t work. A different strategy that corrects the belief without fundamentally breaking their sense of belonging to the group might be much more effective.