A compendium of cognitive biases we've inherited form our ancestors that manipulate our thinking and that frequently crop up in discussions of AI/AGI.
Projecting human characteristics onto non-human entities or objects.
The self-reinforcing dynamic in which a belief gains increasing plausibility though its increasing repetition in the public sphere.
Bias Blind Spot
Assuming oneself is less biased than others or more easily identifying cognitive biases in others rather than oneself.
Noticing and remembering information that supports one's opinions and filtering out information that contradicts them.
Considering only evidence directly supporting one's hypothesis rather than evidence that indirectly supports it or disproves it while disregarding any alternative hypotheses.
Believing and promoting data that agree with one's expectations and disbelieving or disregarding data that conflicts with one's expectations.
Drawing different conclusions from the same data depending on the manner in which the data is presented.
Assuming the probability of future events is altered by past events that actually have no probabilistic connection.
Dysfunctional reasoning that results when members of a group suppress or ignore alternate viewpoints that conflict with the group consensus.
Perceiving a person's positive or negative traits in one area as applicable to an unrelated area.
Hostile Attribution Bias
Interpreting the behaviors or opinions of others to be hostile in nature when no hostility is objectively present.
Illusion of Control
Overestimating one's ability to influence external events.
Perceiving an illusory relationship between unrelated events.
Illusory Truth Effect
Belief in statements based on how easy they are to understand or how often they've been promoted rather than the underlying evidence supporting them.
Rationalizing otherwise inexplicable injustice or negative outcomes based on a general belief that society or the universe is fundamentally just.
Law of the Instrument
Over-reliance on familiar tools and methods rather than potentially more appropriate alternate ones. Related to the maxim, "If you only have a hammer then everything looks like a nail."
Believing that one perceives reality objectively and without bias, that other rational people will perceive it in the same way, and that anyone who perceives it differently is biased, uninformed, or otherwise irrational.
Neglect of Probability
Statements or conclusions that disregard probability in a premise and its alternatives.
The tendency to disregard the possibility or severity of a disaster that has never happened before.
Unconsciously manipulating or misinterpreting experiments or data to better fit an expected result.
Judging harmful actions as worse than equally harmful inaction.
Overestimating the likelihood of favorable or desired outcomes.
The tendency to have more confidence in one's opinions or abilities than is objectively warranted.
Parkinson's Law of Triviality
Giving disproportionate weight to trivial but tractable issues rather than important but complex issues.
Overestimating the likelihood of unfavorable or undesired outcomes.
Underestimating the time it will take to reach a goal.
Excessive optimism towards an innovation's benefit to society without consideration of any potential negative impact.
Overestimating how much other people now or in the future share one's current beliefs and behaviors or how much someone's future self will share his or her current beliefs and behaviors.
Certainty of a conclusion with insufficient evidence to support that certainty. Frequently builds upon itself with each successive stage of a multi-stage premise. Related to Unproven Basis logical fallacy.
Dismissing or devaluing a premise or conclusion only because it's believed to have originated with an adversary.
Giving undue weight to data or behavior that is abnormal or emotionally striking but scarce while disregarding or devaluing that which is unremarkable but abundant.
Perceiving that something is true or that connections exist between unrelated events based on one's beliefs rather than on objective evidence.
Disproportionately recognizing one's successes over one's failures, perceiving ambiguous outcomes as successes, and interpreting ambiguous information as only promoting one's opinions and conclusions rather than contradicting them or being neutral.
Shared Information Bias
The tendency of groups to concentrate on information with which all members are already familiar and spend little time or effort focusing on information with which few members are already familiar.
Mistaking the measurement of progress to a goal for the goal itself.
Perceiving situations as a zero-sum game between participants regardless of whether or not that is objectively the case.