My friend and colleague, Patricia Rogers, says of cognitive bias , “It would be good to think through these in terms of systematic evaluation approaches and the extent to which they address these.” This was in response to the article herecognitive bias The article says that the human brain is capable of 10 to the 16th power (a big number) processes per second. Despite being faster than a speeding bullet, etc., the human brain has ” annoying glitches (that) cause us to make questionable decisions and reach erroneous conclusions.”

Bias is something that evaluators deal with all the time. There is desired response bias, non-response bias, recency and immediacy bias, measurement bias, and…need I say more? Isn’t evaluation and aren’t evaluators supposed to be “objective”? That we as evaluators behave in an ethical manner? That we have dealt with potential bias and conflicts of interest. That is where cognitive bias appear. And you might not know it at all.

In one of my favorite books (Fitzpatrick, Sanders, & Worthen, 2011)  fitzpatrick book 2 , the authors state unequivocally that, “…the possibility of human beings rendering completely unbiased judgments is very slight.” “In fact,” they continue, “…evaluators could actually be more susceptible to bias simply because they believe that…they are objective and unbiased.”

George Dvorsky, a Canadian bioethicist, transhumanist, and futurist, in his blog post that is mentioned above, says that “…it’s important to distinguish between cognitive biases and logical fallacies. A logical fallacy is an error in logical argumentation (e.g. ad hominem attacks, slippery slopes, circular arguments, appeal to force, etc.). A cognitive bias, on the other hand, is a genuine deficiency or limitation in our thinking — a flaw in judgment that arises from errors of memory, social attribution, and miscalculations (such as statistical errors or a false sense of probability).”

Dvorsky lists these 12 cognitive biases which affect all individuals, including evaluators (there are others!):

  1.  Confirmation Bias (the often unconscious act of referencing only those perspectives that fuel our pre-existing views, while at the same time ignoring or dismissing opinions that threaten our world view;
  2. In group Bias (a manifestation of our innate tribalistic tendencies that causes us to overestimate the abilities and value of our immediate group at the expense of people we don’t really know);
  3. Gambler’s Fallacy (the tendency to put a tremendous amount of weight on previous events, believing that they’ll somehow influence future outcomes);
  4. Post Purchase Rationalization (a way of subconsciously justifying  expensive purchases);
  5. Neglecting Probability (inability to properly grasp a proper sense of peril and risk);
  6. Observational Selection Bias (the effect of suddenly noticing things not noticed that much before and wrongly assume that the frequency has increased);
  7. Status Quo Bias (apprehension of change, which often leads us to make choices that guarantee that things remain the same, or change as little as possible);
  8. Negativity Bias ( perceive negative news as being more important or profound);
  9. Bandwagon Effect (often causes behaviors, social norms, and memes to propagate among groups of individuals regardless of the evidence or motives in support);
  10. Projection Bias (overestimate how typical and normal we are, and assume that a consensus exists on matters when there may be none);
  11. Current Moment Bias (hard time imagining the future and altering  behaviors and expectations accordingly);
  12. Anchoring Effect (aka, relativity trap; tendency to compare and contrast only a limited set of items).

Fitzpatrick, Sanders, and Worthen suggest that several Program Evaluation Standards (U(tility).1, U(tility).4, P(ropriety).6, and A(ccuracy).8) and Guiding Principles (C (Integrity/Honesty).3 and C (Integrity/Honesty).4) talk about evaluations (and hence, evaluators) “…being honest, impartial, avoiding conflicts of interest, and conducting evaluations with integrity.” But are we? Or do cognitive biases get in the way of being honest and impartial?

Can you identify where you have cognitive bias affecting what you do?

 

my two cents.

molly.

 

Comments are closed.