Action Oriented Biases In Critical Thinking

  1. Research should be conducted to identify contextual factors, that is, situations in which cognitive biases and heuristics may affect thought and action, and then to develop measures of performance in such situations, for use as criteria in studies aimed at understanding how cognitive biases affect performance. The research should consider the differentiating characteristics of contexts that determine when the use of heuristics for “fast and frugal” decision making might be beneficial, and when such thinking is better thought of as biased and resulting in poor decision making.

REFERENCES

Ariely, D. (2008). Predictably Irrational: The Hidden Forces that Shape Our Decisions. New York: Harper Collins.

Engle, R. (2002). Working memory capacity as executive attention. Current Directions in Psychological Science, 11(1):19–23.

Evans, J.St.B., and K.E. Stanovich. (2003). Dual-process theories of higher cognition: Advancing the debate. Perspectives on Psychological Science, 8(3):223–241.

Gigerenzer, G., R. Hertwig, and T. Pachur, Eds. (2011). Heuristics: The Foundations of Adaptive Behavior. New York: Oxford University Press.

Gilovich, T., and D.W. Griffin. (2002). Heuristics and biases: Then and now. In D.G.T. Gilovich and D. Kahneman, Eds., Heuristics and Biases: The Psychology of Intuitive Judgment (pp. 1–18). Cambridge, UK: Cambridge University Press.

Heuer, R.J., Jr. (1999). The Psychology of Intelligence Analysis. Center for the Study of Intelligence, Central Intelligence Agency. Available: https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-and-monographs/psychology-of-intelligence-analysis/ [January 2015].

Heuer R.J., Jr., and R.H. Pherson. (2011). Structured Analytic Techniques for Intelligence Analysis. Washington, DC: CQ Press.

Kahneman, D. (2011). Thinking, Fast and Slow. New York: Farrar, Straus, and Giroux.

Kyllonen, P.C., and R.E. Christal. (1990). Reasoning ability is (little more than) working-memory capacity?! Intelligence, 14(4):389–433.

Lewin, T. (2001). Sikh owner of gas station is fatally shot in rampage. New York Times, September 17, 2001. Available: http://www.nytimes.com/2001/09/17/us/sikh-owner-of-gas-station-is-fatally-shot-in-rampage.html [January 2015].

Muraven, M., and R.F. Baumeister, (2000). Self-regulation and depletion of limited resources: Does self-control resemble a muscle? Psychological Bulletin, 126(2):247–259.

Oreg, S., and M. Bayazi. (2009). Prone to bias: Development of a bias taxonomy from an individual differences perspective. Review of General Psychology, 13(3):175–193.

Roberto, M.A. (2002). Lessons from Everest: the interaction of cognitive bias, psychological, safety and system complexity. California Management Review, 45(1):136–158.

Ross, L. (1977). The intuitive psychologist and his shortcomings. In L. Berkowitz, Ed., Advances in Experimental Social Psychology, vol. 10. (pp.173–220). New York: Academic Press.

Spellman, B.A. (2011). Individual reasoning. In B. Fischhoff and C. Chauvin, Eds., Intelligence Analysis: Behavioral and Social Scientific Foundations (pp. 117–142). Washington, DC: The National Academies Press.

Stanovich, K.E., and R.F. West. (1998). Individual differences in rational thought. Journal of Experimental Psychology: General, (127):161–188.

This post is part of a 3-Part series on the topic of Biases in Decision Making.  Please click here for Part 1.  The third installment will be posted next week.

Research in behavioral economics and social psychology continuously shows us that people are irrational when they make decisions. Dan Lovallo, professor of business strategy and Olivier Sibony, director at McKinsey & Co. are exploring the most common biases in business and how they create dysfunctional patterns of decision-making. The goal is to create a common language—when we are aware of our biases and their impact on our organizations, we have more power to overcome them.

Action-Oriented Bias

What it is:

How to spot it:

  • Speaking with certainty and confidence is rewarded.
  • Making plans as if the environment was static and unchanging.

How to overcome it:

  • Map out the risks, uncertainties, and unknowns.
  • Consider a variety of outcomes: best-case, worst-case, most-likely-case, least-likely-case, and all in between.
  • While still in the decision-making phase, embrace uncertainty and encourage dissent.
  • Set up checkpoints and success metrics up-front so you can course correct when you’re off track.

Self-Interest Bias

What it is:

  • Incentives that reward the wrong behavior; conflicting incentives.
  • Silo thinking—not considering the big picture or other stakeholders.
  • People are motivated to obtain a favorable outcome for themselves or their unit, at the expense of the organization as a whole.

How to spot it:

  • Different people view the same company goals differently because of their unique role or expertise.
  • Conflict around the correct course of action, though intentions and the desired end result may be similar.
  • Irrational attachment to legacy products or brands.

How to overcome it:

  • Competing interests will always exist—discuss them explicitly and openly.
  • Define the criteria for decision-making up front and stick to it—this lessens the influence of creative debating later on.
  • Build a diverse decision-making team so the interests of one group do not dominate the process.

Pattern-Recognition Bias

What it is:

  • We look for and see patterns where they don’t exist.
  • We give more weight to recent events.
  • We pay more attention to highly memorable events.
  • Confirmation bias – once we have formulated a theory, we pay more attention to items that support it and ignore evidence that disproves it.

How to spot it:

  • Analogies, comparisons, or examples are used to justify a decision.
  • A compelling story is told in an attempt to persuade and influence.
  • Drawing a comparison to a situation that is not quite analogous.

How to overcome it:

  • Look at the facts and evidence from a different perspective.
  • Brainstorm alternative explanations and theories.
  • Encourage out-of-the-box thinking, reframing, role reversal.
  • Ask why. Discuss the past experiences that are influencing today’s thought pattern and decision.

Social Harmony Bias

What it is:

How to spot it:

  • Everyone readily agrees; little to no conflict.
  • People talk about what the leader thinks and wants rather than discussing facts and alternatives.
  • The leader speaks first and makes their views and opinions known.
  • There is a perception that the leader is unlikely to change their mind and not open to debate and suggestions.

How to overcome it:

  • Build a diverse decision-making team: in terms of personalities, expertise, and experience.
  • Foster an environment of trust.
  • Don’t make it personal. Make sure conflict is task-oriented. Disagree with ideas and course of action, not people.
  • Leaders need to see themselves not as the decision-makers, but as the facilitators of a decision-making process.

Stability Bias

What it is:

  • We are comfortable with the status quo, especially when there is no pressure to change.
  • Anchoring bias—the first idea or the current state influences the final outcome more so than is logical.
  • Loss aversion—avoiding risk, cost, and loss to a much higher degree than rewarding innovation, gain, and profit.
  • Sunk-cost fallacy—holding on to a losing strategy or an outdated method because so much time and effort was spent on it in the past.

How to spot it:

  • Business-as-usual attitude despite changing conditions.
  • Budgets look the same, year after year (at least proportionally, if not exactly).

How to overcome it:

  • Start with a blank slate.
  • Decrease budgets to identify true necessities.
  • Get an outside opinion from someone unacquainted with the history.
  • Do the math—calculate potential losses and gains.

Read: 5 Decision-Making Types in our final installment, Part 3.

 

 

Posted in People Management | Tagged change management, Collaboration, communication, Decision Making, team bias

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *