Outline: All de-biasing strategies do not take into consideration the very first and most crucial bias that characterizes human beings: people have a distorted perception of themselves. How can they improve their decision making if the whole process is built on a biased self-perception? In this article, Marco Esposito who also wrote “An Eye-Opening Experience Of A Foreign Student In Portugal – Part 1” shares with the Uduni community how the self-perception bias affects de-biasing strategies!
The Economic Man And His bounded Rationality
The economic man, or Homo Economicus, is a concept that depicts humans as consistently rational and self-interested who tend to achieve their ends optimally. In economic terms, this individual tries to maximize utility as a consumer, and profit as a producer (Rittenberg & Tregarthen, 2009). Even though Homo Economicus bases his choices on a consideration of his own personal utility function, his rationality might be limited by several factors, such as limited available information, limited time or scarce resources. In these cases, the decision-maker seeks for a satisfying solution rather than an optimal one. When rationality fails, a mismatch between the decision-making environment and the choices of the decision maker occurs, originating what is known as bounded rationality (Simon, 1996). Nonetheless, this does not imply that people and their politics are always irrational. Bounded rationality states that decision makers are intendedly rational; meaning that they are goal oriented and adaptive, but because of human cognitive and emotional architecture, they sometimes fail when taking crucial decisions (Jones, 1999). Recalling the book The Invisible Gorilla by Christopher Chabris and Daniel Simons, the psychologist Daniel Kahneman introduces two interesting human mind facts: “we can be blind to the obvious, and we can also be blind to our blindness”. He explained that everyone is aware of his/her limited capacity of attention, and that their social behavior makes allowances for these limitations. However, in some instances- i.e. intense focusing on a task, people can become effectively blind even to stimuli that normally would attract their attention (Kahneman, 2011).
Where Do Heuristics And Biases Come From?
Most of the decisions in one’s everyday life are based on beliefs concerning probabilities of uncertain events. To reduce the complexity of these choices and speed up the decision-making process, people rely on a small number of heuristic principles. As explained by Kahneman and Tversky (1974), these mental shortcuts are based on data of limited validity, and therefore can lead to severe and systematic errors. One example is given by the representativeness heuristic. It is a decision-making shortcut that uses past experiences to guide the decision-making process when an individual is confronted with a new event and need to make a judgment about that situation (Kahneman & Tversky, 1974). This process can be beneficial and allows for quick conclusions to be reached, but at the cost of accuracy. In judging about the probability of an event, the fact that a mental representation of a past experience can be compared to the new situation, does not have any bearing on how likely that representation is to occur in reality (Grether, 1980).
When heuristics produce an incorrect judgment, it might result in a cognitive bias, which refers to a systematic pattern of deviation from rationality. These anomalies may sometimes lead to perceptual distortion, inaccurate judgment or illogical interpretation (Ariely, 2008). Differently, a decision becomes emotionally biased if the cognitive process has been influenced by feelings, affects or moods. The specificity is that the cause lies in one’s desires or fears, rather than in one’s reasoning (Angie et al., 2011). Neuroscience experiments have shown that emotions and cognition, which are located in different areas of the brain, interfere with each other in the decision making process, resulting most of the times in a primacy of emotions over reasoning (MacMullen, 2003). Everyone is susceptible to biases, especially when fatigued, stressed, or multitasking. In those situations, people are mentally, emotionally, and physically spent. Therefore, they tend to rely even more heavily on intuitive judgments- System 1, and less on careful reasoning- System 2 (Kahneman, 2011). Consequently, decision making becomes faster and simpler, but its quality often suffers (Soll, Milkman, & Payne, 2015).
Some Examples Of De-biasing Strategies
There is no straight forward solution. Many researchers have tried to list out a general practical solution that could be implemented by all, but with poor results. One of the first scientists that started this “cruciate” was Baruch Fischoff. In his main research (1982), he reviewed four straightforward strategies for reducing bias:
- Warning subjects about the potential for bias;
- Describing the likely direction of bias;
- Illustrating biases to the subject;
- Providing extended training, feedback, coaching, and other interventions.
Fischoff concluded his study by stating that this method for addressing biases had only a limited applicability and delivered short-term improvements in decision making (Bazerman & Moore, 2008). In the next 25 years, not much has been added to Fischoff’s conclusions. However, more involved methods, such as replacing intuitive decision making with analysis, can result to be more effective (Merkhofer, 2014).
Another way of addressing biases requires decision makers to commit themselves to a new development path, since solely techniques will not improve the quality of their decisions. More specifically, according to the McKinsey Quarterly article (Lovallo & Sibony, 2010) in order to adopt behavioral strategies, individuals should follow four steps: 1. Deciding which decisions warrant the effort; 2. Identifying the biases most likely to affect critical decisions; 3. Selecting practices and tools to counter the most relevant biases; 4. Embedding practices in formal processes. This journey, requires huge effort and commitment. Nevertheless, the possible increase in the quality of the decisions makes it one of the most valuable strategic investments that an organization can make. Unlike in fields such as finance and marketing, where executives can use psychology to minimize biases by making it residing in others, in strategic decision making leaders must recognize their own biases.
Another solution would be to delegate and fight bias externally, using choice architecture to modify the environment in which decisions are made. However, often delegations turn out not to be appropriate or feasible, and the burden resides entirely on one person. In these cases, one should try to outsmart his own biases, starting by understanding where they come from, such as excessive reliance on intuition, defective reasoning, or both (Soll, Milkman, & Payne, 2015). Other more involved methods to minimize biases include encouraging subjects to put more effort into forming judgments, increasing accountability for decisions or training in biases to understand the cause-effect relationship that these have on his/her own life (Merkhofer, 2014).
These methods outlined are just some of the de-biasing mechanism that the literature offers nowadays. All these might require time, adaptation skills, and sometimes also a profound cultural change. Improving strategic decision making therefore requires not only trying to limit one’s biases, but also orchestrating a decision-making process that will confront different biases and limit their impact (Lovallo & Sibony, 2010).
Do These Strategies Really Make Sense?
Instead of focusing on the biases to be removed from decision-making processes, I have deepen my research on the biases that lie behind de-biasing strategies. In my opinion, there exists a gap between one’s perception of himself and how this person really is- a distorted self-perception, and this would mine all existent de-biasing strategies. As Larrick (2004) stated: people’s view of themselves is distorted. This gap makes it difficult to detect true mistakes in one’s reasoning and even harder to make this individual to accept these errors in order to further improve its reasoning. So, how is it possible to look at problems- before solutions, as they truly are if the perception that people have about reality (themselves) is distorted? This gap between perception and reality can affect one’s ability to function and fulfill his/her potential (McGregor, 2015). This misperception can be due to the different types of personalities that humans might have, such as the “ego-syntonic”. It refers to behaviors, values, and feelings that are in harmony with the needs and goals of one’s ego, or consistent with one’s ideal self-image. This term was introduced in 1914 by Sigmund Freud in his book On Narcissism. In this script, he saw a psychic conflict arising when “the original lagging instincts come into conflict with the ego”.
Many studies (Milkman, Chugh, & Bazerman, 2008; Larrick, 2004; Soll, Milkman, & Payne, 2013; Lovallo & Sibony, 2010; Merkhofer, 2014) have proved that de-biasing techniques positively impact decision-making performance. Undoubtedly, this helps people and organizations making better decisions. The reduction of systematic errors in decision-making process has long been the subject of experiments, and both psychologists and economists have found many techniques to prove that. However, sometimes too much focus is devoted to the discovery of a solution that the basic assumptions of these tests are overseen and not fully considered. As in the case discussed in this article, a single assumption/bias have the potential of making the whole de-biasing process meaningless, yielding wrong results. It is therefore vital to carefully consider all the assumptions and biases lying at the base of de-biasing strategies when aiming at reducing distortions in the decision-making process. However, it is extremely difficult both for one-self and for others to understand if and when one’s perception actually matches the reality. “I beseech you, in the bowels of Christ, think it possible you may be mistaken” (Carlyle, 1855).
- Angie, A. D., Connelly, S., Weaples, E. P., & Kligyte, V. (2011). The influence of discrete emotions on judgement and decision-making: a meta-analytic review. Cognition and Emotion, 1393-1422.
- Ariely, D. (2008). Predictably Irrational: The Hidden Forces That Shape Our Decisions. New York: HarperCollins.
- Bazerman, M. H., & Moore, D. (2008). Judgment in Managerial Decision Making. Hoboken, N.J.: Wiley.
- Carlyle, T. (1855). Oliver Cromwell’s Letters and Speeches. New York: Harper.
- De Martino, B., Kumaran, D., Seymour, B., & Dolan, R. J. (2006). Frames, Biases, and Rational Decision-Making in the Human Brain. Journal of Science, 684-687.
- Fairchild, R. (2014). Emotions in the Financial Markets, in Investor Behavior: The Psychology of Financial Planning and Investing. Hoboken, NJ, USA: John Wiley & Sons.
- Fischoff, B. (1982). For those condemned to study the past: Reflections on historical judgment. Judgment under uncertainty: Heuristics and biases.
- Gavetti, G., & Rivkin, J. W. (2005). How Strategists Really Think: Tapping the Power of Analogy. Harvard Business Review.
- Greene, J., & Haidt, J. (2002). How (and where) does moral judgment work? Trends in Cognitive Sciences, 517-523.
- Grether, D. M. (1980). Bayes Rule as a Descriptive Model: The Representativeness Heuristic. The Quarterly Journal of Economics, 537-557.
- Hagberg Consulting Group. (2015, December 10). Organisation : Corporate Culture: The Distorted View From The Top. Retrieved from Leader Values: http://www.leader-values.com/article.php?aid=423
- Hayes, A. (2015, February 20). How Cognitive Bias Affects Your Business. Retrieved from Investopedia: http://www.investopedia.com/articles/investing/022015/how-cognitive-bias-affects-your-business.asp
- Jones, B. D. (1999). Bounded Rationaliy. Annual Review of Political Science, 297-321.
- Kahneman, D. (2011). Thinking Fast and Slow. New York: Farrar, Straus and Giroux.
- Kahneman, D., & Lovallo, D. (1993). Timid Choices and Bold Forecasts: A Cognitive Perspective on Risk Taking. Management Science, 17-31.
- Kahneman, D., & Tversky, A. (1974). Judgment under Uncertainty: Heuristics and Biases. Journal of Science, 1124-1131.
- Kahneman, D., & Tversky, A. (1986). Rational Choice and the Framing of Decisions. The Journal of Business, 251-278.
- Larrick, R. P. (2004). Debiasing: Blackwell Handbook of Judgment and Decision Making. Oxford, UK: Blackwell Publishing.
- Lilienfeld, S. O., Ammirati, R., & Landfield, K. (2009). Giving Debiasing Away. Perspectives on Psychological Science, 390-398.
- Lovallo, D., & Sibony, O. (2010). The case for behavioral strategy. McKinsey Quarterly. Retrieved from http://www.mckinsey.com/insights/strategy/the_case_for_behavioral_strategy
- MacMullen, R. (2003). Feeling in History. Claremont, CA: Regina Books.
- McGregor, H. (2015, December 03). Why companies should take a “corporate selfie”. Retrieved from Management Today: http://www.managementtoday.co.uk/opinion/1374143/why-companies-corporate-selfie/
- Merkhofer, L. (2014). Errors and Biases in Judgment. 24-31.
- Milkman, K. L., Chugh, D., & Bazerman, M. H. (2008). How Can Decision Making Be Improved? Perspectives on Psychological Science, 379-383.
- Rittenberg, L., & Tregarthen, T. (2009). Principles of Microeconomics. Nyack, NY: Flat World Knowledge.
- Simon, H. A. (1996). The Sciences of the Artificial. Cambridge, MA: MIT Press.
- Soll, J. B., Milkman, K. L., & Payne, J. W. (2015). Outsmart Your Own Biases. Harvard Business Review.
- Soll, J. B., Milkman, K. L., & Payne, P. W. (2013). A User’s Guide to Debiasing. Handbook of judgment and decision making.
- Taylor, J. (2013, May 20). Cognitive Biases Are Bad for Business. Retrieved from Psychology Today: https://www.psychologytoday.com/blog/the-power-prime/201305/cognitive-biases-are-bad-business
Author: Marco Esposito, current CEMS MIM student @ NovaSBE, studied in Italy, US, Portugal, India & Denmark, business strategist and business developer, former professional alpine skier, PC gamer. You can also reach Marco via Facebook.