Think Twice: Understanding the Psychology of Irrational Decision-Making

0
41
Photo, Unsplash

In 2017, the world was stunned to learn that Mike Tyson, one of the greatest boxers in history, had gone bankrupt despite earning over $400 million during his career. Tyson had all the fame, wealth, skill, and widespread respect. However, he prioritised short-term pleasures over long-term stability, spent impulsively, trusted the wrong people, and struggled to control his emotions.

When the reporters asked him the reasons, Tyson was brutally honest: “I made emotional decisions.” Millions were left in shock. How could a man who was trained for discipline, strategy, and precision inside the ring turn out to be so indisciplined? At the core of human psychology lies the solution.

We frequently make the wrong decision even though we know what is right. Tyson’s story is a window into the science of decision-making, showing how the human brain is wired to prioritise short-term gains over long-term ones, and how emotions can overwhelm reason. His demise highlights a reality that intelligence does not shield us from making illogical choices. The science of decision-making starts right here.

Every day, humans make hundreds of decisions, from simple ones like choosing breakfast to life-changing ones such as enrolment in a degree program, a career path, property investment, or choosing a spouse. We normally believe our decisions are based on logic, reason, and careful analysis.

However, research in psychology, neuroscience, and behavioural economics reveals that although humans are intelligent beings but they are prone to irrationality. Good decisions are based on a clear integration of scientific evidence and human priorities, which in turn is best achieved through decision analysis. 

decision
Life is a chess match. Every decision you make has consequences. ~P.K.Subban. The photo is AI-generated by the author

For most of the 20th century, economics was based on the idea that people are completely rational in their decisions to get the most benefit. This idea was called Homo economicus. In the 1970s, psychologists Daniel Kahneman and Amos Tversky challenged this idea with their ground-breaking Prospect Theory. They showed that people do not always make decisions to get the maximum benefit; instead, emotions, mental shortcuts, and personal biases strongly influence their options.

People experience losses more intensely than gains. Consequently, even when taking a chance makes sense, people frequently avoid it. Behavioural economics, which blends psychology and economic decision-making, emerged out of this concept, which transformed conventional economics.

Although our brains operate quickly, we use mental shortcuts called heuristics to make decisions faster. These shortcuts enable us to deal with complicated situations, but they also lead to cognitive biases, which are common mistakes in thought. One such bias is anchoring, which happens when we rely too heavily on the first piece of information we receive.

For example, we might assume a Rs. 2000 handbag is inexpensive, as it was initially labelled Rs. 4000. Next is confirmation bias, which causes people to ignore opposing evidence and prefer information that supports their existing beliefs. This is reinforced by social media. Prospect theory explains why people oppose change because they fear losses more than they value gains.

Lastly, the availability heuristic leads people to conclude how likely an event is based on how easily examples come to mind. For instance, after hearing about aeroplane crashes, they may overestimate the danger of flying, even though it is statistically much safer than driving.

Traditionally, people believed that emotions hinder clearer thinking, but modern brain research reveals that emotions are actually required for making decisions. In a study published in 1994, Neuroscientist Antonio Damasio analysed some patients who had an injury in the part of the brain controlling emotions. Even though these people were intelligent enough to think logically, it was quite difficult for them to make simple decisions, like deciding what to eat. They could list the advantages and disadvantages, but without any feeling of the right choice.

Damasio’s research led to the Somatic Marker Hypothesis, which proposes that emotions work as shortcuts that help the brain quickly judge the possible results of our actions. Simply, what may seem like an irrational behaviour is actually the brain trying to combine feelings and real-life context in choice-making. 

Making all rational choices is neither possible nor necessary; the real challenge is to balance instinct with awareness for better judgment.

Besides our own internal biases, other factors influencing our decisions are the environment and the presentation of the information. This idea, called framing, was introduced by Kahneman and Tversky in 1981. It shows that people’s choices can change depending on the wording of the information provided. For example, a medicine with 80% efficacy is considered more favourable than describing it as 20% ineffective, even though both statements are logically equivalent.

Likewise, the default effect plays a key role in many of our decisions. It is our habit to stick with the option that is already chosen for us. For instance, those countries have more donors where people are automatically listed as organ donors. Slight changes in wording, context, and presentation can strongly influence what people choose, showing that our decisions are not only logical but also influenced by how certain facts and figures appear to us.

The people around us have a greater influence on our decisions. In 1951, Psychologist Solomon Asch demonstrated that many people agree with a group even if it is wrong. The need for approval and the desire for belonging, at times, force people to go against their own logical reasoning. The best example is the influence of shares, likes, and viral trends on social media, not only on what people buy, but also on their political and social views.

Being truly rational does not mean to be emotionless or unbiased. Understanding our mental shortcuts helps us pause for a moment, question our reactions, and think more carefully. Kahneman discussed Methods like thinking slowly, which help us in this practice. Taking enough time to decide, respecting various opinions, and using facts and data are part of the successful process. Artificial intelligence and other decision aids have helped reduce human mistakes in many fields. 

Emotions, gut feelings, intuitions, and biases are not weaknesses; they are part of human survival in uncertain situations. Making all rational choices is neither possible nor necessary; the real challenge is to balance instinct with awareness for better judgment.

The world is full of information and influence. A greater understanding of our irrational decisions is actually a rational step. Our minds are not perfection-based. They are meant for speed and useful outputs. Amid all the emotional, mental, and social forces shaping our decisions, making smarter choices brings real power. 

References:

  • Von Winterfeldt, Detlof. “Bridging the gap between science and decision making.” Proceedings of the National Academy of Sciences 110.supplement_3 (2013): 14055-14061.
  • Asch, Solomon E. “Effects of group pressure upon the modification and distortion of judgments.” Organisational influence processes. Routledge, 2016. 295-303.
  • Damasio, Antonio R. “Descartes’ error: Emotion, reason, and the human brain.” Grosset/Putnam (1994).
  • Johnson, Eric J., and Daniel Goldstein. “Do defaults save lives?” Science 302.5649 (2003): 1338-1339.
  • Watson, Kenneth. “D. Kahneman.(2011). Thinking, Fast and Slow. New York, NY: Farrar, Straus and Giroux. 499 pages.” Canadian Journal of Program Evaluation 26.2 (2011): 111-113.
  • Kahneman, Daniel, and Amos Tversky. “Prospect theory: An analysis of decision under risk.” Handbook of the fundamentals of financial decision making: Part I. 2013. 99-127.
  • Nickerson, Raymond S. “Confirmation bias: A ubiquitous phenomenon in many guises.” Review of General Psychology, 2 (2), 175-220.
  • Tversky, Amos, and Daniel Kahneman. “Judgment under Uncertainty: Heuristics and Biases: Biases in judgments reveal some heuristics of thinking under uncertainty.” Science 185.4157 (1974): 1124-1131.
  • Tversky, Amos, and Daniel Kahneman. “The framing of decisions and the psychology of choice.” Science 211.4481 (1981): 453-458.

More from the author: Transforming Pain into Power: Succeeding with PTSD Through a Positive Mindset

LEAVE A REPLY

Please enter your comment!
Please enter your name here