What is Decision Making?
Let's define decision making. Decision making is just what it sounds like: the action or process of making decisions. Sometimes we make logical decisions, but there are many times when we make emotional, irrational, and confusing choices. This page covers why we make poor decisions and discusses useful frameworks to expand your decision-making toolbox.
Why We Make Poor Decisions
I like to think of myself as a rational person, but I’m not one. The good news is it’s not just me — or you. We are all irrational. For a long time, researchers and economists believed that humans made logical, well-considered decisions. In recent decades, however, researchers have uncovered a wide range of mental errors that derail our thinking. The articles below outline where we often go wrong and what to do about it.
5 Common Mental Errors That Sway You From Making Good Decisions
I like to think of myself as a rational person, but I’m not one. The good news is it’s not just me — or you. We are all irrational, and we all make mental errors. For a long time, researchers and economists believed that humans made logical, well-considered decisions. Sometimes we make logical decisions, but there are many times when we make emotional, irrational, and confusing choices. Psychologists and behavioral researchers love to geek out about these different mental mistakes. There are dozens of them and they all have fancy names like “mere exposure effect” or “narrative fallacy.”
Here are five common mental errors that sway you from making good decisions.
1. Survivorship Bias.
Nearly every popular online media outlet is filled with survivorship bias these days. Anywhere you see articles with titles like “8 Things Successful People Do Everyday” or “The Best Advice Richard Branson Ever Received” or “How LeBron James Trains in the Off-Season” you are seeing survivorship bias in action.
example: “Richard Branson, Bill Gates, and Mark Zuckerberg all dropped out of school and became billionaires! You don’t need school to succeed. Entrepreneurs just need to stop wasting time in class and get started.”
When the winners are remembered and the losers are forgotten it becomes very difficult to say if a particular strategy leads to success.
2. Loss Aversion.
Loss aversion refers to our tendency to strongly prefer avoiding losses over acquiring gains. Research has shown that if someone gives you $10 you will experience a small boost in satisfaction, but if you lose $10 you will experience a dramatically higher loss in satisfaction. Yes, the responses are opposite, but they are not equal in magnitude. “Loss aversion in riskless choice: A reference-dependent model.” by Amos Tversky and Daniel Kahneman. The Quarterly Journal of Economics.
Our tendency to avoid losses causes us to make silly decisions and change our behavior simply to keep the things that we already own. We are wired to feel protective of the things we own and that can lead us to overvalue these items in comparison with the options.
For example, if you buy a new pair of shoes it may provide a small boost in pleasure. However, even if you never wear the shoes, giving them away a few months later might be incredibly painful. You never use them, but for some reason you just can't stand parting with them. Loss aversion.
3. The Availability Heuristic.
The Availability Heuristic refers to a common mistake that our brains make by assuming that the examples which come to mind easily are also the most important or prevalent things.
For example, research by Steven Pinker at Harvard University has shown that we are currently living in the least violent time in history. There are more people living in peace right now than ever before. The rates of homicide, rape, sexual assault, and child abuse are all falling.
The overall percentage of dangerous events is decreasing, but the likelihood that you hear about one of them (or many of them) is increasing. And because these events are readily available in our mind, our brains assume that they happen with greater frequency than they actually do.
We overvalue and overestimate the impact of things that we can remember and we undervalue and underestimate the prevalence of the events we hear nothing about. “Availability: A heuristic for judging frequency and probability.” by Amos Tversky and Daniel Kahneman.
There is a burger joint close to my hometown that is known for gourmet burgers and cheeses. On the menu, they very boldly state, “LIMIT 6 TYPES OF CHEESE PER BURGER.”
My first thought: This is absurd. Who gets six types of cheese on a burger?
My second thought: Which six am I going to get?
I didn't realize how brilliant the restaurant owners were until I learned about anchoring. You see, normally I would just pick one type of cheese on my burger, but when I read “LIMIT 6 TYPES OF CHEESE” on the menu, my mind was anchored at a much higher number than usual.
In one research study, volunteers were asked to guess the percentage of African nations in the United Nations. Before they guessed, however, they had to spin a wheel that would land on either the number 10 or the number 65. When volunteers landed on 65, the average guess was around 45 percent. When volunteers landed on 10, the average estimate was around 25 percent. This 20 digit swing was simply a result of anchoring the guess with a higher or lower number immediately beforehand.
5. Confirmation Bias.
The Grandaddy of Them All. Confirmation bias refers to our tendency to search for and favor information that confirms our beliefs while simultaneously ignoring or devaluing information that contradicts our beliefs.
For example, Person A believes climate change is a serious issue and they only search out and read stories about environmental conservation, climate change, and renewable energy. As a result, Person A continues to confirm and support their current beliefs.
Changing your mind is harder than it looks. The more you believe you know something, the more you filter and ignore all information to the contrary.It is not natural for us to formulate a hypothesis and then test various ways to prove it false. Instead, it is far more likely that we will form one hypothesis, assume it is true, and only seek out and believe information that supports it.
Human beings have been blaming strange behavior on the full moon for centuries. In the Middle Ages, for example, people claimed that a full moon could turn humans into werewolves. In the 1700s, it was common to believe that a full moon could cause epilepsy or feverish temperatures. We even changed our language to match our beliefs. The word lunatic comes from the Latin root luna, which means moon.
Today, we have (mostly) come to our senses. While we no longer blame sickness and disease on the phases of the moon, you will hear people use it as a casual explanation for crazy behavior. For example, a common story in medical circles is that during a chaotic evening at the hospital one of the nurses will often say, “Must be a full moon tonight.”
There is little evidence that a full moon actually impacts our behaviors. A complete analysis of more than 30 peer-reviewed studies found no correlation between a full moon and hospital admissions, casino payouts, suicides, traffic accidents, crime rates, and many other common events.
But here's the interesting thing: Even though the research says otherwise, a 2005 study revealed that 7 out of 10 nurses still believed that “a full moon led to more chaos and patients that night.”
How We Fool Ourselves Without Realizing It
An illusory correlation happens when we mistakenly over-emphasize one outcome and ignore the others. For example, let's say you visit New York City and someone cuts you off as you're boarding the subway train. Then, you go to a restaurant and the waiter is rude to you. Finally, you ask someone on the street for directions and they blow you off.
When you think back on your trip to New York it is easy to remember these experiences and conclude that “people from New York are rude” or “people in big cities are rude.”
How to Spot an Illusory Correlation
There is a simple strategy you can use to spot your hidden assumptions and prevent yourself from making an illusory correlation. It's called a contingency table and it forces you to recognize the non-events that are easy to ignore in daily life.
Let's break down the possibilities for having a full moon and a crazy night of hospital admissions.
- Cell A: Full moon and a busy night. This is a very memorable combination and is over-emphasized in our memory because it is easy to recall.
- Cell B: Full moon, but nothing happens. This is a non-event and is under-emphasized in our memory because nothing really happened. It is hard to remember something not happening and we tend to ignore this cell.
- Cell C: No full moon, but it is a busy night. This is easy to dismiss as a “crazy day at work.”
- Cell D: No full moon and a normal night. Nothing memorable happens on either end, so these events are easy to ignore as well.
This contingency table helps reveal what is happening inside the minds of nurses during a full moon. The nurses quickly remember the one time when there was a full moon and the hospital was overflowing, but simply forget the many times there was a full moon and the patient load was normal. Because they can easily retrieve a memory about a full moon and a crazy night and so they incorrectly assume that the two events are related.
How to Fix Your Misguided Thinking
We make illusory correlations in many areas of life:
- You hear about Bill Gates or Mark Zuckerberg dropping out of college to start a billion-dollar business and you over-value that story in your head. Meanwhile, you never hear about all of the college dropouts that fail to start a successful company. You only hear about the hits and never hear about the misses even though the misses far outnumber the hits.
- You see someone of a particular ethnic or racial background getting arrested and so you assume all people with that background are more likely to be involved in crime. You never hear about the 99 percent of people who don't get arrested because it is a non-event.
- You hear about a shark attack on the news and refuse to go into the ocean during your next beach vacation. The odds of a shark attack have not increased since you went in the ocean last time, but you never hear about the millions of people swimming safely each day. The news is never going to run a story titled, “Millions of Tourists Float in the Ocean Each Day.” You over-emphasize the story you hear on the news and make an illusory correlation.
Most of us are unaware of how our selective memory of events influences the beliefs we carry around with us on a daily basis. We are incredibly poor at remembering things that do not happen. If we don't see it, we assume it has no impact or rarely happens.
How to Use Mental Models for Smart Decision Making
The smartest way to improve your decision making skills is to learn mental models. A mental model is a framework or theory that helps to explain why the world works the way it does. Each mental model is a concept that helps us make sense of the world and offers a way of looking at the problems of life.
It will be help you making decision , hope so