Think about it. Our brains are fabulous tools, the thing that makes us human, but our brains also hold us back, from being all we can be. Sometimes we are fooled, and sometimes we fool ourselves.
We are seemingly pre-programmed to make certain kinds of mistakes, over and over again. Here are six of the most common:
We tend to like people who think like we do. If we agree with them, we’re more likely to be friends with them.
While this makes sense, it also builds a wall around ourselves, in which we purposely become ignorant, and sometimes even hostile to beliefs and people outside our own.
This tendency to see things “our” own way and no other way is the root of intolerance and the cause of much strife in the world.
For centuries, the need to confirm existing biases held scientific progress back, as observations of the real world were denied or twisted to fit cherished beliefs.
A study from Ohio State showed that people spend 36% more time reading essays that align with their opinion.
Whenever a person’s opinions are so inter-twined with their self-image, they cannot risk damaging them, and so avoid or lash out at anyone who questions them as a personal assault.
What is the solution?
Lighten up. Realize that you and your beliefs are separate things. You have value just by being you, and beliefs ought to be challenged every now and then.
Without the ability to change beliefs, the Wright brothers would have never gotten off the ground.
If you flip a coin 10 times and each time it comes up heads, are the chances higher or lower of the next flip being heads?
The real answer, as most people know, is that the chances are exactly the same: 50/50. Yet despite knowing this, human behavior seems to pay it no mind.
The bright lights of the Las Vegas strip are testament that believing “our luck will change,” is a permanent human condition.
The problem is that we place too much emphasis on past events, and confuse our memory with how the world actually works.
Our brains are programmed to seek out patterns, and though this has been a powerful tool ensuring our survival, it can also trick us into seeing patterns where none exist.
Expecting a positive result “because” of past negative results is usually a mistake.
But some gambles are still worth taking, in love for example, you only need to win once, and you can’t win if you’re not in the game.
We tend to see the world not as it is, but as we are. Have you ever had the experience of buying a new car, and suddenly notice other people with the same car wherever you go?
It seems like there are more of them, but there isn’t. The same thing happens with pregnant women, or people walking their dogs. People become more attuned to see the world as they are.
This tendency is quite harmless for noticing things like cars or pregnancies, but can become a kind of prejudice when it extends to other people.
In one study, groups of librarians and real-estate agents were invited to a party. Also at the party was an actress named Jane.
In some situations, Jane acted outgoing and extroverted, while in other situations she acted quiet and introspective.
After the party, the real-estate agents were asked if Jane would also be a good real-estate agent. They remembered her as extroverted and said yes. When asked if she would make a good librarian, they said no.
But when the librarians were asked if Jane would make a good real-estate agent, they remembered her as being introverted, and said no. They thought she would be a better as a librarian.
Each group saw Jane as they themselves were, and disregarded pieces that didn’t fit what they wanted to see.
When our actions and beliefs differ, it causes discomfort.
For example, we may believe that we are “helpful,” but if we see someone fall down and do not help them, it causes an inner conflict.
In order to appease the discomfort, we seek excuses or reasons to protect the belief, such as: that person looked like a criminal, someone else will be along soon, I was late for an appointment, etc.
The reasons may or may not be true, but the need to have them is very real.
In everyday life, these kinds of rationalizations are most common in shopping malls and car showrooms.
An emotional sale has already taken place when we see the dress or car on display, and then the mind goes to work making up reasons to fulfill the belief.
The rationalization often continues after a purchase, especially if the object does not live up to expectations.
Similar to the Stockholm Syndrome, in which hostages start to love their kidnappers, we become prisoners of our own decisions.
The tricky thing in avoiding this mistake is that we generally act before we think. But being aware of this mistake can help us avoid it by predicting it before taking action.
When making decisions, we like to compare the available options, and then choose the best one for our needs. Or so we believe.
Consider the following experiment, that shows how a bad option can change people’s decision making.
An ad for subscriptions to a magazine was shown to 100 MIT students. The options were:
Obviously, Option B is the worst one. Why would anyone get the Print only version when they could get Print and Online for the same price?
None of the MIT students chose Option B – while 86% of MIT students chose Option C.
Next the study tested another 100 MIT students with the ‘useless’ option removed. Their options were:
This time, the majority chose the cheaper, web-only version.
So even though nobody wanted the bad-value $125 print-only option, it wasn’t actually a useless choice. In fact, it influenced people to spend more money by thinking they were getting a bargain.
Eliminating ‘useless’ options ourselves, before we make decisions, can help us choose more wisely.
The human mind is so wedded to stereotypes and so distracted by vivid descriptions that it will seize upon them, even when they defy logic.
In a 1983 study, people were asked to read the following description:
Sally is thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.
The researchers then asked people to answer this question: Which alternative is more probable?
Now here is where it gets a bit tricky, because the description of Sally has nothing to do with answering the question.
Because if answer #2 is true, then answer #1 must also be true, so regardless of how the person is described, only answer #1 can be most probable.
Yet 85% of people in the study chose option #2 as the answer.
Stereotypes and details can cloud our judgment without our even knowing it. A simple choice of words can change everything.
It would be wise to realize, that people who seek to affect our judgment and decision-making are aware of this tendency. Like advertisers and politicians.
Being aware of this can help us avoid making false choices, or at least try to sift through emotional details so we can make rational decisions.
The lesson from these common errors, and others like them, is that we shouldn’t always believe what we think without question.