- Understand biases of ourselves and others that can affect the interpretation of information and negatively impact decision-making
- Learn strategies for overcoming those biases
How we feel about a topic, news or information may color our assessment of its validity or relevance. Therefore it is important to understand how we feel about a topics and statements before we consider their validity. It is also important to understand the sources of our biases, the consequences of those biases and strategies to overcome them.
- There are several thinking reactions that can affect and distort our thinking. Awareness of these reactions, their consequences and strategies to overcome them can help strengthen one’s thinking.
- Even if we know about these reactions already, they are easy to forget. Hence it is good to frequently review them.
Dueling Parts of the Human Brain
At the core of the human brain is a reptile brain. This brain is reactive and reflexive. It is useful for quick reactions such as immediate danger and fighting. Many of our base emotions are within this layer.
On top of the reptile brain is a mammal brain. The mammal brain is much more socially aware. Our emotions such as love come from this layer. Our ability to interact in groups comes from this layer as well.
Your immediate judgement of a situation will be based upon your instincts and biases.
Your initial reactions will tend to be negative and uninformed. You will be easily manipulated by others.
Wait until you are calm to make key decisions or interpret important facts.
The action-rationality paradoxis related to the duel existence of reptile and mammal brains. To motivate ourselves, we need to activate our reptile brain. We need to boost our confidence and tell ourselves, we can “do anything”, “beat any odds.” It can help increase our adrenaline levels and increase the chances of success during battle. Unfortunately, much of the reptilian “hype” is nonsense. To think rationally, we need to use our mammalian brain, such as to plan a battle.
Here is the paradox: if we use our reptile brain to plan a battle, it will decrease our chances of victory. If we use our mammalian brain to fight a battle we will react more slowly and less strongly. So we cannot use the same part of our brain to plan and fight.
Think rationally about goals and plans, then tell oneself a lot of nonsense when one goes into battle (or give a presentation or interview for a job).
We tend to believe things that confirm what we already believe.
We become more and more convinced of something regardless of what most external evidence is telling us.
- Identify what you want to believe before interpreting facts.
- Be especially critical of and triple-check any facts that confirm your existing biases or make you feel good.
We tend to reward or punish the deliverer of a message based upon the favorability of the content, due to the immediate reaction of our reptile brain. This is known as kill the messengersyndrome.
This syndrome has the effect that people are hesitant to deliver bad news, so we may not receive important information.
- For the message recipient: remind oneself to save one’s reaction to the message for the party responsible for is contents.
- For the message deliverer: try to take measures to mitigate the impact of any negative content. Be ready with solutions.
Concepts of Perception of Time
We humans tend to think of time in terms of:
This makes it difficult for people to think about and plan for the “medium term” (i.e. the time between soon and infinity). Sustainability efforts are often targeted at infinity or 10,000+ years. Yet, we live our lives in real time. The Earth’s eco-system may be saved or destroyed within the next 50 years (or less), not now or in 1000 years.
- Bad stuff may happen before we get to infinity.
- In real systems, infinity = equilibrium = death
Write out timelines and schedule milestones. Don’t just keep it in your head, because the human brain does not have enough “time bins”. Also, remind oneself that solutions designed for excessively-long periods of time often tend to be inflexible.
Coldness of Math
Most people distrust math and quantitative reasoning.
Many people may get hurt to save the few. Many impractical options are acted upon.
Do the math. Calculate the odds as best as possible. Multiply them by the value of the impact of each potential outcome. Every number should either be a ratio or expressed in units.
Bias Towards the Local
We often favor what we know and can control. We have the most information and influence over local phenomena, from which we can get the most immediate, certain benefits. This also tends to be our comfort zone. We have a distrust over big things, especially those we know little about. So we often miss the big picture.
We overlook system effects and interdependencies. We neglect the big picture, which may be far more significant.
Take the effort to see the big picture, and the impact of the local and global pictures on each other.
There is often the belief that we cannot change anything, or at least not the big things. Although this can sound disempowering, it has the benefit that we absolve ourself of any responsibility for positive change.
Likewise, this can also result in a false idealism. There is the feeling that since we cannot make change, we should adopt the most idealistic position possible regardless of its practicality.
Remember that effort typically pays off, even if not in ways expected.
0 of 2 questions completed
Please choose the best answer.
You have already completed the quiz before. Hence you can not start it again.
Quiz is loading...
You must sign in or sign up to start the quiz.
You have to finish following quiz, to start this quiz:
0 of 2 questions answered correctly
Time has elapsed
You have reached 0 of 0 points, (0)
Question 1 of 2
Which is not a type of brain that humans have?Correct
Question 2 of 2
Which of the following is a cognitive bias covered above?Correct
- George Dvorsky (2013), The 12 cognitive biases that prevent you from being rational(Northwestern University)
- Dan Kopf, “Data shows that using science in an argument just makes people more partisan”, Quartz, 23 December 2016.
- Michael Shermer, “How to Convince Someone When Facts Fail”, Scientific American, January 2017