This is a list of what I think are probably the 10 most common and perhaps most harmful cognitive biases we have when we're discussing or debating. They constantly derail productive discourse and prevent us from thinking rationally and reaching truthful conclusions. Oh, and we all have them.
Here are the top 10 cognitive biases starting with the mother of all biases:
It also includes the tendency to be much more skeptical of evidence that disagrees with your existing beliefs.
Example:
- When we're looking for data to back up our views we notice that the ones that support it stand out as if they're blinking, and the ones that don't support it we ignore. It's so much easier for me to brush off disconfirming evidence and come up with easy justifications for it.
How to fix it:
2. Sunk-cost bias: the tendency to believe in something because of the cost sunk into that belief. (Hanging onto losing stocks, unsuccessful relationships, etc.)- Be more skeptical about data that supports your views. Since your views are relying on that data, you should do an extra amount of work to ensure it is accurate. A few years ago when Chinese scientists claimed mathematical proof the universe came into existence spontaneously from nothing, I didn't accept it as proof despite my desire to do so. I made sure that the evidence stood the test of time first.
- Try and seek out data that is critical of your own view. I look for criticism of atheism all the time. I look for criticism of my political views all the time.
Example:
- Religious people holding onto creationism to the point of absurdity because they've believed it for so long.
- My own belief in free will was held for years because I had held it for a long time and it had become such a deep part of my identity.
How to fix it:
3. Anchoring bias: the tendency to rely too heavily on a past reference or on one piece of information when making decisions.
Example:
- The amount of time you believe in something should bear no importance to whether or not the view is true.
- Consider that the things you've believed for a longer amount of time might even mean they're less likely to be true, since you were likely younger and less knowledgeable when you started believing them.
Example:
- We all have the tendency to refer to one piece of information that caught our attention because knowing all the pertinent information is just too difficult.
- Scientific studies in health or medicine that get a lot of attention that are then falsified are still being used by people as the basis of their view.
How to fix it:
4. Framing effects: the tendency to draw different conclusions based on how data are presented.
Example:
- If you're relying on a single data point to assess an issue or to come to a conclusion on it, you need to make sure that data point is accurate and representative of the subject matter.
- Don't base your views on a single data point, or let it too strongly influence your assessment. Read up on other studies.
- Recognize that you will likely make a guess about something based on a suggested value that is deliberately given to you in order to bias you in a particular way.
4. Framing effects: the tendency to draw different conclusions based on how data are presented.
Example:
- According to a CNBC poll 4 years ago that surveyed two different groups. one was asked whether they opposed Obamacare, and the other the Affordable Care Act. 46% of the group that was asked about "Obamacare" was opposed to the law, while 37% of the group asked about the "Affordable Care Act" was opposed to the law.
- At the same time, more people support "Obamacare" (29%) than those who support ACA (22%.) In other words, having "Obama" in the name "raises the positives and the negatives," as CNBC put it.
How to fix it:
5. Hindsight bias: the tendency to reconstruct the past to fit with present knowledge. (Similar to the Consistency bias: Incorrectly remembering one's past attitudes and behavior as resembling present attitudes and behavior.)
Example:
6. Backfire effect: The reaction to disconfirming evidence by strengthening one's previous beliefs.
Example:
Example:
8. Attribution bias: the tendency to attribute different causes for our own beliefs and actions than that of others.
How to fix it:
And lastly, one of the most harmful biases is in thinking others have biases while you are immune:
10. Bias blind spot: the tendency to recognize the power of cognitive biases in other people but to be blind to their influence upon our own beliefs.
Example:
The bottom line is this: for those of use who were never taught about our biases growing up (which is almost all of us), the best thing we can do to be the best critical thinkers possible is to recognize what our biases are, then recognize that we're all susceptible to them, and then make amends to change our behavior to be less susceptible. We will likely never achieve perfection, but that doesn't mean we shouldn't strive towards having more productive, rational discourse.
Given these natural tendencies that so easily prevent us from being as rationally capable as we can, in order to prevent this in the future I recommend teaching about cognitive biases in school from at least high school and definitely in college. This should be mandatory training. We need to equip a new generation with the tools of critical thinking.
In addition to our biases, some other fallacies and illusions should be strongly acknowledged:
Gambler's fallacy: The tendency to think that future probabilities are altered by past events, when in reality they are unchanged. The fallacy arises from an erroneous conceptualization of the law of large numbers. For example, "I've flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads."
Money illusion: The tendency to concentrate on the nominal value (face value) of money rather than its value in terms of purchasing power.
- Just like how peer review process withholds the names of the person being reviewed and the reviewer to help eliminate this bias, you should sometimes withhold the names of people or organizations when making a case.
- You should also study the merit of the data on its own rather than dismiss it entirely based on its source or whose name is associated with it. I will recognize for example when my political opponents are correct.
5. Hindsight bias: the tendency to reconstruct the past to fit with present knowledge. (Similar to the Consistency bias: Incorrectly remembering one's past attitudes and behavior as resembling present attitudes and behavior.)
Example:
- Donald Trump who pretends he was against the war in Iraq when he wasn't.
How to fix it:
- This is a particularly difficult bias to overcome. If you were recorded saying what you are not now saying, own up to it. Admitting you were wrong is extremely difficult for many people, including me. Just being aware of this bias and that you're susceptible is a first step.
6. Backfire effect: The reaction to disconfirming evidence by strengthening one's previous beliefs.
Example:
- Criticizing one's views can make them more steadfast in them, it is possible that firebrand atheism can be doing this to theists.
How to fix it:
7. Authority bias: The tendency to attribute greater accuracy to the opinion of an authority figure (unrelated to its content) and be more influenced by that opinion.
- Consider more strongly a method that doesn't involve criticizing someone's view. Try nudging them more closely towards you view instead of trying to pull them violently. Avoid name calling. Try promoting an opposing view instead of critiquing theirs.
Example:
- Justifying a person's opinion by saying he or she is in a position of authority like a police officer, teacher, government official, or expert in a field.
- The argument from authority: person X has a degree in biology and is a creationist, therefore creationism is scientific.
- The Milgram experiment showed how easy it was to get people to harm others when told by a person who was disguised as an authority figure.
How to fix it:
- Be weary of the opinions of so called authorities and be especially weary when they are not experts in the subject matter. Many experts get paid large sums of money to lobby for certain interests.
- Expertise is also not always transferable to other fields. Example: Ben Carson.
- We should care about arguments and evidence, and not who it comes from. In fact, given the framing effect, it's probably better to not attach anyone's name to the evidence at all, since that could bias you against or in favor of something.
8. Attribution bias: the tendency to attribute different causes for our own beliefs and actions than that of others.
Example:
- We often think that our own beliefs are made on rational introspection and that other's beliefs are made on emotion. Many times this is not the case.
How to fix it:
9. Ingroup bias: The tendency for people to give preferential treatment to others they perceive to be members of their own groups.
- To truly be a critical thinker you must critically examine all of your views and examine how you got there. Debating them with others who disagree with you is one of the best ways to examine the process by which you arrived at them.
9. Ingroup bias: The tendency for people to give preferential treatment to others they perceive to be members of their own groups.
- We will believe what other people in our group say and often be very skeptical of anyone outside.
- In politics this usually translates to believing what people who are in our political party say over others without any fact checking, and denying what people in other political parties say even when presented with good evidence.
- Recognize that group think can lead to bad ideas flourishing which you will be susceptible to if you don't critically examine them. Encourage the critical examination of the ideas within your own in-group and criticize others who discourage it.
And lastly, one of the most harmful biases is in thinking others have biases while you are immune:
10. Bias blind spot: the tendency to recognize the power of cognitive biases in other people but to be blind to their influence upon our own beliefs.
Example:
- It is much easier to spot biases in others while failing to spot them in yourself, just like it's often difficult to notice your own breath being bad, but notice it easily in others.
- Recognize you too are biased because you are a human being and all human beings are biased.
The bottom line is this: for those of use who were never taught about our biases growing up (which is almost all of us), the best thing we can do to be the best critical thinkers possible is to recognize what our biases are, then recognize that we're all susceptible to them, and then make amends to change our behavior to be less susceptible. We will likely never achieve perfection, but that doesn't mean we shouldn't strive towards having more productive, rational discourse.
Given these natural tendencies that so easily prevent us from being as rationally capable as we can, in order to prevent this in the future I recommend teaching about cognitive biases in school from at least high school and definitely in college. This should be mandatory training. We need to equip a new generation with the tools of critical thinking.
In addition to our biases, some other fallacies and illusions should be strongly acknowledged:
Gambler's fallacy: The tendency to think that future probabilities are altered by past events, when in reality they are unchanged. The fallacy arises from an erroneous conceptualization of the law of large numbers. For example, "I've flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads."
Money illusion: The tendency to concentrate on the nominal value (face value) of money rather than its value in terms of purchasing power.
No comments:
Post a Comment