The Main Barriers to Critical Thinking

Mockingbird

Well-known member
I have written on critical thinking and studied it for several years. I have discovered that some issues are the primary obstacles to overcome when trying to encourage people to use critical thinking.



The simple fact is that if you give people a definition of critical thinking the almost universal response is a variation of "I am smart/ a naturally good critical thinker, other people need to study this. I don't have that problem, they do."



The fact is that critical thinking is a subject and requires a lot of study, questioning and hard work to even begin to learn. It is more like a martial art that requires lots of discipline, practice, concentration, devotion and gradual development than a trait like strength or speed that people have greatly varying degrees of natural endowment in.



People usually treat it as a trait that they have a high degree of naturally and that others, especially others they disagree with and in groups that oppose their beliefs have a low degree of.



But the obvious question is why ?



There are several factors that contribute to this result and I want to take on several of them, starting with some that are the building blocks that combine to create the overall effect.



I think that the fundamental grouping that we need to take on is folk psychology. We have ideas and assumptions about human behavior and minds that we get from parents, ourselves, peer groups and society overall and many of these ideas are, frankly, wrong.



We all hear stories from people like our parents and teachers that describe personal character and they are not all accurate for understanding human beings. We usually don't closely inspect the fundamental assumptions we have or the metaphors that frame how we think of everything else. As children our critical and independent thinking is so poorly developed that we are extremely vulnerable to indoctrination and so we unthinkingly accept the ideas our parents, teachers and peers give us and we usually don't realize the drawbacks of hanging onto these beliefs and unexamined assumptions as adults.



So, how do we correct this ? In part the answer is individual and shaped by our own life experiences.



I think a good starting point is to examine cognitive biases because they affect all of us and are not well laid out in folk psychology. Several contradict folk psychology, so learning them can undo false ideas and replace them with true relevant information.



I want to use quotes from an article to give a foundation on cognitive biases to examine. If you have a good understanding of the subject at the base level it can serve you well for evaluating the topic in other contexts.

















50 COGNITIVE BIASES TO BE AWARE OF SO YOU CAN BE THE VERY BEST VERSION OF YOU
By Mighty Max
The human brain is pretty tricky: While we think we know things, there’s a whole list of cognitive biases that can be gumming up the works. We’ve found 50 types of cognitive bias that come up nearly every day, in petty Facebook arguments, in horoscopes, and on the global stage. Along with their definitions, these are real-life examples of cognitive bias, from the subtle groupthink sabotaging your management meetings to the pull of anchoring making you spend way too much money at a store during a sale. Knowing about this list of biases can help you make more informed decisions and realize when you’re way off the mark.

WHAT IS COGNITIVE BIAS?
Let’s start off with a basic cognitive bias definition: It is a systematic error in cognitive processes (like thinking, perceiving, and memory) diverging from rationality, which can affect judgments. If we think of the human brain as a computer, cognitive bias basically is an error in the code, making us perceive the input differently or come up with an output that’s illogical.
But there are other types of bias as well that aren’t necessarily cognitive; for example, there’s the theory of social proofing, which is one of the more popular social psychological biases. Also, there can be cognitive theories that aren’t necessarily considered biases, or rather, they’re more like a network of common biases tangled together, like cognitive dissonance, which causes mental discomfort when we hold conflicting ideas or beliefs in our minds. Then, there’s the world-famous placebo effect, which can actually result in physiological changes.
End quote

So, let's focus on just the biases that directly make it hard for us to understand that we need to improve our own critical thinking and that other people are not entirely the problem.
I will quote the definitions from the article below and then comment. I am going to take on a category of biases and break it down into two groups. Here are the biases that I have chosen to start with.
The first group is ten biases that are essential to seeing my side as right, including my group, my peers, celebrities and historical figures I admire and authorities who I agree with.
It also helps me to see such people who I dislike or don't admire and see as different from me and my groups as wrong.

  1. Blind Spot Bias: We don’t think we have bias, and we see it in others more than ourselves.
  2. Fundamental Attribution Error: We judge others on their personality or fundamental character, but we judge ourselves on the situation.
  3. Stereotyping: We adopt generalized beliefs that members of a group will have certain characteristics, despite not having information about the individual.
  4. Outgroup Homogeneity Bias: We perceive out-group members as homogeneous and our own in-groups as more diverse.
  5. In-Group Favoritism: We favor people who are in our in-group as opposed to an out-group.
  6. Bandwagon Effect: Ideas, fads, and beliefs grow as more people adopt them.
  7. Groupthink: Due to a desire for conformity and harmony in the group, we make irrational decisions, often to minimize conflict.
  8. False Consensus: We believe more people agree with us than is actually the case.
  9. Availability Cascade: Tied to our need for social acceptance, collective beliefs gain more plausibility through public repetition.
  10. Authority Bias: We trust and are more often influenced by the opinions of authority figures.
Blind Spot Bias is an obvious one to start with, we naturally are blinded by our biases and usually only learn about biases by observing them in others and realizing that we as human beings must also have them too, without the benefit of direct observations of the biases in ourselves. A lot of research on psychological priming and biases has good evidence to support the idea that we all have biases and are profoundly unaware of them in our direct observations of our daily lives. They are well hidden.

We have the vast array of biases that work to make us see ourselves and our peers as both rational and correct and as individuals with differences between us and them as both wrong and more similar to each other than they really are.

The Fundamental Attribution Error helps us to attribute actions of others to general traits and to see our actions as differentiated responses to different situations.

This is reinforced by Stereotyping as it is in agreement with it and further bolstered by Out Group Homogeneity Bias and helps us to see our group members as holding differences and out group members as holding the same beliefs, making seeing ALL of them as having incorrect beliefs easier. It also makes it easier to dismiss evidence that members of our group are wrong because we think of our group as having variations and so some people in it can be wrong without it lowering our opinion of the whole group.

In-Group Favoritism further helps this as we give our individual group members and the group overall the benefit of the doubt whenever possible. In other words we interpret ambiguous or gray area information favorably for our group members and don't give the benefit of the doubt to out-group members. We interpret ambiguous information unfavorably for them.

The Bandwagon Effect compounds this as we see ourselves and our group as better, including better at evaluating the truth, than others outside our group. So, a lot of our people believing an idea is rock solid proof of our accuracy to us and a lot of people outside our group believing an idea we disagree with serves to prove to us how wrong, stupid, irrational, evil, backward and so on the others are. A lot of us believing functions as proof we must be right but a lot of them disagreeing with our beliefs shows how consistently wrong they are.

Groupthink encourages us to conform to group norms and to go along with the most accepted ideas in the group, regardless of their truth or importance. Being believed and strongly embraced by the group serves as a substitute for being believed and embraced by ourselves.

False Consensus can boost the effect of many other group related biases as we think everyone else who is sensible believes something and everyone who is dead wrong disagrees.

The Availability Cascade, especially with red feeds and blue feeds tailored by algorithms which select the content we are most likely to agree with, serves to give us memes, articles, programs and videos that reinforce our beliefs and don't challenge them.

The Authority Bias helps us to see our beliefs and groups as proven right because we recognize authorities who are in agree with us and our peers as valid but see authorities who disagree with us as invalid. Simply put we use the In-Group Favoritism to select the authorities that fit the needs of the group, based on agreement and the other biases can work together such as Groupthink and the Bandwagon Effect when the authority and group agree without dissent. In free groups that are allowed to be open minded and to develop and express differing views this is to a degree blunted, but in high control groups aka authoritarian groups aka cults or destructive cults the power of ALL these biases can combine as obedience to authority and conformity to group norms without dissent creates a potent combination.

I wanted to zero in on these biases that are crucial to understand why we so easily see our group as right and the other groups as wrong or as more wrong when critical thinking is brought up because seeing the others as more wrong helps us to see them as having the problem and needing to improve their critical thinking first, while we usually never get around to it.

So, we can start with Blind Spot Bias as it hides all other biases then see the group of biases that includes (but is not limited to) The Fundamental Attribution Error, Stereotyping, Out Group Homogeneity Bias, In-Group Favoritism, the Bandwagon Effect, Groupthink, False Consensus, the Availability Cascade, and the Authority Bias as together helping us to not see our biases then see our side and peers and our authorities and selves as being right because we are in our groups and to see others outside our groups who disagree with us as wrong. It may seem complicated but I hope that the descriptions I gave help and you can always think of fictional and real examples. This is usually easiest with people and groups you strongly and passionately disagree with.

Just seeing how biases work and "compound" in anyone is difficult and a good start BUT if you don't carry it through to see how your authorities, your peers and you yourself also do it then you have just reinforced the biases with half understood justifications.

It's easy to see that the "opposition" is wrong. Most of us do, but it is harder to see how WE are wrong too. If you cannot see the fact that we are wrong too and dig into the details of when, how and why, then I frankly don't think you are going to be practicing critical thinking. Critical thinking expert Richard Paul in some lectures described pseudo critical thinkers who adopt a piece of critical thinking knowledge but don't have enough understanding of the subject in general or the information that they do have to correctly apply it. This doesn't mean you need a PhD in critical thinking to apply it. It means some of the ideas and techniques are more complicated than a phrase or sentence or paragraph and require more knowledge to properly apply.

In one lecture Richard Paul described just applying something to support your own beliefs or argument as sophistry and pseudo critical thinking. Sometimes people think winning arguments or getting opponents to withdraw from debates in confusion is critical thinking but it isn't, not even close.

A person can learn about biases and logical fallacies and propaganda techniques and rhetoric then face a less educated opponent relative to these ideas and bury them in terms they have to struggle through and unfamiliar aspects of debate such as the burden of proof falling on the claimant and they can use flawed arguments and claims while tearing apart the same flaws in arguments and claims by their opponent. This is sophistry, insincere debate, and attorneys are notorious for it. Also politicians.

It is a kind of intellectual cheating in which you insist that the opposition follows rules for carefully seeking the truth and you yourself only try to win with no regard for honesty or the truth.

There was a break in schools of philosophy a few thousand years ago and the sophists sought victory in debate and persuasion at any cost. Victory was their only goal. Many other philosophers saw both what they were doing and how they were doing it and condemned the sophists as intellectual frauds. And they were right to do so.

It took a lot of work by a lot of people, probably an unimaginable number, to get us the understanding we have today of our biases and critical thinking we can study today. I think it is well worth the time and effort and worth being honest with you about it too.

Click to view the full-size infographic




50 COGNITIVE BIASES TO BE AWARE OF SO YOU CAN BE THE VERY BEST VERSION OF YOU
By Mighty Max
The human brain is pretty tricky: While we think we know things, there’s a whole list of cognitive biases that can be gumming up the works. We’ve found 50 types of cognitive bias that come up nearly every day, in petty Facebook arguments, in horoscopes, and on the global stage. Along with their definitions, these are real-life examples of cognitive bias, from the subtle groupthink sabotaging your management meetings to the pull of anchoring making you spend way too much money at a store during a sale. Knowing about this list of biases can help you make more informed decisions and realize when you’re way off the mark.

WHAT IS COGNITIVE BIAS?
Let’s start off with a basic cognitive bias definition: It is a systematic error in cognitive processes (like thinking, perceiving, and memory) diverging from rationality, which can affect judgments. If we think of the human brain as a computer, cognitive bias basically is an error in the code, making us perceive the input differently or come up with an output that’s illogical.
But there are other types of bias as well that aren’t necessarily cognitive; for example, there’s the theory of social proofing, which is one of the more popular social psychological biases. Also, there can be cognitive theories that aren’t necessarily considered biases, or rather, they’re more like a network of common biases tangled together, like cognitive dissonance, which causes mental discomfort when we hold conflicting ideas or beliefs in our minds. Then, there’s the world-famous placebo effect, which can actually result in physiological changes.
Let’s go into some common cognitive bias examples to really see how they work!

50 TYPES OF COMMON COGNITIVE BIASES
  1. Fundamental Attribution Error: We judge others on their personality or fundamental character, but we judge ourselves on the situation.
  2. Self-Serving Bias: Our failures are situational, but our successes are our responsibility.
  3. In-Group Favoritism: We favor people who are in our in-group as opposed to an out-group.
  4. Bandwagon Effect: Ideas, fads, and beliefs grow as more people adopt them.
  5. Groupthink: Due to a desire for conformity and harmony in the group, we make irrational decisions, often to minimize conflict.
  6. Halo Effect: If you see a person as having a positive trait, that positive impression will spill over into their other traits. (This also works for negative traits.)
  7. Moral Luck: Better moral standing happens due to a positive outcome; worse moral standing happens due to a negative outcome.
  8. False Consensus: We believe more people agree with us than is actually the case.
  9. Curse of Knowledge: Once we know something, we assume everyone else knows it, too.
  10. Spotlight Effect: We overestimate how much people are paying attention to our behavior and appearance.
  11. Availability Heuristic: We rely on immediate examples that come to mind while making judgments.
  12. Defensive Attribution: As a witness who secretly fears being vulnerable to a serious mishap, we will blame the victim less if we relate to the victim.
  13. Just-World Hypothesis: We tend to believe the world is just; therefore, we assume acts of injustice are deserved.
  14. Naïve Realism: We believe that we observe objective reality and that other people are irrational, uninformed, or biased.
  15. Naïve Cynicism: We believe that we observe objective reality and that other people have a higher egocentric bias than they actually do in their intentions/actions.
  16. Forer Effect (aka Barnum Effect): We easily attribute our personalities to vague statements, even if they can apply to a wide range of people.
  17. Dunning-Kruger Effect: The less you know, the more confident you are. The more you know, the less confident you are.
  18. Anchoring: We rely heavily on the first piece of information introduced when making decisions.
  19. Automation Bias: We rely on automated systems, sometimes trusting too much in the automated correction of actually correct decisions.
  20. Google Effect (aka Digital Amnesia): We tend to forget information that’s easily looked up in search engines.
  21. Reactance: We do the opposite of what we’re told, especially when we perceive threats to personal freedoms.
  22. Confirmation Bias: We tend to find and remember information that confirms our perceptions.
  23. Backfire Effect: Disproving evidence sometimes has the unwarranted effect of confirming our beliefs.
  24. Third-Person Effect: We believe that others are more affected by mass media consumption than we ourselves are.
  25. Belief Bias: We judge an argument’s strength not by how strongly it supports the conclusion but how plausible the conclusion is in our own minds.
  26. Availability Cascade: Tied to our need for social acceptance, collective beliefs gain more plausibility through public repetition.
  27. Declinism: We tent to romanticize the past and view the future negatively, believing that societies/institutions are by and large in decline.
  28. Status Quo Bias: We tend to prefer things to stay the same; changes from the baseline are considered to be a loss.
  29. Sunk Cost Fallacy (aka Escalation of Commitment): We invest more in things that have cost us something rather than altering our investments, even if we face negative outcomes.
  30. Gambler’s Fallacy: We think future possibilities are affected by past events.
  31. Zero-Risk Bias: We prefer to reduce small risks to zero, even if we can reduce more risk overall with another option.
  32. Framing Effect: We often draw different conclusions from the same information depending on how it’s presented.
  33. Stereotyping: We adopt generalized beliefs that members of a group will have certain characteristics, despite not having information about the individual.
  34. Outgroup Homogeneity Bias: We perceive out-group members as homogeneous and our own in-groups as more diverse.
  35. Authority Bias: We trust and are more often influenced by the opinions of authority figures.
  36. Placebo Effect: If we believe a treatment will work, it often will have a small physiological effect.
  37. Survivorship Bias: We tend to focus on those things that survived a process and overlook ones that failed.
  38. Tachypsychia: Our perceptions of time shift depending on trauma, drug use, and physical exertion.
  39. Law of Triviality (aka “Bike-Shedding”): We give disproportionate weight to trivial issues, often while avoiding more complex issues.
  40. Zeigarnik Effect: We remember incomplete tasks more than completed ones.
  41. IKEA Effect: We place higher value on things we partially created ourselves.
  42. Ben Franklin Effect: We like doing favors; we are more likely to do another favor for someone if we’ve already done a favor for them than if we had received a favor from that person.
  43. Bystander Effect: The more other people are around, the less likely we are to help a victim.
  44. Suggestibility: We, especially children, sometimes mistake ideas suggested by a questioner for memories.
  45. False Memory: We mistake imagination for real memories.
  46. Cryptomnesia: We mistake real memories for imagination.
  47. Clustering Illusion: We find patterns and “clusters” in random data.
  48. Pessimism Bias: We sometimes overestimate the likelihood of bad outcomes.
  49. Optimism Bias: We sometimes are over-optimistic about good outcomes.
  50. Blind Spot Bias: We don’t think we have bias, and we see it in others more than ourselves.
Use our cognitive bias infographic as inspiration for becoming better and knowing more! You can even print it out and use it as a cognitive bias poster to encourage others to do the same.
End Quote
 

Mockingbird

Well-known member
Here is a list of pro self biases.

The following group of cognitive biases serves to make us prone to fail to see our biases, to see biases (or flaws in thinking and character) in others and to see our own ideas as better than they are or unbiased and the ideas of others (especially if they disagree with or oppose us) as worse than they are or wrong. (Note: all definitions are quoted from

50 COGNITIVE BIASES TO BE AWARE OF SO YOU CAN BE THE VERY BEST VERSION OF YOU
By Mighty Max)


  1. Blind Spot Bias: We don’t think we have bias, and we see it in others more than ourselves.
  2. Self-Serving Bias: Our failures are situational, but our successes are our responsibility.
  3. Halo Effect: If you see a person as having a positive trait, that positive impression will spill over into their other traits. (This also works for negative traits.)
  4. Naïve Realism: We believe that we observe objective reality and that other people are irrational, uninformed, or biased.
  5. Naïve Cynicism: We believe that we observe objective reality and that other people have a higher egocentric bias than they actually do in their intentions/actions.
  6. Dunning-Kruger Effect: The less you know, the more confident you are. The more you know, the less confident you are.
  7. Confirmation Bias: We tend to find and remember information that confirms our perceptions.
  8. Backfire Effect: Disproving evidence sometimes has the unwarranted effect of confirming our beliefs.
  9. Third-Person Effect: We believe that others are more affected by mass media consumption than we ourselves are.
  10. Belief Bias: We judge an argument’s strength not by how strongly it supports the conclusion but how plausible the conclusion is in our own minds.
Blind Spot Bias is an obvious one to start with, we naturally are blinded by our biases and usually only learn about biases by observing them in others and realizing that we as human beings must also have them too, without the benefit of direct observations of the biases in ourselves. A lot of research on psychological priming and biases has good evidence to support the idea that we all have biases and are profoundly unaware of them in our direct observations of our daily lives. They are well hidden.

Regarding Self-Serving bias, seeing OUR failures as caused by circumstances and the failures of OTHER PEOPLE as caused by their character makes us biased to see ourselves as both more moral than others and better in our decision making. Seeing ourselves through rose colored glasses and others through a dark tint gives us a false sense of superiority and competence that is undeserved.

The Halo Effect serves as a one two punch with the Self-Serving Bias because it lets us see any alleged flaws in someone as "proof" they are wrong, stupid, irrational, and so on in their claims and beliefs, conveniently whenever they disagree with us ! This is irrational as a person who is usually or generally wrong can have a valid claim or idea and a person who is generally right or intelligent or agrees with us can have an incorrect claim or idea.

The Halo Effect is one of the most frequently abused biases as politicians and media and people online frequently comment "that guy was SO WRONG about this other entirely unrelated topic ! No one should EVER listen to anything he says !" or "That person complains about an issue and consequences but they are not perfect in their own past behavior and conduct ! So they have no right to bring up the issue until they go back in time and live a life that cannot be criticized in any way because a flawed person only creates flawed arguments and claims !" Well, I may have exaggerated the last one a little to show that saying someone has to be perfect or have impeccable character and conduct to be listened to is just silly, since no human being has perfect behavior that is beyond any criticism.

We when it is convenient pivot to thinking of flaws in people, especially people who disagree with us and to think of good traits in people who agree with us as if the unrelated and irrelevant traits support or disprove claims that they have nothing to do with.

You can find thousands of comments online each day that follow this. A politician proposes a policy and many people attack or praise the policy based on their general attitude towards the politician or party.

Naive Realism has been written about a bit by psychologists and is well worth considering, it primes us to be overconfident in our own perception, memory and thinking and to arrogantly think WE have a better grasp on reality than others.

Naive Cynicism together with Naive Realism sets us up to see OTHER people as needing to learn critical thinking, scientific method and other subjects and to not see that we ourselves are part of the problem.

Dunning-Kruger Effect is a monster. Research has shown we are not built to dig into many or most subjects in depth, we lack the time and we work together as groups.

So, consequently, we have usually only a few subjects we learn in depth and we are prone to think we understand others far better than we really do. Even if we understand one or several subjects well we don't understand others and we and others can mistake knowledge or expertise in one area for the same thing in others.

I have seen people who have degrees even master's degrees and PhDs in some subjects give incredibly wrong and uneducated opinions in other subjects and seen that most people think a doctor or professor is an expert in everything when a little digging in the subject can show that they don't know what they're talking about.

This doesn't mean experts are worthless and always wrong. It means that they are just as vulnerable to Dunning-Kruger Effect in areas outside their expertise as the rest of us. Dunning-Kruger Effect is not about stupidity, it is about expertise.

I have seen thousands of comments that incorrectly identify Dunning-Kruger Effect as "stupid people don't know that they are stupid" that is not what it is, so you are claiming something incorrect if you say that.

Confirmation Bias is easy to understand as we can see it in others, but we have to work hard to monitor and acknowledge it in ourselves and to fight it. It goes against the easiest way to deal with information to fight confirmation bias, it takes willpower, self discipline and a lot of work to go against our first nature.

The ideas from A Theory Of Cognitive Dissonance by Leon Festinger and On Liberty by John Stuart Mill go a long way to deal with the basic problem. Thinking, Fast and Slow by Daniel Kahneman is a superb compliment to those ideas.

The Backfire Effect is a neat little cognitive trick in which we convince ourselves we are right even when evidence shows we are not. It debunks the deficit model of information in which it was believed that availability of true information was the barrier to people knowing the truth. You can provide true information and good evidence all day and it doesn't always help.

Third Person Effect is summed up by a little cartoon showing a half dozen or so stick figure people together. One thought bubble is above them all with a thought like "look at those poor fools, robots controlled by propaganda. Poor devils, I am the only one who is thinking for myself." We tend to see ourselves as having unrestrained free will, regardless of evidence, and others as being influenced by all sorts of things we ourselves somehow escaped.

Belief Bias is very subtle and hard to get people to understand because it operates below conscious awareness and only a lot of experiments and research in psychology and social psychology revealed the subconscious has tremendous unnoticed influence on our beliefs and behavior. The book Subliminal by Leonard Mlodinow gives a terrific description of how this works and great evidence to support the claims.

Beliefs that feel good or not threatening to us and our identity and prior beliefs are much easier to perceive, accept, think of and remember than beliefs that disagree, especially with beliefs that have deep emotional impact and our fundamental values about ourselves, our groups and whatever deep underlying assumptions serve to frame and define everything else for us, they could be emotional, psychological, religious or other types depending on what has strong emotional associations for us as individuals.

Belief Bias is described by the logical fallacy personal incredulity. If I don't consider the evidence and arguments for a claim because I cannot stand the idea of thinking of it because of a strong compulsion to reject it or to not endanger an idea or belief or value it could throw into doubt then the belief bias is at play. It can have a strong emotional or intellectual component or both simultaneously but one way or another considering an idea is unacceptable. But sometimes unthinkable ideas are true.


This grouping is sadly not complete but meant to give a taste of why "I think that I am not wrong and others are" exists and how this is unfortunately in each of us. This tendency serves as a primary obstacle to education on critical thinking, everyone knows it is lacking in society when they hear about it but they also know someone else needs to learn and that I am doing it just fine.
 

programmer_guy

True ex-Scientologist
The main problem is some inability to correct some "back propagation errors to model" (feedback loop) in the brain.

(I am still reading, and videos, about computer Tensor Flow.)

Most of our model in our brain is developed during childhood from our parents and surrounding society.
Confirmation bias is a mental mechanism for protecting the major parts of the model.
 
Last edited:

Riddick

I clap to no man
I have written on critical thinking and studied it for several years. I have discovered that some issues are the primary obstacles to overcome when trying to encourage people to use critical thinking.



The simple fact is that if you give people a definition of critical thinking the almost universal response is a variation of "I am smart/ a naturally good critical thinker, other people need to study this. I don't have that problem, they do."



The fact is that critical thinking is a subject and requires a lot of study, questioning and hard work to even begin to learn. It is more like a martial art that requires lots of discipline, practice, concentration, devotion and gradual development than a trait like strength or speed that people have greatly varying degrees of natural endowment in.



People usually treat it as a trait that they have a high degree of naturally and that others, especially others they disagree with and in groups that oppose their beliefs have a low degree of.



But the obvious question is why ?



There are several factors that contribute to this result and I want to take on several of them, starting with some that are the building blocks that combine to create the overall effect.



I think that the fundamental grouping that we need to take on is folk psychology. We have ideas and assumptions about human behavior and minds that we get from parents, ourselves, peer groups and society overall and many of these ideas are, frankly, wrong.



We all hear stories from people like our parents and teachers that describe personal character and they are not all accurate for understanding human beings. We usually don't closely inspect the fundamental assumptions we have or the metaphors that frame how we think of everything else. As children our critical and independent thinking is so poorly developed that we are extremely vulnerable to indoctrination and so we unthinkingly accept the ideas our parents, teachers and peers give us and we usually don't realize the drawbacks of hanging onto these beliefs and unexamined assumptions as adults.



So, how do we correct this ? In part the answer is individual and shaped by our own life experiences.



I think a good starting point is to examine cognitive biases because they affect all of us and are not well laid out in folk psychology. Several contradict folk psychology, so learning them can undo false ideas and replace them with true relevant information.



I want to use quotes from an article to give a foundation on cognitive biases to examine. If you have a good understanding of the subject at the base level it can serve you well for evaluating the topic in other contexts.

















50 COGNITIVE BIASES TO BE AWARE OF SO YOU CAN BE THE VERY BEST VERSION OF YOU
By Mighty Max
The human brain is pretty tricky: While we think we know things, there’s a whole list of cognitive biases that can be gumming up the works. We’ve found 50 types of cognitive bias that come up nearly every day, in petty Facebook arguments, in horoscopes, and on the global stage. Along with their definitions, these are real-life examples of cognitive bias, from the subtle groupthink sabotaging your management meetings to the pull of anchoring making you spend way too much money at a store during a sale. Knowing about this list of biases can help you make more informed decisions and realize when you’re way off the mark.

WHAT IS COGNITIVE BIAS?
Let’s start off with a basic cognitive bias definition: It is a systematic error in cognitive processes (like thinking, perceiving, and memory) diverging from rationality, which can affect judgments. If we think of the human brain as a computer, cognitive bias basically is an error in the code, making us perceive the input differently or come up with an output that’s illogical.
But there are other types of bias as well that aren’t necessarily cognitive; for example, there’s the theory of social proofing, which is one of the more popular social psychological biases. Also, there can be cognitive theories that aren’t necessarily considered biases, or rather, they’re more like a network of common biases tangled together, like cognitive dissonance, which causes mental discomfort when we hold conflicting ideas or beliefs in our minds. Then, there’s the world-famous placebo effect, which can actually result in physiological changes.
End quote

So, let's focus on just the biases that directly make it hard for us to understand that we need to improve our own critical thinking and that other people are not entirely the problem.
I will quote the definitions from the article below and then comment. I am going to take on a category of biases and break it down into two groups. Here are the biases that I have chosen to start with.
The first group is ten biases that are essential to seeing my side as right, including my group, my peers, celebrities and historical figures I admire and authorities who I agree with.
It also helps me to see such people who I dislike or don't admire and see as different from me and my groups as wrong.

  1. Blind Spot Bias: We don’t think we have bias, and we see it in others more than ourselves.
  2. Fundamental Attribution Error: We judge others on their personality or fundamental character, but we judge ourselves on the situation.
  3. Stereotyping: We adopt generalized beliefs that members of a group will have certain characteristics, despite not having information about the individual.
  4. Outgroup Homogeneity Bias: We perceive out-group members as homogeneous and our own in-groups as more diverse.
  5. In-Group Favoritism: We favor people who are in our in-group as opposed to an out-group.
  6. Bandwagon Effect: Ideas, fads, and beliefs grow as more people adopt them.
  7. Groupthink: Due to a desire for conformity and harmony in the group, we make irrational decisions, often to minimize conflict.
  8. False Consensus: We believe more people agree with us than is actually the case.
  9. Availability Cascade: Tied to our need for social acceptance, collective beliefs gain more plausibility through public repetition.
  10. Authority Bias: We trust and are more often influenced by the opinions of authority figures.
Blind Spot Bias is an obvious one to start with, we naturally are blinded by our biases and usually only learn about biases by observing them in others and realizing that we as human beings must also have them too, without the benefit of direct observations of the biases in ourselves. A lot of research on psychological priming and biases has good evidence to support the idea that we all have biases and are profoundly unaware of them in our direct observations of our daily lives. They are well hidden.

We have the vast array of biases that work to make us see ourselves and our peers as both rational and correct and as individuals with differences between us and them as both wrong and more similar to each other than they really are.

The Fundamental Attribution Error helps us to attribute actions of others to general traits and to see our actions as differentiated responses to different situations.

This is reinforced by Stereotyping as it is in agreement with it and further bolstered by Out Group Homogeneity Bias and helps us to see our group members as holding differences and out group members as holding the same beliefs, making seeing ALL of them as having incorrect beliefs easier. It also makes it easier to dismiss evidence that members of our group are wrong because we think of our group as having variations and so some people in it can be wrong without it lowering our opinion of the whole group.

In-Group Favoritism further helps this as we give our individual group members and the group overall the benefit of the doubt whenever possible. In other words we interpret ambiguous or gray area information favorably for our group members and don't give the benefit of the doubt to out-group members. We interpret ambiguous information unfavorably for them.

The Bandwagon Effect compounds this as we see ourselves and our group as better, including better at evaluating the truth, than others outside our group. So, a lot of our people believing an idea is rock solid proof of our accuracy to us and a lot of people outside our group believing an idea we disagree with serves to prove to us how wrong, stupid, irrational, evil, backward and so on the others are. A lot of us believing functions as proof we must be right but a lot of them disagreeing with our beliefs shows how consistently wrong they are.

Groupthink encourages us to conform to group norms and to go along with the most accepted ideas in the group, regardless of their truth or importance. Being believed and strongly embraced by the group serves as a substitute for being believed and embraced by ourselves.

False Consensus can boost the effect of many other group related biases as we think everyone else who is sensible believes something and everyone who is dead wrong disagrees.

The Availability Cascade, especially with red feeds and blue feeds tailored by algorithms which select the content we are most likely to agree with, serves to give us memes, articles, programs and videos that reinforce our beliefs and don't challenge them.

The Authority Bias helps us to see our beliefs and groups as proven right because we recognize authorities who are in agree with us and our peers as valid but see authorities who disagree with us as invalid. Simply put we use the In-Group Favoritism to select the authorities that fit the needs of the group, based on agreement and the other biases can work together such as Groupthink and the Bandwagon Effect when the authority and group agree without dissent. In free groups that are allowed to be open minded and to develop and express differing views this is to a degree blunted, but in high control groups aka authoritarian groups aka cults or destructive cults the power of ALL these biases can combine as obedience to authority and conformity to group norms without dissent creates a potent combination.

I wanted to zero in on these biases that are crucial to understand why we so easily see our group as right and the other groups as wrong or as more wrong when critical thinking is brought up because seeing the others as more wrong helps us to see them as having the problem and needing to improve their critical thinking first, while we usually never get around to it.

So, we can start with Blind Spot Bias as it hides all other biases then see the group of biases that includes (but is not limited to) The Fundamental Attribution Error, Stereotyping, Out Group Homogeneity Bias, In-Group Favoritism, the Bandwagon Effect, Groupthink, False Consensus, the Availability Cascade, and the Authority Bias as together helping us to not see our biases then see our side and peers and our authorities and selves as being right because we are in our groups and to see others outside our groups who disagree with us as wrong. It may seem complicated but I hope that the descriptions I gave help and you can always think of fictional and real examples. This is usually easiest with people and groups you strongly and passionately disagree with.

Just seeing how biases work and "compound" in anyone is difficult and a good start BUT if you don't carry it through to see how your authorities, your peers and you yourself also do it then you have just reinforced the biases with half understood justifications.

It's easy to see that the "opposition" is wrong. Most of us do, but it is harder to see how WE are wrong too. If you cannot see the fact that we are wrong too and dig into the details of when, how and why, then I frankly don't think you are going to be practicing critical thinking. Critical thinking expert Richard Paul in some lectures described pseudo critical thinkers who adopt a piece of critical thinking knowledge but don't have enough understanding of the subject in general or the information that they do have to correctly apply it. This doesn't mean you need a PhD in critical thinking to apply it. It means some of the ideas and techniques are more complicated than a phrase or sentence or paragraph and require more knowledge to properly apply.

In one lecture Richard Paul described just applying something to support your own beliefs or argument as sophistry and pseudo critical thinking. Sometimes people think winning arguments or getting opponents to withdraw from debates in confusion is critical thinking but it isn't, not even close.

A person can learn about biases and logical fallacies and propaganda techniques and rhetoric then face a less educated opponent relative to these ideas and bury them in terms they have to struggle through and unfamiliar aspects of debate such as the burden of proof falling on the claimant and they can use flawed arguments and claims while tearing apart the same flaws in arguments and claims by their opponent. This is sophistry, insincere debate, and attorneys are notorious for it. Also politicians.

It is a kind of intellectual cheating in which you insist that the opposition follows rules for carefully seeking the truth and you yourself only try to win with no regard for honesty or the truth.

There was a break in schools of philosophy a few thousand years ago and the sophists sought victory in debate and persuasion at any cost. Victory was their only goal. Many other philosophers saw both what they were doing and how they were doing it and condemned the sophists as intellectual frauds. And they were right to do so.

It took a lot of work by a lot of people, probably an unimaginable number, to get us the understanding we have today of our biases and critical thinking we can study today. I think it is well worth the time and effort and worth being honest with you about it too.

Click to view the full-size infographic




50 COGNITIVE BIASES TO BE AWARE OF SO YOU CAN BE THE VERY BEST VERSION OF YOU
By Mighty Max
The human brain is pretty tricky: While we think we know things, there’s a whole list of cognitive biases that can be gumming up the works. We’ve found 50 types of cognitive bias that come up nearly every day, in petty Facebook arguments, in horoscopes, and on the global stage. Along with their definitions, these are real-life examples of cognitive bias, from the subtle groupthink sabotaging your management meetings to the pull of anchoring making you spend way too much money at a store during a sale. Knowing about this list of biases can help you make more informed decisions and realize when you’re way off the mark.

WHAT IS COGNITIVE BIAS?
Let’s start off with a basic cognitive bias definition: It is a systematic error in cognitive processes (like thinking, perceiving, and memory) diverging from rationality, which can affect judgments. If we think of the human brain as a computer, cognitive bias basically is an error in the code, making us perceive the input differently or come up with an output that’s illogical.
But there are other types of bias as well that aren’t necessarily cognitive; for example, there’s the theory of social proofing, which is one of the more popular social psychological biases. Also, there can be cognitive theories that aren’t necessarily considered biases, or rather, they’re more like a network of common biases tangled together, like cognitive dissonance, which causes mental discomfort when we hold conflicting ideas or beliefs in our minds. Then, there’s the world-famous placebo effect, which can actually result in physiological changes.
Let’s go into some common cognitive bias examples to really see how they work!

50 TYPES OF COMMON COGNITIVE BIASES
  1. Fundamental Attribution Error: We judge others on their personality or fundamental character, but we judge ourselves on the situation.
  2. Self-Serving Bias: Our failures are situational, but our successes are our responsibility.
  3. In-Group Favoritism: We favor people who are in our in-group as opposed to an out-group.
  4. Bandwagon Effect: Ideas, fads, and beliefs grow as more people adopt them.
  5. Groupthink: Due to a desire for conformity and harmony in the group, we make irrational decisions, often to minimize conflict.
  6. Halo Effect: If you see a person as having a positive trait, that positive impression will spill over into their other traits. (This also works for negative traits.)
  7. Moral Luck: Better moral standing happens due to a positive outcome; worse moral standing happens due to a negative outcome.
  8. False Consensus: We believe more people agree with us than is actually the case.
  9. Curse of Knowledge: Once we know something, we assume everyone else knows it, too.
  10. Spotlight Effect: We overestimate how much people are paying attention to our behavior and appearance.
  11. Availability Heuristic: We rely on immediate examples that come to mind while making judgments.
  12. Defensive Attribution: As a witness who secretly fears being vulnerable to a serious mishap, we will blame the victim less if we relate to the victim.
  13. Just-World Hypothesis: We tend to believe the world is just; therefore, we assume acts of injustice are deserved.
  14. Naïve Realism: We believe that we observe objective reality and that other people are irrational, uninformed, or biased.
  15. Naïve Cynicism: We believe that we observe objective reality and that other people have a higher egocentric bias than they actually do in their intentions/actions.
  16. Forer Effect (aka Barnum Effect): We easily attribute our personalities to vague statements, even if they can apply to a wide range of people.
  17. Dunning-Kruger Effect: The less you know, the more confident you are. The more you know, the less confident you are.
  18. Anchoring: We rely heavily on the first piece of information introduced when making decisions.
  19. Automation Bias: We rely on automated systems, sometimes trusting too much in the automated correction of actually correct decisions.
  20. Google Effect (aka Digital Amnesia): We tend to forget information that’s easily looked up in search engines.
  21. Reactance: We do the opposite of what we’re told, especially when we perceive threats to personal freedoms.
  22. Confirmation Bias: We tend to find and remember information that confirms our perceptions.
  23. Backfire Effect: Disproving evidence sometimes has the unwarranted effect of confirming our beliefs.
  24. Third-Person Effect: We believe that others are more affected by mass media consumption than we ourselves are.
  25. Belief Bias: We judge an argument’s strength not by how strongly it supports the conclusion but how plausible the conclusion is in our own minds.
  26. Availability Cascade: Tied to our need for social acceptance, collective beliefs gain more plausibility through public repetition.
  27. Declinism: We tent to romanticize the past and view the future negatively, believing that societies/institutions are by and large in decline.
  28. Status Quo Bias: We tend to prefer things to stay the same; changes from the baseline are considered to be a loss.
  29. Sunk Cost Fallacy (aka Escalation of Commitment): We invest more in things that have cost us something rather than altering our investments, even if we face negative outcomes.
  30. Gambler’s Fallacy: We think future possibilities are affected by past events.
  31. Zero-Risk Bias: We prefer to reduce small risks to zero, even if we can reduce more risk overall with another option.
  32. Framing Effect: We often draw different conclusions from the same information depending on how it’s presented.
  33. Stereotyping: We adopt generalized beliefs that members of a group will have certain characteristics, despite not having information about the individual.
  34. Outgroup Homogeneity Bias: We perceive out-group members as homogeneous and our own in-groups as more diverse.
  35. Authority Bias: We trust and are more often influenced by the opinions of authority figures.
  36. Placebo Effect: If we believe a treatment will work, it often will have a small physiological effect.
  37. Survivorship Bias: We tend to focus on those things that survived a process and overlook ones that failed.
  38. Tachypsychia: Our perceptions of time shift depending on trauma, drug use, and physical exertion.
  39. Law of Triviality (aka “Bike-Shedding”): We give disproportionate weight to trivial issues, often while avoiding more complex issues.
  40. Zeigarnik Effect: We remember incomplete tasks more than completed ones.
  41. IKEA Effect: We place higher value on things we partially created ourselves.
  42. Ben Franklin Effect: We like doing favors; we are more likely to do another favor for someone if we’ve already done a favor for them than if we had received a favor from that person.
  43. Bystander Effect: The more other people are around, the less likely we are to help a victim.
  44. Suggestibility: We, especially children, sometimes mistake ideas suggested by a questioner for memories.
  45. False Memory: We mistake imagination for real memories.
  46. Cryptomnesia: We mistake real memories for imagination.
  47. Clustering Illusion: We find patterns and “clusters” in random data.
  48. Pessimism Bias: We sometimes overestimate the likelihood of bad outcomes.
  49. Optimism Bias: We sometimes are over-optimistic about good outcomes.
  50. Blind Spot Bias: We don’t think we have bias, and we see it in others more than ourselves.
Use our cognitive bias infographic as inspiration for becoming better and knowing more! You can even print it out and use it as a cognitive bias poster to encourage others to do the same.
End Quote
I'm a rhetoric guy,

It's not just critical thinking, logos, but emotions, pathos, and authority, status, ethos, that persuade our thinking.

Humans are composed of those. Robots only think logic.
 

Mockingbird

Well-known member
Classic rhetoric has a lot of excellent ideas that I see as a necessary foundation of critical thinking.

I included a post on rhetoric in a series that I posted on critical thinking.

Human beings have many aspects to our existence and unfortunately we need to learn many of them to get a fraction of an educated opinion.

Here is a link to the blog post Cornerstones of Critical Thinking

 

Mockingbird

Well-known member
The main problem is some inability to correct some "back propagation errors to model" (feedback loop) in the brain.

(I am still reading, and videos, about computer Tensor Flow.)

Most of our model in our brain is developed during childhood from our parents and surrounding society.
Confirmation bias is a mental mechanism for protecting the major parts of the model.
if you are saying that ALL cognitive biases are COMPLETELY caused by one issue and you are ABSOLUTELY CERTAIN about that then I would need a tremendous amount of evidence to support such an astounding claim.

if you are only referring to confirmation bias it would require evidence but not as much.

Given everything we don't know or think is possible but not well established about human thought and behavior I don't think a definite and proven answer regarding every cognitive bias in fine detail exists or is likely to be found in my lifetime.

Several books by neuroscientists like A Mind So Rare by Merlin Donald explore the limits of our well supported understanding of the mind. In plain English we don't know a lot and can have hypotheses on many issues but should not mistake them for facts.
 

Riddick

I clap to no man
Classic rhetoric has a lot of excellent ideas that I see as a necessary foundation of critical thinking.

I included a post on rhetoric in a series that I posted on critical thinking.

Human beings have many aspects to our existence and unfortunately we need to learn many of them to get a fraction of an educated opinion.

Here is a link to the blog post Cornerstones of Critical Thinking

when a baby, a human being, is crying, that human being is saying something is wrong, and that baby is using a emotion.
 

ILove2Lurk

Lisbeth Salander
I have written on critical thinking and studied it for several years. I have discovered that some issues are the primary obstacles to overcome when trying to encourage people to use critical thinking.
I like that chart. Is there a book that it's based on? Something for people
with simple minds, like me? Perhaps, a "critical thinking for dummies"
book that you'd recommend.

Also, has all your study on critical thinking actually helped you? What has
it done for you? How has it changed your life? I'm sincerely curious.

My solution to all the craziness and crazy ideas in the world is to leave the
TV off and don't go on the Internet much anymore for things out of my
expertise zone. And everything is just fine in my life. But I'm being a bit
extreme, LOL.

I've studied Photoshop and photography constantly for years and years
and gotten pretty good at both. So I do like to do very deep dives on
different subjects.

Cheers
 

Riddick

I clap to no man
I have written on critical thinking and studied it for several years. I have discovered that some issues are the primary obstacles to overcome when trying to encourage people to use critical thinking.



The simple fact is that if you give people a definition of critical thinking the almost universal response is a variation of "I am smart/ a naturally good critical thinker, other people need to study this. I don't have that problem, they do."



The fact is that critical thinking is a subject and requires a lot of study, questioning and hard work to even begin to learn. It is more like a martial art that requires lots of discipline, practice, concentration, devotion and gradual development than a trait like strength or speed that people have greatly varying degrees of natural endowment in.



People usually treat it as a trait that they have a high degree of naturally and that others, especially others they disagree with and in groups that oppose their beliefs have a low degree of.



But the obvious question is why ?



There are several factors that contribute to this result and I want to take on several of them, starting with some that are the building blocks that combine to create the overall effect.



I think that the fundamental grouping that we need to take on is folk psychology. We have ideas and assumptions about human behavior and minds that we get from parents, ourselves, peer groups and society overall and many of these ideas are, frankly, wrong.



We all hear stories from people like our parents and teachers that describe personal character and they are not all accurate for understanding human beings. We usually don't closely inspect the fundamental assumptions we have or the metaphors that frame how we think of everything else. As children our critical and independent thinking is so poorly developed that we are extremely vulnerable to indoctrination and so we unthinkingly accept the ideas our parents, teachers and peers give us and we usually don't realize the drawbacks of hanging onto these beliefs and unexamined assumptions as adults.



So, how do we correct this ? In part the answer is individual and shaped by our own life experiences.



I think a good starting point is to examine cognitive biases because they affect all of us and are not well laid out in folk psychology. Several contradict folk psychology, so learning them can undo false ideas and replace them with true relevant information.



I want to use quotes from an article to give a foundation on cognitive biases to examine. If you have a good understanding of the subject at the base level it can serve you well for evaluating the topic in other contexts.

















50 COGNITIVE BIASES TO BE AWARE OF SO YOU CAN BE THE VERY BEST VERSION OF YOU
By Mighty Max
The human brain is pretty tricky: While we think we know things, there’s a whole list of cognitive biases that can be gumming up the works. We’ve found 50 types of cognitive bias that come up nearly every day, in petty Facebook arguments, in horoscopes, and on the global stage. Along with their definitions, these are real-life examples of cognitive bias, from the subtle groupthink sabotaging your management meetings to the pull of anchoring making you spend way too much money at a store during a sale. Knowing about this list of biases can help you make more informed decisions and realize when you’re way off the mark.

WHAT IS COGNITIVE BIAS?
Let’s start off with a basic cognitive bias definition: It is a systematic error in cognitive processes (like thinking, perceiving, and memory) diverging from rationality, which can affect judgments. If we think of the human brain as a computer, cognitive bias basically is an error in the code, making us perceive the input differently or come up with an output that’s illogical.
But there are other types of bias as well that aren’t necessarily cognitive; for example, there’s the theory of social proofing, which is one of the more popular social psychological biases. Also, there can be cognitive theories that aren’t necessarily considered biases, or rather, they’re more like a network of common biases tangled together, like cognitive dissonance, which causes mental discomfort when we hold conflicting ideas or beliefs in our minds. Then, there’s the world-famous placebo effect, which can actually result in physiological changes.
End quote

So, let's focus on just the biases that directly make it hard for us to understand that we need to improve our own critical thinking and that other people are not entirely the problem.
I will quote the definitions from the article below and then comment. I am going to take on a category of biases and break it down into two groups. Here are the biases that I have chosen to start with.
The first group is ten biases that are essential to seeing my side as right, including my group, my peers, celebrities and historical figures I admire and authorities who I agree with.
It also helps me to see such people who I dislike or don't admire and see as different from me and my groups as wrong.

  1. Blind Spot Bias: We don’t think we have bias, and we see it in others more than ourselves.
  2. Fundamental Attribution Error: We judge others on their personality or fundamental character, but we judge ourselves on the situation.
  3. Stereotyping: We adopt generalized beliefs that members of a group will have certain characteristics, despite not having information about the individual.
  4. Outgroup Homogeneity Bias: We perceive out-group members as homogeneous and our own in-groups as more diverse.
  5. In-Group Favoritism: We favor people who are in our in-group as opposed to an out-group.
  6. Bandwagon Effect: Ideas, fads, and beliefs grow as more people adopt them.
  7. Groupthink: Due to a desire for conformity and harmony in the group, we make irrational decisions, often to minimize conflict.
  8. False Consensus: We believe more people agree with us than is actually the case.
  9. Availability Cascade: Tied to our need for social acceptance, collective beliefs gain more plausibility through public repetition.
  10. Authority Bias: We trust and are more often influenced by the opinions of authority figures.
Blind Spot Bias is an obvious one to start with, we naturally are blinded by our biases and usually only learn about biases by observing them in others and realizing that we as human beings must also have them too, without the benefit of direct observations of the biases in ourselves. A lot of research on psychological priming and biases has good evidence to support the idea that we all have biases and are profoundly unaware of them in our direct observations of our daily lives. They are well hidden.

We have the vast array of biases that work to make us see ourselves and our peers as both rational and correct and as individuals with differences between us and them as both wrong and more similar to each other than they really are.

The Fundamental Attribution Error helps us to attribute actions of others to general traits and to see our actions as differentiated responses to different situations.

This is reinforced by Stereotyping as it is in agreement with it and further bolstered by Out Group Homogeneity Bias and helps us to see our group members as holding differences and out group members as holding the same beliefs, making seeing ALL of them as having incorrect beliefs easier. It also makes it easier to dismiss evidence that members of our group are wrong because we think of our group as having variations and so some people in it can be wrong without it lowering our opinion of the whole group.

In-Group Favoritism further helps this as we give our individual group members and the group overall the benefit of the doubt whenever possible. In other words we interpret ambiguous or gray area information favorably for our group members and don't give the benefit of the doubt to out-group members. We interpret ambiguous information unfavorably for them.

The Bandwagon Effect compounds this as we see ourselves and our group as better, including better at evaluating the truth, than others outside our group. So, a lot of our people believing an idea is rock solid proof of our accuracy to us and a lot of people outside our group believing an idea we disagree with serves to prove to us how wrong, stupid, irrational, evil, backward and so on the others are. A lot of us believing functions as proof we must be right but a lot of them disagreeing with our beliefs shows how consistently wrong they are.

Groupthink encourages us to conform to group norms and to go along with the most accepted ideas in the group, regardless of their truth or importance. Being believed and strongly embraced by the group serves as a substitute for being believed and embraced by ourselves.

False Consensus can boost the effect of many other group related biases as we think everyone else who is sensible believes something and everyone who is dead wrong disagrees.

The Availability Cascade, especially with red feeds and blue feeds tailored by algorithms which select the content we are most likely to agree with, serves to give us memes, articles, programs and videos that reinforce our beliefs and don't challenge them.

The Authority Bias helps us to see our beliefs and groups as proven right because we recognize authorities who are in agree with us and our peers as valid but see authorities who disagree with us as invalid. Simply put we use the In-Group Favoritism to select the authorities that fit the needs of the group, based on agreement and the other biases can work together such as Groupthink and the Bandwagon Effect when the authority and group agree without dissent. In free groups that are allowed to be open minded and to develop and express differing views this is to a degree blunted, but in high control groups aka authoritarian groups aka cults or destructive cults the power of ALL these biases can combine as obedience to authority and conformity to group norms without dissent creates a potent combination.

I wanted to zero in on these biases that are crucial to understand why we so easily see our group as right and the other groups as wrong or as more wrong when critical thinking is brought up because seeing the others as more wrong helps us to see them as having the problem and needing to improve their critical thinking first, while we usually never get around to it.

So, we can start with Blind Spot Bias as it hides all other biases then see the group of biases that includes (but is not limited to) The Fundamental Attribution Error, Stereotyping, Out Group Homogeneity Bias, In-Group Favoritism, the Bandwagon Effect, Groupthink, False Consensus, the Availability Cascade, and the Authority Bias as together helping us to not see our biases then see our side and peers and our authorities and selves as being right because we are in our groups and to see others outside our groups who disagree with us as wrong. It may seem complicated but I hope that the descriptions I gave help and you can always think of fictional and real examples. This is usually easiest with people and groups you strongly and passionately disagree with.

Just seeing how biases work and "compound" in anyone is difficult and a good start BUT if you don't carry it through to see how your authorities, your peers and you yourself also do it then you have just reinforced the biases with half understood justifications.

It's easy to see that the "opposition" is wrong. Most of us do, but it is harder to see how WE are wrong too. If you cannot see the fact that we are wrong too and dig into the details of when, how and why, then I frankly don't think you are going to be practicing critical thinking. Critical thinking expert Richard Paul in some lectures described pseudo critical thinkers who adopt a piece of critical thinking knowledge but don't have enough understanding of the subject in general or the information that they do have to correctly apply it. This doesn't mean you need a PhD in critical thinking to apply it. It means some of the ideas and techniques are more complicated than a phrase or sentence or paragraph and require more knowledge to properly apply.

In one lecture Richard Paul described just applying something to support your own beliefs or argument as sophistry and pseudo critical thinking. Sometimes people think winning arguments or getting opponents to withdraw from debates in confusion is critical thinking but it isn't, not even close.

A person can learn about biases and logical fallacies and propaganda techniques and rhetoric then face a less educated opponent relative to these ideas and bury them in terms they have to struggle through and unfamiliar aspects of debate such as the burden of proof falling on the claimant and they can use flawed arguments and claims while tearing apart the same flaws in arguments and claims by their opponent. This is sophistry, insincere debate, and attorneys are notorious for it. Also politicians.

It is a kind of intellectual cheating in which you insist that the opposition follows rules for carefully seeking the truth and you yourself only try to win with no regard for honesty or the truth.

There was a break in schools of philosophy a few thousand years ago and the sophists sought victory in debate and persuasion at any cost. Victory was their only goal. Many other philosophers saw both what they were doing and how they were doing it and condemned the sophists as intellectual frauds. And they were right to do so.

It took a lot of work by a lot of people, probably an unimaginable number, to get us the understanding we have today of our biases and critical thinking we can study today. I think it is well worth the time and effort and worth being honest with you about it too.

Click to view the full-size infographic




50 COGNITIVE BIASES TO BE AWARE OF SO YOU CAN BE THE VERY BEST VERSION OF YOU
By Mighty Max
The human brain is pretty tricky: While we think we know things, there’s a whole list of cognitive biases that can be gumming up the works. We’ve found 50 types of cognitive bias that come up nearly every day, in petty Facebook arguments, in horoscopes, and on the global stage. Along with their definitions, these are real-life examples of cognitive bias, from the subtle groupthink sabotaging your management meetings to the pull of anchoring making you spend way too much money at a store during a sale. Knowing about this list of biases can help you make more informed decisions and realize when you’re way off the mark.

WHAT IS COGNITIVE BIAS?
Let’s start off with a basic cognitive bias definition: It is a systematic error in cognitive processes (like thinking, perceiving, and memory) diverging from rationality, which can affect judgments. If we think of the human brain as a computer, cognitive bias basically is an error in the code, making us perceive the input differently or come up with an output that’s illogical.
But there are other types of bias as well that aren’t necessarily cognitive; for example, there’s the theory of social proofing, which is one of the more popular social psychological biases. Also, there can be cognitive theories that aren’t necessarily considered biases, or rather, they’re more like a network of common biases tangled together, like cognitive dissonance, which causes mental discomfort when we hold conflicting ideas or beliefs in our minds. Then, there’s the world-famous placebo effect, which can actually result in physiological changes.
Let’s go into some common cognitive bias examples to really see how they work!

50 TYPES OF COMMON COGNITIVE BIASES
  1. Fundamental Attribution Error: We judge others on their personality or fundamental character, but we judge ourselves on the situation.
  2. Self-Serving Bias: Our failures are situational, but our successes are our responsibility.
  3. In-Group Favoritism: We favor people who are in our in-group as opposed to an out-group.
  4. Bandwagon Effect: Ideas, fads, and beliefs grow as more people adopt them.
  5. Groupthink: Due to a desire for conformity and harmony in the group, we make irrational decisions, often to minimize conflict.
  6. Halo Effect: If you see a person as having a positive trait, that positive impression will spill over into their other traits. (This also works for negative traits.)
  7. Moral Luck: Better moral standing happens due to a positive outcome; worse moral standing happens due to a negative outcome.
  8. False Consensus: We believe more people agree with us than is actually the case.
  9. Curse of Knowledge: Once we know something, we assume everyone else knows it, too.
  10. Spotlight Effect: We overestimate how much people are paying attention to our behavior and appearance.
  11. Availability Heuristic: We rely on immediate examples that come to mind while making judgments.
  12. Defensive Attribution: As a witness who secretly fears being vulnerable to a serious mishap, we will blame the victim less if we relate to the victim.
  13. Just-World Hypothesis: We tend to believe the world is just; therefore, we assume acts of injustice are deserved.
  14. Naïve Realism: We believe that we observe objective reality and that other people are irrational, uninformed, or biased.
  15. Naïve Cynicism: We believe that we observe objective reality and that other people have a higher egocentric bias than they actually do in their intentions/actions.
  16. Forer Effect (aka Barnum Effect): We easily attribute our personalities to vague statements, even if they can apply to a wide range of people.
  17. Dunning-Kruger Effect: The less you know, the more confident you are. The more you know, the less confident you are.
  18. Anchoring: We rely heavily on the first piece of information introduced when making decisions.
  19. Automation Bias: We rely on automated systems, sometimes trusting too much in the automated correction of actually correct decisions.
  20. Google Effect (aka Digital Amnesia): We tend to forget information that’s easily looked up in search engines.
  21. Reactance: We do the opposite of what we’re told, especially when we perceive threats to personal freedoms.
  22. Confirmation Bias: We tend to find and remember information that confirms our perceptions.
  23. Backfire Effect: Disproving evidence sometimes has the unwarranted effect of confirming our beliefs.
  24. Third-Person Effect: We believe that others are more affected by mass media consumption than we ourselves are.
  25. Belief Bias: We judge an argument’s strength not by how strongly it supports the conclusion but how plausible the conclusion is in our own minds.
  26. Availability Cascade: Tied to our need for social acceptance, collective beliefs gain more plausibility through public repetition.
  27. Declinism: We tent to romanticize the past and view the future negatively, believing that societies/institutions are by and large in decline.
  28. Status Quo Bias: We tend to prefer things to stay the same; changes from the baseline are considered to be a loss.
  29. Sunk Cost Fallacy (aka Escalation of Commitment): We invest more in things that have cost us something rather than altering our investments, even if we face negative outcomes.
  30. Gambler’s Fallacy: We think future possibilities are affected by past events.
  31. Zero-Risk Bias: We prefer to reduce small risks to zero, even if we can reduce more risk overall with another option.
  32. Framing Effect: We often draw different conclusions from the same information depending on how it’s presented.
  33. Stereotyping: We adopt generalized beliefs that members of a group will have certain characteristics, despite not having information about the individual.
  34. Outgroup Homogeneity Bias: We perceive out-group members as homogeneous and our own in-groups as more diverse.
  35. Authority Bias: We trust and are more often influenced by the opinions of authority figures.
  36. Placebo Effect: If we believe a treatment will work, it often will have a small physiological effect.
  37. Survivorship Bias: We tend to focus on those things that survived a process and overlook ones that failed.
  38. Tachypsychia: Our perceptions of time shift depending on trauma, drug use, and physical exertion.
  39. Law of Triviality (aka “Bike-Shedding”): We give disproportionate weight to trivial issues, often while avoiding more complex issues.
  40. Zeigarnik Effect: We remember incomplete tasks more than completed ones.
  41. IKEA Effect: We place higher value on things we partially created ourselves.
  42. Ben Franklin Effect: We like doing favors; we are more likely to do another favor for someone if we’ve already done a favor for them than if we had received a favor from that person.
  43. Bystander Effect: The more other people are around, the less likely we are to help a victim.
  44. Suggestibility: We, especially children, sometimes mistake ideas suggested by a questioner for memories.
  45. False Memory: We mistake imagination for real memories.
  46. Cryptomnesia: We mistake real memories for imagination.
  47. Clustering Illusion: We find patterns and “clusters” in random data.
  48. Pessimism Bias: We sometimes overestimate the likelihood of bad outcomes.
  49. Optimism Bias: We sometimes are over-optimistic about good outcomes.
  50. Blind Spot Bias: We don’t think we have bias, and we see it in others more than ourselves.
Use our cognitive bias infographic as inspiration for becoming better and knowing more! You can even print it out and use it as a cognitive bias poster to encourage others to do the same.
End Quote
you can think it's all logical and that is the solution, just study logic, but it's not.

 

Mockingbird

Well-known member
I like that chart. Is there a book that it's based on? Something for people
with simple minds, like me? Perhaps, a "critical thinking for dummies"
book that you'd recommend.

Also, has all your study on critical thinking actually helped you? What has
it done for you? How has it changed your life? I'm sincerely curious.

My solution to all the craziness and crazy ideas in the world is to leave the
TV off and don't go on the Internet much anymore for things out of my
expertise zone. And everything is just fine in my life. But I'm being a bit
extreme, LOL.

I've studied Photoshop and photography constantly for years and years
and gotten pretty good at both. So I do like to do very deep dives on
different subjects.

Cheers
It has helped me a lot. I can see errors in my thinking and try to reduce them but understand that ALWAYS seeing all of them or eliminating all of them is impossible.

I am much calmer in many situations because I have doubts that cause me to pause and think through more possibilities than I would have before and to understand that I can always be wrong about anything no matter how certain I am that I am right.

It gives an order to things and encourages reflective thoughts, looking at issues from different perspectives and trying to be humble and compassionate because if you don't see your ignorance as infinite and knowledge as finite and that you yourself hold thousands of incorrect beliefs that you are not aware of them in my opinion you aren't actually doing critical thinking, at least not beyond a very initial level. You have to accept your ignorance, incorrect ideas and perception and memory as a given to be using critical thinking in my opinion and it encourages you to try to accept your flawed nature then to be sympathetic to others for they have the same nature, so if they hold incorrect ideas it is because they have the same flawed nature you do and not because they are especially stupid, lazy or evil.

in my experience reading dozens of books on psychology and critical thinking leads to seeing how you and others together have an imperfect nature and can strive to be correct, strive to be decent, strive to learn truth, strive to only do good and are ultimately built by our very nature to be imperfect and fail to some degree at each of these things.

If this knowledge shows for example how smart, sane, good people can fall for Scientology it also shows how these people can fall for other things like conspiracy theories or government propaganda or political or religious leaders who are unethical and dishonest and on and on and on.


It can strip away pretense to know no amount of degrees or study or self confidence can overcome our shared flawed nature.

If we were to accurately represent human intelligence and awareness of truth on a graph with many of us plotted I think that we would almost all be in a straight or virtually straight like of nearly equal ratings, especially if rated somehow against being actually correct.

I think we just are much more similar on average than we are different, with possibly some notable exceptions.

The best expert on critical thinking I found is Richard Paul. He and his wife developed a critical thinking curriculum that is superb to me.

I detailed much of it at Mockingbird's Nest blog on Scientology and he has many superb YouTube videos.
 

Mockingbird

Well-known member

Harden Long

OSA no esta hermOSA
This is the only time I will endorse Occam's Razor as the ultimate salvation! Jeezus!!!
 

Mockingbird

Well-known member
This is the only time I will endorse Occam's Razor as the ultimate salvation! Jeezus!!!
You bring up an important point. Lots of intellectuals like Steven Pimker outright reject the immense, overwhelming majority of evidence that we are not purely rational actors, like the stacks of experiments and research on psychological priming and influence and cognitive dissonance and cognitive biases and logical fallacies and on and on. They claim that we are obviously rational so all the evidence can be dismissed without consideration, about as unscientific an approach as possible. It's the personal incredulity fallacy - "I don't believe something, so it can't be true."

Unfortunately a simple explanation regarding cognitive biases doesn't address the reality that they both exist and impair our thinking.

If you don't want to deal with them that is an option but Occam's razor is not a scientific principle that negates a problem, even a problem with a lot of components.
 

Karakorum

Well-known member
You bring up an important point. Lots of intellectuals like Steven Pimker outright reject the immense, overwhelming majority of evidence that we are not purely rational actors, like the stacks of experiments and research on psychological priming and influence and cognitive dissonance and cognitive biases and logical fallacies and on and on. They claim that we are obviously rational so all the evidence can be dismissed without consideration, about as unscientific an approach as possible. It's the personal incredulity fallacy - "I don't believe something, so it can't be true."

Unfortunately a simple explanation regarding cognitive biases doesn't address the reality that they both exist and impair our thinking.

If you don't want to deal with them that is an option but Occam's razor is not a scientific principle that negates a problem, even a problem with a lot of components.
There is also the cognitive self-reference problem observed as far back as ancient Greek skepticism. I'll try express it in a concise modern english without all the usual frills and whistles:

"The list of cognitive biases is limited to those biases that we may not observe in ourselves, but we can observe in others.
By necessity it will not contain those cognitive biases whose very existence prevents us from noticing them - neither in ourselves, nor in others".
 

Mockingbird

Well-known member
The cognitive self reference problem in my opinion is a problem of systemic causation. Numerous scholars have remarked on our difficulty in perceiving and understanding systemic causation. To understand behavior we must understand many factors that combine to influence behavior, to understand weather and then climate we must understand many factors such as the types of wind and precipitation that occur and evaporation and the effects of human beings and industries on the environment and in and on.

In his book Behave Robert Sapolsky explains how genetics, childhood, culture and other factors like trauma and diet and the environment, including air and water and poverty interact to influence behavior.

I have taken on the problem as a systemic one. We know it is easier to find and confirm biases in others than ourselves with some techniques.

So, for example the priming experiments that had Asian women be primed by either considering their background, to prime thinking of themselves as Asian or considering issues as women to prime thinking of themselves as women or not priming them to have a neutral baseline was utilized. They then took math tests and on average the subjects primed to think of themselves as Asian did better and the subjects primed to think of themselves as women did worse and the neutral group was in the middle.

This is considered evidence that they have internalized the stereotypes that Asians are good at math, women are bad at math and that when primed they influenced their self images and behavior.

Here is a link to one study on these priming effects.



So, with numerous experiments and studies evidence for specific biases can be found.
I have examined many of the techniques and results for these and feel that there is sufficient evidence to consider a plausible hypothesis that biases exist regarding gender, race, age, wealth, cultural background and many other subjects.

Additionally the studies described in the book Blindspot make a strong case for evidence of other biases that operate below our conscious awareness. The book The Political Brain ,explains how these subconscious prejudices can be activated and manipulated below our conscious awareness.

If we first see biases in others we can them understand they are active but hidden in ourselves just as others cannot see them directly in themselves.

I feel it is a matter of getting an exterior view of a bias. In research on why people promote others it has been found that we think we promote based on education or past performance and will say so but in reality we promote men, including incompetent and inexperienced men much more than women, no matter how we justify it to ourselves. We also select people based on height and attractiveness for leadership instead of competence.

Numerous studies have found this such as those detailed in
Why Do So Many Incompetent Men Become Leaders? (And How to Fix It)
a book by Tomas Chamorro-Premuzic.

by seeing the strong evidence that OTHER PEOPLE in situations we are not in demonstrate bias we can understand that if they are frequently or consistently biases regarding the separate components of life such as gender, age, race, wealth, in groups and our groups and religion and politics and on and on and on it becomes obvious that we must either see ourselves as outside of humanity or as humans who are so we subject to all these biases.
 
Last edited:

Karakorum

Well-known member
The cognitive self reference problem in my opinion is a problem of systemic causation. Numerous scholars have remarked on our difficulty in perceiving and understanding systemic causation. To understand behavior we must understand many factors that combine to influence behavior, to understand weather and then climate we must understand many factors such as the types of wind and precipitation that occur and evaporation and the effects of human beings and industries on the environment and in and on.

In his book Behave Robert Sapolsky explains how genetics, childhood, culture and other factors like trauma and diet and the environment, including air and water and poverty interact to influence behavior.

I have taken on the problem as a systemic one. We know it is easier to find and confirm biases in others than ourselves with some techniques.

So, for example the priming experiments that had Asian women be primed by either considering their background, to prime thinking of themselves as Asian or considering issues as women to prime thinking of themselves as women or not priming them to have a neutral baseline was utilized. They then took math tests and on average the subjects primed to think of themselves as Asian did better and the subjects primed to think of themselves as women did worse and the neutral group was in the middle.

This is considered evidence that they have internalized the stereotypes that Asians are good at math, women are bad at math and that when primed they influenced their self images and behavior.
I meant something slightly different. By "self-reference" I did not mean a person referring to himself.
I meant it in terms of the bias that self-references itself. So a bias that itself prevents us from noticing this very bias, regardless if we try to observe it in ourselves or in others.

A bias-obscuring-bias is a cognitive catch 22: "To observe and describe bias X (in yourself or others), you must first be free of bias X. No person is free from bias X, hence it will never be observed or defined and it will continue to skew any and all research".

The potential worst thing is that the number of such bias-obscuring-biases might be very high. We will, by their very definition, never know.
 
  • Like
Reactions: M&M

Mockingbird

Well-known member
There is also the cognitive self-reference problem observed as far back as ancient Greek skepticism. I'll try express it in a concise modern english without all the usual frills and whistles:

"The list of cognitive biases is limited to those biases that we may not observe in ourselves, but we can observe in others.
By necessity it will not contain those cognitive biases whose very existence prevents us from noticing them - neither in ourselves, nor in others".
Regarding biases we don't observe in anyone again systemic causation is our best shot. We can sometimes figure out something must exist like dark matter or dark energy or black holes because something is missing, not because we have observed it. Now, I don't know if any of those three things actually exist.

Scientists keep going back and forth with all of them and pointing out problems if they do exist, so maybe they don't exist but SOMETHING does some of the things that dark matter, dark energy and black holes are supposed to do, even if those three things don't actually exist.

So, for example economics used to have the idea that people are rational actors. People observed human beings and found that we don't conform to models that are us as unbiased pure rational actors. So psychologists looked at people and saw the biases and behavior that has created behavioural economics.

The book Thinking, Fast and Slow by Daniel Kahneman details this as do many, many others.

So, we can find biases that are difficult to observe because biases exert influence. The influence, like the gravity of a black hold or the expansion of some things and the tendency to stay together of other things can show that something, or maybe several things seems to do some of the things that dark matter and energy are supposed to do.

So, if biases are influential they leave a trail of breadcrumbs to follow and if they exert no influence then they might be inconsequential and leave no trail.
 

Mockingbird

Well-known member
I meant something slightly different. By "self-reference" I did not mean a person referring to himself.
I meant it in terms of the bias that self-references itself. So a bias that itself prevents us from noticing this very bias, regardless if we try to observe it in ourselves or in others.

A bias-obscuring-bias is a cognitive catch 22: "To observe and describe bias X (in yourself or others), you must first be free of bias X. No person is free from bias X, hence it will never be observed or defined and it will continue to skew any and all research".

The potential worst thing is that the number of such bias-obscuring-biases might be very high. We will, by their very definition, never know.
There are biases we all have that are difficult to observe in others.


There is one bias that is difficult to fully consider in others and ourselves that I have written on a bit.

The prejudice to believe in free will is deeply hardwired into everyone.

I call it a problem of infinite regress.

As an example a person, Joe, believes free will doesn't exist at all to any degree.

Joe writes an article criticizing Lou for believing in free will as a foolish choice. But if Joe is correct the fact is that Lou has no choice but to believe in free will because Lou has no free will regarding anything and further Joe has no free will regarding his criticism of Lou, because no one has any freedom of any kind.

This has the infinite regress that with each step further in perspective you seem to get back to the outside observer perspective and apparently our natural perspective includes a deep prejudice regarding free will. We instantly perceive it in ourselves and others. Many of the cognitive biases that are listed above instinctively rely on assigning responsibility based on an assumption of free will and it varying in different situations.

I have a tremendous amount of difficulty in getting people to understand that just because we assume that free will exists doesn't mean it actually does.
 

Riddick

I clap to no man
I have written on critical thinking and studied it for several years. I have discovered that some issues are the primary obstacles to overcome when trying to encourage people to use critical thinking.



The simple fact is that if you give people a definition of critical thinking the almost universal response is a variation of "I am smart/ a naturally good critical thinker, other people need to study this. I don't have that problem, they do."



The fact is that critical thinking is a subject and requires a lot of study, questioning and hard work to even begin to learn. It is more like a martial art that requires lots of discipline, practice, concentration, devotion and gradual development than a trait like strength or speed that people have greatly varying degrees of natural endowment in.



People usually treat it as a trait that they have a high degree of naturally and that others, especially others they disagree with and in groups that oppose their beliefs have a low degree of.



But the obvious question is why ?



There are several factors that contribute to this result and I want to take on several of them, starting with some that are the building blocks that combine to create the overall effect.



I think that the fundamental grouping that we need to take on is folk psychology. We have ideas and assumptions about human behavior and minds that we get from parents, ourselves, peer groups and society overall and many of these ideas are, frankly, wrong.



We all hear stories from people like our parents and teachers that describe personal character and they are not all accurate for understanding human beings. We usually don't closely inspect the fundamental assumptions we have or the metaphors that frame how we think of everything else. As children our critical and independent thinking is so poorly developed that we are extremely vulnerable to indoctrination and so we unthinkingly accept the ideas our parents, teachers and peers give us and we usually don't realize the drawbacks of hanging onto these beliefs and unexamined assumptions as adults.



So, how do we correct this ? In part the answer is individual and shaped by our own life experiences.



I think a good starting point is to examine cognitive biases because they affect all of us and are not well laid out in folk psychology. Several contradict folk psychology, so learning them can undo false ideas and replace them with true relevant information.



I want to use quotes from an article to give a foundation on cognitive biases to examine. If you have a good understanding of the subject at the base level it can serve you well for evaluating the topic in other contexts.

















50 COGNITIVE BIASES TO BE AWARE OF SO YOU CAN BE THE VERY BEST VERSION OF YOU
By Mighty Max
The human brain is pretty tricky: While we think we know things, there’s a whole list of cognitive biases that can be gumming up the works. We’ve found 50 types of cognitive bias that come up nearly every day, in petty Facebook arguments, in horoscopes, and on the global stage. Along with their definitions, these are real-life examples of cognitive bias, from the subtle groupthink sabotaging your management meetings to the pull of anchoring making you spend way too much money at a store during a sale. Knowing about this list of biases can help you make more informed decisions and realize when you’re way off the mark.

WHAT IS COGNITIVE BIAS?
Let’s start off with a basic cognitive bias definition: It is a systematic error in cognitive processes (like thinking, perceiving, and memory) diverging from rationality, which can affect judgments. If we think of the human brain as a computer, cognitive bias basically is an error in the code, making us perceive the input differently or come up with an output that’s illogical.
But there are other types of bias as well that aren’t necessarily cognitive; for example, there’s the theory of social proofing, which is one of the more popular social psychological biases. Also, there can be cognitive theories that aren’t necessarily considered biases, or rather, they’re more like a network of common biases tangled together, like cognitive dissonance, which causes mental discomfort when we hold conflicting ideas or beliefs in our minds. Then, there’s the world-famous placebo effect, which can actually result in physiological changes.
End quote

So, let's focus on just the biases that directly make it hard for us to understand that we need to improve our own critical thinking and that other people are not entirely the problem.
I will quote the definitions from the article below and then comment. I am going to take on a category of biases and break it down into two groups. Here are the biases that I have chosen to start with.
The first group is ten biases that are essential to seeing my side as right, including my group, my peers, celebrities and historical figures I admire and authorities who I agree with.
It also helps me to see such people who I dislike or don't admire and see as different from me and my groups as wrong.

  1. Blind Spot Bias: We don’t think we have bias, and we see it in others more than ourselves.
  2. Fundamental Attribution Error: We judge others on their personality or fundamental character, but we judge ourselves on the situation.
  3. Stereotyping: We adopt generalized beliefs that members of a group will have certain characteristics, despite not having information about the individual.
  4. Outgroup Homogeneity Bias: We perceive out-group members as homogeneous and our own in-groups as more diverse.
  5. In-Group Favoritism: We favor people who are in our in-group as opposed to an out-group.
  6. Bandwagon Effect: Ideas, fads, and beliefs grow as more people adopt them.
  7. Groupthink: Due to a desire for conformity and harmony in the group, we make irrational decisions, often to minimize conflict.
  8. False Consensus: We believe more people agree with us than is actually the case.
  9. Availability Cascade: Tied to our need for social acceptance, collective beliefs gain more plausibility through public repetition.
  10. Authority Bias: We trust and are more often influenced by the opinions of authority figures.
Blind Spot Bias is an obvious one to start with, we naturally are blinded by our biases and usually only learn about biases by observing them in others and realizing that we as human beings must also have them too, without the benefit of direct observations of the biases in ourselves. A lot of research on psychological priming and biases has good evidence to support the idea that we all have biases and are profoundly unaware of them in our direct observations of our daily lives. They are well hidden.

We have the vast array of biases that work to make us see ourselves and our peers as both rational and correct and as individuals with differences between us and them as both wrong and more similar to each other than they really are.

The Fundamental Attribution Error helps us to attribute actions of others to general traits and to see our actions as differentiated responses to different situations.

This is reinforced by Stereotyping as it is in agreement with it and further bolstered by Out Group Homogeneity Bias and helps us to see our group members as holding differences and out group members as holding the same beliefs, making seeing ALL of them as having incorrect beliefs easier. It also makes it easier to dismiss evidence that members of our group are wrong because we think of our group as having variations and so some people in it can be wrong without it lowering our opinion of the whole group.

In-Group Favoritism further helps this as we give our individual group members and the group overall the benefit of the doubt whenever possible. In other words we interpret ambiguous or gray area information favorably for our group members and don't give the benefit of the doubt to out-group members. We interpret ambiguous information unfavorably for them.

The Bandwagon Effect compounds this as we see ourselves and our group as better, including better at evaluating the truth, than others outside our group. So, a lot of our people believing an idea is rock solid proof of our accuracy to us and a lot of people outside our group believing an idea we disagree with serves to prove to us how wrong, stupid, irrational, evil, backward and so on the others are. A lot of us believing functions as proof we must be right but a lot of them disagreeing with our beliefs shows how consistently wrong they are.

Groupthink encourages us to conform to group norms and to go along with the most accepted ideas in the group, regardless of their truth or importance. Being believed and strongly embraced by the group serves as a substitute for being believed and embraced by ourselves.

False Consensus can boost the effect of many other group related biases as we think everyone else who is sensible believes something and everyone who is dead wrong disagrees.

The Availability Cascade, especially with red feeds and blue feeds tailored by algorithms which select the content we are most likely to agree with, serves to give us memes, articles, programs and videos that reinforce our beliefs and don't challenge them.

The Authority Bias helps us to see our beliefs and groups as proven right because we recognize authorities who are in agree with us and our peers as valid but see authorities who disagree with us as invalid. Simply put we use the In-Group Favoritism to select the authorities that fit the needs of the group, based on agreement and the other biases can work together such as Groupthink and the Bandwagon Effect when the authority and group agree without dissent. In free groups that are allowed to be open minded and to develop and express differing views this is to a degree blunted, but in high control groups aka authoritarian groups aka cults or destructive cults the power of ALL these biases can combine as obedience to authority and conformity to group norms without dissent creates a potent combination.

I wanted to zero in on these biases that are crucial to understand why we so easily see our group as right and the other groups as wrong or as more wrong when critical thinking is brought up because seeing the others as more wrong helps us to see them as having the problem and needing to improve their critical thinking first, while we usually never get around to it.

So, we can start with Blind Spot Bias as it hides all other biases then see the group of biases that includes (but is not limited to) The Fundamental Attribution Error, Stereotyping, Out Group Homogeneity Bias, In-Group Favoritism, the Bandwagon Effect, Groupthink, False Consensus, the Availability Cascade, and the Authority Bias as together helping us to not see our biases then see our side and peers and our authorities and selves as being right because we are in our groups and to see others outside our groups who disagree with us as wrong. It may seem complicated but I hope that the descriptions I gave help and you can always think of fictional and real examples. This is usually easiest with people and groups you strongly and passionately disagree with.

Just seeing how biases work and "compound" in anyone is difficult and a good start BUT if you don't carry it through to see how your authorities, your peers and you yourself also do it then you have just reinforced the biases with half understood justifications.

It's easy to see that the "opposition" is wrong. Most of us do, but it is harder to see how WE are wrong too. If you cannot see the fact that we are wrong too and dig into the details of when, how and why, then I frankly don't think you are going to be practicing critical thinking. Critical thinking expert Richard Paul in some lectures described pseudo critical thinkers who adopt a piece of critical thinking knowledge but don't have enough understanding of the subject in general or the information that they do have to correctly apply it. This doesn't mean you need a PhD in critical thinking to apply it. It means some of the ideas and techniques are more complicated than a phrase or sentence or paragraph and require more knowledge to properly apply.

In one lecture Richard Paul described just applying something to support your own beliefs or argument as sophistry and pseudo critical thinking. Sometimes people think winning arguments or getting opponents to withdraw from debates in confusion is critical thinking but it isn't, not even close.

A person can learn about biases and logical fallacies and propaganda techniques and rhetoric then face a less educated opponent relative to these ideas and bury them in terms they have to struggle through and unfamiliar aspects of debate such as the burden of proof falling on the claimant and they can use flawed arguments and claims while tearing apart the same flaws in arguments and claims by their opponent. This is sophistry, insincere debate, and attorneys are notorious for it. Also politicians.

It is a kind of intellectual cheating in which you insist that the opposition follows rules for carefully seeking the truth and you yourself only try to win with no regard for honesty or the truth.

There was a break in schools of philosophy a few thousand years ago and the sophists sought victory in debate and persuasion at any cost. Victory was their only goal. Many other philosophers saw both what they were doing and how they were doing it and condemned the sophists as intellectual frauds. And they were right to do so.

It took a lot of work by a lot of people, probably an unimaginable number, to get us the understanding we have today of our biases and critical thinking we can study today. I think it is well worth the time and effort and worth being honest with you about it too.

Click to view the full-size infographic




50 COGNITIVE BIASES TO BE AWARE OF SO YOU CAN BE THE VERY BEST VERSION OF YOU
By Mighty Max
The human brain is pretty tricky: While we think we know things, there’s a whole list of cognitive biases that can be gumming up the works. We’ve found 50 types of cognitive bias that come up nearly every day, in petty Facebook arguments, in horoscopes, and on the global stage. Along with their definitions, these are real-life examples of cognitive bias, from the subtle groupthink sabotaging your management meetings to the pull of anchoring making you spend way too much money at a store during a sale. Knowing about this list of biases can help you make more informed decisions and realize when you’re way off the mark.

WHAT IS COGNITIVE BIAS?
Let’s start off with a basic cognitive bias definition: It is a systematic error in cognitive processes (like thinking, perceiving, and memory) diverging from rationality, which can affect judgments. If we think of the human brain as a computer, cognitive bias basically is an error in the code, making us perceive the input differently or come up with an output that’s illogical.
But there are other types of bias as well that aren’t necessarily cognitive; for example, there’s the theory of social proofing, which is one of the more popular social psychological biases. Also, there can be cognitive theories that aren’t necessarily considered biases, or rather, they’re more like a network of common biases tangled together, like cognitive dissonance, which causes mental discomfort when we hold conflicting ideas or beliefs in our minds. Then, there’s the world-famous placebo effect, which can actually result in physiological changes.
Let’s go into some common cognitive bias examples to really see how they work!

50 TYPES OF COMMON COGNITIVE BIASES
  1. Fundamental Attribution Error: We judge others on their personality or fundamental character, but we judge ourselves on the situation.
  2. Self-Serving Bias: Our failures are situational, but our successes are our responsibility.
  3. In-Group Favoritism: We favor people who are in our in-group as opposed to an out-group.
  4. Bandwagon Effect: Ideas, fads, and beliefs grow as more people adopt them.
  5. Groupthink: Due to a desire for conformity and harmony in the group, we make irrational decisions, often to minimize conflict.
  6. Halo Effect: If you see a person as having a positive trait, that positive impression will spill over into their other traits. (This also works for negative traits.)
  7. Moral Luck: Better moral standing happens due to a positive outcome; worse moral standing happens due to a negative outcome.
  8. False Consensus: We believe more people agree with us than is actually the case.
  9. Curse of Knowledge: Once we know something, we assume everyone else knows it, too.
  10. Spotlight Effect: We overestimate how much people are paying attention to our behavior and appearance.
  11. Availability Heuristic: We rely on immediate examples that come to mind while making judgments.
  12. Defensive Attribution: As a witness who secretly fears being vulnerable to a serious mishap, we will blame the victim less if we relate to the victim.
  13. Just-World Hypothesis: We tend to believe the world is just; therefore, we assume acts of injustice are deserved.
  14. Naïve Realism: We believe that we observe objective reality and that other people are irrational, uninformed, or biased.
  15. Naïve Cynicism: We believe that we observe objective reality and that other people have a higher egocentric bias than they actually do in their intentions/actions.
  16. Forer Effect (aka Barnum Effect): We easily attribute our personalities to vague statements, even if they can apply to a wide range of people.
  17. Dunning-Kruger Effect: The less you know, the more confident you are. The more you know, the less confident you are.
  18. Anchoring: We rely heavily on the first piece of information introduced when making decisions.
  19. Automation Bias: We rely on automated systems, sometimes trusting too much in the automated correction of actually correct decisions.
  20. Google Effect (aka Digital Amnesia): We tend to forget information that’s easily looked up in search engines.
  21. Reactance: We do the opposite of what we’re told, especially when we perceive threats to personal freedoms.
  22. Confirmation Bias: We tend to find and remember information that confirms our perceptions.
  23. Backfire Effect: Disproving evidence sometimes has the unwarranted effect of confirming our beliefs.
  24. Third-Person Effect: We believe that others are more affected by mass media consumption than we ourselves are.
  25. Belief Bias: We judge an argument’s strength not by how strongly it supports the conclusion but how plausible the conclusion is in our own minds.
  26. Availability Cascade: Tied to our need for social acceptance, collective beliefs gain more plausibility through public repetition.
  27. Declinism: We tent to romanticize the past and view the future negatively, believing that societies/institutions are by and large in decline.
  28. Status Quo Bias: We tend to prefer things to stay the same; changes from the baseline are considered to be a loss.
  29. Sunk Cost Fallacy (aka Escalation of Commitment): We invest more in things that have cost us something rather than altering our investments, even if we face negative outcomes.
  30. Gambler’s Fallacy: We think future possibilities are affected by past events.
  31. Zero-Risk Bias: We prefer to reduce small risks to zero, even if we can reduce more risk overall with another option.
  32. Framing Effect: We often draw different conclusions from the same information depending on how it’s presented.
  33. Stereotyping: We adopt generalized beliefs that members of a group will have certain characteristics, despite not having information about the individual.
  34. Outgroup Homogeneity Bias: We perceive out-group members as homogeneous and our own in-groups as more diverse.
  35. Authority Bias: We trust and are more often influenced by the opinions of authority figures.
  36. Placebo Effect: If we believe a treatment will work, it often will have a small physiological effect.
  37. Survivorship Bias: We tend to focus on those things that survived a process and overlook ones that failed.
  38. Tachypsychia: Our perceptions of time shift depending on trauma, drug use, and physical exertion.
  39. Law of Triviality (aka “Bike-Shedding”): We give disproportionate weight to trivial issues, often while avoiding more complex issues.
  40. Zeigarnik Effect: We remember incomplete tasks more than completed ones.
  41. IKEA Effect: We place higher value on things we partially created ourselves.
  42. Ben Franklin Effect: We like doing favors; we are more likely to do another favor for someone if we’ve already done a favor for them than if we had received a favor from that person.
  43. Bystander Effect: The more other people are around, the less likely we are to help a victim.
  44. Suggestibility: We, especially children, sometimes mistake ideas suggested by a questioner for memories.
  45. False Memory: We mistake imagination for real memories.
  46. Cryptomnesia: We mistake real memories for imagination.
  47. Clustering Illusion: We find patterns and “clusters” in random data.
  48. Pessimism Bias: We sometimes overestimate the likelihood of bad outcomes.
  49. Optimism Bias: We sometimes are over-optimistic about good outcomes.
  50. Blind Spot Bias: We don’t think we have bias, and we see it in others more than ourselves.
Use our cognitive bias infographic as inspiration for becoming better and knowing more! You can even print it out and use it as a cognitive bias poster to encourage others to do the same.
End Quote
Let me ask you this question, how do you think hubbard got people involved in dianetics and scientology?
 

Mockingbird

Well-known member
Let me ask you this question, how do you think hubbard got people involved in dianetics and scientology

Hubbard used a variety of techniques that he pursued for a variety of reasons. Partly his behavior was the behavior of a malignant narcissist or traumatic narcissist as described by Daniel Shaw in the book Traumatic Narcissism, additionally he follows the behavior a guru displays as described by Robert Jay Lifton in books such as Thought Reform and the Psychology of Totalism and more recently his description of solipsistic reality in his book Losing Reality.

Plainly, the behavior of cult leaders often fits their psychology and the relationships they pursue follow the pattern described in Terror, Love and Brainwashing by Alexandra Stein.

This explains a significant portion of what Hubbard did. I certainly think that the hypothesis that Jon Atack has put forward that Hubbard plagiarized hundreds of hypnotic techniques and tried to covertly mentally enslave humanity.

It is worth serious and in depth study of hypnosis and the works of cult experts on this topic. In addition to those I already mentioned the works of experts like Steven Hassan in his books such as Freedom of Mind and the book Cults Inside Out by Rick Alan Ross and Cults In Our Midst by Margaret Singer and Take Back Your Life by Janja Lalich.

I written over four hundred posts on Scientology at Mockingbird's Nest blog on Scientology.

You and everyone else are certainly welcome to read all the posts if you like.
 
Top