The Knowledge Illusion


Well-known member
This post is on the book The Knowledge Illusion by Steven Sloman (cognitive scientist and professor at Brown University) and Phillip Fernbach (cognitive scientist and professor of marketing at Colorado's Leeds School of business).

I have written on numerous other books on psychology, social psychology, critical thinking, cognitive dissonance theory and related topics already but discovered this one and feel it plays a complimentary and very needed role. It helps to explain a huge number of "hows" and "whys" regarding the other subjects I mentioned, all of the subjects.

I generally have written primarily about Scientology, after being in Scientology for twenty five years, and tried to evaluate information from other subjects, both for its own sake and against the experience of being in Scientology to see what, if any, understanding it can bring to that.

It is a niche, granted, but it has been my niche.

With The Knowledge Illusion I found a book that presents an entire hypothesis and analysis for a concept I kept running into bits of it variations of in different works and even different subjects - we as individuals have some knowledge but there is a lot of evidence that it isn't a match for what our intuition tends to be, not by a lot. In fact our intuition itself and judging the accuracy of a belief by our certainty is one of our strongest and best hidden flaws in terms of our efforts to be rational, that is to say accurate, meaning perceiving and understanding things as they truly are.

I kept finding that the truth as found in a lot of research with real studies and experiments by lots of people has been that we as human beings habitually assign causes to events and behaviors and are generally extremely confident in our judgements on such matters and when looked at closely it seems that we are terrible at getting it right but undeterred by failure in understanding the motives of others and equally inaccurate and equally undeterred by failure in understanding ourselves.

It is one of the most counterintuitive discoveries in science and took a lot of work by a lot of people to establish and more work to get across that it is true for me and you, regardless of intelligence, education or any other factors.

The Knowledge Illusion has far more it explores than the "high certainty" and "low accuracy" blindspot we have regarding motivations in ourselves and others. But that kept coming up over and over in good books on psychology, neuroscience, critical thinking and other subjects. A lot of roads lead to the same discovery.

Scientology has very much, well, the opposite idea from what the research and evidence in all these other subjects supports. In Scientology certainty is knowledge. Scientology founder Ronald Hubbard used the idea that being certain was more important than facts. He also redefined reality as agreement, rather than what actually exists, which is what reality is, or was before Hubbard's redefinition for his purposes.

The Knowledge Illusion takes on many of the difficult to understand and contradictory aspects of human nature. Scientology has efforts to explain the same things but I think they are failures.

If you are a Scientologist or ex Scientologist I invite you to read this and consider the ideas presented here, especially the ones that have scientific evidence and research behind them in comparison to the ideas from Scientology. Decide for yourself.

If you were never in Scientology and want to understand how people can believe things like Scientology and be capable in some ways but not capable in other ways then I hope this helps to answer how and why. And it applies to far, far more than Scientology. In life most of us deal with far more than Scientology.

The authors started the introduction with a story of an error in one of the fusion bomb tests by the United States government. It was a bomb called Shrimp, code named Castle Bravo and it was about three times as powerful as the scientists expected. It caused death for some and radiation sickness and contaminated several inhabited areas.

The authors started out "This story illustrates a fundamental paradox of humankind. The human mind is both genius and pathetic, brilliant and idiotic." They went on "Each of us is error-prone, sometimes irrational, and often ignorant."

"It is incredible that we have developed governance systems and economies that provide the comforts of modern life even though most of us have only a vague sense of how those systems work."

"How is it that people can simultaneously bowl us over with their ingenuity and disappoint us with their ignorance? How have we mastered so much despite how limited our understanding often is?" (page 3)

I found the answers they provide worth serious consideration. I hope you will too.


Well-known member
The authors in the introduction take on the origin of cognitive science and explain how in the 1950s how we think and do actions was explored.

The authors have spent years researching cognitive science and remarked "We have seen directly that the history of cognitive science has not been a steady march toward a conception of how the human mind is capable of amazing feats. Rather, a good chunk of what cognitive science has taught us over the years is what individual humans can't do - what our limitations are." (Page 4)

"The darker side of cognitive science is a series of revelations that human capacity is not all that it seems, that most people are highly constrained in how they work and what they can achieve. There are severe limits on how much information a person can process (that's why we often forget someone's name seconds after being introduced). People often lack skills that seem basic, like evaluating how risky an action is, and it's not clear they can be learned (hence many of us - authors included - are absurdly scared of flying, one of the safest modes of transports available). Perhaps most important, individual knowledge is remarkably shallow, only scratching the true surface of the complexity of the world, and yet we often don't realize how little we understand." (Page 4)

"The result is that we are often overconfident, sure we are right about things we know little about." (Page 4)

Now that is a lot to take in and leads to an obvious question which people have given me repeatedly when I write about the limitations of the human mind - how do we know so much and have so much technology and art and science if we are so limited ? The achievements of humanity and the confidence an individual can have in their personal knowledge seem to contradict this - but upon closer examination they don't.

The authors explain one part "The human mind is not like a desktop computer, designed to hold reams of information. The mind is a flexible problem solver that evolved to extract only the most useful information to guide decisions in new situations. As a consequence, individuals store very little detailed information about the world in their heads. In that sense, people are like bees and society a beehive: Our intelligence resides not in individual brains but in the collective mind. To function, individuals rely not only on knowledge stored within our skulls but also on knowledge stored elsewhere: in our bodies, in the environment and especially in other people. When you put it all together, human thought is incredibly impressive. But it is a product of a community, not of any individual alone." (Page 5)

The authors described the Castle Bravo nuclear weapons tests of 1954 and how they much like the Manhattan project before required thousands of people including engineers, physicists, doctors, nurses and on on. No one person understood it all.

As a useful exercise the authors explain that modern planes and cars are too complicated for most of us to understand. I confess I don't know how they work. Not even a little bit. I know that they exist and work but have no clue how.

The authors then demonstrate their point further by asking if we as a reader know how modern toilets work. I again confess I don't know how. They even have a few paragraphs to explain it but they leave out so much that I still don't actually know how a toilet works even after reading the answer.

They go on to explain a complete understanding of toilets would include a lot more and get into things like why people buy certain toilets and economics and on and on.

They comment "Nobody could be a master of every facet of even a single thing. Even the simplest objects require complex webs of knowledge to manufacture and use." (Page 8)

Then they contrast really complicated stuff like bacteria, trees, hurricanes, love and reproduction. They mention how most of us can't tell how a coffeemaker works or how glue holds paper together or how a focus on a camera works.

If we look around a modern workplace or home there are hundreds of things we usually do not understand. Computers and lights and machines and architecture and on and on.

"Our point is not that people are ignorant. It's that people are more ignorant than they think they are. We all suffer, to a greater or lesser extent, from an illusion of understanding, an illusion that we understand how things work when in fact our understanding is meager." (Page 8)

"We wager that, except for a few areas that you've developed expertise in, your level of knowledge about the causal mechanisms that control not only devices, but the mechanisms that determine how events begin, how they unfold, and how one event leads to another is relatively shallow." (Page 9)

"We can't possibly understand everything, and the sane among us don't even try. We rely on abstract knowledge, vague and unanalyzed. We've all seen the exceptions - people who cherish detail and love to talk about it at great length, sometimes in fascinating ways. And we all have domains in which we are experts, in which we know a lot in exquisite detail. But on most subjects, we connect only abstract bits of information, and what we know is little more than a feeling of understanding we can't really unpack. In fact, most knowledge is little more than a bunch of associations, high-level links between objects or people that aren't broken down into detailed stories.

So why don't we realize the depth of our ignorance? Why do we think we understand things deeply, that we have systemic webs of knowledge that make sense of everything, when the reality is so different? Why do we live in an illusion of understanding?" (Page 10)

These are big questions, profound questions. The authors believe thinking evolved to help us with action. Lots of living animals do various actions and have likely been doing actions for far longer than thought has existed.

"Thought allows us to select from among a set of possible actions by predicting the effects of each action and by imagining how the world would be if we had taken different actions in the past. " (Page 11)

"We will see that humans specialize in reasoning about how the world works, about causality. Predicting the effects of action requires reasoning about how causes produce effects, and figuring out why something happened requires reasoning about which causes are likely to have produced an effect. This is what the mind is designed to do. Whether we are thinking about physical objects, social systems, our pet dog - whatever - our expertise is in determining how actions and other causes produce effects. We know that kicking a ball will send it flying, but kicking a dog will cause pain. Our thought processes, our language, and our emotions are all designed to engage causal reasoning to help us act in reasonable ways.

This makes human ignorance all the more surprising. If causality is so critical to selecting the best actions, why do individuals have so little detailed knowledge about how the world works? It's because thought is masterful at extracting only what is needed and filtering out everything else. " (Page 11 - Page 12)

"Your causal understanding is limited to only what you need to know: how to make the thing work (with any luck you've mastered that)." (Page 12)

So the authors see us as individuals as being very limited in our knowledge and usually very unaware of how little we each really know. They see thinking as a tool to help us carry out actions. If we are good at picking which actions to do and do them we have a better chance at surviving in evolutionary terms. So a function of thinking is defining the world well enough to accurately enough choose actions and carry out actions that are better for survival than other options. Thought is designed to predict possible futures and work towards more desirable ones that can be achieved regarding survival.

The enormous amount of the world we don't know or understand is treated mostly as irrelevant to making the decisions required for action, especially if we have enough knowledge to make decisions.

This is a profound proposal - we don't need the irrelevant and treat much of what we don't know as irrelevant. It is an amazing revelation to me that we as individuals know so very little because we focus on just knowing enough to get by. I know just enough about my car to drive it in good conditions and if it is in good order. I know just enough about my job to do most of what I need to but sometimes have to ask questions. I know just enough about the vast majority of things that I deal with to engage with them on a very superficial level. Wow.


Well-known member
"We would not be such competent thinkers if we had to rely only on the limited knowledge stored in our heads and our facility for causal reasoning. The secret to our success is that we live in a world in which knowledge is all around us. It is in the things we make, in our bodies and workspaces, and in other people. We live in a community of knowledge.

We have access to huge amounts of knowledge that sit in other people's heads: We have our friends and family who each have their little domains of expertise. We have experts that we can contact to, say, fix our dishwasher when it breaks down for the umpteenth time. We have professors and talking heads on television to inform us about events and how things work. We have books, and we have the richest resource of all time at our fingertips, the Internet." (Page 13)

The authors go on to explain how things themselves demonstrate knowledge as many can be taken apart to show us how to fix them or used to show us how to use them. A city is laid out so traveling around it shows you how it is laid out.

Modern life includes what scientists call a division of cognitive labor. You can find information much more easily, don't need to remember as much as before, only needing to remember where some of it is stored.

The division has always existed. No one person knows for example every song or how to play every instrument or how to do everything in construction or every dish that could be cooked. The labor has been divided up, the cognitive labor within a single profession even.

"So we collaborate. That's a major benefit of living in social groups, to make it easier to share our skills and knowledge. It's not surprising that we fail to identify what's in our heads versus what's in others', because we're generally - perhaps always - doing things that involve both." (Page 14)

"Sharing skills and knowledge is more sophisticated than it sounds. Human beings don't merely make individual contributions to a project, like machines operating in an assembly line. Rather, we are able to work together, aware of others and what they are trying to accomplish. We pay attention together and we share goals. In the language of cognitive science, we share intentionality. This is a form of collaboration that you don't see in other animals. We actually enjoy sharing our mind space with others. In one form, it's called playing." (Page 14)

"You now have the background you need to understand the origin of the knowledge illusion. The nature of thought is to seamlessly draw on knowledge wherever it can be found, inside and outside of our own heads. We live under the knowledge illusion because we fail to draw the line between what is inside and outside of our own heads. And we fail because there is no sharp line. So we frequently don't know what we don't know." (Page 15)

"Instead of appreciating complexity, people tend to affiliate with one or another social dogma. Because our knowledge is enmeshed with that of others, the community shapes our beliefs and attitudes. It is so hard to reject an opinion shared by our peers that too often we don't even try to evaluate claims based on the merits. We let our group do our thinking for us. Appreciating the communal nature of knowledge should make us more realistic about what's determining our beliefs and values. " (Page 16)

"Appreciating the communal nature of knowledge can reveal biases in how we see the world. People love heroes. We glorify individual strength, talent and good looks. Our movies and books idolize characters who, like Superman, can save the planet all by themselves. TV dramas present brilliant but understated detectives who both solve the crime and make the climactic final arrest after a flash of insight. Individuals are given credit for major breakthroughs. Marie Curie is treated as if she worked alone to discover radioactivity, Newton as if he discovered the laws of motion in a bubble. All the successes of the Mongols in the twelfth and thirteenth century are attributed to Genghis Khan, and all the evils of Rome during the time of Jesus are often identified with a single person, Pontius Pilate.

The truth is that in the real world, nobody operates in a vacuum. Detectives have teams who attend meetings and think and act as a group. Scientists not only have labs with students who contribute critical ideas, but also have colleagues, friends and nemeses who are doing similar work, thinking similar thoughts, and without whom the scientist would get nowhere. And then there are other scientists who are working on different problems, sometimes in different fields, but nevertheless set the stage through their own findings and ideas. Once we start appreciating that knowledge isn't all in the head, that it's shared within a community, our heroes change. Instead of focusing on the individual, we begin to focus on a larger group. " (Page 17)

" There are other implications too. Because we think communally, we tend to operate in teams. This means that the contributions we make as individuals depend more on our ability to work with others than on our individual horsepower. Individual intelligence is overrated. It also means that we learn best when we're thinking with others. Some of our best teaching techniques at every level of education have students learning as a team. This isn't news to education researchers, but the insight is not implemented in the classroom as it should be. " (Page 18)


Well-known member
The authors of The Knowledge Illusion in chapter one What We Know attack "the same illusion that we have all experienced: that we understand how things work even when we don't." (Page 20)

In this chapter they describe the work of Frank Keil who was a cognitive scientist at Cornell for many years and moved to Yale in 1998. They describe how Keil realized that the theories people hold on how things work are "shallow and incomplete" but that Keil had a hard time finding a scientific way to show people how much they actually know in comparison to what they think they know. Keil tried different ways but they took too long and people just made stuff up.

Then one day he discovered an idea he developed. He figured out a way to show what he called the illusion of explanatory depth. IoED paradigm for short. He grabbed Leon Rozenblitz and together they asked a series of questions.

Here is an example.

"1. On a scale from 1 to 7, how well do you understand how zippers work?
2. How does a zipper work? Describe in as much detail as you can all the steps

If you're like most of Rozenblitz and Keil's participants, you don't work in a zipper factory and you have little to say in answer to the second question. You just don't really know how zippers work. So, when asked this question:
3. Now, on the same 1 to 7 scale, rate your knowledge of how a zipper works again." ( Page 21)

They described how most people who do this realize how little they know and lower their knowledge rating by a point or two.

They found this can work with speedometers, piano keys, flush toilets, cylinder locks, helicopters, quartz watches, and sewing machines. They found this at many elite universities for grad students and under grads, they found it at a large public school and a random sampling of Americans on the internet.

They found it not just with objects but with just about everything.

"People overestimate their understanding of political issues like tax policy and foreign relations, of hot-button scientific topics like GMOs and climate change, and even on their own finances. We have been studying psychological phenomena for a long time and it is rare to come across one as robust as the illusion of understanding." ( Page 22)

Regarding the subjects and their explanatory depth:

"They realized that they have less knowledge that they can articulate than they thought. This is the essence of the illusion of explanatory depth. Before trying to explain something, people feel they have a reasonable level of understanding; after explaining, they don't." ( Page 23)

"According to Rozenblitz and Keil, "many participants reported genuine surprise and new humility at how much less they knew than they originally thought." "( Page 23)

Further research showing the knowledge illusion was done by Rebecca Lawson, a psychologist at the University of Liverpool. She used a picture of a bicycle and has the two wheels, the frame, seat and handlebars present. She left out the chain, pedals and all other parts. People are asked to fill in the rest and very frequently designed bikes that wouldn't work and sometimes couldn't turn. Even people who regularly ride bikes get it wrong.

This brings us to a couple ideas. We demonstrably overestimate how much we know. Well this prompts the question how much do we know ? Thomas Landauer worked on cognitive science for decades. He worked at Harvard, Stanford, Princeton and tried to apply his ideas for twenty five years at Bell labs.

Folks like Alan Turing and John von Neumann developed computing and in the sixties scientists thought the mind worked like a computer. Landauer worked in this period.

By the 1980s Landauer worked to estimate the size of human memory. In 2017 the memory of a laptop computer was about 250 to 500 gigabytes as long term storage. Landauer used many techniques to estimate how much knowledge people have. Going by vocabulary he estimated an average adult at half a gigabyte.

He used a very different technique involving how quickly people recall information against how quickly they identity a new item. With very precise measurements of differences he tried factoring forgetting and a number of other factors.

He ended up with the conclusion that people likely have one gigabyte of memory on average. Even if he is off by a factor of ten that is a very small amount of memory.

Many computers, phones and videogames have far more memory and they are not sentient. We are able to function in society. We can sit at a table with five other adults and follow the conversation if everyone is gossiping about the neighborhood and understand the different feelings and ideas everyone at the table holds. We can know what is going on if the topic switches to the news or work and we all speak at least one language.

So, how can we know so little but function so well ?

The key is how OUR minds work.

"Cognitive scientists don't take the computer metaphor so seriously anymore. There is a place for it; some models of how people think when they're thinking slowly and carefully - when they are deliberating step-by-step as opposed to being intuitive and less careful - look like computer programs. But for the most part these days, cognitive scientists point to how we differ from computers. Deliberation is only a tiny part of what goes on when we think. Most of cognition consists of intuitive thought that occurs below the surface of consciousness. It involves processing huge quantities of information in parallel. " (Page 27)

"More to the point, people are not computers in that we don't just rely on a central processor that reads and writes to a memory to think. As we'll discuss later in the book, people rely on their bodies, on the world around them, and on their minds." (Page 27)

The authors go on next to explain the incredible complexity of the world around us. They use examples like the thirty thousand parts that go into a modern car and that those parts can be designed in thousands of ways that affect how they interact and how the car as a whole functions.

Modern airplanes are so complex no one person understands everything about them. They require multiple teams to design and build. Modern cars are so complex many mechanics see themselves as replacing modules. To understand the basics students work on older engines so they can learn effectively.

Many modern devices like clocks and coffee makers are so complex that most people throw them away and replace them.

And nature is many, many , many times more complex than human inventions. From black holes to gravity to why ice is slippery much is unknown in science. And when you get to biology it becomes much more complex.

Even cancer cells are largely not understood. And as you move to larger things it only gets more complex. The human brain has an estimated 100 billion neurons and far, far more connections.

The authors describe the brain as far too complex for any one person to understand. They may be right. I have tackled dozens of books on neuroscience and psychology and related fields and am quite certain I know far less than one percent of one percent of what I would need to know to form an educated opinion regarding many, possibly most, questions regarding the human brain.

They introduce other aspects of complexity such as weather and geography and fractals and on and on.

It is a mind boggling effort to consider how complex the world is.

So, this brings us to the big question - the enormous contrast between how little we know and how we can function, even feel and sound knowledgeable in such a complex world ?

"The answer is that we do so by living a lie. We ignore complexity by overestimating how much we know about how things work, by living life in the belief that we know how things work even when we don't. We tell ourselves that we understand what's going on, that our opinions are justified by our knowledge, and that our actions are grounded in justified beliefs even though they are not. We tolerate complexity by failing to recognize it. That's the illusion of understanding." (Page 35)

The authors note how young children can ask a stream of never ending "whys" and that as adults we don't. We gave up at some point and accepted that we understand enough. And that is a key in understanding how and where we accepted the illusion of knowledge.


Well-known member
In chapter two, Why We Think, the authors of The Knowledge Illusion present their idea of the reason that we think. They note that animals have a couple noticeable differences between them and plants. Plants don't have the ability to move like animals and they can't think.

One creature that gives us a clue is the Sea Squirt. This off little creature has different stages it goes through in life. At one point it needs to pick a spot on the bottom of the ocean or sea and attach itself for the rest of its life. Once that has happened it will remain there for the rest of its life.

Once the Sea Squirt has picked a spot and attached itself it does something remarkable. It literally absorbs into itself or eats it's own brain, well brainlike primitive structure. But still it is amazing to think of an animal having one decision to make then disposing of its brain permanently. But it has no more decisions to make. It isn't going anywhere and has no way to fight or flee.

The big difference between plants without brains and brainlike structures and animals with them is organized action. Animals commit actions and have to have senses to get information and a way to motivate actions. Plants rely on a different way of living. They use photosynthesis to produce energy. Animals rely on other methods and have to do actions to achieve those methods.

The more complex the actions required by animals the more developed their brains become. The more neurons present in a brain the more sophisticated the behavior by that brain can be. Human brains have billions of neurons.

In research it was discovered that Horseshoe crabs have very primitive senses and brains compared to humans. Hadan Hartline won a Nobel prize in 1967 for studying them. The crabs are not as smart as people, not to brag, and their senses are not as sophisticated. This makes studying their brains and nervous system and sense organs easier than studying human ones.

In 1982 Robert Barlow, a student of Hartline, led a team that found the simpler eyestalks of the crabs could tell differences in shading of light. They would try to mate with cement casings that resembled the female crabs in contrast with the sand and form. They didn't realize the casings were not in fact female crabs. The importance of reproduction from an evolutionary perspective cannot be overstated. If a species cannot reproduce it cannot survive. Period. So, the ability to relatively quickly pick out a female and start mating at the opportune time is the absolute highest priority for the crab and every species.

The crabs have very simple eyes but the feature of being able to pick out contrast in shades is one they cannot do without because the action of mating requires them.

Moving up from the relatively primitive crab to humans is a huge leap. Most scientists who study brains consider about a third of the human brain as devoted to our primary sense - vision. Many animals have a primary sense and about a third of their brain devoted to it. Dogs have smell as theirs.

Vision is so central to our perception that parts of the brain are devoted to minor details regarding it. We have one section devoted entirely to recognizing faces. It doesn't just recognize human faces as faces but particular faces. It recognizes them so well we can see a particular person and see them years or decades later with a lot of weight gained or lost, different hairstyle and clothing, different expression and very often easily recognize someone. And we see people from different angles, in different lighting and sometimes from close or far but still recognize faces.

As an example the authors discuss seeing a high school photo of Danny DeVito and recognizing high school Danny DeVito as the younger version of the actor. Our brains are primed to detect RELEVANT details. In dealing with people it is relevant if you are dealing a human being or not. It is relevant if you are dealing with the exact same human being over and over.

What makes things relevant for us ? For one thing if you are thinking as a way to guide action what situations accompany which circumstances becomes extremely relevant.

Our thinking is a kind of prediction machine. It is primitive in that it often treats correlation as causation or close enough for its purposes. It assigns cause to things by association.

The ability to pick out particular faces is useful because it greatly improves predictions of behavior. If you could not tell people apart then predicting their behavior would be much more difficult.

We are built to find certain patterns. We can recognize many songs for example if we hear them even if they are played with a few errors or unfamiliar instruments.

We are designed to find the relevant and to key in patterns, not to remember or even perceive everything.

There was an Argentine writer Jorge Luis Borges who wrote about Funes the Memorious in a short story. Funes could remember everything, all his dreams and every event in fine detail of every day of his life. He could describe every moment of a day in full and it would take a full day to describe any one day. Funes was supposed to have gained this extraordinary ability after falling off a horse and hitting his head.

Funes was considered a creation of fiction. But in 2006 Elizabeth Parker, Larry Cahill, and James McGaugh of UC Irvine and the University of Southern California published a case study of an extraordinary patient they call AJ.

AJ like Funes recalls much more than an ordinary person. She can recall what day of the week it is accurately and what day of the week every day in her lifetime occurred on with no need for a calendar and from a very young age can recall what occurred on that day in extreme detail.

This condition is called hyperthymesia. It can be called a highly superior photographic memory. Apparently the human brain could store virtually every detail of almost our entire lives. So, the question is why doesn't it ? Because it isn't designed for that. It's only designed to facilitate action. Remembering everything on an absolutely equal basis is not helpful.

Borges understood this and had Funes say:

"I alone have more memories than all mankind has probably had since the world has been the dreams are like you people's waking hours" "My memory, sir, is like a garbage heap."

AJ described her memory: "It is nonstop, uncontrollable and totally exhausting. Some people call me the human calendar while others run out of the room in complete fear but the one reaction I get from everyone who eventually finds out about this "gift" is total amazement. They can start throwing dates at me to try to stump me...I haven't been stumped yet. Most have called it a gift but I call it a burden. I run my entire life through my head every day and it drives me crazy!!! " (Page 40)

NPR reported in 2013 that 55 hyperthymeses have been identified and most struggle with depression.

"The reason that most of us are not hyperthymesics is because it would make us less successful at what we evolved to do. The mind is busy trying to choose actions by picking out the most useful stuff and leaving the rest behind. Remembering everything gets in the way of focusing on the deeper principles that allow us to recognize how a new situation resembles past situations and what kinds of actions will be effective." ( Page 47)

We are active and need to effectively and often quickly choose actions and carry them out quickly and to perceive our progress and failure or success with these actions quickly and determine which actions to take with that new information in a seemingly never ending dash. We are a sort of rapid data collection unit that must instantly sort, and treat data as relevant or irrelevant and keep and further sort or discard.

It is go, go, go and move, move, move very often. We think for how we live and we live lives of action.
  • Like
Reactions: M&M


Well-known member
In chapter three, How We Think, the authors of The Knowledge Illusion take on that topic. Steve Sloman has a dog Cassie. Steve pointed out that he and Cassie have different levels of intelligence. When Cassie wants food she goes to her bowl, where her food is given to her. She may wait quite a while. Steve goes to his wife who gives him food.

This is the difference between having an association with the food and knowing the cause of the food being given. As a result Steve isn't sitting by a bowl hoping for food for hours, I hope.

Ivan Pavlov did research on dogs salivating at the sight of food, or smell, and the possibility that a dog could associate other stimuli like a ringing bell with food and the dogs could come to salivate when they hear the dinner bell, even with no food present.

Pavlov won a Nobel prize in 1904 and his associationist theories influenced behaviorist ideas that ruled psychology through the first half of the twentieth century.

In the 1950s psychologist John Garcia did experiments with rats and paired stimuli and found flaws in the ideas from Pavlov.

He found rats could associate some stimuli but not others. They could associate a noisy flashing light with an electric shock they can associate drinking sweetened water with a stomach ache but they could not reverse the association and associate a stomach ache with a noisy flashing light and the sweetened water with an electric shock. This showed a causal link has to be part of the association.

So even rats and dogs are causal thinkers to a degree. Associations have to make a kind of sense as being causally possible.

We are primed to think of how things work in a framework based on cause and effect. We know lighting a match makes fire. And we know how a causal relationship can be disturbed or interrupted. If the match is wet or pressed too softly or quickly it may not light and we know this.

We know hundreds of things like this. Thinking causally is first nature to us. We know yelling at someone who is too far away won't get the effect of yelling at them close enough to hear us. We know how to cause effects and why they won't occur if things are out of sequence or a part is missing or too weak.

Other kinds of reason are hard for us. Figuring out cube roots of four digit numbers, quantum mechanics and lots of other things are difficult and may go against out intuition.

We still suffer from the knowledge illusion but we are built to see causes. We think of A then B as a routine action constantly.

If I want to get in the building I need to open the door, If I want some food I need to pick something from the menu and wait in line then order it and pay then wait. If I take it after that I can eat it or take it and go.

Sometimes we get these things wrong and think that if A then always B or If A then B is true then if B then A is also true. Sometimes we get these right.

If A then B could involve some cause that always makes an effect like if it rains the ground will be wet but the ground could be wet without rain having occurred. Water could leak from a pipe or kids could play with a hose.

Much of the time we know automatically there are multiple reasons for a result. It is often the case that we consider other causes for effects.

"We may make inferences all the time, but those inferences are not based on textbook logic; they are based on the logic of causality.

Just as people don't think only associatively (as Pavlov thought we do), people do not reason via logical deduction. We reason by causal analysis. " (Page 56)

That is a lot to take on. Let's look at some definitions to see what is what here.

Deductive reasoning is something we can define.

Here is a few excerpts from Wikipedia on deductive reasoning.

"Deductive reasoning, also deductive logic, is the process of reasoning from one or more statements (premises) to reach a logically certain conclusion."

Deductive reasoning goes in the same direction as that of the conditionals, and links premises with conclusions. If all premises are true, the terms are clear, and the rules of deductive logic are followed, then the conclusion reached is necessarily true.

An example of an argument using deductive reasoning:
  1. All men are mortal. (First premise)
  2. Socrates is a man. (Second premise)
  3. Therefore, Socrates is mortal. (Conclusion)
The first premise states that all objects classified as "men" have the attribute "mortal." The second premise states that "Socrates" is classified as a "man" – a member of the set "men." The conclusion then states that "Socrates" must be "mortal" because he inherits this attribute from his classification as a "man." End quote

So deductive reasoning starts with premises, ideas assumed to true. These are general statements usually that describe a broad category at first then describe a quality of the category then move to place some one or something in the category and conclude by saying the particular person or thing has the quality.

There are lots of variations and rules for this in formal logic but that is the simplest explanation. When people talk about a detective using deduction they are talking about the detective putting things like people in the category of suspect and eliminating them with premises of when a crime likely occurred and how and eliminating suspects who have alibis that are believed or lack ability, like a suspect who is disabled and unable to stand when it is accepted that a killer stood up strangled a victim and carried the body.

The important thing to understand is that is not how we usually think. It takes a lot of work and practice to even start to follow the rules for formal deduction. I have whole books on it and it certainly isn't first nature for anyone I know.

Causal thinking is our regular routine. We think of what to do to get results all the time and adjust to changing circumstances. I have to sometimes adjust my route to get to work or home due to construction, accidents and bad weather. It is a breeze compared to formal logic.
Our long term planning also is centered on causal thinking. We think of consequences way down the road. That is part of why we do many things like saving toys for grandchildren who may not be born for years or set aside money for retirement.

We even reason causally about the minds others. If your spouse isn't talking to you you might wonder if you said or did something that upset them, or forgot an anniversary or birthday or if they are mad because of something you did or said that you are not remembering.

You act like a detective and think through clues. Clues of causation.

We engage in this analysis at every social encounter. We decide why people are behaving the way they are based on the clues we collect.

The authors noted that we reason far better from causes to effects than from effects backwards to causes. It is easier for a doctor to predict a symptom a particular illness may cause far more often than for the doctor to determine the cause of a symptom, especially a symptom that many illnesses cause.

Perhaps only humans reason backwards from effects to causes. In reasoning forward we simulate the results of actions. We think up little movies in our minds of the results of actions.

The authors point out a crucial error we make when using predictive reasoning. They pointed out how when we reason from cause to effect we create a mental simulation and leave no room for the possibility of alternative causes.

That is a big error. If we find an effect like a burned down house we are aware of several possibilities. Someone could have set the fire, lightning could have hit it, chemicals could have reacted, even more unlikely possibilities exist. But if we saw someone the day before playing with fireworks we might run our simulation and be sure they set the fire.

Storytelling is our way of making sense of the world in causal terms. We have many stories that link causes to effects. If we believe in values we find stories that demonstrate the merits of those values.

We even think about the effects of causes that don't exist. We think of stories of "what if" something unlikely or impossible happened, what would be the results ? We have science fiction and horror and fantasies of being rich or famous or handsome or other things that may never happen.

Some people come up with brilliant ideas when imagining the impossible. Einstein did thought experiments and imagined riding light and what he would see and that helped him to develop relativity and special relativity.

Stories are extremely important to understanding who we are and how we think. George Lakoff has extensively on how we think and understand the world in metaphor, stories on a deeply subconscious level.

Stories create associations of the good with the good. Good people do good things and bad people do bad things in stories. Bad values have bad results and good values have good results in stories.

The sacred is sacred because of stories and the absurd is absurd because of stories as are the disgusting and the ordinary. Stories make the world make sense and give things their meaning.

We need to be able to think well enough to understand stories to listen to them and to tell them. Some psychologists think our cognitive development is so far past other mammals even primates for our complex social development including telling stories. An author must understand characters, the characters thoughts and feelings and actions and reactions. And understand the way to tell the reader and that telling them certain things certain ways is more likely to be enjoyed by the reader as the reader imagines the story as the author presents it. That takes a lot of causal reasoning.


Well-known member
In chapter four, Why We Think What Isn't So, the authors take on inaccuracies in our thinking. They take on how we can incorrectly guess the paths that objects will take if for example they are attached to a rope that is spinning in a circle and the rope is cut.

Our causal reasoning is often good but not perfect. We have trouble with many tasks like adjusting brakes on a bike and setting clocks on ovens. I have struggled to get my seat back in position after getting the interior done at the car wash for years and resetting the sensor for when my oil change is due is always a challenge.

Andrea diSessa, a researcher at University of California, Berkley has found we often have faulty intuition. We make poor guesses about objects moving on round surfaces, possibly because we are used to objects moving on flat surfaces. We also make poor guesses about electricity, possibly because we of it flowing like water. We set a thermostat way above the temperature we want.

Most mechanistic phenomena are too small for us to directly observe so we use things that seem similar to guide our intuition. Much of this thought is usually not considered in fine detail so it goes along as uninspected assumptions and becomes habit.

We just are not naturally good enough at direct observation to easily overcome this limitation and so much of the time our kind of close approximation is good enough.

Our understanding of the physical world is mostly shallow and guided by intuition. It's the same for the emotional world. We gather what information we can on a superficial level and make a lot of assumptions and follow our intuition.

The authors point out that deeper understanding is sometimes required, despite shallow understanding being habitual and passing for accurate understanding much of the time.

Two situations that require deeper understanding are given as examples. If a con man is trying to take advantage of someone then being able to figure out their true intentions is crucial.

Also, if a loved one is acting erratic or upset, then figuring out what to do may take a deeper understanding.

It is sad to say that very few people are good at spotting con men (though most of us assume we are) or dealing with upset people.

We have two ways we work out what causes things. One is quick and routine and largely unconscious and instinctive.

The other is much slower and more deliberate.

Daniel Kahneman described the differences in his excellent book Thinking, Fast and Slow.

The two methods go by a variety of names and we can call them deliberative and intuitive.

If I ask for an animal whose name starts with "E" most of us just think "Elephant" quite easily. That is intuitive.

If I ask you what word can the anagram "vaeertidebli" be made into it takes a lot of work for most of us to get the answer.

I will write it in a few lines from here so you can work it out if you like.

Aristotle wrote on how hard it is to overcome ingrained intuitions as the authors pointed out:

"Now if arguments were in themselves enough to make men good, they would justly...have won very great rewards...But as things are..they are not able to encourage the many to nobility and goodness...What argument would remold such people? It is hard, if not impossible, to remove by argument the traits that have long since been incorporated into the character."
Aristotle , Nicomachean Ethics, 1179. (Page 77)

The anagram can be reformed into "deliberative" .

Plato also gave a quote on this which appears in an abbreviated version:

"Let us, then, liken the soul to the natural union of a team of winged horses and their charioteer. One of the horses is a lover of honor and guided by verbal commands alone; the other is companion to wild boasts and indecency, and barely yields to the goal." Plato , Phaedrus, 246 and 253 (Page 78)

What Plato calls "reason" is what Aristotle calls "argument" and we call "deliberative" , it's carefully, consciously thinking to solve problems and make the best decisions, especially when we take time to think things through.

The authors point out the important fact that intuitions are not, strictly speaking, passions. They are instinctive conclusions, not desires.

For example they point out that hearing a person say "about" a certain way can prompt us to think they are Canadian.

Although intuition can inspire desire. We can see a box that reminds us of cake and imagine a delicious cake. Seeing a desirable car can make us imagine driving it and seeing a desirable house can make us desire living in it.

So, passions are associated with intuitions. But not all intuitions are associated with desires.

Our causal reasoning can give us very different answers to problems depending on whether we look to intuition or deliberation. Intuitive answers come quicker and they are often not supported by good reason or facts but we have an amazing ability to make up explanations that seem logical to ourselves to support these answers.

Deliberation is harder and has a lot more "I don't know" and "I am not sure" involved.

People are not able to use intuition together. We can deliberate together and use reason to work things out. That is a crucial difference and influences us to be inspired to work together.

"Recall the illusion of explanatory depth, the finding that people think they understand causal systems better than they in fact do. The illusion is a product of the intuitive mind; we think about how things work automatically and effortlessly. But when we deliberate about our knowledge the illusion is shattered. This helps to explain why not everybody falls for the illusion." (Page 80)

Yale marketing professor Shane Frederick developed a three question test to see if someone is more intuitive or deliberate.

I am going to confess that I have seen this test and the answers about a half dozen times in reading various books on psychology, neuroscience, critical thinking and so on.

I know it is easy to get quick answers to these questions.

Here they are:

"A bat and ball cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost?"

The quick answer most people guess (even people at Ivy League schools) is 10 cents. But if the ball costs ten cents then the bat would cost one dollar and ten cents by itself. So the two together would cost a dollar and twenty cents.

if you check carefully you can realize this and whether you use algebra or just keep guessing the only answer that works is five cents. The ball costs five cents and the bat at a dollar more costs a dollar and five cents so by adding the ball cost of five cents and the bat cost of a dollar and five cents we get the dollar and ten cents they combine to make.

"In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake? "

Most people guess "24" as the answer. In fairness I have run into variations on the "doubling" question and was prepared because my wife explained it to me on one of them. If it doubles in size every day and on the 48th day fully covers the lake, then one day earlier, the 47th day, it half covered the lake.

Last question:

"If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets?" ( Page 82)

The quick response of 100 minutes is wrong. Each machine makes one widget per 5 minutes as 5 machines make 5 widgets in 5 minutes. So, a hundred machines likewise need 5 minutes to make 100 widgets.

I have to admit that until I ran into these kinds of problems over and over I got a lot wrong. Books on critical thinking sometimes have them to show that you can develop the self checking habit and improve your accuracy but it takes a lot of work and practice.

The most important part is checking your answers and seeing where the quick guess is wrong. Most people don't.

Less than 20 percent of Americans get the whole test right. People at Massachusetts institute of Technology had 48 percent get all three right and Princeton had 26 percent.

The people who get all three answers right are more deliberative than most of us. They think things through and don't just grab the first idea they think of. If they do get that intuitive answer they check to verify it.

By looking at them interesting tendencies have been found. Deliberative people tend to be better at spotting a collection of words that are meant to sound impressive but hold no meaning. They are less impulsive than others and they can wait or take a chance to get a bigger reward. They are also less likely to believe in God.

But regarding our focus, more deliberative people have less of an illusion of explanatory depth. In research they were not unaware of the lack of understanding they had with unfamiliar or little known things. They knew they didn't understand things without having to have it revealed.

"Intuition gives us a simplified, coarse, and usually good enough analysis, and this gives us the illusion that we know a fair amount. But when we deliberate, we come to appreciate how complex things actually are, and this reveals to us how little we actually know." (Page 83)

One possible explanation on the difference between the people who did better on the Cognitive Reflection Test (CRT) and most of us is that they are crave more detail. In research it was found that they tend to like products with detailed advertisement more than most people.

There are people who love to get answers for things and to explain things in fine detail. In other research people have talked about people who like "large cognitive loads" and read books about things in depth like books on psychology, how the brain works, critical thinking or, well, anything in depth.

Different people prefer different cognitive loads. Some people are not interested in the reasons why things are certain ways. Others want to know and think through them in great depth.

You might accuse someone of over-thinking or going down a rabbit hole when they are just doing what comes naturally for them. It is important to understand that what may seem unpleasant or even non-productive to one person is enjoyable to another.

I know people who have not read a single book since high school and hate reading and others who feel something is wrong if they can't read a book a week, or write an article or two per week. Some are content to use Facebook every day and read a few short articles a day but would hate to read books, no matter how useful the books could be.

The authors note that the illusion of explanatory depth is intuitive, we reflexively FEEL like we understand things when they are brought up. But when deliberation about the details and in depth explanation is required we can see the illusion shatter.

I have several times exposed the illusion myself. In dealing with ex Scientologists on several occasions they have proclaimed knowing a lot about both Scientology and the subject of hypnosis. I have followed in the tradition of Jon Atack and many, many other Scientology critics and extensively documented evidence that Scientology founder Ronald Hubbard plagiarized a vast amount of the techniques and methods in Dianetics and Scientology from hypnosis.

Very often Scientologists and ex Scientologists tell me that is absurd and impossible. I usually just ask them about their opinion on the methods of hypnotic induction. And what they think about the techniques of mimicry, paradox aka confusion, repetition, vivid imagery and attention fixation. And what they believe a stage hypnotist does to pick out which of the available volunteers to bring on stage.

They usually concede that actually don't know much about hypnosis. If you don't know these things, then unless you studied a very unusual school or style of hypnosis you don't know much about it.

It's fine to not know about something. It can be a disaster to think that you know all about something you actually don't know anything about.


Well-known member
In chapter five, Thinking with Our Bodies and the World, the authors of The Knowledge Illusion explain how those two factors help us in thought.

The authors contrast cognitive science as the study of human intelligence against artificial intelligence which is the study of how to build a machine that can behave in intelligent ways.

From the 1940s to 1980s there was a focus on individual computers. A lot of information and processing abilities were put into computers so they could rapidly come up with answers to many questions.

A funny thing happened on the way to superintelligent computers. We never got there. In 2003 Marvin Minsky, one of the cofounders of the AI lab at MIT remarked on how we never developed computer intelligence with common sense.

We have made computers that are good at taking in some kinds of information and performing calculations but not good at other things. We have computers that are good as cash registers and calculators and even playing chess or even helping a doctor. I have seen a little on computers that can compare scans of a patient against a vast library of scans that were further verified to help a doctor to know if a growth is cancerous for example.

But we are nowhere near where we thought we would be.

The authors give the example of how a human being with their mind, probably the vast majority of human minds of adults, can hear a statement or the poem Casey at the Bat and know what the poem means. To do this you need some understanding of the rules of baseball, and you need to understand that if he gets a hit Casey will go to a base or possibly around all all the bases. You need to understand the hometown crowd will get excited and cheer. The fans of the other team would not cheer, and neither would guys selling peanuts.

A lot of things affect each other and affect the emotions of participants and spectators. To understand this a computer would need to have all this information and somehow in its algorithms also have the emotional associations attached to concepts like a human does. As humans our emotions and thoughts are linked and prompt each other largely in a hidden and automatic way through subconscious processes.

But those processes help us to understand other people because we often see them as similar to our own without a conscious thought.

For computer programmers to know how to even start to program something to handle this or simulate it is a challenge they don't even know how to start to take on. We simulate all the people and emotions and possible reactions from just a couple lines of the poem.

Another problem is that a human being can go for a walk in the woods on uneven terrain with sticks and stones and changing angles, no problem. For computer to run a robot that can do this would require a tremendous number of sensors and calculations for each step taken with an evaluation of the result then a new series of calculations and this would take a tremendous amount of memory and capacity.

Rodney Brooks worked as a computer science professor at MIT in the 1980s. He was part of a team that tried to simulate the way nature builds animals. It builds one part for a function and builds others for other functions and adds to them over time.

He would work with his team and build a robot that was pretty good at walking. It would have tiny processors in each limb to calculate the right action for that limb. No central computer was handling every calculation.

A similar robot that has separate competent parts that combine to function well is the Roomba. It has parts for propulsion, a sensor to get it to turn from collisions and it has a vacuum. It is capable of vacuuming a floor with no big plan.

This kind of intelligence is called embodied intelligence. The environment provides the information to guide the robots.

we similarly get information from the environment. And we cheat in a way. In research following eye movements it has been shown that we often build a model of our environment and if we were to focus in on a series of words in text for example, we to a great degree assume the rest of the world is staying the same. We might hear or feel something that startles us, but for the most part this assuming the world is still there works out much of the time.

"This assumption that the world is behaving normally gives people a giant crutch. It means that we don't have to remember everything because the information is stored in the world. If I need to know something, all I have to do is look at it. If I need to know what the sentence was at the top of the page, I don't have to remember it, I just have to look at the top of the page. As one of the researchers doing these experiments said, "the visual environment functions as a sort of outside memory store""
(Page 95)

We just assume we are constantly perceiving most of our environment constantly but in reality we focus on a tiny fraction of the environment at one time and create a composite picture from memory. Where ever you look things conform to expectations when looked at.

One experiment the authors suggest is closing your eyes and reconstructing the world around you then adding everything above your normal line of vision.

I got to have some experience with this because my wife is about a half foot shorter than me. I noticed that she doesn't seem to notice things more than about two inches above her eyes. We have a dog that will eat or tear up lots of things if she can get them so we have a lot of things we have moved to my height or higher. My wife forgets about bananas that are on top of a cabinet for example and has to throw out the snack she bought for herself. The change from eye level or out on a table to above eye level renders things invisible for her.

We use the world to help us to observe and remember it. We also use it to guide us. In trying to catch a ball in baseball we don't get out a pad and run calculations, instead we try to have the angle of the ball increase as it comes towards us, we might move towards the ball or away but if we move quickly enough then we can get to the ball before it lands and attempt to catch it. That is why you see the outfielder run into the wall. He is trying to move to intersect the ball.

Actual experiments have confirmed this is how professional baseball players catch the ball. They move to increase the angle of the ball between themselves and the ground.

We also use the world to help us navigate as when we drive if both sides of a lane are passing us at the same rate we are staying in the middle. We do the same thing with doorways.

Actual experiments with driving simulators and virtual reality have shown that making one side of the lane move faster or one side of a doorway move faster makes people go off course and go into one side. We use little tricks that have focused attention with assumptions about the environment to make it so we can do a lot of things by just focusing on a little bit of the world to guide us.

We also use our bodies to guide us. In an experiment, people were shown pictures of items like watering cans and utensils and asked to indicate if the object was upright or not by pressing a button. Sounds simple. The researchers varied just a couple elements in the experiment. They sometimes required participants to use their right or left hand and sometimes put the handle of the object on the right or left side. This resulted in a quicker response when the handle and the hand were on the same side. It shows that we subconsciously get ready to identify if we can use objects in an automatic way aligning hands and objects. The delay meant that when you are ready to grab something with a particular hand and it is upright but not in the right position for that available hand, it takes a little longer to decide how to proceed.

Another study showed that acting out a scene is a more effective way to remember it than other techniques.

it has been found that arithmetic goes better with an external aid like paper or a blackboard. Our number system is based on ten and that matches the number of fingers a person usually has available to count on.

Many activities are easier with a physical aid. It is easier to work out how to write or spell something with pen and paper than without it and easier to work out how to play a guitar with one than without it.

"We even use emotional reactions as a kind of memory. When we react, with pleasure, pain, or fear to an event, we discover what to pay attention to and what to avoid. Antonio Damasio, a neuroscientist at the University of Southern California, called these reactions somatic markers from the Greek soma, meaning body. Our bodies produce feelings to make us aware and warn us. When an option is pleasing, we have a positive affective reaction - a good feeling. That's why we feel good in a French patisserie. Our bodies are trying to draw our attention to all the delectables within sight. " (Page 103)

"When an option is displeasing, we have a negative reaction like disgust or fear. The reaction tells us to avoid the option because it might be infectious, dangerous in some other way, or just annoying. A well-placed disgust response tells us to get away from whatever caused it." ( Page 103)

Now, sometimes emotional responses are good. We panic and catch a child that is about to walk into traffic or slam on the breaks and avoid hitting a car or deer. We might be absent -mindedly walking down a street at night and realize someone is behind us because a flash of panic brings our attention to it.

But emotion guiding us isn't always beneficial. We might feel fear or disgust about people and not use good judgment in dealing with them.

Many historians note that the portrayal of people to be scapegoats includes depicting them as disgusting, like vermin and filth, and as something of an existential threat to be feared.

The path of genocide quite often has a phase in which the propaganda of entire races, religions, sexual identities and other broad groups as both disgusting and something to be feared is used. This propaganda often has outright lies and impossible stories. The fact that it lacks credible evidence and contradicts good sense is irrelevant. This kind of information is meant to appeal to emotions and not to withstand vigorous intellectual scrutiny.

Robert Jay Lifton recognized this as dispensing of existence in his right criteria for thought reform.

Dr. Robert J. Lifton's Eight Criteria for Thought Reform

  1. Milieu Control. This involves the control of information and communication both within the environment and, ultimately, within the individual, resulting in a significant degree of isolation from society at large.
  2. Mystical Manipulation. There is manipulation of experiences that appear spontaneous but in fact were planned and orchestrated by the group or its leaders in order to demonstrate divine authority or spiritual advancement or some special gift or talent that will then allow the leader to reinterpret events, scripture, and experiences as he or she wishes.
  3. Demand for Purity. The world is viewed as black and white and the members are constantly exhorted to conform to the ideology of the group and strive for perfection. The induction of guilt and/or shame is a powerful control device used here.
  4. Confession. Sins, as defined by the group, are to be confessed either to a personal monitor or publicly to the group. There is no confidentiality; members' "sins," "attitudes," and "faults" are discussed and exploited by the leaders.
  5. Sacred Science. The group's doctrine or ideology is considered to be the ultimate Truth, beyond all questioning or dispute. Truth is not to be found outside the group. The leader, as the spokesperson for God or for all humanity, is likewise above criticism.
  6. Loading the Language. The group interprets or uses words and phrases in new ways so that often the outside world does not understand. This jargon consists of thought-terminating clich�s, which serve to alter members' thought processes to conform to the group's way of thinking.
  7. Doctrine over person. Member's personal experiences are subordinated to the sacred science and any contrary experiences must be denied or reinterpreted to fit the ideology of the group.
  8. Dispensing of existence. The group has the prerogative to decide who has the right to exist and who does not. This is usually not literal but means that those in the outside world are not saved, unenlightened, unconscious and they must be converted to the group's ideology. If they do not join the group or are critical of the group, then they must be rejected by the members. Thus, the outside world loses all credibility. In conjunction, should any member leave the group, he or she must be rejected also. (Lifton, 1989)

Here is a further quote on dispensing of existence from a blog at Freedom of Mind.

Finally, the eighth, and perhaps the most general and significant of these characteristics is what I call the “dispensing of existence.” This principle is usually metaphorical. But if one has an absolute or totalistic vision of truth, then those who have not seen the light—have not embraced that truth, are in some way in the shadows—are bound up with evil, tainted, and do not have the right to exist.
Robert Jay Lifton

Robert Jay Lifton made it clear the ultimate expression of this is genocide.

Genocide Watch lists ten stages of genocide.

Here is dehumanization.

DEHUMANIZATION: One group denies the humanity of the other group. Members of it are equated with animals, vermin, insects or diseases. Dehumanization overcomes the normal human revulsion against murder. At this stage, hate propaganda in print and on hate radios is used to vilify the victim group. In combating this dehumanization, incitement to genocide should not be confused with protected speech. Genocidal societies lack constitutional protection for countervailing speech, and should be treated differently than democracies. Local and international leaders should condemn the use of hate speech and make it culturally unacceptable. Leaders who incite genocide should be banned from international travel and have their foreign finances frozen. Hate radio stations should be shut down, and hate propaganda banned. Hate crimes and atrocities should be promptly punished. End quote

As we can easily see genocide has the use of disgust (something Jon Atack pointed out to me) and fear as a key ingredients.

We use our brains, bodies, the environment around us and our feelings all together to guide us.

Part of the mystery of how individuals navigate through the same environment while being so ignorant is explained. Our individual knowledge is limited but we have many things to help us.


Well-known member
In chapter six of The Knowledge Illusion, Thinking with Other People, the authors take on another aspect of how we can function despite our extreme limitations as individuals.

"The world serves as a memory and is part of the thought process. But a single thinker can only do so much. In nature we often see complex behavior arise through the coordination of multiple individuals. When multiple cognitive systems work together, group intelligence can emerge that goes beyond what each individual is capable of." ( Page 107)

The authors elaborate on the example of bees. Lots of different kinds of bees do lots of jobs and only know their own jobs. The authors also elaborated on the communal hunt of bison. Many specialists needed to fill specific roles in the group hunt and a shaman planned and guided the group. Other specialists made weapons, used spears, butchered the animals and made fire.

Finally, the authors give the example of modern homes which require specialized knowledge in a number of trades.

"These examples illustrate one of the key properties of the mind: It did not evolve in the context of individuals sitting alone solving problems. It evolved in the context of group collaboration, and our thinking evolved independently, to operate in conjunction with the thinking of others. Much like a beehive, when each individual is master of a domain, the group intelligence that emerges is more than the sum of its parts." (Page 111)

There have been competing hypotheses why our brain size is about three times that of our evolutionary predecessors. It is believed to have occurred very rapidly in evolutionary terms.

One hypothesis, the ecological hypothesis, supposes that being better at dealing with the environment prompted the sudden growth.

Another hypothesis, the social brain hypothesis, considers that the need for more complex and sophisticated social relationships and thinking prompted the growth.

The authors noted that Anthropologist Robin Dunbar did research on brain sizes and numerous other factors for primates and found that brain size and group size are closely related. Primate species that live in large groups have large brain sizes, they noted. Other factors were found to be unrelated.

Primatologist Robert Sapolsky has spent decades studying various primates and has stated the same idea in his work.

Our social complexity has demanded a level of language and communication far beyond that of any other creature.

Bees can communicate by how they fly and sort of dance around and release pheromones but we are many, many degrees more advanced than that.

The authors went back to our communal hunt example and laid out how it we are hunting as a team effort and I see you raise your bow, for example, I have to reason from your raising of the bow to understand you are about to open fire on the bison.

It comes naturally to us but if you break it down it requires a lot of understanding on my part. I have to know you are going to share to be comfortable letting you get the prey and going off to do another task like telling others or getting a fire ready at another location.

We can share our attention with others, find common ground and share intentions with others. Russian psychologist Les Vygotsky in the early twentieth century developed the idea that the mind is a social entity.

He argued that it is not individual brainpower that distinguishes humans, but that we can learn through others and engage in collaborative activities, the authors noted.

The authors described some research by Michael Tomasello on the difference between children and chimpanzees. I have also seen a presentation by neuroscientist David Eagleman on this and it has been consistently found that by a certain age, around seven to twelve years old, human children understand the thinking and intentions of other people far, far better than any of our chimpanzee and other primate relatives.

We have a level of understanding of each other that is unmatched.

I have seen a few experiments on this and even young children in some ways cooperate with humans in ways chimpanzees simply never do.

Our ability to share intentions is key to one of our most important traits. We can share knowledge over generations. This is seen as necessary for our culture. We transmit knowledge and intentions across generations and can influence the future long after we are gone.

The way actual large projects are done shows no one person or even dozen are fully responsible for the achievement of a monumental task nowadays. The Manhattan project took countless physicists, engineers and scientists of all types along with a virtual army of other people.

Similarly the CERN supercollider was seeking the Higgs bison particle and nearly three thousand people are listed as authors on the papers that inspired the research.

Modern hospitals use teams of specialists and the function of the initial staff is to route you to possibly the right specialist and see if that is the right fit.

Researchers Toni Guiliano and Daniel Wegner found that couples tend to divide up who should remember what based on their interests and specialties.

The authors noted that we tend to remember what we need to to best make our contributions to the cognitive labor and leave the rest to experts.

It has been found that in collaborative effort we can blur the lines on what we know and what others know and which ideas we come up with and which others do.

We don't have giant blackboards with our names on them that have everything we think and say written out. We remember imperfectly and can think a good idea is our own when someone else said it first, in a meeting.

In another experiment subjects were told that scientists discovered a new kind of glowing green rock and said they understand it quite well.

Other subjects were told the same story but told that the scientists didn't understand the rock well.

When asked how well they understood it the people who were told the scientists understood it felt like they understood it.

Crucially the authors said: "It's as if people just cannot distinguish their own understanding from what others know." (Page 124)

If we don't actually have knowledge but feel we have access to it we feel as though we have it. Just by being told someone, somewhere that we can access has the information prompts us to feel like we personally have it.

We already saw the requirements of being able to share intentions and sharing attention and goals.

"Another requirement has to do with how we store information. Communal knowledge is distributed across a group of people. No one person has it all. So what I as an individual know has to connect to the knowledge that other people have. My knowledge has to be full of pointers and placeholders rather than just facts." (Page 125)

"The knowledge illusion occurs because we live in a community of knowledge and we fail to distinguish the knowledge that is in our heads from the knowledge outside of it." (Page 127)

"The world and our community house much of our knowledge base. A lot of human understanding consists simply of awareness that the knowledge is out there. Sophisticated understanding usually consists of knowing where to find it. Only the truly erudite actually have the knowledge available in their memories." (Page 128)

The curse of knowledge is what economists call our bias of mistakenly believing everyone knows or should know whatever we know.

The authors point out that with the curse of knowledge we assume others know what is in our heads and with the knowledge illusion we think that what is in others' heads is in our own. Either way we confuse who knows what.

As we live in a hive mind we get away with superficial and specialized knowledge, because everyone else does as well.

The tendency to take for granted that we know more than we really do leaves us both ignorant of how little we really know and vulnerable to other problems.


Well-known member
In the seventh chapter of The Knowledge Illusion, Thinking with Technology, the authors take on technology as extension of thought.

The authors caution about the possibility of AI reaching the hypothetical singularity. In technology the term is used to describe a point at which AI surpasses human intelligence and incredibly rapidly goes from our equal or slight superior to hundreds then thousands of times superior to us in a matter of weeks or hours.

Let us consider technology as an extension of thought.

"According to Ian Tattersall, curator emeritus with the American Museum of Natural History in New York, "cognitive capacity and technology reinforced each other" as civilization developed. Genetic evolution and technological change have run in tandem throughout our evolutionary history. As brains increased in size from one hominid species to its descendants, tools became more sophisticated and more common. " (Page 133)

The progress from using rocks with edges to using fire to stone axes and knives then nets, hooks, traps and bows and arrows then eventually farming was accompanied at each step by changes in the culture and genes of our ancestors that made this progress possible.

One thing the authors note is our adaptability to different situations. We can use tools of many different types and styles as comfortably as our hands. We can use a hammer or knife or broom or a hundred different tools and rapidly feel comfortable with them.

One thing that is making modern technology less comfortable is its changing features. If I use a hammer or pen or fork I reasonably expect it to work or break and to hold few surprises.

But if I use a computer or device like a modern phone with a computer in it, it may do things that I do not anticipate or even understand. And modern technology may monitor me in ways I do not anticipate and do things with information from and about me I would never have dreamed of, and didn't give knowing consent to.

"One consequence of these developments is that we are starting to treat our technology more and more like people, like full participants in the community of knowledge. The Internet is a great example. Just as we store understanding in other people, we store understanding in the internet. We have seen that having knowledge available in other people's heads leads us to overrate our own understanding. Because we live in a community that shares knowledge, each of us individually can fail to distinguish whether knowledge is stored in our own head or someone else's. This leads to the illusion of explanatory depth: I think I understand things better than I do because I incorporate other people's understanding into my assessment of my own understanding." (Page 136)

Two different research groups found we have "confusion at the frontier" regarding searching the internet. Adrian Ward, a psychologist at the University of Texas found that using internet searches increases our cognitive self-esteem, our sense of being able to remember and process information. Additionally, it was found that people who search the internet for facts we don't know can misremember and report we knew them when we actually didn't and had to look them up.

I know that I often search for a couple search terms to remember something in my mind and use those instead of searching my memory. Through practice I have learned a celebrity and phrase is often enough to get a song with all its lyrics or a movie or television show with all the actors, writers, directors, episodes and on on related to it.

In another group of studies Matt Fischer who was a PhD student at Yale with Frank Keil (one of the original discoverers of the illusion of explanatory depth), had students answer questions like "how does a zipper work?" and some people were allowed to use the internet to find information to confirm their answer and some people were not.

The people who were allowed to use the internet subsequently felt greater confidence in their abilities to answer other, unrelated questions. The conclusion was that using the internet caused participants to feel they knew the answers to other questions, even though they hadn't researched them yet.

The authors give the example of a person searching the internet to plan out a trip. We bring some ideas and a timeframe, a destination and some priorities but get a lot of information from the internet. At the end we feel like we personally planned the trip and don't usually say "I came up with these seven ideas and got these five from the internet."

I have written about a lot of things and can confirm sharply separating what I brought to the table and what the internet provides is hard to distinguish.

"This has some worrying consequences. The Internet's knowledge is so accessible and so vast that we may be fashioning a society where everyone with a smartphone and a Wi-Fi connection becomes a self-appointed expert in multiple domains." (Page 138)

One study with the authors and Adrian Ward found that doctors and nurses say patients who search websites like WebMD don't know much more than other people but think they do and often doubt or reject diagnoses. They also asked in another experiment "what is a stock share?" and had people play an investment game. People who looked it up online bet more in the game. They didn't do better in the game and earned less money.

They identified the problem of looking up medical information or financial information for a few minutes as nowhere near equal to a real medical or financial education but it feels like we have knowledge and understanding based on access, but it is not the same thing.

As of right now machines don't have intentionality. A GPS can map out a route but it doesn't have desires that choose one. It also doesn't independently decide to pass down its information over generations, so it lacks culture.

Without shared intentions and desires we don't truly collaborate with machines. We use them. To share intentions you have to be able to reflect on your desires and the desires of others and conclude which are in agreement or are not and even work to make them agree.

No machine can do that. We don't know how to program a machine to do that and that is why certain kinds of artificial intelligence simply have not been successfully developed.

We are as the authors point out at an awkward moment. We depend on machines for knowledge and see our own knowledge as increased by increased access to information that the machines themselves contain but they do not think and understand as we do. They primarily store information, like libraries but we are acting more educated because these vast libraries exist with easy access but we really don't know or understand the vast majority of information stored and accessed this way.

We now face a paradox of automation leading to greater dependence on machines and greater ignorance about that dependence coupled with the greater assumption or illusion of knowledge because the machines hold knowledge.

We feel safer but because machines don't have our intentions, thoughts, feelings and understanding when they fail we can be surprised at the outcome because we would act differently.

The authors pointed out several relevant examples. Airplanes stall when the plane's airspeed is not enough to generate lift. It means that the plane needs to go at a minimum speed to keep flying or it won't fly. Simple.

Pilots learn that a good way to increase speed and save yourself from a very unpleasant landing (crash) is to point the nose of the plane down and dive, increasing speed quickly, then as the plane's airspeed goes up enough raise yourself out of the dive and you are back to flying and can hopefully continue safely on your way. It is a very basic idea that pilots learn early in their training and it has saved lives probably thousands of times.

In 2009 Air France Flight 447 crashed and tragically killed 228 people. The Airbus A330 had entered a stall and the black box revealed that the copilot tried to push the nose up rather than down. The FAA issued a report claiming that the pilots were too reliant on the technology and lacked basic flying training as a result. The report concluded the flight crew was unaware the plane even could stall (which all kinds of planes have been doing for a very long time) and didn't understand how to interpret the complex signals from the equipment.

They died in a situation in which an older crew using earlier technology and especially earlier training would very likely survive and have a scary but not catastrophic story to tell.

It makes a difference if you know what to do and live as a result or don't know and die as a consequence.

There have been similar issues with reliance on GPS devices and cars driving off docks or ships getting stuck on shore.

How serious could an error in relying on machines for accurate information be ? Believe it or not the world has been in danger of being destroyed and only through human judgement to not obey orders has it been saved.

A name most people don't know but probably everyone should is Stanislav Petrov. In 1983 he was in the Russian military and got an alert that an American missile had launched and per his orders he was supposed to launch missiles in retaliation. This would have resulted in the Americans and their allies launching everything in retaliation and the Russians and likely Chinese launching everything. Probably over five thousand nuclear weapons of various types would have been used in the next few hours.

Stanislav Petrov correctly decided it was more likely that his machine had an error than the Americans had only launched one missile. His system reported five more American missiles but he still felt it was unlikely the Americans would launch a few missiles when they had thousands of nuclear weapons and a real attack would require thousands of weapons.

Fortunately for anyone who likes the human race surviving past 1983 Stanislav Petrov was suspicious of the new system that gave him notification of the launch and disobeyed his orders, resulting in the survival of the human race.

It is only one of several such incidents and Noam Chomsky has remarked that around a hundred such near launch events have occurred, making our survival up to now a minor miracle.

Maybe, just maybe our huge stockpiles of nuclear weapons and military personnel informed to launch everything at a moment's notice are not good ideas. I personally think that if we had treaties limiting every member to, say for example, two hundred modern nuclear weapons, for countries like China, Russia and the United States that would save many billions of dollars on nuclear weapons. For the United States and Russia it is closer to trillions over decades. Billions that could be used in education and healthcare and eradicating poverty. And if they get down to two hundred weapons then a treaty to go down to a hundred could be introduced. A hundred nuclear weapons could deter any attack. No country would be sustainable after such an attack but they are not the thousands we now have and just that reduction would increase the odds of the human race surviving a nuclear war. Something to consider.

As we can see, computers lack intelligence and cannot truly share knowledge, but there is a way technology helps us to. With crowdsourcing applications people help each other and combine knowledge. As the authors point out crowdsourcing is a critical provider of information to sites that integrate knowledge from different experiences, locations and knowledge bases.

People can share all kinds of information this way and you can answer a question on Reddit or Quora or get a recipe or get a traffic map. Or get a lot of restaurant reviews.

Crowdsourcing works best when it connects people with a need or interest with the right experts. You really need to know about construction or an accident blocking your path when you go to a map or driving app and you really need someone who knows what they are talking about when you ask about something highly specialized like Scientology.

People using crowdsourcing need to find incentives to get good experts. Money is one incentive and is sometimes used. Feeling right or important is another. Many people contribute tremendous content to Wikipedia and are never paid. The Oxford English Dictionary also has volunteers contribute content. But importantly the right experts are needed for this to work.

Several years ago Pallokerho - 35, a Finnish soccer club, invited fans to participate in decisions regarding recruiting, training and game tactics. They let fans vote. Of course this was a disaster. The team did poorly, the coach was fired and the experiment halted.

This shows an important lesson, for crowdsourcing to work people need expertise in the relevant fields. Enthusiasm alone doesn't guarantee success.

Similarly with certain items the reviews regular people give are actually not helpful as experts know better how to rate and compare them. Things like digital cameras and kitchen appliances are better understood by experts.

The authors point out something several books on psychology have pointed out. Crowdsourcing for some things has been successful. Francis Galton in 1907 wrote a paper entitled Vox Populi (The Wisdom of Crowds). He described a contest to guess the weight of a fat ox. 787 people entered a contest to correctly guess the weight and win a prize. All kinds of people including butchers and farmers entered and plenty of people with no expertise regarding livestock. The average guess was reported to be within one percent of the 1,198 pounds the fat ox actually weighed. So, in some circumstances the average of the crowd, even an uneducated crowd, can be accurate.

The evolution of technology continues as web developers are at work trying to create platforms that use experts to solve specific problems. They have to work out how to get the experts and how to get the experts to work together on the right problems but the potential for success has people hard at work.

The authors suggest that crowdsourcing and future collaborative platforms are where actual superintelligence are going to be found. People sharing knowledge and finding better ways to work out problems together will out produce and out work the machines of today.

The authors warn that as our own systems get more complex our understanding will be less and less but our own illusion of understanding will become greater.

They warn our dependence on experts will grow, especially when technology and our own meager understanding fails. We are more like cogs in a great machine now than masters of our own domain. They point out crucially that this means we have to be even more vigilant and remind ourselves that we really don't know what is going on. That is the most important lesson to me from this chapter.

They also point out two advantages. They see countless benefits from technology like increased safety, reduced effort, increased efficiency. And they see greater access to experts as improving our own knowledge.


Well-known member
In chapter eight, Thinking About Science, the authors take on our knowledge and understanding of science (gulp). The authors start by pointing out that people have long had a strain of apprehension and suspicion regarding science and technology.
There is a story of a man named Ned Ludd who allegedly smashed a knitting machine with a hammer. Now the term Luddite or neo Luddite has expanded to embody the attitude of opposing technology.

The authors point out that a reasonable skepticism towards science and technology is probably healthy for society but antiscientific thinking can be dangerous when it goes too far. I tend to concur.

The authors give several topics as examples including climate change, genetic engineering, and vaccination.

They then introduced a bit on the history of scientific education of the public. For a long time the assumption was that people who didn't know the accepted science on a topic simply were not exposed to the information. This is called the deficit model.

The model is based on the hypothesis that antiscientific thinking is due to a knowledge deficit and will disappear as soon as deficit is filled.

The Bodmer Report was an effort to see how well scientific education has been doing. A group called the National Science Board came up with a quiz of twelve basic questions about science such as does the earth go around the sun or does the sun go around the earth and several true or false questions on topics such as evolution, if electrons are smaller than atoms and similar topics.

People have also been asked their feelings about topics like genetic engineering, and other scientific issues. People who have better attitudes toward those topics do a little better on average.

People in America tend to get the answers right about on about 51% of the questions or less for 6 of the questions and get 61 - 73% right for 3 of the questions and get 80 - 84% right for three questions. Not great for very simple and basic science and most are true false questions, so just guessing gets 50% correct on average. In case you think it is particularly damning for Americans, people from Russia, China, the EU, India and Japan do no better.

"But here's the real problem with the deficit model. Decades of attempts to educate people about science have been ineffective at achieving the aspirations of the Bodmer Report: to promote positive views about science throughout society by fostering scientific literacy. Despite all the effort and energy that has gone into promoting public understanding of science, the millions and millions of dollars spent on research, curriculum design, outreach, and communication, we just do not seem to be making headway on that goal. Antiscientific beliefs are still pervasive and strong, and education does not seem to be helping."
(Page 159)

The issue of vaccines was used as an example. Parents were given information including specific information on the terrible effects of failing to vaccinate children, including emotional descriptions. They were also given information debunking the link between vaccines and autism.

"None of the information made people more likely to say they would vaccinate. In fact, some of the information backfired. After seeing images of sick children, parents expressed an increased belief in the vaccine-autism connection, and after reading the emotional story, parents were more likely to believe in serious vaccine side effects." (Page 159)

People spent a lot of time and a lot of effort, including research in many fields to find the answer to this. I think it is one of the most important answers that has been found regarding human knowledge and behavior.

"The answer that has dominated thinking recently is that nothing has gone wrong. Scientific attitudes are not based on rational evaluation of evidence, and therefore providing information does not change them. Attitudes are determined instead by a host of contextual and cultural factors that make them largely immune to change."

"One of the leading voices in promoting this new perspective is Dan Kahan, a law professor from Yale. Our attitudes are not based on a rational, detached evaluation of the evidence, Kahan argues. This is because our beliefs are not isolated pieces of data that we can take and discard at will. Instead, beliefs are deeply intertwined with other beliefs, shared cultural values, and our identities. To discard a belief often means discarding a whole host of other beliefs, forsaking our communities, going against those we trust and love, and in short, challenging our identities." (Page 160)

So the authors discovered WHY giving people a little information regarding GMOs, vaccines, evolution, or global warming does little to change their views.

Now, I can say I completely understand the difficulty in rejecting a pseudoscience belief or in my case belief system as I rejected Scientology after twenty five years and found it one of the most difficult things that I ever had to do. It involved rejecting my justification for years of neglecting and rejecting the people in my life who I should have been there for the most.

When I rejected the false justification Scientology provided it was like living your life thinking that an autobiography about yourself would portray a good or even heroic person and getting the big twist of realizing you had been a villain the whole time and only fooled yourself in your deluded mind, only to wake up to the dark truth in the third act. It has been called mind shattering and heart crushing for good reason. You get a big reversal and find out that you have done almost incomprehensible evil while thinking you were righteous all along. It is a revelation that is understandable why people avoid it and often never accept it, no matter how much evidence there is or how strong that evidence is.

The authors gave the example of a guy they call Science Mike. Mike McHargue is a podcaster and blogger who goes by that name. He grew up in Tallahassee Florida as a fundamentalist Christian. Like many fundamentalists he believed in Young Earth Creationism, denied evolution and believed prayer could heal people.

Mike in his thirties read about evolution, paleontology, biology, physics regarding the universe and studies on the effectiveness of prayer. Mike went through a long journey and today is a Christian but he has rejected the fundamentalist antiscientific beliefs of his old church.

Science Mike will discuss both science and his faith in his podcast. In one podcast he had to answer a caller who realized his beliefs don't match the fundamentalist Christian faith he was brought up in and Mike told him he would need to find people who share his current beliefs to worship with. Mike was honest about the hardship this creates and the lost relationships it can bring. For Mike his roots were in one community and faith. When he began questioning that, his whole world was turned upside down. The relationships that were most important to him were dramatically changed.

"That is the power of culture. Our beliefs are not our own. They are shared with the community. And this makes them really hard to change.

Science Mike's experience gives us a feel for where the knowledge illusion comes from. We typically don't know enough individually to form knowledgeable, nuanced views about new technologies and scientific developments. We simply have no choice but to adopt the positions of those we trust. Our attitudes and those of the people around us thus becoming mutually reinforcing. And the fact that we have a strong opinion makes us think that there must be a firm basis for our opinion, so we think we know a lot, more than in fact we do. " (Page 162)

This is chilling. But supported by a lot of good research and evidence in my opinion. I have written numerous blog posts on many books and studies that support this.

We tend to take beliefs on with only the slimmest of superficial understandings regarding them based almost entirely on how they fit with our prior beliefs and the beliefs people in our social groups embrace. And we think we have knowledge far beyond what we do and mistake our confidence in our beliefs, which is often high, with both our understanding of those beliefs, which is routinely low and the soundness of the basis for our beliefs which is also routinely poor. Gulp.

Regarding the science quiz given earlier people had high confidence they did well, regardless of whether they did well or not.

"There was no relationship at all between science literacy and people's evaluations of their own knowledge; those who got many answers wrong reported knowing just as much about the technologies as those who did well." (Page 162)

"All of this should sound familiar by now. People tend to have limited understanding of complex issues and they have trouble absorbing details (like answers to factual questions). They also tend not to have a good sense of how much they know, and they lean heavily on their community of knowledge as a basis for their beliefs. The outcome is passionate, polarized attitudes that are hard to change." (Page 163)

To compliment the information presented here a long list of experiments and research from many books could be added such as Influence by Robert Cialdini, A Theory Of Cognitive Dissonance by Leon Festinger, Subliminal by Leonard Mlodinow and many others. I have written numerous posts at my blog and often described the very research I am referring to.

The authors pointed out that as humans we are poor at remembering details and think in causal models. They often involve substitution of things that seem similar with us thinking of electricity like water and drugs like fuel in a car.

These inaccurate folk understandings lead to incorrect beliefs about how the effects of drugs get affected by vigorous activity or how electricity moves or is stored.

A false idea of what genetic modifications do to food can lead to scary beliefs about what they can do to us, even if those beliefs are false. Many concerns about genetic engineering of food don't hold up if you understand how it actually works.

The case is similar with many arguments against vaccines.

"Beliefs are hard to change because they are wrapped up with our values and identities, and they are shared with our community. Moreover, what is actually in our heads - our causal models - are sparse and often wrong. This explains why false beliefs are so hard to weed out. Sometimes communities get the science wrong, usually in ways that are supported by our causal models. And the knowledge illusion means that we don't check our understanding often or deeply enough. This is a recipe for antiscientific thinking. " (Page 169)

For several years Michael Ranney, a psychologist at the University of California, Berkeley has been researching how to educate people so they will understand global warming.

He found the unsurprising fact that people have very little understanding of how it works. He asked a couple hundred people in Paris in San Diego a series of questions and found only 12 percent of people have even a partial understanding and no one he asked could give a full and accurate description.

Next he tried to inform people and showed a four hundred word primer on global warming. He is making a website that shows videos of varying length from five minutes to fifty two second videos.

Preliminary results are promising, but only time will tell if this will work on the long term.

"The first step to correcting false beliefs is opening people's minds to the idea that they and their community might have the science wrong. No one wants to be wrong." (Page 170)

That is a huge barrier to overcome. To get someone to realize that they and their community, including religious leaders, parents, friends, politicians they support and on and on could all be wrong ? That is a lot to ask. It's downright critical thinking !


Well-known member
In chapter nine, Thinking About Politics, the authors of The Knowledge Illusion take on the dreaded topic of politics, or more exactly how we think about politics.

The authors pointed out that we as voters are often unaware of the topics that we are voting on or the reality that is involved. They used as an example low awareness regarding the affordable care act as 40 percent of Americans in 2013 were unaware it was law.

In 2012 the supreme court ruled to uphold portions of the law. People were asked by Pew research if they supported the ruling. 36 percent were in favor, 40 percent opposed, and 24 percent had no opinion. Pew asked what the court ruled. Only 55 percent got that right. 15 percent said the court ruled against the act, 30 percent had no idea.

There are many, many other examples. Americans who most strongly supported military intervention in Ukraine couldn't identify Ukraine on a map.

A survey in Oklahoma State University's Department of Agricultural Economics asked if we should have a law for mandatory labeling of food containing GMOs. 80 percent of people supported the law. They also were asked if we should have a law requiring a label for food containing DNA and 80 percent of people also supported that as law. So, according to these people all food should be labeled to warn us it has DNA.

"How seriously should we take the vote to label genetically modified foods if it comes from the same people who believe we should label all foods that contain DNA? It does seem to reduce their credibility. Apparently, the fact that a strong majority of people has some preference does not mean that their opinion is informed. As a rule, strong opinions on issues do not emerge from deep understanding. They often emerge in the absence of understanding or, as the great philosopher and political activist Bertrand Russell said, "The opinions that are held with passion are always those for which no good ground exists." Clint Eastwood was more blunt: "Extremism is so easy. You've got your position and that's it. It doesn't take much thought." watch
(Page 172)

Socrates had a quote in response to a "political expert."

" I reasoned to myself, as I left him, like this - "I am actually wiser than this person; likely neither of us knows anything of importance, but he thinks he knows something when he doesn't, whereas just as I don't know anything, so I don't think I do either. So, I appear to be wiser, at least than him, in just this one small respect: that when I don't know things, I don't think that I do either." ( Plato, Apology, 21d;trans. Christopher Rowe) (Page 173)

"In general, we don't appreciate how little we know; the tiniest bit of knowledge makes us feel like experts. Once we feel like an expert, we start talking like an expert. And as it turns out that the people we talk to don't know much, either. So, relative to them, we are experts. That enhances our feeling of expertise." (Page 173)

"This is how a community of knowledge can become dangerous. The people we talk to are influenced by us - and truth be told - we are influenced by them. When group members don't know much but share a position, members of the group can reinforce one another's sense of understanding, leading everyone to feel like their position is justified and their mission is clear, even when there is no real expertise to give it solid support. Everyone sees everyone else as justifying their view so that opinion rests on a mirage. Members of the group provide intellectual support for one another, but there's nothing supporting the group." (Page 173)

The authors pointed out that when people of like minds discuss an issue together they become more polarized. The internet has made this problem much, much worse. Often companies use stories intended to arouse the strongest emotions and the views of the most extreme members get represented, sometimes even exaggerated into, well lies, to generate strong emotions.

Psychologist Jonathan Haidt has written several books including The Righteous Mind and by doing a lot of research he found that many of the views presented online and in media only represent less than ten percent of conservatives on the right and less than ten percent of liberals on the left and the vast majority are either not represented or very poorly represented.

Social isolation via the internet and the choice of one type of media is a dangerous combination. The authors pointed out that Socrates died because ancient Athenians didn't want him contaminating thinking, the same reason is given regarding Jesus and the Romans, and so with the first crusades and the Spanish Inquisition and on to Stalin's purges, Mao's Great Leap Forward and the incarcerations and death camps of Nazi Germany.

The authors note each of these events have multifaceted causes, but that they share a common justification. They all were done to satisfy a need for ideological purity to enable a society to follow the one true path into the future. It is safe to say that none of the leaders who thought they were protecting the glorious truth were correct. They were all suffering from an illusion of understanding and the consequences were terrible.

The authors asked people a variety of questions regarding policy and noted the responses. Questions included whether there should be a flat tax, if we should have a cap and trade program, whether we should have single payer healthcare.

As before, they asked people to rate their understanding on an issue from 1 to 7. They then asked them to explain the policy and its effects in explicit details. As expected, people are terrible at explaining policies, especially ones they don't understand, and after this step they rated their understanding lower.

The authors also wanted to know if having this ignorance exposed would lessen the degree of polarity people have.

So, they also had people rate how firmly they support or oppose a position on the 1 to 7 scale before being questioned and after.

"We found that attempting to explain how a policy worked not only reduced our participants' sense of understanding, it also reduced the extremity of their position. If we consider the whole group together, the fact that people were on average less extreme means that the group as a whole was less polarized after the explanation exercise. The attempt to explain caused their positions to converge. " (Page 172)

The authors pointed out a vital contrast. Usually when people in a group think about their positions and discus them they become more certain. They think of the reasons for their beliefs in their own minds and reinforce them.

In this exercise people are forced to think of policies in causal terms involving the steps and details and sequence and results. Causal explanation gets us out of our beliefs and into the real world. It also exposes gaps in our knowledge and how little we really know.

The authors use the example of a law limiting use of water water to ten gallons a day per person. With causal reasoning you have to look at what would be short term effects ? Long term ? How would you bathe ? Wash dishes ? Would you drink water ? What about lawns ?

They point out that we can't understand a policy by thinking about how we would feel. We need to think about the causes and effects it would bring into being.

"Getting people to think beyond their own interests and experiences may be necessary for reducing their hubris and thereby reducing polarization. Causal explanation may be the only form of thinking that will shatter the illusion of explanatory depth and change people's attitudes." (Page 179)

The authors ran another experiment in which they again used the 1 to 7 system and asked people their level of understanding on issues and their positions but this time asked them to use reasons they had for their positions and didn't as any causal questions.

The participants didn't discover how shallow their understanding was or moderate their positions. They finished as certain of their knowledge as when they started.

Just finding reasons is easy for people. Coming up with details of what causes what regarding a policy is hard, especially when you don't know those details.

"Pondering your reasons for your position will do nothing but reinforce what you already believe. What you have to do is think about the issue on its own terms, think about exactly what policy you want to implement and what the direct consequences of that policy would be and what the consequences of those consequences would be in turn. You have to think more deeply about how things work than most people do." (Page 180)

The authors ran another experiment in which they tested two groups. One gave causal explanations and the other gave reasons as in the earlier experiment. In this experiment instead of ratings they had participants choose between four options. They could donate money to an advocacy group that favored their position, they could donate money to a group that opposes their position, they could keep the money or they could turn down the money. Very few people gave to a group that opposes their position and very few turned down the money.

The people who gave reasons had high certainty and people who had a strong position initially were more likely to donate. The moderates were less likely to donate. In the group that gave causal explanations the people who started with extreme positions donated no more than the moderates. This suggests that exposing how little we know can inspire more moderate behavior.

A very important different category or exception was found regarding this.

"But it's important to recognize another critical driver of people's opinions: There are certain values that we hold sacred, and no amount of discussion is going to change them.

Jonathan Haidt argues that moral conclusions are rarely based on much reasoning but come instead from intuitions and feelings. His strongest evidence for saying this comes from cases that he calls moral dumbfounding. To demonstrate this, he offered the following scenario (beware it's designed to generate discomfort) " (Page 181)

The authors then present a story of a brother and sister traveling together and they make love and the sister is on birth control and the brother uses a condom as well.

Most people are disgusted by the story and say the brother and sister did something wrong. (I tend to agree)

But the interesting thing is many people have trouble explaining WHY it is wrong. They say incest is wrong but have trouble giving reasons. Several reasons are given as examples and counterarguments are presented for each one.

I personally would take on aspects of the counterarguments in a debate because I have my own opinion but that's not the point. Most people don't have well thought out arguments ready for thorough scrutiny on this issue.

"Apparently, strong moral reactions don't require reasons. Strong political opinions don't either. Sometimes whether or not we understand the consequences of a policy is irrelevant. Such attitudes are not based on causal analysis. We don't care whether the policy will produce good results or bad outcomes. What matters are the values enshrined by policy." (Page 182)

The authors point out that attitudes about some topics are often set and no statements about costs or practicality or studies or anything but a value. They give several examples including pro-choice and pro-life positions regarding abortion. Generally people have a value they have embraced and that is the beginning and end regarding abortion for them. That makes this a sacred values issue.

It is similar with assisted suicide. Some people have a pro or con position and nothing you tell them affects it. Some see ALL suicide as a sin or cowardly and so don't ever support it while others see being forced to live in suffering as immoral and believe it should be treated as a fundamental right. Not much room between those positions.

The authors noted this.

"But if people's positions are not consequentialist but based on sacred values, then shattering the illusion won't matter."
(Page 184)

The authors tried asking people about these two issues of abortion and assisted suicide and found no illusions of explanatory depth. People felt the same before and after causal explanations. They were not moderated by learning more. For some topics, apparently facts don't matter to people.

This brings up something extremity important. The authors point out that as people are open to causal explanations only for policies that are thought of in terms of consequences, meaning outcomes or results and not effective on issues involving sacred values. We tend to feel like values are settled and not open for debate or change, consequences be damned.

They point out that most issues are consequentialist for most people, which leaves a lot of room for using causal explanations. Most people want the best outcomes regarding education, healthcare, power and many other topics.

"Proponents of political positions often cast policies that most people see as consequentialist in values-based terms in order to hide their ignorance, prevent moderation of opinion, and block compromise. The health care debate is a perfect example of this. Most people just want the best health care for the most people at the most affordable price. " (Page 184)

The authors point out that politicians often take a sacred values position on a topic that has clear facts to hide truth. Often they support an alleged value over a preferable outcome because it fits an agenda or the ideology they promote or the values their big money donors support.

"The secret that people who are practiced in the art of persuasion have learned over millennia is that when an attitude is based on a sacred value, consequences don't matter."
(Page 185)

This is probably one of the most important facts in propaganda analysis. If you want a position to be adamant unchanging, uncompromising and irreversible tie it to be a sacred value. Make it about honor, duty, loyalty or some other sacred value.

If you want people to be open to change or compromise on a policy make it a practical matter, about expenses and savings, efficient allocation of resources, getting a job done in a timely manner, anything that is not a sacred value.

The authors gave the example of gay marriage in America. According to Pew Research 60% of Americans opposed gay marriage in 2004 with 31% supporting it. By 2015 55% of Americans support it while only 39% oppose it.

What happened ? Jonathan Haidt believes the exposure of people to gay marriage made them realize it didn't negatively impact their lives. For most people in America your spouse is the same, your job is the same and your life is not really different. So, people found that the consequences are acceptable.

"Whether we frame issues in terms of consequences or sacred values also influences the likelihood of achieving compromise in negotiation. " (Page 186)

The authors give the sad example of the Israeli-Palestinian conflict. Both sides would benefit from some long time term peaceful solution. But tragically the authors note that both sides have sacred values as grievances and the idea of compromise is unacceptable.

One thing I learned rapidly in trying to understand persuasion and propaganda is that you are in very deep trouble if you are trying to discuss facts and evidence, causes and effects if you stumble into a sacred value someone holds.

They can be very certain and any disagreement or even questioning regarding a sacred value can derail both a conversation and reason. This can manifest in different circumstances, depending on who you talk with based on their own values.

Some Americans tolerate no criticism of any acts by United States military and say things like "support the troops" and "better to fight the war over there than over here" and "they all want to blow us up anyway." And they are unwilling to consider any consequences or evidence against the United States government in this regard.

Similarly, I have spoken to American Jews and found a minority that support any actions by Israel and refuse to consider that the government could be doing anything wrong in any way. Period. Case closed.

Sacred values also have the benefit of letting you skip all that fancy causal analysis. Causal analysis requires a lot of thinking and if done well requires a lot of knowledge.

I in my own journey out of Scientology noted the sacred science as Robert Jay Lifton described eliminated causal or any analysis and assumed Scientology was far too sacred to be doubted, questioned or ever criticized or ridiculed. But I realized that anything that cannot be doubted, questioned, or even criticized and ridiculed has been taken out of the category of critical thinking. Maybe something else is being done with it but critical thinking isn't.

This brings us to something the authors pointed out. They talked about how sacred values have their place and are worth standing up for. Basic human rights are often considered sacred values and certainly make sense to me.

The authors point out that we can have sacred values but they shouldn't stop causal reasoning about the consequences of social policy.

They pointed out a simple reality about most political discourse. It's remarkably shallow. Citizens, commentators and politicians often take a stand before engaging in a serious analysis of pros and cons of proposed legislation.

They point out the fact that much of television is shouting matches. It is often just talking points being repeated in competition. The authors assert that the public deserves some analysis. I tend to concur.

Often networks just let politicians and their representatives say whatever they want and present it as balanced to have a republican and democrat say whatever they want.

But this is often a disservice. Often politicians who wear a flag on their suit every day and make values based speeches regarding supporting veterans and the military have a track record of consistent opposition to bills that would increase benefits or pay for the military or support bills that cut foodstamps to military families and oppose programs to help homeless veterans. But the politicians can put their hands over their hearts and stand tall when they are on camera in front of a flag. So patriotic.

Showing their voting record and listing bills would be much more useful.

Similarly topics like raising the minimum wage have history and so little it presented that people get away with outright lying about the effects of raising it because no one thinks to check out the truth. People go on television and write articles claiming every time the minimum wage is raised job numbers decline. Every single time. But it has a history. History is facts, causes and effects. Policies and outcomes. And guess what ? In the United States the federal minimum wage has been raised twenty two times and 68% of the time employment grew in the next year, not declined. That is a fact. It should be pointed out EVERY TIME the false claims regarding raising wages and declining jobs are presented. EVERY SINGLE TIME.

(Source regarding minimum wage study:


Paul Constant from website Civic Skunk Works

This is not a pro raising minimum wage argument. It is a pro learning enough about the history and facts, not values, of a topic so you cannot easily be fooled. Howard Zinn was quoted as saying "History is important. If you don't know history, it's as if you were born yesterday. And if you were born yesterday, anybody up there in a position of power can tell you anything, and you have no way of checking up on it." End Quote

Now, I must admit I have dug deeper on some topics than most people would like to. That's fair enough. But with most topics if you don't dig a bit you have no way of checking anything.

The authors recognize that we all can't become experts on everything. They advise getting the best experts and having them advise us what our options are. We can check the history of institutions and individuals and look at recommendations. It is certainly easier than becoming an expert on everything.

The authors point out that having citizens decide issues regarding ballot measure has the liability that citizens can be unaware of consequences like lowering taxes may have disastrous effects on a local government.

They noted two quotes from Winston Churchill:

"The best argument against democracy is a five-minute conversation with the average voter." Ouch.

"Democracy is the worst form of government, except for all the others. " (Page 191)

Now, I must admit Winston Churchill despite being known for leading England through resisting the Nazis isn't exactly my favorite historical figure. His attitudes on race, tolerance and genocide are not the most kind, to put it mildly. But his deep character flaws aside, he has a way with words.

I must admit that until I left Scientology I hadn't read the constitution or begun learning about how the United States Senate and house of representatives make laws which the president may accept or in some cases veto and the two bodies can try for a high enough majority to override a veto and the courts as a third branch must ensure the laws are followed.

So, I understand the first comment. Many Americans I speak with in person admit privately that they don't know how laws are made. Or how taxes are allocated.

Most people don't know the federal budget is now around 4.7 trillion dollars a year with a deficit around one trillion dollars a year and a debt of over twenty two trillion dollars. It is usually not known that the disbursement of the budget is as follows.

According to the Center on Budget and Policy Priorities the United States federal budget for 2017 was allocated:

Defense and international security assistance 15%
Social Security 24%
Medicare, Medicaid, CHIP, and marketplace subsidies 26%
Safety net programs 9%
Interest on debt 7%
Benefits for federal retirees and veterans 8%
Transportation infrastructure 2%
Education 3%
Science and medical research 2%
Non-security international 1%

All other 4%

You can note facts like foreign aid is usually one or two percent of the budget but sometimes politicians and reporters act like it is a lot of money by just saying a figure in the billions with no context. Similarly they can complain about food stamps. But according to several sources including the book Age of Propaganda the actual taxes a family that makes fifty thousand dollars a year pays towards food stamps totals thirty six dollars a year, while in contrast the same family pays around four thousand dollars a year in taxes to pay for corporate subsidies, money given to corporations. The authors of Age of Propaganda pointed out the usefulness of an itemized tax bill for every American so they can know the truth about where their taxes go. We can find the information but it takes a lot of digging. It shouldn't.

I not preaching that every person become a budget expert but we should know enough about it to be aware that about eighty percent of the budget is devoted to defense, social security, medical programs and safety net programs and that things like education, science, infrastructure and foreign aid get just a few percent each, hardly anything. And that we have a huge deficit with no solution to that in sight.

There are other important facts like corporations pay about seven percent of taxes and in the fifties they paid about thirty five percent, so they are making enormous profits.

The lesson isn't meant as an insult but as a guard against propaganda.

The authors discovered something else. People can get upset by having their ignorance exposed. They found asking people for details on something that the person doesn't really understand frequently led to the person no longer wanting to talk to them.

"We had hoped that shattering the illusion of understanding would make people more curious and more open to new information about the topic at hand. This is not what we have found. If anything, people are less inclined to seek new information after finding out that they were wrong. Causal explanation is an effective way to shatter the illusion, but people don't like having their illusion shattered. In the words of Voltaire: " Illusion is the first of all pleasures. " Shattering an illusion can cause people to disengage. People like to feel successful, not incompetent. " (Page 192)

Now I must admit to similar experiences. When I left Scientology and studied hypnosis and critical thinking I realized many ex Scientologists didn't want someone pointing out things they didn't know about hypnosis or Scientology and many certainly don't want anyone to point out logical fallacies in anything they write.

I ran into one woman who goes on Facebook for maybe a couple hours a day and comments on various issues she has with Scientology. She asks lots of questions online about Scientology. I foolishly pointed out a dozen books on cults and psychology to answer her questions. She got extremely angry and wrote that no one has time for that. But, she has a couple hours a day to rehash her experiences in Scientology over and over. Okay. She made several comments and I, perhaps foolishly, pointed out several logical fallacies she was using, because, hey, now that we are out of Scientology it would be great if we left behind the bad habits Scientology instilled in us.

She of course said she learned about all that in her first semester in college and knows all that so shouldn't point it out. Umm, think about that. She was in essence saying she knows what logical fallacies are and that they are wrong (poor critical thinking with faulty logic) and they should not be pointed out, ever. It frankly makes no sense. If you know all the fallacies then why would you use them over and over in your statements ?

A much more plausible explanation is that she felt called out when I recommended the books and she tried to save face when she said she knew knew all about fallacies from her first year in college as if she was years past where you learn and talk about fallacies. Bottom line, she didn't want to be exposed as ignorant.

I didn't want to overwhelm her and make her look bad. That wasn't my intent. I was hoping she would read some books and find something that helped her to recover from Scientology and I foolishly had hoped that pointing out the fallacies Scientology instilled would help somebody, several somebodies, see the harm these habits create and work to improve their thinking and lives But lots of people have shown both in and out of ex Scientologists' groups have shown that if you expose that they are using fallacies they just double and triple down.

When you point out logical fallacies such as red herring fallacies like ad hominem and the genetic fallacy being irrelevant to the truth of claims they dive in and use them more.

How we get this information out in a way that people accept is a difficult challenge. In my opinion people would greatly benefit from this knowledge but accepting it requires admitting a profound ignorance, and most people, much of the time aren't ready for this. To me the gateway subject for many people to examine this can be critical thinking or cognitive science or psychology. These subjects are so deep and unexplored in earlier education that a serious student of any of them should quickly realize they have a lot to learn to understand these subjects. This approach in my opinion is needed for other subjects like hypnosis and cults, but one thing at a time.

The authors point out that a good leader must make people aware of their ignorance buy not make them feel stupid. I must confess that it is challenging.

In my experience I in my forties discovered virtually everything I believed in was a fraud and further discovered that decades of attempting to understand and practice Scientology and even extreme confidence and certainty I understood it didn't make me correct, not even a little bit.

I got to have my whole world shattered and discovered I had delusions of grandeur and delusions of competence with no basis in reality. That intensely stunning and demoralizing experience left me willing to explore how someone could be so ignorant but feel enlightened, be so wrong but be sure they are so right.

But most people don't find recorded evidence that the person they thought was the savior of humanity was just a pathological liar and conman, scheming his way through life, spinning a web of lies. I discovered plenty plenty and realized I needed to learn much, much more than I know.

How can people whose religous, political and scientific beliefs are not so completely debunked see something like this ?


Well-known member
In chapter ten, The New Definition of Smart, the authors take on this concept in light of everything they have discovered. In this chapter they discuss the fact that we achieve things together. From Martin Luther King Jr and the thousands and thousands of activists in the civil rights movement to the large number of people who wrote the twenty thousand pages of the affordable care act, many things that the history books credit to one person involve and require many, many more people to be achieved.

We think of individuals as entire movements and mentally substitute individuals for complicated groups routinely. We discuss the Kennedy administration or the Eisenhower administration when in reality many people and groups and ideas and social circumstances all were involved.

The government involves millions of people in making many millions of decisions and the leader is merely a symbol for the vast majority of these decisions.

We don't just elevate people in politics. We also practice hero worship in entertainment. In fiction super competent geniuses are great lovers, nearly superhuman fighters, superb drivers and pilots and experts in every kind of engineering and theoretical science, and know a half dozen languages.

The great man (most often a man, and very, very often a white man - even if no white men lived in a region when events occurred) myth has been presented as a lone individual doing every great deed, winning every battle, coming up with every idea and invention as if he from pure talent birthed these things alone, with no help.

The authors give the examples of Socrates, Copernicus and Galileo. Copernicus built on ideas from ancient Greeks, Copernicus used ideas from Ptolemy. Einstein gave us relativity and special relativity. But I wrote about it once and had a scientist give me examples of about a half dozen ideas that are the foundation of these ideas that other scientists had first. Einstein didn't dream up his model from scratch or as a direct sequel to Newtonian gravity. It looks impossible to get directly from one to the other. It didn't take a moment of unique genius. It took years and years of slight and gradual intermediate steps. It was like having a template of a solution to a puzzle but it can't be right, because pieces are left over and there are empty spaces you can't fill. And then someone gives you clue after clue on how the pieces could go together. The clues were presented by dozens of scientists, many brilliant people in their own right who studied the ideas of others and worked to advance the subject themselves for years.

To even take apart the great man myth more, it is possible that without the specific person we give the credit, in many cases someone else would have made the same discoveries. With the periodic table, for example, Dmitri Mendeleev gets a lot of credit but many other scientists came up with many parts of it and could have discovered the ideas he found, given time, if he wasn't available. Much of what he put in the table appears in letters and papers by others, much of it before he produced his findings.

Simultaneous multiple discoveries in science are surprisingly frequent. Right now there is a conflict over who should get a patent involving CRISPR DNA editing technology as two groups of scientists developed it simultaneously and independent of one another.

The authors contend that science in real life unlike science fiction and comic books, advances because conditions are right, meaning the information, evidence and technology is advanced enough that a step, possibly a tiny incremental step, is now possible. This is in contrast to the super genius in fiction who can build a dimensional portal, spaceship, time machine or sentient robot or supercomputer from scratch at whatever speed the plot requires.

The community of scientists function together and often progress together. They have conversations and combine and refine ideas together.

"Human memory is finite and human reasoning is limited. Students of history can understand only so much. As a result, we tend to simplify. One way we simplify is through hero worship, by conflating significant individuals with the community of knowledge they represent. Instead of understanding the enormous complexity that goes along with multiple people pursuing multiple aims and trying to remember all of it - an impossible task - we wrap events up into a tiny little ball and associate them with a single individual. Not only does that allow us to ignore vast amounts of gory detail, but also it allows us to tell a story." (Page 200)

We use a story of a great individual to substitute for the complex combination of relationships and events that make up a community and a time for that community. We as the authors point out substitute a story in politics, religion, entertainment, science and history. We use stories to replace the impossibly detailed and complex truth.

The authors pointed out our tendency to form an initial impression of a person and then to try to find ways to support it. If we see someone as successful we often assume they are intelligent. The authors point out that significant success requires more than individual intelligence.

"One common and relatively old distinction is between fluid and crystallized intelligence. Fluid intelligence is what we we're thinking when we say someone is "smart." The person has the ability to come to conclusions quickly whatever the topic and is able to figure new things out. Crystallized intelligence refers to how much information one has at one's disposal stored in memory. It includes the size of one's vocabulary and one's access to general knowledge. " (Page 202)

The authors point out several ways to define intelligence and ways it gets broken down, such as with language or math. There are other descriptions and categories, many others.

Intelligence has a long history of psychologists attempting to find good ways to measure it. Many people from the early 1900s on have developed various intelligence tests. Something that has been discovered is that if you do well or poorly on one intelligence test you tend to do about the same on another or more clearly you tend to get nearly the same results, especially your average results, across all of them. This was discovered by Charles Spearman in 1904.

He created a concept called factor analysis. He figured out a way to take all the results a person gets on several intelligence tests and just get to the underlying result. A way to more accurately reflect your actual intelligence than any of the tests alone would find. His result g general intelligence is something psychologists like because they like having a way to measure things and a way to measure intelligence that doesn't vary from test to test is useful. Psychologists also like it because it has a correlation with success both in school and employment. People who get a higher g, on average, do better at school and work.

One report looked at 127 tests of twenty thousand people and found this result. The authors caution against overvaluing test results for any individual. A person may be influenced by personal events, such as a girlfriend leaving them or having a fight the day before the test and not sleeping well, or having too little or too much coffee or worrying about a bill they can't pay.

Remember, these results are on average for large groups, so an outlier may occur. Don't feel damned if you had average or poor results on such a test and don't feel invincible and destined for greatness if you did well.

The authors recognized the fact that the higher a person's intelligence the better they tend to do, but also have their discoveries regarding our knowledge being something shared by the community.

In this light they looked at intelligence as how much someone contributes to the community. That's a very different take than a lone genius coming up with discoveries and brilliant plans in isolation. The authors state that if thinking is a social entity that takes place in a group and involves teams, then intelligence resides in the team and not just in individuals. That is a completely different way to look at the subject.

They argue that the best way to assess intelligence is by assessing how much an individual contributes to a group's success. They state, an individual contributes to a team, and it is the team that matters, because it is the team that gets things done. An individual's intelligence reflects how critical that individual is to the team.

They go on to say, if we think this way, intelligence is no longer a person's ability to reason and solve problems but is how much a person contributes to a group's reasoning and problem solving process. They include more than individual information processing abilities and add the ability to understand the perspective of others, to take turns effectively, to understand emotional responses, and to listen. They propose that a community needs people who together can play a variety of roles to be complimentary.

They say we don't need a lot of people who get high g scores but instead need people who have a variety of different skills.

"A team with complimentary skills are more likely to satisfy all the demands made by the division of cognitive labor. Therefore, when you're picking people to be part of that team, each person's ability to contribute to the group is more important than his or her g score. Instead of measuring intelligence by testing individuals alone in a room, we need to test teams of people working in groups." (Page 207)

That is a radically different approach. It has precedents in other fields. Sometimes you need to rate a team, reward a team and advance a team based on team performance. It is obvious in many sports. A person who does well at American football or baseball doesn't play for a championship without their entire team getting there.

They use the analogy of a mind participating in a community with the parts of a car working together to achieve transportation labor. They point out that we could measure and examine the individual parts that make up a car and look for higher or lower quality parts. But what actually matters ? The qualities of the whole car such as reliability, durability, longevity and performance. Having high quality individual parts is a good idea because they are likely to perform well.

"To perform most tasks, you want people who make different contributions. To run a company, you need people who are cautious and others who are risk takers, some who are good with numbers and others who are good with people. It might even be a liability for someone who interacts with people to be really good at numbers; customers will be more comfortable with a salesperson who doesn't make them feel stupid by doing fancy calculations that they are unable to follow." (Page 209)

This compliments research presented by Avi Tuschman in his book Our Political Nature. He presented evidence that in families different children tend to end up with different political affiliations, regardless of those chosen by parents.

This supports a wide body of work that shows having dissenting voices in a group can enable members to realize a majority or especially a unanimous opinion can be wrong.

Here is an excerpt from a relevant post at this blog -
Scientologists, ex Scientologists and Watchers - Starting Out:

In the book Sway The Irresistible Pull Of Irrational Behavior authors (and brothers) Ori Brafman ( MBA Stanford Business School) and Rom Brafman (PhD Psychology) described experiments on dissent.

Solomon Asch did one of the most famous experiments in social psychology. In one experiment a subject was told they were being tested for visual acuity. They were placed in a group with several other people. The group was shown three straight lines of greatly varying lengths and a fourth line and asked which of the three lines the new one matched. The lines were intentionally different enough that the answer was meant to be obvious.

But there was a hidden element, as there usually is in a social psychology experiment, every person except one was an actor. The actors were all instructed to give the same answer before the actual subject responded. They all gave the same wrong answer.

Now there were several rounds of being presented lines and answering. And when everyone else gave the same obviously wrong answer 75% of subjects ALSO gave that answer in at least one of the rounds.

Asch found unanimity gave the experiment its full persuasive power. It's hard to be a lone dissenting voice.

He did something I have found people often do with good experiments. He repeated it with a slight variation to test an idea. He had the same set up with one crucial alteration: he had one actor give the right answer while the others gave the same wrong answer.

He found that having even one person give the true and easily observable answer made it so the test subjects felt free and confident enough to also give the correct answer, almost every single time.

The authors wrote, "The really interesting thing, though, is that the dissenting actor didn't even need to give the correct response; all it took to break the sway was for someone to give an answer that was different from the majority." (Page 155 Sway)

To really drive home this point with evidence another clever experiment is described. Psychologist Vernon Allen conducted it.

In this one a subject was asked to do a self-assessment survey alone. After five minutes a researcher knocked on the door and asked the subject to share the room due to a lack of space.

The new subject was of course an actor. The new subject had special extra, extra thick glasses intentionally designed to give the impression of him being nearly blind without them. Super coke bottle glasses.

To step it up a notch the researcher and actor even had a script. The actor said, "Excuse me, but does this test require long-distance vision ?" The researcher confirmed it and the subject responded, "I have very limited eyesight" and "I can only see up-close objects."

They even acted out a scene of the researcher asking the coke bottles wearing actor to read an easily legible sign on the wall. The actor of course acted out straining and finding the sign impossible to make out to drive home the point that he was practically blind over long distances.

The researcher explained that he needed five people for the testing apparatus to work, so it was okay for the nearly blind seeming subject to, "Just sit in anyway, since you won't be able to see the questions, answer any way you want; randomly, maybe. I won't record your answers."

But even with the coke bottle glasses and blind as a bat routine the actor was able to affect conformity significantly. 97% of participants conformed when agreement was unanimous but it dropped to 64% with the coke bottles wearing actor even if he gave an incorrect answer as long as it was different from the majority.

That is astounding. Having three people give an incorrect answer can be countered for 33% of people even with an obviously incorrect answer from an obviously unreliable source !

It's truly worth considering. Imagine yourself being like 97% of us and conforming with the crowd in denying what you see before your eyes, but that one out of three of us actually will see and acknowledge the truth if anyone, no matter how unlikely or wrong or obviously unqualified simply disagrees and breaks the unanimous opinion.

I think dissenting views shouldn't just be accepted or even suggested for important decisions that time permits careful consideration of but frankly should be required ! end quote

The authors described how a team led by Tepper School of Business professor Anita Woolley tested groups on performing group tasks together. They did a test of spatial reasoning, a moral reasoning problem, a shopping trip planning task, and a group typing task.

Remember how it has been found for individuals that if an individual does well on any type of general intelligence test they tend to do equally well on all the other general intelligence tests ? That established the g or general intelligence factor for an individual.

The collective intelligence hypothesis state a similar correlation should exist for group intelligence and performance. In other words groups that perform well as a group on a test of general intelligence should perform closely on other tests of general intelligence, especially as an average over many tests.

They actually found a degree of correlation. If one group does well at any of the tasks, on average they tend to do well on the other tasks. With some tasks the correlation is low, but it exists. Maybe further research can find types of tasks that the correlation is higher or lower in and we can learn more about how groups work together, or don't, and the about how they succeed or fail.

They came up with the c factor, for collective intelligence of a group.

With this established they discovered some other useful information, they tried seeing if individual intelligence, g, or collective intelligence, c, were better predictors at a computer checking group task. C was a useful predictor while g, general intelligence of individuals was useless.

They determined that factors like cooperation can determine group success far better than individual intelligence. I have worked on projects with intelligent individuals who didn't cooperate at all that were disasters and projects with less intelligent people who did cooperate that were successful.

The authors use the analogy of being better off with a group of semiskilled workers who work together to renovate a kitchen than you would be with a group of prima donnas who do separate tasks extremely well but don't align the cupboards and counter.

Many people who follow group sports have seen talented players who are not interested in coordinating their efforts with the team and take off plays where they are not featured or fail to pass to teammates or block in American football when appropriate. They infuriate coaches and often find themselves off teams just as quick as they found themselves on teams and talent may get them second and third chances, especially with teams like the Raiders who pride themselves on turning rejects from other teams into champions on their team. Even those risk taking teams will eventually need a reward or the risks will end.

But the original question remains. What exactly are we testing for c ? They wanted to find what makes a group effective.They found that some things have no correlation. Group cohesion, motivation and satisfaction didn't matter.

What predicted success was social sensitivity, how often groups took turns, and the proportion of females in the group. They found having more females made the group more socially sensitive and more successful.

"Nevertheless, data are starting to come in that suggest that the success of a group is not predominantly a function of the intelligence of it's individual members. It's determined by how well they work together."
(Page 211)

"The notion of intelligence has fostered a deep confusion: We think of intelligent acts as performed by individuals even when communities are really responsible. " (Page 211)

We buy into the great man myth and think a lone genius through intellect and persistence makes things happen but it's not true. Comic books tell us Lex Luthor or Reed Richards or Tony Stark alone can build a city or energy reactor or time machine from scratch, even if they have to master several sciences and invent a few more.

In the real world Mark Zuckerburg didn't make Facebook alone and Steve Jobs didn't make Apple succeed. People who succeed as venture capitalists don't back an idea or individual. They back successful teams.

The company Y Combinator backs successful teams. They look for teams that can divide up tasks well and distribute individual labor effectively.They avoid single founders.

The authors propose that for humans we have been looking in the wrong place for intelligence in individuals. Individual variations in intelligence certainly exist but as success is what we are actually after and not good scores on tests, we should look at groups.

They propose considering how an individual contributes across many groups to see how the group does with it without them, much like a hockey team having the plus-minus statistic to see that the team scores more goals when you are on the ice and allows fewer, that way if you don't personally score goals your own contribution is still recognized with a measurable stat. If you are a great player and distract the goalie and pass to teammates often, setting them up to score, you will have a measurable way to demonstrate your efforts.

The authors propose looking at when a person contributes to a group or other group, how often does the group succeed ? Not to be too insulting, but I have actually turned down "help" from people offered to assist in tasks if I knew from past experience they made success more difficult, in other words they were definitely liabilities.

There can be problems in applying this as two people who work together often may succeed because of one more than the other or fail in a similar way. They pointed out that some people are bright and can appear competent but if their projects consistently fail they are not really successful.

"The question an employer should ask is whether the projects that the employee is involved in tend to be successful or not relative to other employees. " (Page 213)

I think we can see lots of applications and variations with this. In American football two people usually have the win - loss record as a statistic, the head coach and the starting quarterback. As people who are seen as ultimately responsible their overall performance is seen as equal to victory or defeat.

Most players cannot alone cause this. I could be a great guard and succeed in blocking on every play and it may not create victory. It can help, but if the quarterback throws five interceptions and the defense allows fifty points, there is nothing I can do.

Even if I was a superb defensive player and made a dozen tackles and caused two turnovers and got a sack my team could still lose if either the offense or special teams part of the team play poorly.

That is why football teams try to put the best available player for the position on the field and rate most positions relative to success at their tasks and not team success.

In this way a project is limited to a very specific criteria, like blocking or tackling a man in front you or getting free and making a catch or running six feet through a line of men and hanging onto the ball.

It may be a new way to think of it for the positions below the top, but it has merits. If you could hire a team of people to, say for example, sell cars for your dealership, wouldn't you prefers a team that sells a lot of cars and brings in a lot of profits to a team that is impressive in an interview or on an individual intelligence test, but don't sell very many cars ?

And if we give up giving all the credit to unrealistic great men and give a lot more credit to everyone involved isn't that better if it's actually true and leads us to more successful efforts in the future ? Because we can better understand everything and everyone that succeeded in the past.


Well-known member
In chapter eleven, Making People Smart, the authors discuss making people smart. The authors pointed out that in a comparison of poor kids in Brazil who were street merchants to survive the hyperinflation and extreme poverty the country experienced during the 1980s it was found the street merchants were better at addition and subtraction than children of the same age educated in the Brazilian school system.

They claim that people are designed primarily for action, not for listening to lectures and memorizing facts. They propose an element of action be included with learning even if reflection and classroom work is also required.

They comment on how students have difficulty understanding how little they study in reading, for example, when they have no element of application for what they read.

This is such a problem it has the name of illusion of comprehension, meaning people confuse understanding with familiarity or recognition. I run into this perhaps with people who have read a lot of statements regarding hypnosis, for example, by people who either misrepresent it to a degree, like Ronald Hubbard, or who don't believe it exists, so the other person believes they understand it when they really don't.

I also have found it regarding critical thinking, logical fallacies and numerous other subjects. A person can read about something, feel that they understand it but not be capable of explaining how it works in step by step cause and effect fashion and not be able to use the information when appropriate, but as they feel familiar with it they know that they understand it well.

This has been found to be true even if we run into information years later. In a test psychologist Paul Kolers found people who read text inverted (with all the letters upside down) were capable of doing it again a year later faster than text they hadn't read before. The mind retains some degree of memory or familiarity.

We can mistake familiarity with understanding. Both have accompanying emotions or sensations and we probably rarely analyze the cause and influence of our internal sensations.

The authors point out how many of us through repetition can recite the pledge of allegiance, but the understanding of it is not developed, so sometimes people have odd word substitutions they use.

"Comprehension requires processing text with some care and effort, in a deliberate manner. It requires thinking about the author's intention. This apparently isn't obvious to everyone. Many students confuse studying with light reading.

So the conclusion we have come to in previous chapters - that people are more superficial than they realize, that we suffer from a knowledge illusion - extends to education as well: Learning requires breaking common habits by processing information more deeply.

Knowing What You Don't Know

We also suffer from the knowledge illusion because we confuse what experts know with what we ourselves know. The fact that I can access someone else's knowledge makes me feel like I already know what I'm talking about. The same phenomenon occurs in the classroom: Children suffer from an illusion of comprehension because they can access the knowledge they need. It is in their textbook and in the heads of their teacher and the better students. Humans aren't built to become masters of all subjects; humans are built to participate in a community (another point suggested many years ago by the great John Dewey). " (Page 218)

The authors believe it is a mistake to see ourselves as going to school to become independent thinkers. Now, I personally place an extremely high value on independent and critical thinking.

They have observed that we tend to think of education as solely preparing one to operate completely independently, say as someone who can fix cars if you study to be a mechanic or to know a lot of facts about the past if you want to be a historian.

"These ideas aren't wrong so much as incomplete. The idea education should increase intellectual independence is a very narrow view of learning. It ignores the fact that knowledge depends on others." (Page 219)

The examples of mechanic and historian need to know things beyond their immediate specialty. The mechanic needs to know where to get parts, how designs of cars change and many other things. The historian needs to know about other times, regions, people and countries so his knowledge will fit into a context. Either one needs to know something about economics as the mechanic likely wants to make a profit and the historian needs to know about how economic circumstances impact events.

The authors point out that when you have a skeletal understanding you need to know how to get more information and who to go to. Then you are using the community of knowledge.

"A real education includes learning that you don't know certain things (a lot of things). Instead of looking at in at the knowledge you do have, you learn to look out at the knowledge you don't have. To do this, you have to let go of some hubris; you have to accept that you don't know what you don't know. Learning what you don't know is just a matter of looking at the frontiers of your knowledge and wondering what is out there beyond that border. It's about asking why. Instead of learning to ask about events that occurred in Spain, it's learning to ask what you don't know, like why long division works.

As individuals we know little. There's not too much we can do about that; there's too much to know. Obviously we can learn some facts and theories, and we can develop skills. But we also have to learn how to make use of others' knowledge and skills. In fact, that's the key to success, because the vast majority of the knowledge and skills that we have access to reside in other people. In a community of knowledge, an individual is like a single piece in a jigsaw puzzle. Understanding where you fit requires understanding not only what you know but also what others know that you don't. Learning your place in a community of knowledge requires becoming aware of the knowledge outside yourself, what you don't know that touches on what you do know. " (Page 220 - 221)

The authors point out a course called Ignorance at Columbia University. Scientists are invited to come and speak about what they don't know, so the frontiers of their subjects are exposed and explored.

The authors consider one way that is good to expose what isn't known in a field is to do the actions of the people in the field. I have spoken with doctors and scientists and realized they often don't have access to information that a fictional depiction of their profession routinely shows. This is certainly true for police officers and attorneys. They have far less information than television shows and movies demonstrate and often get the information they do get at a much, much slower rate.

With science there has been said to be a tremendous problem with little exposure to scientific methods and a lot of emphasis on memorizing facts. I developed a very strong appreciation for scientific research in reading dozens of books on psychology and neuroscience and seeing explanations on how studies and experiments are constructed, carried out, observed then analyzed and how they can be repeated and varied to gain further understanding. They also can inspire other studies and experiments to debunk or strengthen any hypothesis that results from observing the results.

This includes learning about sample sizes, correlation not meaning causation and many other factors.

This is unlikely to occur unless you read a tremendous amount specifically on the design of experiments and about hundreds of experiments and studies or unless you get actual experience with the scientific method via application.

They explain that actual science involves a lot of division of cognitive labor. People focus on their specialties and leave the work of others to other people. The scientist relies on the community.

The authors explain how some conclusions come from direct observations, some come from inference, but most come from authority.

The authority is part of the community of knowledge and we rely on it. The authors consider it more important to know what is known and justified by others than to know the facts and justifications themselves. It is good to know what you personally concluded, how and why and what you are taking from an authority and why. You can even consider them both tentative and open to revisions or debunking.

The authors point out how scientists rely on the knowledge of the community and how all of us routinely operate things we don't understand including cars as modern cars are extremely complicated.

Much of what modern scientists operate on involves faith. It is faith that other scientists have done good research, used good methods and been honest and correct with their observations and conclusions. They point out this faith is different than religious faith. It has a feature at the core of science: observation. Science is the study of nature.

This study includes observation and has the feature of verification. Someone can check a scientific claim. Scientific claims get tested every day and often get debunked by observations of evidence that shows that they aren't true. The authors point out that if a scientific claim is false, eventually someone can test it and observe that it isn't true. It doesn't reflect reality.

"Teaching science requires more than teaching scientific theory and facts. It also requires that students pay attention to the limits of their knowledge and learn how to fill in the gaps by working within a community. This entails learning about who to trust and where the real expertise is. When someone makes a scientific claim, should we believe that person? " (Page 226)

As much as I try to reject the appeal to authority (because authorities can be and have been wrong before), and the genetic fallacy (looking at the genesis or origin of a claim rather than a claim), and try to examine claims on their own merits whenever practical, we just don't have the ability, resources and time to personally fully verify or debunk every important claim. It pains me to admit it, but it is true.

I can't become a medical expert and a legal expert and a political expert and a hundred other experts whose advice and help I might need. I have a car and I don't plan to become a mechanic, I have a body and don't plan to become a doctor, I have a mind and don't plan to get a PhD in psychology or neuroscience or a dozen related fields. It's just not practical.

"There are many situations in which obtaining expert advice is the only sane thing to do: when you can't identify a weird flat discoloration on your skin; when the brakes on your car are smoking; when you're considering spending your life savings to purchase stock in an exciting new company (or a bridge in Brooklyn); or when you're thinking about mixing Diet Coke and hydrochloric acid to clean the rust off your cutlery.

How do you know when the advice you're getting is coming from an expert? If you understand the science behind the claim, then you're golden. You can evaluate the claim directly. But usually you don't have the necessary knowledge. Then you can ask if the claim is based on replicable evidence or if it's wisdom that comes from a friend of a friend. Was it published in a peer-reviewed scientific journal, in the New York Times, or a supermarket tabloid? Learning about the nature of science - about the scientific process, about cases of scientific fraud, about the nature of peer review, and about scientific change and uncertainty - is critical to obtain the skills to evaluate scientific claims. " (Page 227)

I think a good understanding of how the scientific method has been developed and how good experiments are conducted is a foundation for being able to look at ideas and beliefs and see which are intuitive or untested assumptions and which are well supported based on observations of reality, observations designed to be accurate and reliable. It's different when you observe to find out what something really does as opposed to assuming it does what you expect and making a story to fit your beliefs rather than observations to test or debunk your beliefs.

If we understood many factors including sample sizes, isolating factors, and many other things that affect the actual research, we could ask the media to provide evidence and descriptions, not just conclusions.

"One goal of education should be to allow nonscientists to also be critical of what they see in the media. If enough of their audience was critical, news organizations might make a more concerted effort to get it right.

An important part of education consists of learning whether a claim is plausible, who would know, and whether this person is likely to tell the truth. There's no simple answer to making any of these judgements, but an educated person should be better at them than an uneducated one. This isn't just true in science; it's true of everything we teach, be it the law, history, geography, literature, philosophy or anything else. "(Page 228)

The authors embrace the idea of using this knowledge in having students learn as a group. Education researcher Ann Brown has a program called Fostering Communities of Learners. In this approach students in a grade school are presented a topic. They are divided into research groups that focus on a separate component of the issue. The students might examine an animal and have one group focus on its defense mechanisms, another on mating, another on food gathering. Then the students with minimal direction research the topics with experts, computers, written materials and learn as much as they can about their specialty.

They then gather and have one student from each team teach the other groups about their research. This encourages cooperation and fosters teamwork. No one knows it all, so everyone has recognizable value.

The authors encourage this communal learning far beyond grade school.

In real jobs teams that can work together and succeed at projects are what we desire, why not make it first nature ?

The authors encourage playing to people's strengths and giving everyone a foundation of a liberal education including critical thinking to help people and teaching skills like empathy and the ability to listen.

It's remarkable that many things they emphasize have been treated as personal characteristics rather than subjects in their own right. I can't tell you the number of people I have encountered who told me they "naturally" have good critical thinking skills, empathy and are good listeners.

Some people may be better without training than others, but a knack is no substitute for an education.


Well-known member
In chapter twelve of The Knowledge Illusion, Making Smarter Decisions, the authors look at the issue of how everything they covered in the entire book up to now can be used to make smarter decisions.

They go over information on savings and compound interest and the fact that very few of us save in a way that is smart for retirement. Now, there are many criticisms of this as most are not making enough to cover every possible expense, even without saving for retirement.

Similarly many of us carry debt in a way that adds significant debt as interest on the initial principle. That means we pay for things many, many times over as we pay them off gradually.

It's true for credit cards, car loans, mortgages, medical debt and school loans.

A lot of people won't want to hear all this. It doesn't mean it is unimportant or that they are stupid. It's a factor in human beings that varies. Lots of factors vary from person to person. Some are taller or shorter, heavier or lighter, like ice cream or don't like ice cream. We have certain qualities that come in greatly differing degrees.

The authors did research on how much explanatory detail people want by giving them information on Band-Aids.

They have this on some packages "Bubbles in the padding helps cuts heal faster." And they tried going with " Bubbles increase air circulation around the wound, thereby killing bacteria. This causes cults to heal faster. " they found this gave people a sense of causal understanding but it is actually pretty shallow and they pointed out it doesn't explain how the bubbles increase air circulation or how that helps with healing.

They added "Bubbles push the padding away from the wound, allowing air to circulate. Oxygen in the air interferes with the metabolic processes of many bacteria, killing them and allowing the wound to heal faster." ( Page 238)

They discovered most people's estimation of the product actually decreased with the third explanation, the detailed and thorough explanation.

"Most of us are explanation foes when it comes to our decisions. We are like Goldilocks. We have a sweet spot for explanatory detail, not too little and not too much. The truth is that we all know a few people who are exceptions. They do try to master all the details before making a choice. They spend days reading everything they can find, learning all the ins and outs of all the new technology. We call such people explanation fiends.

What explains the difference between explanation foes and fiends? The answer is cognitive reflection, discussed in chapter 4. People who get high scores on the Cognitive Reflection Test tend not to fall for trick questions because they naturally mull over how well they understand. Similarly, highly reflective people have a higher threshold for satisfactory explanation. A shallow explanation like the first one and even the second one is not enough. They want to know more. But most people are explanation foes. They are satisfied long before getting to the third explanation. Adding too much detail only makes the product feel more complex. " (Page 237)

We are stuck in a terrible bind - seek too many details and too much depth on things, everything, and take on far too much of ignore details and reject depth and be vulnerable.

Most advertisers take advantage of this vulnerability. They know that if they tell you to check the other guys regarding prices that most of us won't. They know that if they say to get both sides of a story most of us won't. We usually listen to one side, decide if we agree or not and don't move to other sides. They take advantage of poor critical thinking habits. It's human nature and needs to be discovered, explored and worked on.

The advertising industry takes extreme advantage of people who are not demanding explanatory depth. Beer often just shows attractive people who look athletic having fun with friends in places that are clean and pleasant looking, often outdoors doing healthy activities like playing sports or riding bikes. Beer doesn't make you healthy, happy, surrounded by friends and fit.

The beauty industry is plagued with advertising that is long on promises and short on scientific evidence or even detailed explanation. A lot of it involves false and exaggerated claims.

Similarly we have discovered just telling people they need to save for retirement isn't sufficient to make people do it. Lots of people can't afford it and many others experience pressures to not do it far exceeding the pressure to do it.

The authors reported that billions of dollars have been poured into financial education programs and that an analysis in 2014 of 201 studies found they had virtually no benefit regarding people saving money for retirement.

Now, regarding saving in particular I think other relevant information exists, but the general principle of people fitting into categories of how much explanatory depth they desire or tolerate or dislike is a relevant fact, as is the reality that most people, probably by a large percentage, do not generally want enough information to really understand most things but they will feel they understand them with very little information, far too little to actually understand them.

So, what to do ?

"Here's where we think these efforts have gone wrong: They put all the weight of a decision on the individual. Individuals make decisions, and therefore the individual must be educated to make wise decisions. If things go wrong, the individual is to blame." ( Page 241)

"But this is the same faulty reasoning that we've seen throughout this book. Individuals don't make decisions by themselves. Other people formulate options for them, other people present those options, and other people give them advice. Moreover, people sometimes copy decisions that are made by others (for example, when stock market guru Warren Buffett makes a decision to buy a stock, many people copy him). We should be thinking about decision-making from a communal perspective. The knowledge required for decision-making is not merely in individuals'heads but depends heavily on the community of knowledge." ( Page 241)

The authors give the economy as an example of a hive mind. Lots of us play a tiny role in the economy but most of us have a superficial understanding at best. In examining human psychology after leaving Scientology to find out the truth about human behavior I discovered a lot of subjects are less than settled from a scientific perspective. In Scientology evidence of a scientific nature is lacking, people are convinced by other means, deceptive practices. If I understood scientific method and evidence better as part of critical thinking I would have been better prepared to correctly evaluate Scientology and to see it is heavy on narrative (stories) but light on evidence.

I looked at other subjects and found out something scientists also knew, some subjects have stories but the facts and history of human behavior contradict the stories, sometimes by a lot.

Behavioral economics sprung from people looking at the ideology that is dogmatic (accepted as valid and beyond question) in economics and realized it often doesn't fit real individuals, real groups and real behavior. Unfortunately the basic economic models that make claims about supply and demand and many other things, simply are contradicted by observing human beings.

Daniel Kahneman took this on in Thinking, Fast and Slow. I read several other good books on behavioral economics and psychology and they have a lot of research showing the purely rational actor that the enlightenment philosophers dreamed of, and generations of intellectuals accepted as a given, doesn't really exist. We are not exactly insane or immoral, but we have biases and flaws in thinking, and quirks in behavior that pure rational actors wouldn't have. Several of those biases, flaws, and quirks are exposed and explored by the book this post is about.

Back to the economy and our way of interacting with it as a group, each playing a limited role. In many times we have based the value of money on a link to something else, until the 1930s in America we used gold. Now we just use the idea that other people will treat money as valuable, valuable enough to trade for goods and services, valuable enough to work for it.

The economy is especially vulnerable to certain problems because so many people participate in it, are entirely dependent on it and are so ignorant about so much of it. In the seventeenth century in Holland the price of tulip bulbs exploded, people saw other people paying more and more for Tulips and thought they should invest in them. At one point a single bulb could sell for several times the annual income of a middle class household.

Fortunes were lost when the bubble finally burst and people realized that the tulip bulbs didn't truly have the value to back up such prices. Similarly in the 1990s comic books were being bought by people who had never read or collected them as investments. Special number one issues and big stories like "The Death of Superman" lured in millions of people who had no love of comics to buy collectors special editions and watch the prices go up by thousands of percent on their investments. Once the speculators stepped back they realized the valuable comics from earlier decades had only a few copies in good condition available and the new wave of comics had millions of copies available and further all the speculators were willing to sell their comics, while old school collectors often hung onto comics out of sentiment, further driving up the price of the old comics, but doing nothing for the millions of copies available of new comics. The bubble again burst and millions of people were left with comic books they bought by the box that they couldn't get a quarter for. The comic book store owners understood what happened and didn't want backrooms fill of worthless comics.

Some people invested millions in garages fill of special editions to end up throwing them out when the bottom fell out of the market.

In 2008 several economic trends intersected. Most of us knew far too little to see what was happening and even if someone told us we probably wouldn't have understood or believed it.

Since the economic growth and relative boom for many middle class Americans in the post-war forties, fifties and even through part of the sixties and seventies many people had income riding to relatively keep pace with inflation. But by the eighties and nineties costs of living started to consistently outpace income.

So, what did people do ? Some people had both partners go to work. Some had a mom who had a part time job take on a full time job. This worked somewhat for some people, but over time it wasn't enough for everyone and so some people became overtime fiends, I had several years in which I worked sixty hours a week routinely and a handful in which I averaged seventy to seventy five hours a week. That helped some but overtime isn't guaranteed and if a business slows down it may dry up.

What else have people done ? Many turned to credit. Cars, houses, furniture, appliances and a lot more has been getting bought on more and more credit so people can survive riding costs and stagnant wages. For the bottom eighty percent of Americans real wages have been nearly flat since about 1980.

Regarding the housing market a discovery was made. Our wages for working class Americans were going up about 2.5% per year in average but housing prices across America were going up closer to 5% per year and in some places by far more.

I know someone who had an apartment in New York city and to get by he would take a loan off the value of his apartment for ten to thirty thousand dollars a year but the value of his apartment was going up fifty to seventy five thousand dollars a year. He eventually sold his apartment and retired with several million dollars.

Many people weren't so lucky and when the housing bubble burst it ruined the finances of millions of people who counted on the value of their homes rising and the equity in their homes being the foundation of their credit. They built it as a cornerstone of their lives, their biggest financial investment.

Many found that they after years of paying tens or hundreds of thousands of dollars ended up owing far more than their homes are worth and having far more debt than value in their property and so went from having great credit to being considered a liability. Making this much worse, many people had adjustable rate mortgages and they weren't prepared for the rates to skyrocket on them and drive their payments through the roof. Many didn't understand the extreme risk with the adjustable rate mortgage and took it on advice that it was lower. Initially it was, but it finally wasn't.

Bad decisions can utterly ruin people. Paying thirty or fifty dollars less a month seems good but when the market turns paying hundreds and hundreds of dollars more per month is often crushing. Most people would have been far better off with a slightly higher fixed rate mortgage. Many would still have had their credit devastated but they at least could have kept their homes, paid them off and had the chance to rebuild their lives with the payments they had made counting towards buying their homes been able to avoid defaulting on their mortgages, losing their homes, getting nothing for all the money they had put into them and been forced to start over with ruined credit.

So, what can WE do ? The University of Chicago economist Richard Thaler and the Harvard legal scholar Cass Sustein created paternal libertarianism. Paternal meaning father or parent like and libertarianism meaning emphasizing freedom.

Lots of great books on psychology and behavioral economics explain dozens of studies on human behavior and have found we are likely in many situations to go with the flow, whether it is good for us or not. We do things like don't act when a default option is automatic. In countries where being an organ donor requires filling out special forms or signing something but not being one is the default, most people don't become organ donors. In countries where it is the default and requires a special act like going somewhere and filling out a form to opt out, almost no one opts out. Some people do for religious reasons, but very very few.

In paternal libertarianism whenever a choice is seen as significantly more beneficial for most people the default is automatic but the freedom to opt out without extreme difficulties remains.

Some companies try to do this with retirement savings and may match the contributions of employees to a certain percentage to encourage participation with "free" matching funds.

It's paternal in that options are designed so that if you ignore the issue, on average something that benefits you in the long is chosen for you. Or maybe even encouraged with a reward. It's libertarian in that if you really don't want it for some reason you are free to reject it. Another example is that food presented earlier in a cafeteria is more likely to be chosen, so if you present students or employees with a line of items, it's probably a good idea to put the salad and healthier items first. They can still get pizza or fried chicken or bacon if you have to provide it or maybe you can have baked chicken or fish, depending on your situation. It's about working with the willingness of your people to fit within a range of options, then encouraging but not requiring the better options on average.

With libertarian paternalism comes the idea that we can be nudged by how options are presented, which are presented first and which are default settings to get more of us to routinely do what is better for us. Of course we still have a responsibility to find out what is best for us and not just beneficial for a decision maker.

Taking all this into account the authors say that the big lesson of the nudge approach is to know it is easier and more effective to change the environment than the person and once we understand what quirks of cognition drive behavior we can design the environment so those quirks help us instead of hurt us.

They recommend we apply this to our decision making in the community of knowledge. We usually are explanation foes and don't want to get to know all the details and don't have the time either. So, the challenge is to structure the environment so we make good decisions despite our lack of understanding.

They have several suggestions.

Lesson 1: Reduce Complexity

They recommend giving people explanations they can actually understand. They recommend the Reddit forum "Explain Like I'm 5" in which people give extremely simple explanations to questions.

Lesson 2: Simple Decision Rules

Richard Thaler recommends simple rules to help people financially like invest as much as possible in a 401k or save fifteen percent of your income or get a a fifteen year mortgage if you are over fifty.

The authors recommend a simple explanation with each rule to strongly encourage people. There are topics like the benefits of diversified investments (lowered risk), investments with low returns and guaranteed principle (extremely low risk), the benefits of compound interest that people can benefit from a simple explanation of especially at the right time.

Lesson 3: Just-in-Time Education

John G. Lynch Jr., the director of the Center for Research on Consumer Financial Decision Making at the University of Colorado, recommends "just-in-time" financial education.

A lot of people recommend courses on finances in high school. I personally do, because some aspects of finances like credit cards and debt can be warned about before people get into them. Some of the information would stick but much would be remembered to pass a test and then discarded.

So, it is recommended that we get information, even if we got it before, right before we need it. They have the example of when we lose a job and are contemplating taking money from a 401k . I have cashed out a 401k on two separate occasions and was fortunate I understood to set aside about a third to meet tax liabilities. I could have had a huge debt and no way to pay it, if I didn't check with a financial advisor. For some people with hundreds of thousands or more in retirement accounts taking the money out would involve a tremendous tax penalty and they are still going to have to retire. For many people taking out just enough to get by is smarter than cashing out.

Additionally, there are predatory companies that look for people in vulnerable moments, like when they are laid off to sell high risk investments. There are companies that give reverse mortgages and sometimes take property sooner than expected. There are many situations that you would be better off with a little education right before you decide, like when an estate is settled and inheritance is finalized or when you're surprised a relative made you a beneficiary and you have tens of thousands of dollars for the first time in your life.

Lesson 4: Check Your Understanding

The earlier recommendations were for society dealing with individuals. For individuals we can be aware of our tendency to be explanatory foes. Be aware of our ignorance. If an area or topic is important enough take the time and effort to examine it in depth, look for solid evidence. Look at criticism of the ideas and arguments for them. Use your critical thinking and develop critical thinking practices for these situations.

The authors point out how intellectual arrogance leads us to make poor choices as we mistake out ignorance for wisdom.

I certainly was incredibly arrogant about human behavior and psychology and influence and incorrectly assumed that if Scientology was a con I would be immune to it based on being too smart and so I was the perfect recruit. I already fooled myself before Scientology, making the job a piece of cake.

I played myself. And got burned. If I just got wasn't so arrogant and self deluded, I could have listened to my wife and simply stayed away from Scientology and satisfied my curiosity by reading books about Scientology and other cults.

My example of almost unimaginable arrogance is unfortunately not the worst. People fall in with groups that they murder for, like criminal gangs, white supremacists and terrorists like ISIS. And some cult members even kill.

Ray Dalio of Bridgewater Associates, a hedge fund sees his success as relying on how he deals with what he doesn't know, he looks for people who may disagree, so he can learn what they know that he doesn't, because he is aware that there is a lot more information than just what he knows, and he can be wrong.

With my situation, of losing or wasting much of twenty five years in Scientology, living a lie, doing harm and thinking I was doing good, becoming a self absorbed and self deluded jerk and thinking I was finding profound wisdom and secret enlightenment (that sounds about as pretentious as it gets) when I was just becoming a terrible husband and person who hid under a bunch of excuses is a monumental example of how wrong this can go.

There are many who have succeeded in life far more than me by having a little humility, admitting they don't know it all, admitting they don't understand things like Scientology and when warned by the most important people in their lives, stayed away from Scientology or been wise enough to examine evidence against Scientology. There are also many people in many endeavors who have been successful thanks to intellectual humility.

I hope this post encourages us to do better and the ideas here highlight how I did the opposite, and paid a high price.


Well-known member
In the final chapter, Conclusion: Appraising Ignorance and Illusion, the authors wrap it all up.

"When academics encounter a new idea that doesn't conform to their preconceptions, there's often a sequence of three reactions: first dismiss, then reject, and finally declare it obvious. The initial reaction to an idea that challenges an academic's world view is to ignore it: Assume it's not worthy of one's time and consideration. If that doesn't work, if community pressure forces the idea to be confronted, academics come up with reasons to reject it. Academics are terrific at justifying their opposition to an idea. Finally, if the idea is just too good to reject, if the idea hangs on in the community, academics find reasons to claim they knew it all along because it's self-evident."
(Page 255)

The authors hope readers will jump to the conclusion that these ideas are self-evident. I hope readers will consider the evidence for and against the ideas and if they see them as well-established by evidence then they will accept them.

They feel these ideas when properly explained come across as obvious but having needed a little attention to point them out.

"Ignorance is not bliss, but it doesn't have to be misery. For humans, ignorance is inevitable: It's our natural state. There's too much complexity in the world for any individual to master. Ignorance can be frustrating, but the problem is not ignorance per se. It's the trouble we get into by not recognizing it."
(Page 257)

The authors discuss the work of David Dunning, famous for the Dunning-Kruger effect. He has studied people for many years and discovered that the people who know the least about a subject tend to rate their skills the highest. He has studied expertise, having skills and knowledge regarding a topic and ignorance meaning that you have neither.

One can have skills, like as a driver or pilot or hockey player or one can have knowledge like a person who has seen pilots and drivers perform and hockey players and may even know how to evaluate or train them. But different skills bring different kinds of expertise. A person who would not be a good hockey player may be able to train others very well and a person who can play extremely well may not be able to train others. An ignorant person knows less than either of the others.

Dunning found that those who lack skills also lack the knowledge to know what they are missing.

"The unskilled just don't know what they don't know. And, according to Dunning, it matters because all us are unskilled in most domains of our lives:

Our ignorance, in general, shapes our lives in ways we do not know about. Put simply, people tend to do what they know and fail to do that which they have no conception of. In that way, ignorance profoundly channels the course we take in life...People fail to reach their potential as professionals, lovers, parents, and people simply because they are not aware of the possible." ( Page 258)

I want to point out something important. Dunning-Kruger effect reflects IGNORANCE , not STUPIDITY. People often have a tendency to dismiss others with different religious and political beliefs and assume they are stupid. In reality we are all ignorant and far more likely to get things wrong but think we have them all figured out when we remain ignorant. That doesn't mean we are stupid or that people with knowledge are more intelligent. It doesn't work that way.

Sometimes ignorance doesn't make a problem as not knowing about something trivial just may mean we don't think or worry about it.

"But ignorance has costs. If we don't know about birth control, then we don't use it. If we remain ignorant about the horrors that are going on next door, we won't do what's necessary to stop them. And if we are ignorant about the dangerous things our children are getting into, disaster can follow." (Page 259)

The authors advise knowing what we as individuals don't know, where we are ignorant and respect the knowledge of others. They also pointed out the deaths from murder and suicide associated with Jim Jones and the Heaven's Gate cult. 909 people died in Jonestown and 39 with Heaven's Gate and of course the Branch Davidian cult had leader David Koresh and 39 other people die in a tragic fire. They point out that communities can have an insidious effect on what people believe and their actions and decisions.

"So we're not championing faith in whatever a community believes or whatever a credentialed expert says. Along with faith must come a healthy dose of skepticism and a keen eye for charlatans and those who are confidently wrong. When your community gives you bad advice, it's your responsibility to not take it. Nazi prison guards are not excused because they were following orders, and terrorists are certainty not excused because they were members of an ideological community."
(Page 260)

The authors point out that we can chose communities that strive to tell the truth and that most people try to be honest most of the time, which makes community even possible in the first place.

"We live with the illusion that we understand things better than we do. Is illusion something we necessarily need to dispel? Should we always strive to have beliefs and goals that are as realistic as possible? This is the choice that confronts Neo, Keanu Reeves's character in the film The Matrix: take the red pill and live in the real world or take the blue pill and maintain the comfort of illusion. If he chooses the red pill, he'll have to face the world as it is, including the pain, sorrow, and robot overlords that accompany reality. If he chooses the blue pill, he'll return to the collective delusion of human existence.

By avoiding illusion, you're more likely to be accurate. You'll know what you know and what you don't know, and this can only help you achieve goals. You won't take on projects that are beyond you and you won't disappoint others. You'll be better positioned deliver on your promises. " (Page 261)

Now, this is especially poignant for me because again I reflect on my ignorance about my ignorance and my illusion of knowledge that I foolishly stuck with. I was not educated or prepared for the deception and fraud in the Scientology cult, a closed community of deluded individuals deluding others.

I didn't know what I knew and didn't know and so got in way over my head right from the start. I made a promise to evaluate Scientology, figure out if it was a scam or on the up and up, to avoid being fooled and to respond based on my findings. I was not skilled or knowledgeable enough to fulfill my promises and paid a terrible price for my decisions, decisions made from profound ignorance.

I didn't understand rhetoric, psychology, hypnosis, critical thinking, logical fallacies and dozens of subjects that could have made it obvious to me that Scientology is not a honest and legitimate organization. The general ignorance I had on so many things left me vulnerable.

The authors pointed out that we use illusions all the time for happiness. We think of fictional stories and fantasies and novels and television shows and movies and are entertained by them and get joy thinking about them. But a crucial difference is we know those things are not real, and can engage or disengage from them on that basis.

I think it is unfortunately a fact of life that all too often unscrupulous individuals and groups can use our tendencies to think in terms of stories we find appealing and our ignorance about out ignorance together to persuade us with deceptive tactics. Cults rely on this as do criminals and abusers and unethical politicians.

But I sincerely hope we can learn about out weaknesses in these areas and reduce the opportunities for exploitation. And learn how to evaluate experts and their claims and to understand the roles we all play in the community of knowledge together.


Veteran of the Psychic Wars
"People overestimate their understanding of political issues like tax policy and foreign relations, of hot-button scientific topics like GMOs and climate change, and even on their own finances. We have been studying psychological phenomena for a long time and it is rare to come across one as robust as the illusion of understanding." ( Page 22)
People have limited understanding of complex processes, true.

The problem is that relying on "experts" is not necessarily a solution. It comes down to trust.

When an expert tells me that I should do X instead of Y in order to avoid IRS tax issues, I will go with the expert, trusting that the expert
  1. Is actually knowledgeable in the particular area, and
  2. Is actually operating according to my best interest, rather than for a hidden personal agenda.
In lots of other areas, I may not be able to be certain that the "expert" actually fits the above two criteria.


Veteran of the Psychic Wars
When we get new information, we accept some bits, and reject other bits. Why?

I think Hubbard might have actually had a nugget of useful observation when he talked about "stable data".

Each of us has a bunch of "stable data", which we accept as true, and which we use to evaluate incoming information. Information which aligns with the stable data is accepted, and may even be added to the body of stable data if it is seen as a refinement of the existing stable data. Information which violates the stable data is rejected.

Hence, when you have somebody who is a true-believer Scientologist, who has as a stable datum that LRH is always right, it's very hard to talk to them about the shortcomings of Scientology.

From time to time, we get into situations where we cannot avoid seeing that something we accepted as a stable datum is obviously false. This results in cognitive dissonance.
  • Like
Reactions: M&M


Well-known member
People have limited understanding of complex processes, true.

The problem is that relying on "experts" is not necessarily a solution. It comes down to trust.

When an expert tells me that I should do X instead of Y in order to avoid IRS tax issues, I will go with the expert, trusting that the expert
  1. Is actually knowledgeable in the particular area, and
  2. Is actually operating according to my best interest, rather than for a hidden personal agenda.
In lots of other areas, I may not be able to be certain that the "expert" actually fits the above two criteria.
So, what do you do ? Are you going to learn enough to diagnose and treat cancer ? To repair a car ? To plan policies for a government ? To create a just legal system ? To decide every question on every issue ? No one lives long enough to even try.


Veteran of the Psychic Wars
So, what do you do ? Are you going to learn enough to diagnose and treat cancer ? To repair a car ? To plan policies for a government ? To create a just legal system ? To decide every question on every issue ? No one lives long enough to even try.
I'm not getting the impression that you read what I wrote.

Here's the viewpoint I tried to put across:
With "experts", it's often necessary to examine the expert, to see if his expert advice can be trusted. A lot of the time, it can be. Sometimes, though, you get people who put across a recommendation, not because it will benefit YOU, but because you following the recommendation will benefit THEM. And you always have to be alert for potential conflicts of interest.

Among your examples, "to repair your car", it's not uncommon for me to seek a second opinion when a mechanic gives me an estimate that I consider unduly high. Or when the problem he was supposed to fix, is still a problem.

Similarly, with doctors, it's standard to seek out a second opinion before agreeing to get an operation done or when a treatment may determine if you live or die. It's also standard to inquire about a doctor's record of success.