Why the brain is hardwired to believe falsehoods

Why do some people still believe the earth is flat, man hasn’t walked on the moon, the Holocast never happened or do not believe the scientific evidence of global warming?  Given its negative impact on society, it is important to understand why certain groups of people are more vulnerable to believing unsupported lies than others. The fields of psychology and neuroscience can offer insight.

A basic fact about the brain: it takes more mental effort to reject an idea as false than to accept it as true. In other words,

it’s easier to believe than to not.

“This fact is based on a landmark study published in the journal PLOS ONE in 2009, which asked the simple question, how is the brain activated differently during a state of belief compared to a state of disbelief? To test this, participants were asked whether or not they believed in a series of statements while their brain activity was being imaged by an fMRI scanner. Some sentences were simple and fact-based (California is larger than Rhode Island), while others were more abstract and subjective (God probably does not exist). The results showed the activation of distinct but often overlapping brain areas in the belief and disbelief conditions.”

While these imaging results are complicated to interpret, the electrical patterns also showed something that was fairly straightforward. Overall, there was greater brain activation that persisted for longer during states of disbelief. Greater brain activation requires more cognitive resources, of which there is a limited supply. What these findings show is that the mental process of believing is simply less work for the brain, and therefore often favored. The default state of the human brain is to accept what we are told, because doubt takes effort. Belief, on the other hand, comes easily.

This finding makes sense from an evolutionary standpoint.

If children questioned every single fact they were being taught, learning would occur at a rate so slow that it would be a hindrance. 

This finding makes sense from a developmental standpoint.

“For some children being taught to suppress critical thinking begins at a very early age. It is the combination of the brain’s vulnerability to believing unsupported facts and aggressive indoctrination that create the perfect storm for gullibility. Due to the brain’s neuroplasticity, or ability to be sculpted by lived experiences, some of us literally become hardwired to believe far-fetched statements.”

For example: This wiring begins when children are first taught to accept whatever adults tell them as objective truth and not to question.  Even mystical explanations for natural events train young minds to not demand evidence for beliefs. “As a result, the neural pathways that promote healthy skepticism and rational thought are not properly developed. This inevitably leads to a greater susceptibility to whatever we are told.”

“If we want to combat the brain’s habit of taking the path of least resistance, which has destructive downstream consequences for critical thinking, as a society we must place more value on empirical evidence, and this must be reflected in how we educate our youth. Additionally, we must create an awareness of the fact that for the human mind, believing is more of a reflex than a careful and methodical action.”

From Psychology Today Article

Why don’t facts matter to our inquiring minds?

“People generally see what they look for and hear what they listen for.” –Harper Lee

Tali Sharot is the author of, “The Influential Mind: What the Brain Reveals About Our Power to Change Others.” An associate professor of cognitive neuroscience, she is the director of the Affective Brain Lab at University College London. The information and opinions expressed are hers.

Why does evidence seem to have little influence on people’s beliefs?

“To many of us who study the human mind, the diminishing influence of evidence is less a puzzle than a prototypical example of how the mind forms beliefs. And the very idea that simply providing people with data would be sufficient to alter their beliefs is condemned to fail.”

“The very first thing we need to realize is that beliefs are like fast cars, designer shoes, chocolate cupcakes and exotic holidays: they affect our well-being and happiness. So just as we aspire to fill our fridge with fresh fare and our wardrobe with nice attire, we try to fill our minds with information that makes us feel strong and right, and to avoid information that makes us confused or insecure.”

“It’s not only in the domain of politics that people cherry-pick news; it is apparent when it comes to our health, wealth and relationships. Many individuals avoid medical screenings in an attempt to evade alarming information.”

Using . . .” non-invasive brain imaging techniques, my colleagues and I have recently gathered evidence that suggests our brain reacts to desirable information as it does to rewarding stimuli like food, and reacts to undesirable information as it does to aversive stimuli like electric shocks. So, just as we are motivated to seek food and avoid shocks, we are also motivated either to seek or avoid incoming information.”

Confirmation Bias

“Of course, we do not always turn away from uncomfortable data. We do undergo medical tests, face our debts and occasionally read columns written by people who hold different political views than ours. But on average we are more likely to seek confirmation of what we believe (or want to believe.

“Unfortunately, the solution is not as simple as providing people with full and accurate information. When you provide someone with new data they quickly accept evidence that confirms their preconceived notions and assess counter evidence with a critical eye.

“For example, my colleagues and I also conducted a study in which we presented information to people who believe that climate change is man-made as well as to people who were skeptics. We found that both groups strengthened their pre-existing beliefs when the new data confirmed their original position, but ignored data that challenged their views.”

“Such effects are examples of the confirmation bias. It is not new. But today, as information is more readily accessible and people are frequently exposed to different opinions and data points, this bias is likely to have an even greater role in shaping people’s beliefs — moving ideological groups to extremes.”
“And while you may assume such biases are a trait of the less intelligent, the opposite is true. Scientists discovered that those with stronger quantitative abilities are more likely to twist data at will. When volunteers in that study were given data about the effectiveness of gun control laws that did not support their views, they used their math skills not to draw more accurate conclusions, but to find fault with the numbers they were given.
Why have human beings’ brains evolved to discard perfectly valid information when it does not fit their preferred view? This seems like bad engineering, so why hasn’t this glitch been corrected?”

Confidently-held opinions are difficult to change.

“Cognitive scientists have proposed an intriguing answer: our brain assesses new information in light of the knowledge it has already stored, because in most cases that is, in fact, the optimal approach. More likely than not, when you encounter a piece of data that contradicts what you believe with confidence, that piece of data is in fact wrong.”

“They are even more difficult to change once people act on them. Research has shown that immediately after making an overt choice, our conviction strengthens as we tend to rationalize our choices to ourselves and others.”

“So while data is important for uncovering the truth, it is not enough for convincing people of that truth.”
“We should not, however, be discouraged. The solution, I believe, is not to fight the way our brain works, but to go along with it. We should take our biases into account and use them when trying to convey our truth.”