About twelve minute video on the Cognitive Bias Paradox: Knowing Cognitive Biases Can Worsen them with possible ways to overcome this difficult problem.
Text Articles
Short URL: bit.ly/43CZJXH Long URL: http://wordpress.jmcgowan.com/wp/the-cognitive-bias-paradox/
Short URL: http://bit.ly/wv713 Long URL: http://wordpress.jmcgowan.com/wp/the-worldview-prison/
About Us:
Main Web Site: https://mathematical-software.com/ Censored Search: https://censored-search.com/ A search engine for censored Internet content. Find the answers to your problems censored by advertisers and other powerful interests!
Subscribe to our free Weekly Newsletter for articles and videos on practical mathematics, Internet Censorship, ways to fight back against censorship, and other topics by sending an email to: subscribe [at] mathematical-software.com
Avoid Internet Censorship by Subscribing to Our RSS News Feed: http://wordpress.jmcgowan.com/wp/feed/
John F. McGowan, Ph.D. solves problems using mathematics and mathematical software, including developing gesture recognition for touch devices, video compression and speech recognition technologies. He has extensive experience developing software in C, C++, MATLAB, Python, Visual Basic and many other programming languages. He has been a Visiting Scholar at HP Labs developing computer vision algorithms and software for mobile devices. He has worked as a contractor at NASA Ames Research Center involved in the research and development of image and video processing algorithms and technology. He has published articles on the origin and evolution of life, the exploration of Mars (anticipating the discovery of methane on Mars), and cheap access to space. He has a Ph.D. in physics from the University of Illinois at Urbana-Champaign and a B.S. in physics from the California Institute of Technology (Caltech).
Knowing cognitive biases can worsen them. Paradoxically, detailed knowledge of cognitive biases such as “confirmation bias” and “cognitive dissonance” provides a powerful set of tools to reinforce these very biases and dismiss data, evidence, and even direct personal experiences that contradict our preconceived ideas and prejudices.
In practice, one thinks:
I am one of the special intelligent, educated elite who are well aware of the cognitive biases. Knowing the biases, I am able to compensate for them, for example through ‘steelmanning‘ my opponent’s arguments. In contrast, the data or evidence from any third party contradicting my evidence-based beliefs is clearly the product of cognitive biases X,Y, and Z leading to cherry picking of evidence, faulty statistical methodologies, or other mistakes.
That personal experience that contradicts my evidence-based belief is a special case, a fluke, a coincidence, the product of some perceptual error such as the well known phenomenon cited by skeptics of misperceiving the rising Moon, Venus, Jupiter, lighthouses, and other conventional objects as a silvery flying saucer, or some other perceptual or cognitive flaw mined from the literature or made up as needed.
This is due in part to the so-called GI Joe Fallacy:
Knowing about one’s biases does not always allow one to overcome those biases — a phenomenon referred to as the G. I. Joe fallacy.
The use of phrases such as “cognitive bias,” “confirmation bias”, and “cognitive dissonance” has grown dramatically in the last twenty years as shown by Google’s NGRAM viewer above. Indeed if you follow many political or scientific controversies in recent decades, it is likely you will have heard these phrases used to dismiss the data, evidence, opinions, and even direct personal experiences of the “other side” in these debates.
By most accounts the phrase “cognitive dissonance” entered general use from the publication of the popular science book When Prophecy Fails: A Social and Psychological Study of a Modern Group That Predicted the Destruction of the Worldby Leon Festinger, Henry Riecken, and Stanley Schacter in 1956 and A Theory of Cognitive Dissonance by Leon Festinger in 1957. Awareness of cognitive dissonance and other cognitive biases has soared in the last few decades. The publication of Daniel Kahneman’s popular science book Thinking, Fast and Slow in 2011 is often credited with contributing to recent greater awareness of cognitive biases.
Yet, in fact, this wider awareness of cognitive bias does not appear to have improved the quality of logical argument or debate. Quite the opposite if anything, with censorship and thought-stopping labels such as “conspiracy theory,” “conspiracist,” “conspiracy theorist,” “conspiracy thinking”, even “conspiracy” used as a short hand for “conspiracy theory” in the pejorative, non-literal sense that has become ubiquitous, “denialism” and “denier” in the Holocaust denial sense, “fake news,” “misinformation,” “disinformation,” and “malinformation” proliferating.
More and more people may simultaneously believe that their knowledge of cognitive biases makes them immune to error while dismissing the views of others as hopelessly wrong due to their unrecognized cognitive biases.
Is there anything we can do about this growing problem?
Paradoxically, simply knowing that detailed knowledge of cognitive biases can actually aggravate these biases is not enough. Knowing is not even half the battle.
It is unclear what will actually work. Modern technologies and system such as the Internet as a whole, smartphones, and Twitter can bombard us with huge quantities of often emotional, propagandistic content. Scaling back the quantity of this content may be helpful.
Going through the “news” and other content one consumes, systematically striking out the many popular thought-stopping words and phrases such as “conspiracy theory” and ironically especially invoking “cognitive bias,” “cognitive dissonance,” “confirmation bias,” and related phrases, leaving hopefully a small set of alleged “facts” — not opinion or analysis, may help us drill down to the substance of the content.
Common Thought-Stopping Words and Phrases Today
pseudoscience
climate denial
XXX denial
climate denialism
XXX denialism
anti-vaccine
anti-science
anti-XXX
conspiracy theory
conspiracy thinking
conspiracy theorist
conspiracist
conspiracy
fake news
misinformation
disinformation
malinformation
election interference
far right (more common)
far left (less common)
Russian
Putin
woke
racist
racism
white supremacy
white supremacism
xenophobia
homophobia
Russophobia
XXXphobia
pedo
pedophilia
PARADOXICALLY:
cognitive bias
confirmation bias
cognitive dissonance
cherry picking data/evidence/etc.
The reader can probably list several more from their own experience. It is easier to identify thought-stopping words and phrases that you disagree with.
Edit out thought stopping words and phrases
Edit out other emotional words and phrases
Identify remaining factual or logical claims
Check the facts yourself (don’t rely on so-called fact checkers)
Locate original source or citation
Verify what the original source or citation says in the body of the article — don’t rely on abstracts, summaries, titles, headlines.
Wikipedia relies on secondary sources. Track down the primary (original) sources. Wikipedia is not reliable on “controversial” subjects.
Check sources of funding, possible conflicts of interest or biases of any authors or publishers.
Verify the claimed “facts” through personal experience if possible.
John F. McGowan, Ph.D. solves problems using mathematics and mathematical software, including developing gesture recognition for touch devices, video compression and speech recognition technologies. He has extensive experience developing software in C, C++, MATLAB, Python, Visual Basic and many other programming languages. He has been a Visiting Scholar at HP Labs developing computer vision algorithms and software for mobile devices. He has worked as a contractor at NASA Ames Research Center involved in the research and development of image and video processing algorithms and technology. He has published articles on the origin and evolution of life, the exploration of Mars (anticipating the discovery of methane on Mars), and cheap access to space. He has a Ph.D. in physics from the University of Illinois at Urbana-Champaign and a B.S. in physics from the California Institute of Technology (Caltech).
The large majority or possibly all people are unable to accept or even process data, even personal experiences, perceived to contradict their worldview. This is common human behavior. It is not a function of intelligence, education, social status, or any other obvious marker. In fact, highly educated intelligent people may be more prone to it as they are better equipped — like a good lawyer — to rationalize away obviously contradictory data or experiences.
One should not think of our “worldview” as a rigorous axiomatic system like Euclid’s Geometry or the precise logically consistent rules of games like chess. It is not rational in the usual sense. It appears to be a hodgepodge of elements and can often contain contradictory elements. It is highly social in nature, being comprised of direct personal experience and the claims of authority figures in groups we identify with such as our family, tribe or nation, profession etc. It is closely associated with our sense of group identity.
Many people can and do change their worldview but it usually follows sustained negative direct personal experiences over a period of years, very rarely less than six months. Data alone in the sense of books, newspaper articles, peer reviewed research papers, databases, spreadsheets etc. almost never changes the worldview.
Direct personal experiences, usually strongly negative, are the underlying cause in most cases. Once these have occurred and begin to work their way through the subconscious and conscious mind, the person will begin to seriously engage “data” whereas before they may not have even been able to “see” that very same data: ignoring it, dismissing it out of hand, reacting with great hostility to “obvious” baloney or lies.
It may be that an extreme traumatic event such as the loss of a loved one to an immediate negative reaction to a vaccine can cause an immediate change of worldview — less than six months — but this seems quite rare.
Beliefs about “science” and vaccines constitute a worldview or part of a worldview, especially for scientists, engineers and other technical people, for whom “science” constitutes an actual substitute for a religion or other spiritual beliefs.
Other psychological concepts such as “denial” and “cognitive dissonance” overlap with this concept of a rigid, difficult to change worldview. Cognitive biases seem strongest — most difficult to overcome — for elements of the worldview.
This behavior of the worldview in most, possibly all, people suggests that attempts to persuade others where the worldview of some or all may be involved should focus on drawing their attention to their own direct experiences, in-person experiments or demonstrations of contradictory phenomena, and persuading those for whom the issue is not connected to their worldview.
It is difficult, perhaps impossible, to distinguish true belief in a worldview from deliberate deception and fraud including “gaslighting” because the ability to perceive both experiences and “data” is substantially impaired. The “True Believer” genuinely cannot see obviously contradictory evidence that others can easily see.
Nor is there a sharp dividing line between true belief and fraud. Authority figures in the group may engage in Plato’s Noble Lies to protect the worldview (and often their power) not because they don’t mostly believe in the worldview but in the same way that parents simplify, hide, sugar-coat or flat out lie to children to protect them from complex or painful issues.
I have changed my worldview a few times (it is a rare occurrence). In all cases, the sequence of negative experiences preceded even conscious doubt of underlying assumptions in my worldview, by at least six months. The whole process took a couple of years in each case from start to finish.
Major scientific and technological breakthroughs frequently involve a change of worldview. The inventors and discoverers usually spend several years failing miserably before the flash of insight, the change of worldview occurs. The final flip can be quite fast but it is almost always preceded by long periods of failure. There are for example many accounts of the change of worldview, moment of insight, the “Heureka moment”, happening almost instantaneously on a contemplative walk or other break after a long period of hard but unsuccessful work.
Note that reading about the many failures of previous researchers does not lead the inventor or discoverer to abandon the prevailing wisdom of how to solve the problem. It usually appears to require personal repeated failure as well.
What does this mean both for persuading others and being sure of our own beliefs?
Direct personal experiences are more persuasive. Test your own beliefs if you can. In your own mind, clearly distinguish between the relatively small number of beliefs well founded in repeated clear direct personal experiences and those derived from others, generally external authorities in groups that you identify with.
This is easier said than done. For example, many people especially scientists and engineers are taught that it is “obvious” that the Earth is a sphere about 8,000 miles in diameter and believe it is obvious when in fact it is not — unless you have actually circumnavigated the Earth or flown on a space ship. Try demonstrating to yourself that the Earth is a sphere about 8,000 miles in diameter from only your own personal experience, not reciting claims by science popularizers such as the late Carl Sagan.
Get others to notice personal contradictory experiences and to test their beliefs first hand. This is not always possible. There is no guarantee of success. Remarkably we often consider elements of the worldview derived from the words of others (“everyone knows”) just as solid and true as those based on personal experience. Even more reliable in some cases — “how could everyone be wrong?” “How could all the experts be wrong?”
Psychoanalyzing people to their face is usually not persuasive. Most people find it offensive and patronizing. Most highly educated people are well aware of cognitive biases. It is in the news. Many are not aware there are social psychology studies that knowing about cognitive biases does not immunize you to them. Daniel Kahneman actually effectively retracted one chapter of his book which contained an obviously statistically under powered study. Even the experts are demonstrably vulnerable to confirmation bias and other cognitive biases.
Paradoxically, knowledge of cognitive biases including the extreme rigidity of the worldview provides a powerful set of tools to dismiss obviously contradictory experiences, evidence, and data. “I know about these biases and those nitwits over there do not. The data was cherry picked etc. etc.”
One should be cautious about accusing people of deliberate lying or gross stupidity where the worldview is likely involved as highly intelligent, educated “True Believers” are truly unable to see, accept or otherwise process contradictory data or even personal experiences. Nor is this unusual or pathological behavior. Most of us are True Believers in something.
One should also keep in mind the parable of the Blind Men and the Elephant when different worldviews seem grossly incompatible. Each blind man, having never encountered an elephant before and touching a different part of the elephant, describes the elephant as like a “snake”, “a tree”, “a sharp spear”, etc. It may be that each contradictory worldview is substantially incomplete. All have some truth and all are wrong.
In the case of the worldview, in contrast to other beliefs, we are unable to treat it as provisional, as possibly wrong. We are True Believers in our worldview and our group is obviously right.
(C) 2023 by John F. McGowan, Ph.D.
About Me
John F. McGowan, Ph.D. solves problems using mathematics and mathematical software, including developing gesture recognition for touch devices, video compression and speech recognition technologies. He has extensive experience developing software in C, C++, MATLAB, Python, Visual Basic and many other programming languages. He has been a Visiting Scholar at HP Labs developing computer vision algorithms and software for mobile devices. He has worked as a contractor at NASA Ames Research Center involved in the research and development of image and video processing algorithms and technology. He has published articles on the origin and evolution of life, the exploration of Mars (anticipating the discovery of methane on Mars), and cheap access to space. He has a Ph.D. in physics from the University of Illinois at Urbana-Champaign and a B.S. in physics from the California Institute of Technology (Caltech).
John F. McGowan, Ph.D. solves problems using mathematics and mathematical software, including developing gesture recognition for touch devices, video compression and speech recognition technologies. He has extensive experience developing software in C, C++, MATLAB, Python, Visual Basic and many other programming languages. He has been a Visiting Scholar at HP Labs developing computer vision algorithms and software for mobile devices. He has worked as a contractor at NASA Ames Research Center involved in the research and development of image and video processing algorithms and technology. He has published articles on the origin and evolution of life, the exploration of Mars (anticipating the discovery of methane on Mars), and cheap access to space. He has a Ph.D. in physics from the University of Illinois at Urbana-Champaign and a B.S. in physics from the California Institute of Technology (Caltech).