This is the first of a planned two posts that I've been mulling over and sporadically writing notes to myself about over the last year or so. This first post is a very brief overview of some important concepts and well established results of psychology that help me think about my thinking. The second post is about valuing scientific expertise and why our non-expert default position should be to believe scientific consensuses, or at the very least, to be comfortable in saying, "I don't know". Both include a lot of links to further reading and learning if you're so inclined.
Since the ascendancy of Donald Trump to the White House, and now with the coronavirus pandemic, I've noticed a lot of weird beliefs and thinking take hold across social media. Common to these groups is the idea that "the media" and other non-specific all-powerful groups or people (the "deep state", Bill Gates) are manipulating us with their "fake news", false narratives and sinister motives; that scientists and so-called "experts" can't be trusted. Unlike me, this new group of free and critical thinking peoples see the truth; the scales have fallen from their eyes (not that they are being manipulated by an authoritarian dictator and a disinformation campaign). I, on the other hand, as someone who tends to take experts and journalists who have a proven track record at face value, am one of the “sheeple”.
Arguing with people who have fallen down the YouTube and Facebook rabbit hole about their specific beliefs and claims is exhausting and likely to backfire. As such, I thought I'd take a different tack and discuss how I think about critical thinking vs non-critical and pseudo-critical thinking, and the importance of valuing expertise.
If I have sent you a link to this post, it's because I know I won't convince you about whatever it is we're disagreeing about. I'm not even going to try. Forget about whatever it was we were discussing as it’s not specific to what follows. What I hope to do is to show how I think about any claim, how I try to reflect on my own thinking (I'm not always successful), and in another post, why I think we should all value and respect expertise.
Critical thinking vs non-critical and pseudo-critical thinking
First, I think it's important to distinguish between the three different modes of thinking that to some degree we all use to come to a point of view — critical thinking, non-critical thinking (see Kahneman's slow and fast thinking) and pseudo-critical thinking.
Critical thinking isn't a "thing" to be defined, but rather a mindset linked to a particular set of knowledge and skills that can be honed and developed over time. A key characteristic of critical thinking is its goal, which is not to “win” an argument at all costs, but to “seek the truth”. Truth seeking involves both a habit of mind (a disinterested search for truth), a set of knowledge and intellectual skills, and criteria by which we can feel confident in our knowledge claims.
Non-critical thinking is unconscious and passive, typically involving "going with my gut" and "common sense", as well as regurgitation of beliefs that are accepted a priori, without justification. Non-critical thinking doesn’t really have an intellectual goal, it's just what we do most of the time in most situations to make everyday decisions and go about our lives. It's only an issue when it is applied to claims that we would like other people to believe or when making important decisions. Asserting that we like the smell of coffee does not require justification, whereas stating that coffee causes cancer does.
Pseudo-critical thinking dresses itself in many of the characteristics of critical thinking, using most of the techniques, but the goal is reversed. That is, the goal is to "win" at all costs and critical thinking knowledge and skills are employed to this end.
Critical thinking requires some understanding of informal (and to a lesser extent formal) logical fallacies and applying them to our own (and other people’s) reasoning. Non-critical thinking typically means not even knowing about fallacies and unconsciously employing (or falling for) them. Pseudo-critical thinking occurs when one knows about fallacies and deliberately employs them in order to "win" an argument.
We don't need to know a list of fallacies off by heart to be a critical thinker, but we do need to scrutinise our own arguments and avoid making non-sequiturs. We should also try to apply the principle of charity when evaluating the arguments of others, rather than immediately trying to tear them down (I'm the first to admit this is easier said than done).
Neuropsychological humility and cognitive biases
Critical thinking requires an understanding of cognitive biases and all the ways our brains can fool us, and applying this understanding to our own beliefs and experiences. For example, as a consequence of understanding the sunk cost fallacy, successfully developing sufficient self-control to not finish the entire meal if we're full, even if we've paid for it.
Non-critical thinking often involves a naïve belief about how our brains and senses work, for example, taking an interpretation of an experience or eye-witness recollection of an event at face-value, as if our brains are a video recording. Our memory is incredibly unreliable.
Pseudo-critical thinking deliberately uses cognitive biases against us. For example, the ingroup bias is used by politicians and the media, and the anchoring effect by marketers, to manipulate us all.
How much do you know about your (and my) cognitive biases? My personal favourite, that I regularly catch myself in the act of doing, is the fundamental attribution error.
Metacognition vs motivated reasoning
Critical thinking requires disinterested reasoning; avoiding as much as humanly possible investing in a particular belief or conclusion. As seekers after truth and critical thinkers, we should invest in the process of truth seeking, logic and reason, well-established scientific evidence, and intellectual honesty, not in any specific answer. To do this we must reflect on our own reasoning and try to ensure we are "married to the process", even seeking evidence to disconfirm our favoured idea and celebrating the times we do change our minds.
Non-critical and pseudo-critical thinking involves being "married" to a predetermined answer, consciously or unconsciously seeking confirming evidence and dismissing counter-evidence to maintain the favoured narrative. This typically occurs when we have an ideological position or belief, where we engage in non-critical unconscious motivated reasoning and confirmation bias in order to minimise cognitive dissonance — the treacherous trio of biases. Pseudo-critical thinking is conscious motivated reasoning, typically in defense of or attacking an ideological belief system, and especially in professions such as politics and law. The predetermined conclusion is all that matters.
A good trick to test our own motivated reasoning is to employ a substitution, e.g. substitute “Obama” for “Trump” and think about how we'd react. This can quickly reveal that we're holding a position not because of evidence or logic, but rather, because of the nature of the conclusion or who is making the argument. Another technique is to be our own Devil's advocate. Deliberately be the opponent, skeptical of our own position and try to find flaws in our own reasoning.
If you're looking for a place to start, besides: www.skepticsfieldguide.net, there's: yourlogicalfallacyis.com and www.fallacyfiles.org; and for a bit of a deeper dive, there's my (shameless self-promotion) book and podcast Hunting Humbug 101.
Two good places to start are The Cognitive Biases Tricking Your Brain and yourbias.is. And this quiz can provide a little insight into underlying beliefs that help shape your positions on a number of cultural and societal issues.
If you want to go deeper into how bad our brains are at thinking, I highly recommend the following books:
· Thinking, Fast and Slow, by Nobel prize winner Daniel Kahneman
· Predictably Irrational by Dan Ariely
· How We Know What Isn't So, by Thomas Gilovich
My No. 1 recommendation
The Skeptics' Guide to the Universe: How to Know What's Really Real in a World Increasingly Full of Fake, which covers all of the above and more in great detail.
If you really want to test yourself and minimise erroneous thinking, check out this free course offered by the University of Michigan — Mindware: Critical Thinking for the Information Age. This course gets you to apply basic concepts of statistics, probability, science and psychology to everyday life. You will learn how to critique reports of scientific findings in the media and the most pervasive and important cognitive biases, which often produce erroneous judgments.