This is the second of two posts that go hand-in-hand. The first post is a brief overview of some important concepts and well established results of psychology that help me think about my thinking. This second post is about valuing scientific expertise and why our non-expert default position should be to believe scientific consensuses, or at the very least, to be comfortable in saying, "I don't know".
You are not an expert (and neither am I)
As Socrates is said to have said, wisdom is knowing that you don't know. A big part of critical thinking is intellectual humility. While it is possible to become an expert at something, it takes years and years of training in a discipline. Think about something that for you this applies to. Think about all the time, hard work, practice and dedication that it took to earn that expertise. Think about all the mistakes you learned from and the pitfalls you now know how to avoid, On that one thing you're probably pretty safe talking at length about all that you know. And when you meet other experts, they can easily recognise you know what you're doing or talking about.
Now think about non-experts and novices trying to talk to you about it. How much credit do you give their views? Well, that's you (and me) when it comes to every other field of human endeavour. It's the ultimate form of arrogant overconfidence to assume your opinion on a subject that you don’t have expertise in has much, if any, merit. And when you dismiss expert consensus out-of-hand, by definition, your dismissal of something you know nothing about is worthless. In fact, it says everything about you and nothing about the experts or their view(s). Imagine a non-expert novice dismissing your views out of hand when discussing a subject you are an expert in?
Scientific (and expert) consensus
A scientific consensus is more than scientists within a field being surveyed for their beliefs or tallying their views in research papers. It relates to what the researchers in a field are interested in studying and why. When there is a question to be answered or a problem to be solved, they'll be researching it and arguing about it; i.e. doing science. But when a question is answered to sufficient satisfaction, e.g. anthropogenic climate change, evolution by natural selection or the second law of thermodynamics, there is little or no incentive to keep researching this question — it's done. Scientists operate within well-established paradigms for a reason — they're well established based on the convergence of theory with data.
Now, on occasion "settled science" is accused of being wrong, as if "science" itself has failed. People who make this claim basically know nothing about the history and nature of science, and are typically guilty of an exaggerated conflict. Modern science is a collaborative endeavour, "far too complex to be understood, let alone experimentally verified, by any one person". Yes, paradigms can shift, but it's through doing more science, not thanks to some "maverick researcher" who thinks they're the next Galileo or Tesla.
Most of the time, especially in the last 50 years or so, you don't tend to get "wrong" science, but rather incomplete science. For example, there are theories and predictions that are not detailed, accurate or precise enough, or have oversimplified a phenomenon. Over time, however, more data and better models and technology change and improve our understanding (think climate change), to the point that previous predictions or recommendations can even be overturned.
There have been cases where science has gotten it completely wrong. But we only find this out by doing more science. To get to the level of expertise to be able to do this requires dedication and hard work. So, unless you have dedicated yourself to doing this hard work, your opinion carries little weight. Thus a critical thinker's default position is to believe the experts. They're the ones who are smart enough and dedicated enough to do the hard work that you're not able or prepared to do. Here's a nice overview of what it takes to develop that expertise.
Another way of thinking about this is that you almost certainly do align with the scientific consensus with many topics, so why those but not others? If, for example, your views align with the scientific consensus on anthropogenic climate change, but not the consensus on the safety of genetically modified organisms (GMOs), vaccine efficacy and safety, health effects of mobile phone radiation, or even how to handle a worldwide pandemic, ask yourself, "Why is that the case?" I'm fairly confident in assuming that you lack the expertise to evaluate any of these scientific fields directly. I know I don’t have the expertise, and I have a science degree and taught high school physics and maths. (You can see the gap between scientists and the general public with a number of scientific questions here.)
So why would you take a position that agrees with one consensus, but not another? Often the rejoinder to this point is some variant of, "No, I do believe in science, but this is a special case where this field is hopelessly compromised", e.g. "But research into GMOs is beholden to corporate interests / evil Monsanto and can't be trusted". Well, for whatever field you do believe the scientific-expert consensus, there are very likely people who use the exact same argument to try and deny it, e.g. "Climate change research is beholden to green/leftist ideology and can't be trusted".
Why is your special case special to you and not to others, and vice-versa? An obvious reason is it's because of an ideological belief that puts the conclusion first (see motivated reasoning). This was me, by the way, when it came to GMOs back in the late nineties and early naughts (thanks David Suzuki). But because I care more about the process of science rather than a predetermined conclusion, I eventually changed my mind. I also recognised the fundamental inconsistency in believing climate scientists but not believing GMO researchers. (Finding out that the details of all the most famous anti-GMO claims were outright lies or deliberately distorted propaganda helped move me on pretty quickly too.)
This isn't to say you can't or shouldn’t have opinions about stuff, but the default should be to stick with the expert and scientific consensus (even if you believe it for the wrong reasons, such as common sense). And if you can't bring yourself to that, then it should be, "I don't know enough about that topic". None of us are immune for thinking we're more knowledgeable than we really are (yes, me included). Given this knowledge we should try to do our best to have some intellectual humility, lest we fall victim to Dunning and Kruger.