We’ve all heard the phrase “you’re entitled to your own opinion, but not your own facts.” Opinions are the sorts of things about which we can take a poll. They are sometimes well-informed, but rarely expected to be anything other than subjective. Facts, on the other hand, are “out there” in the world, separate from us, so it makes little sense to ask people what they think of them. As the comedian John Oliver so aptly put it in commenting on a recent Gallup poll that found that one in four Americans disbelieve in climate change: “You don’t need people’s opinion on a fact. You might as well have a poll asking: ‘Which number is bigger, 15 or 5?’ Or ‘Do owls exist’ or ‘Are there hats?’”
With the United Nations’ conference on climate change set to begin in Paris this month, and the presidential election only a year away, we are about to be steeped in political arguments on every conceivable issue, all carried out with the usual confusing mix of fact, opinion, opinion stated as fact and fact portrayed as opinion. How can we prepare ourselves to make sense of it?
A good first step would be to distinguish between skepticism and what has come to be known as denialism. In other words, we need to be able to tell when we believe or disbelieve in something based on high standards of evidence and when we are just engaging in a bit of motivated reasoning and letting our opinions take over. When we withhold belief because the evidence does not live up to the standards of science, we are skeptical. When we refuse to believe something, even in the face of what most others would take to be compelling evidence, we are engaging in denial. In most cases, we do this because at some level it upsets us to think that the theory is true.
The throes of denial must feel a lot like skepticism. The rest of the world “just doesn’t get it.” We are the ones being rigorous. How can others be so gullible in believing that something is “true” before all of the facts are in? Yet a warning should occur when these stars align and we find ourselves feeling self-righteous about a belief that apparently means more to us than the preservation of good standards of evidence. Whether they are willing to admit it or not — perhaps even to themselves — denialists often know in advance what they would like to be true. But where does that leave the rest of us who think that our own beliefs are simply the result of sound reasoning?
As Daniel Kahnemann so beautifully demonstrates in his book “Thinking Fast and Slow,” the human mind has all sorts of wired-in cognitive shortcuts that can feel an awful lot like thinking. Within the dark recesses of confirmation bias, an entire field of academic inquiry (behavioral economics) now proposes to explain whole swaths of human behavior based on such mental foibles. And entire television news networks now make their living through exploiting this by telling us exactly what we want to hear.
So how to tell a fact from an opinion? By the time we sit down to evaluate the evidence for a scientific theory, it is probably too late. If we take the easy path in our thinking, it eventually becomes a habit. If we lie to others, sooner or later we may believe the lie ourselves. The real battle comes in training ourselves to embrace the right attitudes about belief formation in the first place, and for this we need to do a little philosophy.
We hear a lot of folks in Washington claiming to be “skeptics” about climate change. They start off by saying something like, “Well, I’m no scientist, but …” and then proceed to rattle off a series of evidential demands so strict that they would make Newton blush. What normally comes along for the ride, however, is a telltale sign of denialism: that these alleged skeptics usually have different standards of evidence for those theories that they want to believe (which have cherry picked a few pieces of heavily massaged data against climate change) versus those they are opposing.
Surely few would willingly embrace the title of “denialist.” It sounds so much more rigorous and fair-minded to maintain one’s “skepticism.” To hold that the facts are not yet settled. That there is so much more that we do not know. That the science isn’t certain. The problem here, however, is that this is based not only on a grave misunderstanding of science (which in a sense is never settled), but also of what it means to be a skeptic. Doubting the overwhelming consensus of scientists on an empirical question, for which one has only the spottiest ideologically-motivated “evidence,” is not skepticism, it is the height of gullibility. It is to claim that it is much more likely that there is a vast conspiracy among thousands of climate scientists than that they have instead all merely arrived at the same conclusion because that is where they were led by the evidence.
Couldn’t the scientists nonetheless be wrong? Yes, of course. The history of science has shown us that any scientific theory (even Newton’s theory of gravity) can be wrong. And it is helpful to remember that not every field that claims scientific status — like certain branches of the social sciences — necessarily deserve it. But this does not mean that one is a good skeptic merely for disbelieving the well-corroborated conclusions of science. To reject a cascade of scientific evidence that shows that the global temperature is warming and that humans are almost certainly the cause of it, is not good reasoning, even if some long-shot hypothesis comes along in 50 years to show us why we were wrong.
In scientific reasoning, there is such a thing as warrant. Our beliefs must be justified. This means that we should believe what the evidence tells us, even while science insists that we must also try our best to show how any given theory might be wrong. Science will sometimes miss the mark, but its successful track record suggests that there is no superior competitor in discovering the facts about the empirical world. The fact that scientists sometimes make mistakes in their research or conclusions is no reason for us to prefer opinions over facts.
True skepticism must be more than an ideological reflex; skepticism must be earned by a prudent and consistent disposition to be convinced only by evidence. When we cynically pretend to withhold belief long past the point at which ample evidence should have convinced us that something is true, we have stumbled past skepticism and landed in the realm of willful ignorance. This is not the realm of science, but of ideological crackpots. And we don’t need a poll to tell us that this is the doorstep to denialism.
Lee McIntyre is a research fellow at the Center for Philosophy and History of Science at Boston University and the author of “Respecting Truth: Willful Ignorance in the Internet Age.”
Follow The New York Times Opinion section on Facebook and on Twitter, and sign up for the Opinion Today newsletter.
No comments:
Post a Comment