The Razor Blade in the Apple - Part I
Scientific consensus-making is the process of navigating fears.
To start things off in this commonplace book, I’d like to do some housekeeping: I’m going to offer some parting thoughts on a topic that was ostensibly the subject of the podcast over the past four of years, although we often ventured off course. In our defense, we did say from the get-go, when we started “Book 2: Race – Is That a Thing?”, that we were more interested in the larger metaphysical question, What makes anything a “thing”? But today I’ll give some context and then in Parts 2 and 3 I’ll confine myself to one narrow subject: the current scientific stance on race.
Creating a scientific consensus
and buttressing it from constant siege is a massive enterprise. In fact, that might be a decent rough-and-ready definition of “science.” But sometimes all that buttressing and defending winds up being more political than what Karl Popper ever had in mind in his vision of falsifiability as the main criterion of empirical investigation.
In the political sphere, we have learned over the past few years that the sort of open exchange of ideas J.S. Mill advocated might not be an unalloyed good. Back in 2018, Supreme Court Justice Elena Kagan wrote in her dissent to the Janus v. AFSCME decision—which held that requiring union contribution payments from public-sector employees who disagree with political stances espoused by those unions was a violation of the employees’ free speech rights—that the decision was one more instance of American conservatives “weaponizing the First Amendment.” Supposedly a principled defense of individual rights, the decision hobbled unions’ ability to collectively bargain in all public servants’ interests—including those who didn’t want to pay. Whether the motivating factor behind such claims of restricted speech is, like in the Janus case, greedy cynicism (which always seems to redound to the benefit of petty plutocrats); or misguided identity politics (which only seems to lend an ear to members of groups with entrenched power; or some perverse marriage of the two; I think it is indisputable that an absolutist embrace of free speech undercuts other values—maybe even values that should have a higher priority. After all, the Janus case is a jurisprudential codification of a nifty rhetorical trick (saying unconscionable things, and then claiming your own rights are being violated when people castigate you) that gets wielded all the time by folks who want to spread lies and bullshit.
So a lack of faith in open communication among those who hold this “weaponization” of free speech in contempt is understandable. Nonetheless, efforts to curb disinformation or misinformation always feel heavy-handed and undemocratic. In that vein, Yascha Mounk worries that
“well-intentioned people can use these very real concerns about misinformation or disinformation to erect an effective censorship regime, which actually often will misfire. And that can have bad substantive consequences if it means that certain truths actually get banned in the name of fighting misinformation. And also, of course, that only serves to delegitimize mainstream institutions even more.”
The thing that compels everyone—whether we’re talking about those flooding the zone with shit or those hoping to clean the shit up with censorship—is the fear of risk and uncertainty.
People really don’t like
risk and uncertainty. That’s trivial, I know. (Perhaps you were also unaware that the Pope is, in fact, Catholic.) But it is a legitimate fear: How should we deal with uncertainty? Documented cases of someone sneaking a foreign object into an apple are vanishingly rare. So, unless you go out of your way to swallow a Wilkinson Sword, your chances of dying of razor-blade consumption are pretty slim. (In fact, all claims of tainted treats are persistent fictions.) The consensus of those among us whose blood runs a little cooler is that we should draw a lesson from statistical knowledge: Be not afraid. It is irrational to let yourself be consumed by fear of things that are not likely to happen.
But there are two formidable responses to that argument. First, “rationality” is not necessarily defined as strict adherence to the rules of probability. It has become old hat to point out that we are not constrained to the behavioral traits that define “rational” individuals in game theory. And second, it is easy to be nonchalant about potential threats when they do not materialize. You have less reason to worry about something if you never personally see any of its negative consequences. But reciting the statistical improbability of a child dying from consuming over-the-counter medicine coated in poison would be cold comfort if your child did in fact die in the Chicago Tylenol poisoning cases of the 1980s. Similarly, no one likes to imagine what thoughts go through the minds of people who are about to die in an airplane crash, but I doubt many of them mumble to themselves, “Well, this sucks, but if I could do it over, I’d still get on this plane, because the likelihood of its crashing was so low.”
Individual subjective experience has an immanence that overmatches the weaker pull of generalized rationality. Just a little bit of pain, genuinely suffered by someone, somewhere, is grounds enoughto arouse our fear. And it is possible to hold this fear—let’s call it the fear of the razor blade in the apple—up to scrutiny and regard it by turns with a sympathetic and a critical eye.
The past two years of the pandemic can leave no doubt that the same dynamic of bullshit vs. censorship-of-bullshit plays out in the realm of science communication. As I reported in Book 1 Episode 2: “Party and science and Bullsh*t,” Dan Kahan and his team at the Yale Cultural Cognition Project have spent the last 15 years investigating this phenomenon by using opinion surveys on disputed matters of fact—matters like climate change and the efficacy of vaccines—and providing respondents with increasing amounts of new scientific information. And they have repeatedly encountered the same disconcerting dynamic: When people are already inclined to use a heuristic like cultural affiliation to help interpret data, giving them new scientific information just makes them dig their heals in.
Kahan has noted that people who express greater scientific curiosity are less prone to this behavior, but in a pluralistic democratic society, not everyone is scientifically curious. His policy proposal to deal with this crisis is to treat polarization itself as a sort of virus and to control the timing and release of scientific communication.
I have to side with Yascha Mounk: Kahan’s seems like a pretty chilling solution. People faced with uncertainty and risk tend to resort to identitarian ideas to avoid thinking about that risk. Fine. But, when faced with the risk of that very phenomenon spreading through society, for authorities to then react by stifling free and open communication seems ironic at best, and tragically misguided at worst. Looking back at 2020, it’s hard to imagine former President Trump’s handling of the pandemic having been worse, but consider what misinformation he would have spread if there had been career civil servants whose mandate was to "clean up" the "pollution" in the science communication environment.
I submit that sometimes scientists—especially in their public-facing positions, but also in the very process of forming scientific consensus—are prone to the fear of the razor blade in the apple. They fear the motives of malefactors among them who are offering something supposedly innocuous but in fact quite nefarious.
In Part 2, I will offer my best and most honest effort to explain just such a phenomenon; one that has been seething among scientists, philosophers, and historians for decades at the subcutaneous level—with unsightly rashes erupting in public view every so often. . . .