Following the Science in Leaps and Bounds

When people don’t fully understand a particular branch of science or scientific inquiry (which we imagine must be most non-scientists), “following the science” comes down to making an informed leap of faith. There is nothing wrong with that. Leaps of faith are necessary on a daily basis in every part of life. Without them, we would be unable to function as individuals or as a society. However, we want our leaps to be as short, unstressful, and error-free as possible. We might speak about life as a series of cautious assumptions and educated guesses because it is impossible to know very much with absolute certainty.

For example, I will take NASA at its word when it tells me something about Mars. And I will believe my lawyer when he tells me my best chance is to settle out of court. This is because I have more confidence in NASA when it comes to space and in my lawyer when it comes to lawsuits than I do in myself (or in other non-experts) regarding those areas. The leap of faith I have to make seems small and therefore less subject to error because I know NASA is an expert space organization and my lawyer has a professional license to practice law. I could persuasively cite NASA in a paper on space and my lawyer in a paper on litigation.

Conversely, I will not reference NASA on settling a lawsuit or my lawyer on exploring Mars. They might have opinions about those things, but because they have no authority to speak professionally about them, my leap of faith in the credibility of their claims would be too great, stressful, and subject to error. I might enjoy their opinions, but I wouldn’t cite them as documentation or support in a paper.

Opinions outside one’s field of expertise carry far less weight. When I taught college-level rhetoric, I’d talk to students about the true purposes of legitimate sourcing and documentation in their essays—not primarily to provide additional reading or resources, but to establish credibility and authority on the part of the writer and, by extension, within his or her claim structure.

You can claim anything in a paper, but you will only be persuasive if you can support those claims with authoritative references (where the leap of faith you’re asking the reader to make is small and easy). If I want to say something about Mars, I will show you how NASA agrees with me. If I want to make a point about an aspect of law, I will show you how my lawyer wrote an article on it in The American Lawyer. Their expertise, authority, and credibility will give my argument an aura of expertise, authority, and credibility. This is a powerful aspect of persuasive rhetoric. We encounter it all the time, formally and informally.

Unfortunately, when it comes to “following the science” about Covid, the authority of scientists and national health experts has been eroded by a range of political and social counter-arguments, usually employing what we call the fallacy of “Faulty Comparison.” Faulty Comparison is bad logic that draws an equals sign between things that should not be presented as equal.

Using the above example, if I wrote, “NASA says that Mars rocks are highly radioactive, but my lawyer says they aren’t. Now it is unclear who to believe,” it wouldn’t be hard to see the bad logic. I’m making a Faulty Comparison between what NASA thinks about space and what my lawyer thinks about space. Then on the basis of that faulty comparison, I’m claiming it is impossible to tell who is more credible. One opinion is clearly credible (that of NASA) and has persuasive weight. The other (that of my lawyer) does not. They should not be presented as persuasively equal. And there should be no confusion about where the shorter, less stressful, and less error-prone leap of faith can be made.

But if I use a politician or faith leader to attack the expertise of NASA, it’s a bit harder to spot the fallacy: “NASA says Mars rocks are highly radioactive, but the President and Reverend Osteen both disagree. What, then, can we safely believe?” That’s still bad rhetoric, but it widens the necessary leap of faith and generates stress in the audience, especially if the audience strongly supports the President and Reverend Osteen. The politician’s and minister’s expertise are being presented as carrying equal weight about Mars as that of NASA on the subject. It’s an example of Faulty Comparison, but it’s slightly hidden.

Trump and his staff made a lot of Faulty Comparisons during his Administration, claiming “fake news” and “alternate facts” as a way of neutralizing negative press and keeping their political base activated and incensed. They tried to make necessary and appropriate leaps of faith as difficult and stressful as possible by politicizing Covid data and playing on the already existing suspicions that academics and experts are inherent leftists or even crypto-Marxists (which isn’t always false but isn’t as uniformly true as many on the right seem to believe) acting in bad faith.

Asking Trump or Kellyanne Conway or Biden or Pelosi about the nature or behavior of Covid is like asking your lawyer about Mars. Their political and bureaucratic authority does not translate into scientific authority. Putting faith in their pronouncements about the virus is not the same as putting your faith in the Center for Disease Control on the subject. This also includes questions of mask protocol and vaccines.

Rhetorically, the leap of faith is much smaller when you do “follow the science,” even if it’s still an act of faith, an assumption that someone knows more or is better than you when it comes to a subject in which you are ignorant. By sourcing the most credible authorities, you are, in effect, asking NASA about Mars and your lawyer about law. You are making the most reasonable assumption, the most educated guess about a subject you do not understand.

Rhetorical Edgelordism and the Summary Dismissal

[Edgelord:] Even from its earliest uses, the word carries the connotation of eye-rolling skepticism.  The edge in edgelord comes from expressions like cutting edge or the idea of being edgy, applying a sense of boldness or unconventionality to such behavior; the lord half elevates such a person ironically with the rank of a deity or member of British nobility, with echoes of Voldemort, Sauron, and other dark-spirited, villainous characters who hold that title. — “Doing the Work of the Edgelord,” Merriam-Webster.com

Lately, on political news blogs, Twitter, and Facebook, we’ve been seeing a lot of summary dismissals of arguments, particularly those which are racially or pandemically charged.  This might suggest people are more stressed out than ever.  One rarely sees argumentative moves like this when times are calm, even in the divisive cesspools of social media and in the freewheeling comments areas still permitted by news sites. 

Only when people begin to crack under sweeping emotional strain do they start to become rhetorically evasive and nihilistic.  They want to appear as though they’re open to reasoned discussion and debate, but really they want to close down the conversation and talk about their cats.  In a sense, I don’t blame them.  We’re in a very emotionally difficult moment right now.  And no one wants to admit to having an exploding head.  

We might classify this particular evasion as a form of “rhetorical edgelordism”—an attempt to disingenuously self-protect by dismissing an argument while also trying to seem like the smartest, most incisive person in the room. 

If someone says, “It could be A or it could be B,” the edgelord adds, “No, A and B are a false choice because C,” which invalidates them, ostensibly ending the discussion.  Usually the person bringing C is upset with having to choose between A or B and wishes to redefine the choice as (A vs B) vs C—where C is much less controversial, threatening, or applicable.

C is usually something exotic. In order to function as a blanket dismissal, C can’t use the ideas from A or B (because then it falls into the scope of original discussion).  It has to be from a distant discipline or sphere, so far outside the purview of A or B that the core argument gets derailed. 

Here’s an example: “COVID-19 originated in fruit bats” (A) vs. “It was bio-evolved in a Chinese lab” (B). Then (C) pops up: “Actually, statistics have shown social attitudes to pandemics track according to political party affiliation, if you want to talk relevance when it comes to the virus.”  Ironically, C itself is immensely and obviously irrelevant to what’s being talked about.  But unless it is instantly ignored by everyone, it’s work is done.

People who see this move might point out the scope creep.  But by then the thrust of the original discussion has already fractured.  In our example, we’re now talking about at least 3 issues: (1) the bat theory vs the lab theory, (2) the new political party theory, and (3) whether the new political party theory matters or is an irrelevant digression.  Now it’s much easier for the edgelord to divert the argument, self-soothe, and still pose as the edgy freethinker not caught up in the preoccupations of A vs B conformist thinking.  At this point, we’re about three or four rhetorical steps away from looking at a jpg of his cat, Waffles.

In healthy discussions (with psychologically healthy people), this is sometimes called “reframing the issue,” and it’s a perfectly legitimate way of clarifying a subject under consideration—when it focuses on getting at a deeper point significant to A and B.  In the example, this might be something like, “The issue of whether the virus originated in fruit bats or in a lab actually raises the deeper question of whether determining the origin will matter to developing a vaccine.”  Here, the reframe is aiming at a link between both A and B and trying to enhance and clarify the discussion by pointing that link out.  The test is relevance: A and B are both compelling because they are interested in how we know and therefore can control the global outbreak.  But when reframing is done as a way to distract and dismiss by bringing in an extraneous consideration, there are usually disingenuous motives at work.

People who didn’t live through the online evolution of bulletin boards, newsgroups, and discussion forums (all of which disappeared eventually into the reeking maw of social media), might not recognize this tactic as a largely online way of posturing and pseudo-arguing.  Like most rhetorical strategies born in the disinhibited, critical-thinking-starved world of the internet, it’s largely an empty, counterproductive tactic, an emotional time and energy sink best avoided.

Still, during a lockdown, when we’re spending more of our lives online as opposed to in person, pointing these things out might be worthwhile.  They’re no longer the sole province of trolls, basement dwellers, loudmouths, and fakes.  As we move toward the 2021 US Presidential election, social tensions flare, and the virus dances in the streets, stress levels are likely to soar.  And, in cases where public discourse is critical, we might even see close friends and family posing as the edgelord in the room while surreptitiously looking for the exit.

Tiredness, Truth, and Mockery: the American Way

...
Early rendition of Alfred E. Neuman, 1908.

Today, I wonder whether I should re-think some of my ultra-liberal biases and attendant leftist news consumption.  This is good.  But, man, I’m beat.

The alt-right (and the radical religious right) to me seems like a uniquely American expression of deep stupidity but, of course, I would say that. Look at my demographic: college educated, democrat, fiction writer, from Southern California, who’s been an expat for almost a decade. Of course, I think Trump is the worst thing that could have possibly happened to the world. Of course, I wanted Bernie but voted Hillary. Of course, I want net neutrality. Of course, I support many (but not all) positions taken by the ACLU. Of course, I believe that, in an earlier era, Obama would have been considered a moderate republican. Of course, I have a problem with drones, civilian casualties, the terrific scope creep of the Patriot Act, and the “war on drugs.” Of course, I care about my country.

If I didn’t think the Green Party was run by bumblers, I would probably join. I’m pro-choice, pro-Planned Parenthood, and I support gay marriage. I think many of these things should not even have to be controversial in a liberal democracy. I dream of a day when there will be universal healthcare and free college tuition. I think climate change is one of the most, if not the most, serious issues we face today. But the truth is that most of these biases and beliefs can be (and are) predicted by an algorithm. The even sadder truth is that I only have so much energy I can devote to fact checking and being outraged. This is a problem. Tiredness is a problem.

The problem is not that there is a right answer we have to find. The problem is that uncertainty and complexity are exhausting over time, especially when you’re necessarily engaged in other things. Most Americans are not, actually, stupid. They’re invested in certain areas–mostly job and family–and in most other respects have a general (superficial) understanding of the world, including political issues and identifying yellow journalism, confirmation bias, and what passes for fear mongering click-bait. I have also seen this in European and Asian countries, relative to various cultural differences and levels of education. The USA doesn’t own “stupid.” Every country with a powerful media has a horse as a proconsul somewhere. The difference is that the States likes to put its toga-wearing horses on display, whereas other countries have not. But this is changing.

In an enormous post-industrial society, you will have many levels of political, historical, and economic awareness and many opinions emerging constantly in the news media. You will also have crackpot theories; secessionism; separatism based on race, religion, and / or gender biases; conspiracy paranoia; multi-directional shaming; late night talk show infotainment; social justice fanatics; religious absolutists; new age hucksters; ambulance chasers; a continuous horde of cynics; doom-saying historians looking for their 15 minutes; the resurgence of failed orthodoxies (like Nazism, ethno-nationalism, and whatever Steve Bannon happens to be reading); and the all-encompassing opportunism that feeds off these things. What you won’t have is a simple black-and-white truth. You will have truthiness.

To live in an information society infected with truthiness is extremely taxing. But just as there is no black-and-white truth, there is no easy solution. A friend of mine has suggested “slow news” as opposed to internet news feeds. It seems like there are some merits there. But slow news does not necessarily safeguard against yellow journalism, which has been around since newspapers could fold. In many ways, the 24-hour news cycle and its problematic presence on social media makes it harder for governments and corporations to spin interpretations in their favor. We should be grateful for the ineptitude of Sean Spicer and the alacrity with which he and his boss are covered by the press corps.

I don’t have answers. I don’t think there is a single version of what is true—at least not one that can be had through the media. But I also don’t think the cross-eyed chants of “burn it down” and “fuck your feelings” have done any good. They helped Trump get elected as president, and he has thus far made a mockery of America. The left understandably wants him gone. The GOP wants him to calm down and let them get on with the kleptocracy. His base supporters are currently upset because he bowed 5 inches to receive an award in Saudi. Some of his supporters are no doubt upset that the Reich hasn’t yet emerged in all its glory. I suspect they will still be upset when he gets impeached.

“Nothing is an absolute reality; all is permitted” – Hassan-i Sabbah

Nobody Knows It But Me

classroomLong ago, I was an English teacher at a private high school in central California. It was a good, if demanding, job and unlike many of my colleagues, I seemed to manage occasional moments of non-misery in the workplace. In fact, the two years I spent working there taught me more about human nature than two decades of university teaching, freelance writing, and working abroad ever did.

Without a doubt, teaching over 100 adolescents each semester schooled me not only in how people will behave when going through some of the worst years of their lives but the extent to which many parents are feverishly inadequate when it comes to raising kids. With respect to family, no one wants to admit they have no clue what they’re doing. Everyone must pretend things are running smoothly and they’re in complete control.

I found this pretense interesting, particularly during parent-teacher conferences when ashamed, bewildered parents would whisper, “What do you think I should do?” as if my ability to manage large groups of adolescents somehow qualified me to give them advice. At first, I would explain that my two degrees in English plus minor gifts in speaking in front of people and writing did not mean I had a solution to why Jimmy couldn’t sit still or why Leticia cried through every class and felt compelled to layer everything around her in Purell, or why Leo circulated pictures of his girlfriend’s vagina. Over time, I developed a less draining response: “I do not know.” All Quiet on the Western Front may prepare us to think about the meaning of war, but it will not prepare us for Leo’s girlfriend’s vagina.

I suspected then, as I still do, that confronting such situations is not within the job description of a high school English teacher. But maybe, in the hundreds of outrageous situations in which I found myself in that job, I could have done more. The questions I ask myself now are the questions many parents asked me then: what should I have done? Was there anything to be done at all? There must be an expert somewhere, a veteran administrator or someone with a PhD in education theory, who can speak to this. Maybe a prison psychologist.

I wish I could believe that. In spite of my lingering questions, I think I’ve come to believe the opposite: there actually are no rules—not just for teaching or parenting, but for any area of human experience. A friend once said to me when we were going through our own high school torment: “This is the meaning of life: we all suck and we’re nothing.” I don’t think he fully appreciated how profound that statement was when he said it. 27 years later, I’m still seeing it prove out.

We all suck: no one—and I mean this in the broadest, most inclusive, most general sense—actually knows what they’re doing to the extent that assumptions and judgment calls are unnecessary. Perfect human understanding does not exist and human error is ubiquitous. Even our attempts at artificial intelligence are subject to our limited assumptions about what intelligence actually is (or can be). What can we know beyond a shadow of a doubt? The truth is: nothing, unfortunately.

Surely an engineer will feel confident that, say, as energy is transformed or transferred, an increasing amount of it is wasted. Surely something as dependable and consistent as a physical law (in this case, the Second Law of Thermodynamics) is immutable, absolute, not a matter for interpretation. But even something as ironclad as a law of physics is not without its exceptions. Some things are givens within the parameters of a particular knowledge paradigm, but those givens are always relative to and dependent upon the parameters themselves.

For example, within the agreed-upon bounds of thermodynamic theory, basic laws obtain as a reliable set of rules for the behavior of energy, entropy, and temperature at thermal equilibrium. But we also know that even within that theoretical framework, an empirical finding like the Second Law is subject to exceptions. In 2002, researchers at the Australian National University, in a paper entitled, “Experimental Demonstration of Violations of the Second Law of Thermodynamics for Small Systems and Short Time Scales,” found that “systems can undergo fleeting energy increases that seem to violate the venerable law.” And while this is only one small example, it is by no means isolated or anomalous to the extent that we could dismiss all such exceptions out of hand.

In fact, our entire narrative of scientific progress is predicated on discoveries which alter existing assumptions about how the world works. As Thomas Kuhn observes in The Structure of Scientific Revolutions, “though the world does not change with a change of paradigm, the scientist afterward works in a different world.” The scientific narrative changes. Because it was always a narrative, never an unassailable, omniscient catalogue.

Nothing is beyond interpretation, not even the bedrock assumptions of our most materialistic sciences. Rather, ways of knowing amount to best possible premises always subject to discourse and development over time (to say nothing of the complexities of the information technology we rely on to document and transmit that discourse). We do the best we can. We develop and codify optimal principles for a given field. And then we work with those principles until we encounter a paradigm-disruptive discovery that forces us to revise our theories.

But we’re nothing: Even the most qualified and intellectually responsible claims are part of a conversation (discourse) which is grounded in work that came before and which will be superseded by discoveries and realizations that follow. In many cases, an individual contribution to any field is no greater than a minuscule inch forward with vastly disproportionate implications.

Still, there are careers to develop and Cessnas to purchase and grants to chase and colleagues to slander and books to write and mistresses to support and students to convince. In Polishing the Mirror, the guru Ram Dass—then a social psychology professor named Richard Alpert—describes what he felt was a hollowness at the center of western academia:

In 1961, I was thirty and at the height of my academic career. I had a PhD from Stanford University, and I was a professor of social relations at Harvard. I had arrived at a pinnacle of life as I thought it should be, professionally, socially, and economically. But inside there remained an emptiness—a feeling that, with all I had, something was still missing. Here I was at Harvard, the mecca of the intellect. But when I looked into the eyes of my peers, wondering “Do you know?” I saw in their eyes that what I was looking for was nowhere to be found. In a social or family setting, people looked up to me and hung on my every word because I was a Harvard professor, and they clearly assumed that I knew. But to me, the nature of life remained a mystery.

In Ram Dass’ use of the term, we “do not know” much about the world in any absolute sense. We cannot know because our intellectual tools are as finite as the time we have in which to use them. This is not to argue that we should be content with ignorance. But it is a way to foreground a simple suggestion: speculation is absolutely necessary when it comes to developing knowledge.

Assumptions are necessary. Ultimately, belief is necessary. Kuhn, at least, seems to agree: “Normal science, the activity in which most scientists inevitably spend almost all their time, is predicated on the assumption that the scientific community knows what the world is like.” This seems reasonable not just in science but in any field of human activity.

So what remains to be done if we can never fully know our world and ourselves? Everything! Our inability to attain perfect understanding is no reason to abandon the pursuit of outer and inner knowledge. Rather, it leads us to an acceptance of our limitations as individuals and as a species and, in that acceptance, a very personal yet very real sense of freedom.

Maybe the right answer to those parents who wanted advice should have been: you already know how to raise your kids because what you think is best will be the best you can possibly do. Maybe, as my high school friend seemed to imply back in 1989, we are not static, monolithic, isolate objects. We are no thing.

Instead, we are dynamic, dialectic, fluid collaborations—living syntheses of what was known in the past and what will be discovered in the future. Maybe “discourse” is the most accurate metaphor for human experience. If so, all we can do is try to engage in life’s conversation as robustly as possible. Maybe there are no rules beyond that.

“Baby,” I said, “I’m a genius but nobody knows it but me.” 
― Charles Bukowski, Factotum