Worshiping the Dead, Part 2

I look back at the stressed-out writing I did four years ago after Trump was elected and recall what my European friends were saying at the time.  They were a bit blasé about the new administration, a bit cynical about whether anything had changed or would change in U.S. geopolitics. 

They wanted to know why I was so upset.  They saw Trump as a slightly more pungent embodiment of the same old repellent post-WWII American hubris.  Yes, he lacked Obama’s varnish and capacity.  But, in that sense, they seemed to think Trump was a little better—take the makeup off the beast and you can see it more clearly.  Take off its camoflauge and you can more easily track its movements.  America, they felt, is evil, uneducated, childish, and wrong.  Let’s be honest.  Let’s reveal its true nature.  Why not accomplish this by letting Trump flail about and demonstrate American dysfunction on the international stage?

But the imperial, crass, clumsy, adventurist America that colored the perceptions of my friends abroad wasn’t what I was mourning.  Countries, like the people that create and sustain them, are not simple.  Reducing them to an exploitative foreign policy (which their people may not even fully understand or approve of) and the philistine values of their nastiest and most brutish citizens is disingenuous at best. 

The America in which I grew up was liberally and tolerantly Democrat and Republican.  People voted locally, believed in civil rights, valued humanistic education, and, incidentally, were not part of anything that could be considered “systemically racist.”  Higher education and health care were expensive but were seen as inherently good and worth working for.  My southern Californian neighborhood was diverse.  My childhood friends were Hmong, Vietnamese, Italian, African-American, Polish, Mexican-American, and every other ethnicity you can imagine. 

This is not to say that the country (including my small part of it) didn’t have serious problems or that there was no racism or crime.  It’s to say that America was seen as a decent place in which to live, in spite of those things.  And I grew up in a fairly poor neighborhood in a house with the roof always falling in.  I wasn’t dirt poor, but I wasn’t “rolling like a hog in the fat house,” either.  So, with deep reservations, I voted for Hilary in 2016.  I was well aware of the evils in my country.  But it was the good that I cared about and wanted to protect by voting Democrat.  I suspect most who voted Republican felt the same way.

The reason I was upset four years ago, and why I disagreed with the pessimistic view of my European friends, was that Trump brought a vision of “American carnage”—a distorted view of failure and fallout that didn’t square with what I knew from firsthand experience as a citizen.  He encouraged us to envision ourselves as a nigh-failed state.  And I knew that what you envision eventually becomes real.

But none of us could foresee the damage that this would do to the political continuum in the United States and, by extension, to American society.  The last four years have been tragic, convoluted, and intense to an almost unbelievable degree, such that the best analysts and political commentators now seem occupied with catching up or doomsaying instead of predicting what’s to come or offering solutions; though, one can admit that foreseeability is always an issue.

The complexity of this moment in American history is as broad and deep as it is disturbing.  And we might forgive the pundits for having a hard time with it.  It’s hard to think of any moment in American history simple and clear enough that we can say it’s open and shut, that we can easily understand it without much discussion.

For example, it’s not enough to argue that Truman dropped the bomb in 1945 to curtail protracted war in the Pacific and ultimately save lives.  It’s not enough to say that he was spineless and unduly influenced by hawkish generals and politicians looking for payback and glory.  It’s not enough to say that the American public had grown hardened by the war to the extent that the mass-murder of Japanese civilians seemed like an acceptable trade-off for victory.  If we’re in search of the broadest, clearest, most unvarnished view, we have to say all these things and more. 

Such considerations and a hundred others like them, existing side by side, are what make American history so confounding and fascinating (and are what make the New York Times’ dubious “1619 Project” more race-oriented speculative fiction than history, on par with creationist textbooks and What If the Nazis Won the War fan fic).  History is not simple because we are not simple.  The breadth of a cultural, historical moment is always hard to grasp, even in retrospect.  And, at the time, suffused as it is with emotion and rhetoric, it’s nearly impossible to fully and clearly understand what’s going on, even if we have a great deal of information. 

Nevertheless, we all agree that it’s the job of journalists, philosophers, artists, historians (even politicians) to set aside the fear and make sense of things.  When they succeed, we can arrive at maybe a partial understanding of what’s happening, maybe to the extent that we can act in accordance with best possible premises and mitigate the damage.  But after four years of American carnage, we seem to have unambiguously failed in that respect.  And we can’t pin it all on Trump.  We’re the ones who fed him tequila and acid and took him off the leash.  We’re the ones who burned our own neighborhoods.  Political writers, in particular, share the blame.

What we unfortunately have now is vicious black-and-white thinking across the political spectrum, the sort of irrationality that greenlights violence, tribalism, feuding, and revenge and thinks it’s all for the best, the sort of illiberal extremism that forgets how to come together and resolve differences.  And foreseeability remains a key problem—even the partial sort that sets aside the Huxlean herd poison in favor of the common good. 

The vision of American carnage is coming to pass when we could have imagined and brought forth something far better.  Such is the root of my discontent—what I felt in 2016 and what I feel now.  It’s why I regard political conventions as rituals of death worship, paying homage to dead systems and broken ideologies, rigidly entrenched in old enmities and feuds, and enslaved to a partisanship so obsolete and toxic that it has become clownish and absurd.

Cynicism and black-and-white thinking are too easy in times like this.  My deepest wish for the United States is that it will let go of those things and embrace classical liberalism—the radical notion that the left and the right can come together in the middle, take the best of what they are, and form a more perfect union.  We’ve done it before.  I don’t see why we can’t do it again. 

Read my latest at Splice Today . . .

 

 

Read it here: https://www.splicetoday.com/politics-and-media/the-new-york-times-is-rotting-at-the-seams

The follow-up on Jeffery Epstein that I didn’t want to write . . .

Read it here: https://www.splicetoday.com/politics-and-media/jeffrey-epstein-and-the-usual-media-hate-porn

STEM, Scientism, and the Decline of the Humanities – my latest on Splice Today

Read it here: https://www.splicetoday.com/on-campus/stem-scientism-and-the-decline-of-the-humanities

All Hail the Tooth Fairy: Reasonability and Belief in Times of Rapid Change

028

We come into the world screaming and it remains for us to learn how to impose order on our experiences such that we can live in a reasonable way. We could say that adult life comes down to whether we have behaved reasonably, according to the information available. In the tension between what we know (including perhaps what we should know) and what we do, agreed upon behavioral standards emerge as a social foundation.

This seems obvious, but the question as to what constitutes reasonable behavior (and, by extension, right action) is always a matter for interpretation. The standard of a “reasonable person” for determining negligence, at least in U.S. civil law, usually goes like this:

The so-called reasonable person in the law of negligence is a creation of legal fiction. Such a “person” is really an ideal, focusing on how a typical person, with ordinary prudence, would act in certain circumstances. The test as to whether a person has acted as a reasonable person is an objective one, and so it does not take into account the specific abilities of a defendant. Thus, even a person who has low intelligence or is chronically careless is held to the same standard as a more careful person or a person of higher intelligence.

A jury generally decides whether a defendant has acted as a reasonable person would have acted. In making this decision, the jury generally considers the defendant’s conduct in light of what the defendant actually knows, has experienced, or has perceived. (http://bit.ly/2fWFUqi)

When this is effective, the reasonable person standard incorporates social assumptions about “ordinary prudence” (aka common sense) and still allows for the particular situation of the defendant, creating polarity between the ideal and the real, insofar as the court is able to imagine these things. In other words, we try to come up with a standard for what should have been done and then ask whether the individual in question could / should have followed suit.

Due to its flexibility, however, the standard can lead to what may seem like odd or inappropriate conclusions. And as useful as it may be, sometimes we can’t help but suspect that any attempt to establish reasonability will still be rooted in the arbitrary biases and popular assumptions of those involved. It calls to mind the old philistine response to art: we don’t know what art is, but we know what we like. Similarly, we don’t always know what reasonability is, but we always like to think we know what we would have probably done. And just as this remains a difficult question for the courts, it remains a difficult question in every individual life.

Birth is painful—the first, most profound lesson we are taught by our mothers. Every subsequent lesson, whether taught to us by family, friends, or schools, is a matter of determining and enacting reasonability in a particular context. This may often be just as painful as giving birth, but it’s something we must do and something most of us do automatically. It is equally necessary in the life of a chemical engineer, an entrepreneur, an impresario, and a stay-at-home mom. When undertaking anything, we have to ask: What are the best practices? What are the best premises? How should we proceed according to what we already know? Our overall effectiveness depends on how we answer these questions and put the answers into practice.

Again, this is all very obvious on its face; though, the questions become more difficult and complex when someone raises “policy.” Policy comes into play in arguments where “each party offers plausible interpretations of the law. In this situation, the judge may then decide the case on the basis of the social goals that the decision will promote, and the purposes behind the particular rules” (http://bit.ly/2f3JC0c). Simply put: we may not only ask what should have been done and whether the person could / should have followed suit. We may also ask whether this is the sort of behavioral standard we want to encourage. As the old saying goes, “If you can’t argue the facts, argue the law. If you can’t argue the law, argue the facts. And if you can’t argue either, argue the best policy.”

Such determinations are challenging in established courts with rules for how arguments are made. It’s that much harder to determine reasonability and best policy in private organizations that require everyone to accept the group’s ideological viewpoint before discussion can even take place. Courts of law are bound to examine a broad range of ethical, political, and social policy considerations, whereas private groups—even those that are extremely large and diverse—need only keep their narrow institutional assumptions in mind when determining what should be.

For example, while one might argue against punishment in a particular case because it could invite a deluge of frivolous litigation (and it is good policy to discourage frivolous litigation in our already overburdened courts), a religious tribunal might decide that the same party should be punished because they are in violation of holy law. To the religious court, the highest standard of reasonability, and therefore the most beneficial policy, rests on scripture, which is the ultimate value and authority, no matter whether this is at variance with the values of mainstream culture.

With this in mind, the members of such organizations find themselves faced with an important question: are the beliefs of the organization more persuasive, i.e. more reasonable, than those of mainstream society? This opens up additional problematic questions, like: is there something unreasonable in mainstream assumptions as I have come to understand them? Is that what led me to this organization (with its alternate viewpoint)? Do I have the knowledge and insight to make a reasonable choice now that I’m in this organization? Did I have the requisite knowledge and insight when I decided to join?

In this, as in so many other things, only lived experience and its results can provide clarity. In other words, when trying to decide what a reasonable person would do and the best policies that should support that behavior, one must look to the “real world”—to what passes for objective material existence—as the only reliable test.

For instance, it’s all well and good to believe that if you pray to the tooth fairy, she’ll bring you a plate of peppermint candies at midnight. It may be more pleasant to believe in a world where that could happen, but believing doesn’t make it so—doesn’t make it a feature of lived, objective, material existence. And if you choose to sincerely believe that this is going to happen, you shouldn’t be disappointed when you don’t see the peppermints in the morning. So one comes back to the necessity of testing one’s assumptions in the laboratory of everyday life.

Saying this presupposes that it is more reasonable to behave in a way that squares with lived experience and to found one’s assumptions on observable, material phenomena. In other words, it is reasonable to assume there is an objective, physical reality that is more or less dependable. Some things prove out, again and again, when we do them. If you put your hand in the flame, your hand will burn.

Without falling into Cartesian hypotheticals about external forces capable of conditioning our perceptions, we can say that ideologically driven organizations run into difficulties when the foundational assumptions of these groups do not prove out in lived experience. Most religions have this problem, but also cults, social clubs, governments, corporations, and private societies do as well.

The question for the individual remains, no matter what group’s set of assumptions are being used: what is the most reasonable standard of behavior? How to do I know? And what should I do when I’m satisfied that I do know?

So here are a few questions you can ask yourself if changing times make you feel like your beliefs are in jeopardy: What groups have you joined? What is your “reality tunnel”? What is the reasonable person standard of these groups? What biases and assumptions do they contain? How do you think they contrast with the standards of mainstream society? What values do these standards reveal? What “best policies” are implied? Where and how do you lack the knowledge to answer these questions? Are there obstacles to gaining this knowledge? What are they and why are they in place?

Maybe the most important question is: how has your lived experience supported / refuted the assumptions about reasonable behavior in your political party, cult, religion, club, society, or corporation? If your lived experience has nothing to do with what you’re being asked to believe, you might want to make some changes.