Nobody Knows It But Me

classroomLong ago, I was an English teacher at a private high school in central California. It was a good, if demanding, job and unlike many of my colleagues, I seemed to manage occasional moments of non-misery in the workplace. In fact, the two years I spent working there taught me more about human nature than two decades of university teaching, freelance writing, and working abroad ever did.

Without a doubt, teaching over 100 adolescents each semester schooled me not only in how people will behave when going through some of the worst years of their lives but the extent to which many parents are feverishly inadequate when it comes to raising kids. With respect to family, no one wants to admit they have no clue what they’re doing. Everyone must pretend things are running smoothly and they’re in complete control.

I found this pretense interesting, particularly during parent-teacher conferences when ashamed, bewildered parents would whisper, “What do you think I should do?” as if my ability to manage large groups of adolescents somehow qualified me to give them advice. At first, I would explain that my two degrees in English plus minor gifts in speaking in front of people and writing did not mean I had a solution to why Jimmy couldn’t sit still or why Leticia cried through every class and felt compelled to layer everything around her in Purell, or why Leo circulated pictures of his girlfriend’s vagina. Over time, I developed a less draining response: “I do not know.” All Quiet on the Western Front may prepare us to think about the meaning of war, but it will not prepare us for Leo’s girlfriend’s vagina.

I suspected then, as I still do, that confronting such situations is not within the job description of a high school English teacher. But maybe, in the hundreds of outrageous situations in which I found myself in that job, I could have done more. The questions I ask myself now are the questions many parents asked me then: what should I have done? Was there anything to be done at all? There must be an expert somewhere, a veteran administrator or someone with a PhD in education theory, who can speak to this. Maybe a prison psychologist.

I wish I could believe that. In spite of my lingering questions, I think I’ve come to believe the opposite: there actually are no rules—not just for teaching or parenting, but for any area of human experience. A friend once said to me when we were going through our own high school torment: “This is the meaning of life: we all suck and we’re nothing.” I don’t think he fully appreciated how profound that statement was when he said it. 27 years later, I’m still seeing it prove out.

We all suck: no one—and I mean this in the broadest, most inclusive, most general sense—actually knows what they’re doing to the extent that assumptions and judgment calls are unnecessary. Perfect human understanding does not exist and human error is ubiquitous. Even our attempts at artificial intelligence are subject to our limited assumptions about what intelligence actually is (or can be). What can we know beyond a shadow of a doubt? The truth is: nothing, unfortunately.

Surely an engineer will feel confident that, say, as energy is transformed or transferred, an increasing amount of it is wasted. Surely something as dependable and consistent as a physical law (in this case, the Second Law of Thermodynamics) is immutable, absolute, not a matter for interpretation. But even something as ironclad as a law of physics is not without its exceptions. Some things are givens within the parameters of a particular knowledge paradigm, but those givens are always relative to and dependent upon the parameters themselves.

For example, within the agreed-upon bounds of thermodynamic theory, basic laws obtain as a reliable set of rules for the behavior of energy, entropy, and temperature at thermal equilibrium. But we also know that even within that theoretical framework, an empirical finding like the Second Law is subject to exceptions. In 2002, researchers at the Australian National University, in a paper entitled, “Experimental Demonstration of Violations of the Second Law of Thermodynamics for Small Systems and Short Time Scales,” found that “systems can undergo fleeting energy increases that seem to violate the venerable law.” And while this is only one small example, it is by no means isolated or anomalous to the extent that we could dismiss all such exceptions out of hand.

In fact, our entire narrative of scientific progress is predicated on discoveries which alter existing assumptions about how the world works. As Thomas Kuhn observes in The Structure of Scientific Revolutions, “though the world does not change with a change of paradigm, the scientist afterward works in a different world.” The scientific narrative changes. Because it was always a narrative, never an unassailable, omniscient catalogue.

Nothing is beyond interpretation, not even the bedrock assumptions of our most materialistic sciences. Rather, ways of knowing amount to best possible premises always subject to discourse and development over time (to say nothing of the complexities of the information technology we rely on to document and transmit that discourse). We do the best we can. We develop and codify optimal principles for a given field. And then we work with those principles until we encounter a paradigm-disruptive discovery that forces us to revise our theories.

But we’re nothing: Even the most qualified and intellectually responsible claims are part of a conversation (discourse) which is grounded in work that came before and which will be superseded by discoveries and realizations that follow. In many cases, an individual contribution to any field is no greater than a minuscule inch forward with vastly disproportionate implications.

Still, there are careers to develop and Cessnas to purchase and grants to chase and colleagues to slander and books to write and mistresses to support and students to convince. In Polishing the Mirror, the guru Ram Dass—then a social psychology professor named Richard Alpert—describes what he felt was a hollowness at the center of western academia:

In 1961, I was thirty and at the height of my academic career. I had a PhD from Stanford University, and I was a professor of social relations at Harvard. I had arrived at a pinnacle of life as I thought it should be, professionally, socially, and economically. But inside there remained an emptiness—a feeling that, with all I had, something was still missing. Here I was at Harvard, the mecca of the intellect. But when I looked into the eyes of my peers, wondering “Do you know?” I saw in their eyes that what I was looking for was nowhere to be found. In a social or family setting, people looked up to me and hung on my every word because I was a Harvard professor, and they clearly assumed that I knew. But to me, the nature of life remained a mystery.

In Ram Dass’ use of the term, we “do not know” much about the world in any absolute sense. We cannot know because our intellectual tools are as finite as the time we have in which to use them. This is not to argue that we should be content with ignorance. But it is a way to foreground a simple suggestion: speculation is absolutely necessary when it comes to developing knowledge.

Assumptions are necessary. Ultimately, belief is necessary. Kuhn, at least, seems to agree: “Normal science, the activity in which most scientists inevitably spend almost all their time, is predicated on the assumption that the scientific community knows what the world is like.” This seems reasonable not just in science but in any field of human activity.

So what remains to be done if we can never fully know our world and ourselves? Everything! Our inability to attain perfect understanding is no reason to abandon the pursuit of outer and inner knowledge. Rather, it leads us to an acceptance of our limitations as individuals and as a species and, in that acceptance, a very personal yet very real sense of freedom.

Maybe the right answer to those parents who wanted advice should have been: you already know how to raise your kids because what you think is best will be the best you can possibly do. Maybe, as my high school friend seemed to imply back in 1989, we are not static, monolithic, isolate objects. We are no thing.

Instead, we are dynamic, dialectic, fluid collaborations—living syntheses of what was known in the past and what will be discovered in the future. Maybe “discourse” is the most accurate metaphor for human experience. If so, all we can do is try to engage in life’s conversation as robustly as possible. Maybe there are no rules beyond that.

“Baby,” I said, “I’m a genius but nobody knows it but me.” 
― Charles Bukowski, Factotum

Planespotting and the Persistence of Facts

These are strange times to be an American. || Michael Davis

Source: Planespotting and the Persistence of Facts

Thoughts on Sally Yates

Sally Yates at Carter Center

Woke up this morning thinking about Sally Yates—how standing up to President Trump seems to have dramatically influenced the course of her life, how I’ve watched part of her emotional transformation through social media, specifically Twitter, and how her public narrative seems to reveal and confirm things I’ve suspected about the nature of personal meaning and career.

She seems to be undergoing a kind of emotional rebirth.  As someone who works primarily in the emotional mind—emotional intelligence being the primary resource for teaching and doing creative writing—I have learned to recognize when someone is emerging into a deeper, more meaningful emotional life.  She certainly is, even if only by a slight degree.

Frankl’s Man’s Search for Meaning consistently seems to prove out: it doesn’t matter what we do or where we are as long as we can find or create meaning for ourselves.  And so I return to the question of my own career, my own meaning.  When I think back to the teaching I have done, I’m faced with the choice of believing that most of my professional life has been meaningful vs. meaningless.  Obviously, I prefer to think my work has made some kind of difference.

It’s hard to believe in things I cannot see, but I have to nurture a certain degree of faith in the teaching and writing I’ve done.   Sally Yates, someone who has lived primarily in the analytical mind, is now at the beginning of something new—one hopes, something emotionally significant and transformative.  To see someone publicly come into being like this is to bear witness to a largely unnoticed dimension of human experience.  It’s something that sincere teachers get to see more often than any other profession. 

But my personal question remains: how am I coming into being?  Just as someone with Yates’ background and skill set might step into a more intuitive life (by running for public office instead of remaining in the legal-bureaucratic infrastructure), I bear the responsibility for my own development.  Where am I going now?  What’s next?  The future is never fixed, never certain.

Sater, Cohen, and the Collapsing House of Cards

More developments are emerging regarding Trump’s Russian connections – a new post on Splice Today.  Read it here:

My new piece today on Splice Today . . .

trumppressSpeak of the Devil: Journalism in the Trump Era

Good nutritious narrative angst from yours truly…

Publication News: my 32nd short story published.

Fun news: I just published my 32nd piece of short magazine fiction, this time in Ink & Coda magazine.  You can read it for free on their website:

Best wishes to everyone in 2017!




All Hail the Tooth Fairy: Reasonability and Belief in Times of Rapid Change


We come into the world screaming and it remains for us to learn how to impose order on our experiences such that we can live in a reasonable way. We could say that adult life comes down to whether we have behaved reasonably, according to the information available. In the tension between what we know (including perhaps what we should know) and what we do, agreed upon behavioral standards emerge as a social foundation.

This seems obvious, but the question as to what constitutes reasonable behavior (and, by extension, right action) is always a matter for interpretation. The standard of a “reasonable person” for determining negligence, at least in U.S. civil law, usually goes like this:

The so-called reasonable person in the law of negligence is a creation of legal fiction. Such a “person” is really an ideal, focusing on how a typical person, with ordinary prudence, would act in certain circumstances. The test as to whether a person has acted as a reasonable person is an objective one, and so it does not take into account the specific abilities of a defendant. Thus, even a person who has low intelligence or is chronically careless is held to the same standard as a more careful person or a person of higher intelligence.

A jury generally decides whether a defendant has acted as a reasonable person would have acted. In making this decision, the jury generally considers the defendant’s conduct in light of what the defendant actually knows, has experienced, or has perceived. (

When this is effective, the reasonable person standard incorporates social assumptions about “ordinary prudence” (aka common sense) and still allows for the particular situation of the defendant, creating polarity between the ideal and the real, insofar as the court is able to imagine these things. In other words, we try to come up with a standard for what should have been done and then ask whether the individual in question could / should have followed suit.

Due to its flexibility, however, the standard can lead to what may seem like odd or inappropriate conclusions. And as useful as it may be, sometimes we can’t help but suspect that any attempt to establish reasonability will still be rooted in the arbitrary biases and popular assumptions of those involved. It calls to mind the old philistine response to art: we don’t know what art is, but we know what we like. Similarly, we don’t always know what reasonability is, but we always like to think we know what we would have probably done. And just as this remains a difficult question for the courts, it remains a difficult question in every individual life.

Birth is painful—the first, most profound lesson we are taught by our mothers. Every subsequent lesson, whether taught to us by family, friends, or schools, is a matter of determining and enacting reasonability in a particular context. This may often be just as painful as giving birth, but it’s something we must do and something most of us do automatically. It is equally necessary in the life of a chemical engineer, an entrepreneur, an impresario, and a stay-at-home mom. When undertaking anything, we have to ask: What are the best practices? What are the best premises? How should we proceed according to what we already know? Our overall effectiveness depends on how we answer these questions and put the answers into practice.

Again, this is all very obvious on its face; though, the questions become more difficult and complex when someone raises “policy.” Policy comes into play in arguments where “each party offers plausible interpretations of the law. In this situation, the judge may then decide the case on the basis of the social goals that the decision will promote, and the purposes behind the particular rules” ( Simply put: we may not only ask what should have been done and whether the person could / should have followed suit. We may also ask whether this is the sort of behavioral standard we want to encourage. As the old saying goes, “If you can’t argue the facts, argue the law. If you can’t argue the law, argue the facts. And if you can’t argue either, argue the best policy.”

Such determinations are challenging in established courts with rules for how arguments are made. It’s that much harder to determine reasonability and best policy in private organizations that require everyone to accept the group’s ideological viewpoint before discussion can even take place. Courts of law are bound to examine a broad range of ethical, political, and social policy considerations, whereas private groups—even those that are extremely large and diverse—need only keep their narrow institutional assumptions in mind when determining what should be.

For example, while one might argue against punishment in a particular case because it could invite a deluge of frivolous litigation (and it is good policy to discourage frivolous litigation in our already overburdened courts), a religious tribunal might decide that the same party should be punished because they are in violation of holy law. To the religious court, the highest standard of reasonability, and therefore the most beneficial policy, rests on scripture, which is the ultimate value and authority, no matter whether this is at variance with the values of mainstream culture.

With this in mind, the members of such organizations find themselves faced with an important question: are the beliefs of the organization more persuasive, i.e. more reasonable, than those of mainstream society? This opens up additional problematic questions, like: is there something unreasonable in mainstream assumptions as I have come to understand them? Is that what led me to this organization (with its alternate viewpoint)? Do I have the knowledge and insight to make a reasonable choice now that I’m in this organization? Did I have the requisite knowledge and insight when I decided to join?

In this, as in so many other things, only lived experience and its results can provide clarity. In other words, when trying to decide what a reasonable person would do and the best policies that should support that behavior, one must look to the “real world”—to what passes for objective material existence—as the only reliable test.

For instance, it’s all well and good to believe that if you pray to the tooth fairy, she’ll bring you a plate of peppermint candies at midnight. It may be more pleasant to believe in a world where that could happen, but believing doesn’t make it so—doesn’t make it a feature of lived, objective, material existence. And if you choose to sincerely believe that this is going to happen, you shouldn’t be disappointed when you don’t see the peppermints in the morning. So one comes back to the necessity of testing one’s assumptions in the laboratory of everyday life.

Saying this presupposes that it is more reasonable to behave in a way that squares with lived experience and to found one’s assumptions on observable, material phenomena. In other words, it is reasonable to assume there is an objective, physical reality that is more or less dependable. Some things prove out, again and again, when we do them. If you put your hand in the flame, your hand will burn.

Without falling into Cartesian hypotheticals about external forces capable of conditioning our perceptions, we can say that ideologically driven organizations run into difficulties when the foundational assumptions of these groups do not prove out in lived experience. Most religions have this problem, but also cults, social clubs, governments, corporations, and private societies do as well.

The question for the individual remains, no matter what group’s set of assumptions are being used: what is the most reasonable standard of behavior? How to do I know? And what should I do when I’m satisfied that I do know?

So here are a few questions you can ask yourself if changing times make you feel like your beliefs are in jeopardy: What groups have you joined? What is your “reality tunnel”? What is the reasonable person standard of these groups? What biases and assumptions do they contain? How do you think they contrast with the standards of mainstream society? What values do these standards reveal? What “best policies” are implied? Where and how do you lack the knowledge to answer these questions? Are there obstacles to gaining this knowledge? What are they and why are they in place?

Maybe the most important question is: how has your lived experience supported / refuted the assumptions about reasonable behavior in your political party, cult, religion, club, society, or corporation? If your lived experience has nothing to do with what you’re being asked to believe, you might want to make some changes.