The Killer is Disturbing and Excellent

Michael Caine once said he thought Tom Cruise was an excellent actor but wouldn’t be acknowledged for this until Cruise got older and stopped appearing in teen heartthrob roles.  Caine was pointing out how Hollywood tends to appreciate actors for looks first and artistry second—part of a self-deprecating explanation for why Caine was considered a “great actor” early on.  Whether or not this is true, Michael Caine certainly belongs to what we sometimes think of as an earlier, classier generation of actors, many of whom were English and seemed to be good as well as good looking.

It’s not hard to think of Michael Fassbender in that category, an Irish-German trained in England, capable of a decent Macbeth as well as many supporting roles in sci-fi films.  And even if Fassbender couldn’t be more different than Cruise in terms of affect and culture, he still gets a lot of Cruise-ish work.  David Fincher’s The Killer is a recent example.  We can easily imagine Cruise’s “Vincent” from Collateral as the nameless hitman in The Killer; though, Fassbender’s voiceover is more like Ed Norton in Flight Club (also a Fincher movie) and less in the HeatCollateral tradition of Michael Mann’s stylized noir.

Fincher loves to generate dramatic tension by starting off in an undramatic way, even in a thriller about an international assassin.  Routine and mundanity is a Fincher signature and he does this better than anyone, which The Spectator’s Alexander Larman points out, calling The Killer “a black-comic masterpiece” on par with Fight Club: “If you found Fight Club and Gone Girl funny, you’re going to love the similarly bleak tone here.”  Maybe it’s bleak.  It’s certainly laced with black humor, but it also shows us something slightly new, diverging from the formulaic morality play that hitman films usually seem destined to become.

Larman mentions “Early hype suggested that The Killer would follow the exploits of Fassbender’s anonymous assassin as he (inconveniently) develops a conscience, presumably setting up an existential quandary,” which sounds straight out of assassin-film Central Casting.  Thankfully, this is not what The Killer did.  Fassbender’s character never develops a conventionally moralistic conscience.  Nor are we encouraged to see him as a product of his environment, a beleaguered anti-hero merely playing out the hand he was dealt by circumstances and fate.  Rather, he tries, with varying degrees of success, to adhere to his personal philosophy: “Anticipate; don’t improvise.”  His highest value is not redemption in the eyes of the audience but job efficiency, which is what makes The Killer different and interesting.

Contrast this with Jessica Chastain’s character in Ava, another international assassin film, released in 2020 and directed by Tate Taylor, who, for all his ostensible qualities as a director, is evidently no David Fincher.  Ava is an assassin struggling with addiction, tormented by family drama, and for some reason never fully explained, roundly hated by one of her handlers to the point of him trying to eliminate her.  It’s a movie doused in conventional morality and reeking with guilt as the protagonist meanders through the emotional wasteland of her career between unpleasant family visits and AA meetings.  Essentially, Ava is a movie trying to get us on the main character’s side, trying to motivate us to put an arm around her and say, yes, you kill people for a living, but you really shouldn’t be so hard on yourself.  That is also very funny and absurd, but unintentionally so.

                 Ava: guilt, self-loathing, and family melodrama.

The Killer is different.  It’s a movie that takes its ethos not from soppy Hollywood morality but from the mission statement of an Amazon fulfillment warehouse: efficiency is job one.  It’s bleak because we’ve come to find that perspective vaguely horrifying in an era of technocratic, anti-life post-industrial culture, where STEM has bludgeoned the humanities into Peking opera irrelevance and artificial intelligence has grown increasingly monstrous in service of profit.  It’s an echo of Black Mirror’s “Metalhead,” where a woman robbing such a warehouse gets hunted down by a killer robot that goes far beyond the bounds of anything resembling fairness or reason.  Human concerns like restorative justice, punishment fitting the crime, and personal honor have no place in this brave, new Gradgrind-ish corporate utilitarianism—which may not be that new after all (thank you, Charles Dickens).

Fassbender’s character embodies this, subscribes to it, aspires only to peak efficiency. He explains, “If I’m effective, it’s because of one simple fact.  I . . . don’t . . . give . . . a . . . fuck.”  He isn’t hung up on the usual human concerns.  “Luck isn’t real,” he says.  “Nor is karma, or sadly, justice.  As much as I’d like to pretend these concepts exist.”  There’s only job performance and production standards, which he pursues with the exactitude and precision of a killer robot, even when an unforeseen twist forces him to turn against his employer.

Ava turns against her employer because she feels like she’s bad and wants to be good.  The Killer’s narrator does it because that’s the best practice indicated in the job manual.  The former is a clichéd, forgettable film we’ve seen a hundred times.  The latter is an interesting, disturbing comment on an emergent aspect of our cultural mindset.  And that is why The Killer may haunt us long after we’ve moved on to other, far less insightful, entertainment.

Interview with the Vampire Reconsidered

I rewatched Interview with the Vampire last night and it just doesn’t seem dark enough. Maybe that’s a reflection of how my emotional self has darkened after Covid, rapacious politics, and so much social turmoil. But it seems to me that the story, the myth, of the vampire is dangerous because it is Dionysian and feral.  It has to be dark.  It has to flirt with real evil and suffering. 

Interview is too tame, too inhibited.  It tries to show evil but it stops at longing for redemption.  There has to be heartbroken bitterness (Lestat pretends to be bitter, but he’s just bored and infatuated).  And that bitterness has to become so intense that it doubles over into malice.  Then we have something.  That would be a vampire story fit for 2021.

Anne Rice (who became a super-Christian) thought of the vampire more the way Mary Shelley thought of Frankenstein’s monster: a messianic anti-hero.  That’s great.  But Rice didn’t come up with the vampire mythos.  And when you make a vampire movie, it goes beyond your particular ideas into the greater mythic paradigm that contains all vampire symbolism and stories, especially those of the vampire as a 19th century expression of human suffering and desire, a twisted reaction to the oppressive side of industrial capitalism. 

Romance, blood, eternal life, its price, and its consequences only come with the darkening of the world—a rejection of daylight, machines, industry, and Protestant ideas of clean living. The vampire seems like an embodiment of Victorian longing for nature, for Pan, for the Wordsworthian overflow of feelings denied by the western progress narrative and cynical social Darwinism.  And so you only get the vampire if you’re willing to accept a certain amount of darkness and violence.  It’s why you traditionally have to invite the vampire over your threshold.  It has to be your choice to let the darkness in.  Of course, you might turn into a rotting corpse or a raving madman like Dracula’s R. M. Renfield.  But nothing comes for free in this mythology.

I guess most of us are over messiahs and redemption narratives these days.  I think I definitely am.  I don’t care about being brought back into the great huddled mass with its dead gods.  Maybe I’m looking for a different sort of vampire tale, not one born in the lingering optimism of Anne Rice’s 1990s pre-Christian return.  Suffering.  Darkness.  The Eleusinian Mysteries come around again in a story drenched in blood and derangement.  Joaquin Phoenix’s Joker.  That’s where it is for me these days.  The vampire archetype still matters, but it goes a lot deeper and gets a lot more disturbing than sexy-but-guilty anti-heroes in velvet, tormented by their otherness, seeking some kind of reintegration into banal conformist culture.

Mapping the Swamp

Today, I think I overcame my hitherto impassable mental block, the one I always get between pages 50 and 70, that indicates I’ve hit the “swampy middle.” The term “great swampy middle” wasn’t invented by me. In fact, I have no desire to discover who first coined the term because I have no desire to utter it ever again; though, I fear that’s just wishful thinking. Of course, I’m going to talk about, think about, and confront the GSW again. I always get bogged down in the middle. It’s stopped me from completing whole books. It hits me in longer stories, too. The hideous abyss waiting for writers at the middle of a piece of fiction is an inevitable occupational hazard.

I’ve been struggling with this novel for several weeks. The first 50 pages emerged quickly. And, in all seriousness, I think they’re very good pages, some of my best. So I can’t allow myself to seriously entertain thoughts of abandoning the project. I have to see it through if only for those good pages.

The only way out is to make an outline. I hate outlines. When I write, I want to be in a creative trance, driving the muse’s burning chariot through the dark firmament of hell. Or something like that. Bukowski promised that you’d know the gods and your nights would flame with fire.  When his promise comes true, it really is the best thing. When the divine chariot is half-submerged in the swamp, when it backfires a cloud of rancid bio-diesel and won’t even start, when the muse doesn’t even show up because she was partying with some publishing industry types last night and has to sleep it off, when the way forward is just a mucky green-brown maze of shit-streaked walls, you need a scaffold. You need to build a ladder out of the swamp. You need to draw a map. So that’s what I did.

I will always hate outlines. But now the editor part of my brain can see the way forward. Now I have a schematic. I know I can follow it—if everything doesn’t change tomorrow, if the muse doesn’t laugh at me and send me a dream that completely turns my scaffold upside-down. That happens, too. We’ll see.

 

Maybe You Can’t Handle the Truth

Today, after all the Covidy Trump ups and downs, the questions about Amy Coney Barrett’s confirmation, and the hard questions about whether there should even be a VP presidential debate, I’m thinking again about Chris Beck’s excellent piece in Splice Today, “The Media Reports Narratives, Not Facts.” 

We all live online now. We look at the world through electrified windows. All we see in our non-digital lives is our homes and immediate neighborhoods. Maybe we travel some, but we don’t get much of an overview of what’s going on unless we use digital media.  This is good and bad.

The Good: we live in an information society where communication, news, and knowledge can be produced instantaneously.

The Bad: we live in an information society where communication, news, and knowledge can be produced instantaneously.

He / She / It who controls the location and size of the digital window (and do take a moment to learn about the “Overton Window” as well) controls what is seen. Is it true that the United States is collapsing? What does the New York Times say about it? More importantly, how, when, and to what end does the NYT cover the “decline of America”? You can’t just think about the content; you have to think about how it’s framed and marketed to you.

All media is a product. This is capitalism. And the truth (often much more complex than how it is presented in one “window” or another) is out there, but it is always, always beholden to the bottom line for any media platform. Of course, they all say they’re dedicated to the truth.

Is Fox News a legitimate news source? Sure. It’s about as legit as CNN. But it will seem more or less reliable depending on your assumptions about the world, your values, your community, and your culture. How about the Daily Wire? Take a look at it (especially if you consider yourself a liberal) and you won’t see a whole lot of variation between what’s in there and what’s showing on the Wall Street Journal on a given news day.

You might notice that certain stories are emphasized more than others or are framed to imply certain conclusions (the “secret message” in a news story that used to be called “slant” or “an angle” but which is now called “news bias”). But the Daily Wire is considered to be much farther to the political right on the American spectrum than the WSJ. Why? Probably because conservative pundit, Ben Shapiro, founded and until recently ran DW. But that really isn’t a good reason. It’s just perceptual media bias.

Do this comparison between The Washington Post and Mother Jones. How about The Daily Beast and Vox? How about any of these and Breitbart or The Drudge Report? Products. Marketing. Stoking controversy in targeted audiences. Know why I don’t watch Russia Today news? Google it and the reason should jump off the screen. Even search engines have slant, bias, implicit preferences that show the world a certain way. You can’t escape slant.

But you can do this: read conservative news if you’re a liberal along with your liberal stuff. Read liberal news if you’re a conservative along with your conservative stuff. Look at Media Bias Fact Check and search your favorite media sources there. Do this in order to see the world through more windows, even though you’ll never get a comprehensive view of anything.

Don’t let any media source trick you into thinking that what you’re seeing is the whole truth or the entire scope of something. You have to work to get that on your own.  As Beck puts it in his Splice Today piece: “It’s no surprise that Americans’ trust in the media is minuscule. The New York Times can’t even recognize third-rate journalism. As a consumer of media, the only way to be well-informed is to remain skeptical about the media’s competence, understand that they’re reporting a narrative instead of the facts, and get your news from a variety of sources.” 

Here are some questions to ponder for yourself:

  • Is there a problem with the stories on Zero Hedge? What might it be?
  • What makes The National Review a “libertarian” publication? Is it?
  • Why aren’t more writers for Quillette publishing in The New Yorker and The New York Times?
  • Is the NYT’s “1619 Project” history or speculative fiction? How can you tell?
  • What is the primary difference between Rachel Maddow’s and Ben Shapiro’s coverage? Why might this be a pointless question to ask?
  • I say above that “you can’t escape slant.” So why do all this thinking and reading about media? If bias is inevitable, why try to see past it?
  • Does believing a QAnon conspiracy theory indicate that you are intelligent, stupid, or just misinformed? How do you know? How about believing in the tenets of the religion of your choice? Smart? Stupid? How about believing that Critical Race Theory realistically depicts power relations in the world? Smart? Something else? What do these three belief systems have in common?

The End of Supernatural

Supernatural has come to an end after 15 long seasons.  That’s probably for the best.  No TV show should run that long.  And it had multiple corny, goofy, b-movie impossible-to-believe arcs that sometimes made us groan.  But when it was good—and it usually was at some point in every season—especially when it got back to the fable of two brothers taking ghost-hunting road trips in their muscle car through middle America—it was really unique and fascinating.

I found myself keeping up with the show over the years, even at times when there were ostensibly more serious things I wanted to watch.  I even read some of the scripts in order to figure out how, exactly, they pulled off certain nigh-unbelievable episodes.  I even talked with a teacher of mine about a few of them. 

I learned TV writing from a very smart, funny woman who taught me a lot about the business and the machinelike precision that often goes into making a TV episode.  It changed the way I thought of television as a creative medium and sharpened my sense of how to make something happen in a scene.  Literary fiction writers often have a hard time with plot.  They tend to think more about the inner landscape of their characters.  But from a TV writer’s point of view, inner upheavals, quiet moments, and realizations emerge in the acting.  Good TV writing is plot.  And Supernatural’s writers never forgot that.

Often, I’d be watching an utterly goofy episode about a swamp monster eating cheerleaders in central Iowa and I’d realize the immense skill being employed to pack a fully formed dramatic arc into a single episode with surprisingly good character actors filling in the blanks.  Nothing about that is simple or easy.  Good TV never is.

Say you’ve got 50 pages of script for about 30 minutes of content in a tripartite dramatic structure.  An episode needs to sustain tension across commercial breaks, involve most of the cast regulars, and keep within the boundaries of the “series bible,” the style book for the show.  In a continuing series, it has to do these things while advancing the broad dramatic arc of the season.  Nothing can be wasted.  Every available minute must be used.  In this highly commercial form of storytelling, time is always money. 

There were a few episodes that astonished me in that respect.  And I started to follow some of the show’s writers on social media.  I’m not much of a fan, but sometimes I’d see some particularly acrobatic bit of dramatic structure and think, damn, who wrote this?  Who can build that sort of clockwork mechanism episode after episode, show after show well enough to make a career out of it?  I’d describe such a person as highly disciplined, precise, and obsessive.  She has to have all the skills with language that every writer has plus a fanatical work ethic, the willingness to commit to someone else’s creative guidelines, and an overwhelming amount of determination to dust herself off and get back into it when Hollywood inevitably hands her a beating.  Supernatural seemed to have a number of these ringers in its rotation.  You could see the craft.

As someone who stepped into that world, realized how harsh it is, and stepped back out just as quickly, I harbor a deep respect for what goes on in a show’s writers room.  I try to hold myself to a similar standard in my work and always enjoy discovering other writers who do the same, even though I’m writing stories and novels and not building three-act chronographs.  But a work of great craftsmanship is a wonderful thing to see, whether it’s pretending to be TV b-horror or something more serious.

I’m going to miss Supernatural, as much for these writerly things as for how entertaining and fun the show could be.  People talk about the Battlestar Galactica reboot, Breaking Bad, Better Call Saul, and Dexter as being really well-crafted shows in the new golden age of streaming television.  And I think they are.  But a show like Supernatural sneaks in the back door.  It comes dressed as lowbrow pulp, as a jester that doesn’t take itself seriously, and it does such a good job pantomiming and parodying that we overlook its immense skill.  Of course, that’s what we were meant to do all along.

Surpassing Meritocracy: the Artist’s Way

There are many different paths to greatness, not just the ones most commonly identified by conformist culture.  As long as your basic needs are met, where you put your energy—how you pursue excellence—is completely your business.  Realizing this can be difficult and gradual.

It seems true, even if we admit that discourses (value systems) will always compete with each other for dominance.  And one of the most ruthless and rapacious, at least in the West, is that of “meritocracy.”  A meritocracy is inherently based on an assumed set of cultural values.  But you need to realize that you are free to opt out of those assumed values.  What the masses consider to be good doesn’t have to define your life.  

If you don’t accept meritocratic cultural values, merit-based judgments by those who do are irrelevant.  In other words, it is a mistake to impose the rules of a game on someone who refuses to play; though, because discourses will compete with each other, people will usually try to impose their personal values-discourse on you.  Often, they will do so because they’re not aware of alternatives.  They may not even remember the moment they chose to buy in.  And they may not understand that imposing values on someone else is an act of violence.

Remove the question of merit (and its various implications) and the locus of meaning in life shifts (possibly returns) from an external authority to the individual.  One arrives squarely within Viktor Frankl’s “Will to Meaning“—not seeking meaning / value relative to others, but exploring what is already resonant / resident in the self.  “Thy Will be Done” becomes “My Will be Done,” with all the freedoms and responsibilities arising from that shift.

It makes no difference if your private world is idiosyncratic to the point at which it would seem very strange to more common sensibilities.  As long as you’re not behaving like a hypocrite by harming or otherwise curtailing the autonomy of others, your interiority (including the way you choose to perceive the world outside your self) is completely yours.  And it doesn’t seem outrageous to conclude that this is how it should be.  If you don’t own your thoughts, can you ever own anything else?  In fact, it seems that the more you personalize your unique way of seeing and acting in the world, the stronger and more persuasive that uniqueness becomes. 

Because discourse is grounded in conflict and competition, this self-originating, self-describing narrative you are spinning can have a destabilizing effect on others, who may accuse you of being a delusional, a dreamer, someone out of touch with (what the dominant culture considers) reality.  But if it works for you, isn’t it the right thing?  Isn’t that choosing inner freedom instead of pledging fealty to ideas and to a lifestyle that was designed (or emerged) without you particularly in mind?

Walking away from a meritocracy takes a lot of courage and effort.  Because you are a social being, it can involve a certain amount of suffering, alienation, and lonesomeness.  You risk being called a deviant, being labeled as a disaffected undesirable.  Even if you don’t agree with those judgments, they will still hurt.  Hopefully, your growing curiosity about your own sui generis greatness and freedom will mitigate that pain.

You might call this the “inward path,” the “artist’s way,” or “the path beyond the campfire” which leads into dark unmapped places, where all new things wait to be discovered.

FYI: New Story Forthcoming in Visitant Magazine

A recent short short of mine, “You Are Somewhere Else,” is forthcoming in Visitant and should be available online.  As usual, I will post the links when the story comes out. – M

Tiredness, Truth, and Mockery: the American Way

...
Early rendition of Alfred E. Neuman, 1908.

Today, I wonder whether I should re-think some of my ultra-liberal biases and attendant leftist news consumption.  This is good.  But, man, I’m beat.

The alt-right (and the radical religious right) to me seems like a uniquely American expression of deep stupidity but, of course, I would say that. Look at my demographic: college educated, democrat, fiction writer, from Southern California, who’s been an expat for almost a decade. Of course, I think Trump is the worst thing that could have possibly happened to the world. Of course, I wanted Bernie but voted Hillary. Of course, I want net neutrality. Of course, I support many (but not all) positions taken by the ACLU. Of course, I believe that, in an earlier era, Obama would have been considered a moderate republican. Of course, I have a problem with drones, civilian casualties, the terrific scope creep of the Patriot Act, and the “war on drugs.” Of course, I care about my country.

If I didn’t think the Green Party was run by bumblers, I would probably join. I’m pro-choice, pro-Planned Parenthood, and I support gay marriage. I think many of these things should not even have to be controversial in a liberal democracy. I dream of a day when there will be universal healthcare and free college tuition. I think climate change is one of the most, if not the most, serious issues we face today. But the truth is that most of these biases and beliefs can be (and are) predicted by an algorithm. The even sadder truth is that I only have so much energy I can devote to fact checking and being outraged. This is a problem. Tiredness is a problem.

The problem is not that there is a right answer we have to find. The problem is that uncertainty and complexity are exhausting over time, especially when you’re necessarily engaged in other things. Most Americans are not, actually, stupid. They’re invested in certain areas–mostly job and family–and in most other respects have a general (superficial) understanding of the world, including political issues and identifying yellow journalism, confirmation bias, and what passes for fear mongering click-bait. I have also seen this in European and Asian countries, relative to various cultural differences and levels of education. The USA doesn’t own “stupid.” Every country with a powerful media has a horse as a proconsul somewhere. The difference is that the States likes to put its toga-wearing horses on display, whereas other countries have not. But this is changing.

In an enormous post-industrial society, you will have many levels of political, historical, and economic awareness and many opinions emerging constantly in the news media. You will also have crackpot theories; secessionism; separatism based on race, religion, and / or gender biases; conspiracy paranoia; multi-directional shaming; late night talk show infotainment; social justice fanatics; religious absolutists; new age hucksters; ambulance chasers; a continuous horde of cynics; doom-saying historians looking for their 15 minutes; the resurgence of failed orthodoxies (like Nazism, ethno-nationalism, and whatever Steve Bannon happens to be reading); and the all-encompassing opportunism that feeds off these things. What you won’t have is a simple black-and-white truth. You will have truthiness.

To live in an information society infected with truthiness is extremely taxing. But just as there is no black-and-white truth, there is no easy solution. A friend of mine has suggested “slow news” as opposed to internet news feeds. It seems like there are some merits there. But slow news does not necessarily safeguard against yellow journalism, which has been around since newspapers could fold. In many ways, the 24-hour news cycle and its problematic presence on social media makes it harder for governments and corporations to spin interpretations in their favor. We should be grateful for the ineptitude of Sean Spicer and the alacrity with which he and his boss are covered by the press corps.

I don’t have answers. I don’t think there is a single version of what is true—at least not one that can be had through the media. But I also don’t think the cross-eyed chants of “burn it down” and “fuck your feelings” have done any good. They helped Trump get elected as president, and he has thus far made a mockery of America. The left understandably wants him gone. The GOP wants him to calm down and let them get on with the kleptocracy. His base supporters are currently upset because he bowed 5 inches to receive an award in Saudi. Some of his supporters are no doubt upset that the Reich hasn’t yet emerged in all its glory. I suspect they will still be upset when he gets impeached.

“Nothing is an absolute reality; all is permitted” – Hassan-i Sabbah

Nobody Knows It But Me

classroomLong ago, I was an English teacher at a private high school in central California. It was a good, if demanding, job and unlike many of my colleagues, I seemed to manage occasional moments of non-misery in the workplace. In fact, the two years I spent working there taught me more about human nature than two decades of university teaching, freelance writing, and working abroad ever did.

Without a doubt, teaching over 100 adolescents each semester schooled me not only in how people will behave when going through some of the worst years of their lives but the extent to which many parents are feverishly inadequate when it comes to raising kids. With respect to family, no one wants to admit they have no clue what they’re doing. Everyone must pretend things are running smoothly and they’re in complete control.

I found this pretense interesting, particularly during parent-teacher conferences when ashamed, bewildered parents would whisper, “What do you think I should do?” as if my ability to manage large groups of adolescents somehow qualified me to give them advice. At first, I would explain that my two degrees in English plus minor gifts in speaking in front of people and writing did not mean I had a solution to why Jimmy couldn’t sit still or why Leticia cried through every class and felt compelled to layer everything around her in Purell, or why Leo circulated pictures of his girlfriend’s vagina. Over time, I developed a less draining response: “I do not know.” All Quiet on the Western Front may prepare us to think about the meaning of war, but it will not prepare us for Leo’s girlfriend’s vagina.

I suspected then, as I still do, that confronting such situations is not within the job description of a high school English teacher. But maybe, in the hundreds of outrageous situations in which I found myself in that job, I could have done more. The questions I ask myself now are the questions many parents asked me then: what should I have done? Was there anything to be done at all? There must be an expert somewhere, a veteran administrator or someone with a PhD in education theory, who can speak to this. Maybe a prison psychologist.

I wish I could believe that. In spite of my lingering questions, I think I’ve come to believe the opposite: there actually are no rules—not just for teaching or parenting, but for any area of human experience. A friend once said to me when we were going through our own high school torment: “This is the meaning of life: we all suck and we’re nothing.” I don’t think he fully appreciated how profound that statement was when he said it. 27 years later, I’m still seeing it prove out.

We all suck: no one—and I mean this in the broadest, most inclusive, most general sense—actually knows what they’re doing to the extent that assumptions and judgment calls are unnecessary. Perfect human understanding does not exist and human error is ubiquitous. Even our attempts at artificial intelligence are subject to our limited assumptions about what intelligence actually is (or can be). What can we know beyond a shadow of a doubt? The truth is: nothing, unfortunately.

Surely an engineer will feel confident that, say, as energy is transformed or transferred, an increasing amount of it is wasted. Surely something as dependable and consistent as a physical law (in this case, the Second Law of Thermodynamics) is immutable, absolute, not a matter for interpretation. But even something as ironclad as a law of physics is not without its exceptions. Some things are givens within the parameters of a particular knowledge paradigm, but those givens are always relative to and dependent upon the parameters themselves.

For example, within the agreed-upon bounds of thermodynamic theory, basic laws obtain as a reliable set of rules for the behavior of energy, entropy, and temperature at thermal equilibrium. But we also know that even within that theoretical framework, an empirical finding like the Second Law is subject to exceptions. In 2002, researchers at the Australian National University, in a paper entitled, “Experimental Demonstration of Violations of the Second Law of Thermodynamics for Small Systems and Short Time Scales,” found that “systems can undergo fleeting energy increases that seem to violate the venerable law.” And while this is only one small example, it is by no means isolated or anomalous to the extent that we could dismiss all such exceptions out of hand.

In fact, our entire narrative of scientific progress is predicated on discoveries which alter existing assumptions about how the world works. As Thomas Kuhn observes in The Structure of Scientific Revolutions, “though the world does not change with a change of paradigm, the scientist afterward works in a different world.” The scientific narrative changes. Because it was always a narrative, never an unassailable, omniscient catalogue.

Nothing is beyond interpretation, not even the bedrock assumptions of our most materialistic sciences. Rather, ways of knowing amount to best possible premises always subject to discourse and development over time (to say nothing of the complexities of the information technology we rely on to document and transmit that discourse). We do the best we can. We develop and codify optimal principles for a given field. And then we work with those principles until we encounter a paradigm-disruptive discovery that forces us to revise our theories.

But we’re nothing: Even the most qualified and intellectually responsible claims are part of a conversation (discourse) which is grounded in work that came before and which will be superseded by discoveries and realizations that follow. In many cases, an individual contribution to any field is no greater than a minuscule inch forward with vastly disproportionate implications.

Still, there are careers to develop and Cessnas to purchase and grants to chase and colleagues to slander and books to write and mistresses to support and students to convince. In Polishing the Mirror, the guru Ram Dass—then a social psychology professor named Richard Alpert—describes what he felt was a hollowness at the center of western academia:

In 1961, I was thirty and at the height of my academic career. I had a PhD from Stanford University, and I was a professor of social relations at Harvard. I had arrived at a pinnacle of life as I thought it should be, professionally, socially, and economically. But inside there remained an emptiness—a feeling that, with all I had, something was still missing. Here I was at Harvard, the mecca of the intellect. But when I looked into the eyes of my peers, wondering “Do you know?” I saw in their eyes that what I was looking for was nowhere to be found. In a social or family setting, people looked up to me and hung on my every word because I was a Harvard professor, and they clearly assumed that I knew. But to me, the nature of life remained a mystery.

In Ram Dass’ use of the term, we “do not know” much about the world in any absolute sense. We cannot know because our intellectual tools are as finite as the time we have in which to use them. This is not to argue that we should be content with ignorance. But it is a way to foreground a simple suggestion: speculation is absolutely necessary when it comes to developing knowledge.

Assumptions are necessary. Ultimately, belief is necessary. Kuhn, at least, seems to agree: “Normal science, the activity in which most scientists inevitably spend almost all their time, is predicated on the assumption that the scientific community knows what the world is like.” This seems reasonable not just in science but in any field of human activity.

So what remains to be done if we can never fully know our world and ourselves? Everything! Our inability to attain perfect understanding is no reason to abandon the pursuit of outer and inner knowledge. Rather, it leads us to an acceptance of our limitations as individuals and as a species and, in that acceptance, a very personal yet very real sense of freedom.

Maybe the right answer to those parents who wanted advice should have been: you already know how to raise your kids because what you think is best will be the best you can possibly do. Maybe, as my high school friend seemed to imply back in 1989, we are not static, monolithic, isolate objects. We are no thing.

Instead, we are dynamic, dialectic, fluid collaborations—living syntheses of what was known in the past and what will be discovered in the future. Maybe “discourse” is the most accurate metaphor for human experience. If so, all we can do is try to engage in life’s conversation as robustly as possible. Maybe there are no rules beyond that.

“Baby,” I said, “I’m a genius but nobody knows it but me.” 
― Charles Bukowski, Factotum