You are currently browsing the category archive for the ‘rhetorical context’ category.

[Edgelord:] Even from its earliest uses, the word carries the connotation of eye-rolling skepticism.  The edge in edgelord comes from expressions like cutting edge or the idea of being edgy, applying a sense of boldness or unconventionality to such behavior; the lord half elevates such a person ironically with the rank of a deity or member of British nobility, with echoes of Voldemort, Sauron, and other dark-spirited, villainous characters who hold that title. — “Doing the Work of the Edgelord,”

Lately, on political news blogs, Twitter, and Facebook, we’ve been seeing a lot of summary dismissals of arguments, particularly those which are racially or pandemically charged.  This might suggest people are more stressed out than ever.  One rarely sees argumentative moves like this when times are calm, even in the divisive cesspools of social media and in the freewheeling comments areas still permitted by news sites. 

Only when people begin to crack under sweeping emotional strain do they start to become rhetorically evasive and nihilistic.  They want to appear as though they’re open to reasoned discussion and debate, but really they want to close down the conversation and talk about their cats.  In a sense, I don’t blame them.  We’re in a very emotionally difficult moment right now.  And no one wants to admit to having an exploding head.  

We might classify this particular evasion as a form of “rhetorical edgelordism”—an attempt to disingenuously self-protect by dismissing an argument while also trying to seem like the smartest, most incisive person in the room. 

If someone says, “It could be A or it could be B,” the edgelord adds, “No, A and B are a false choice because C,” which invalidates them, ostensibly ending the discussion.  Usually the person bringing C is upset with having to choose between A or B and wishes to redefine the choice as (A vs B) vs C—where C is much less controversial, threatening, or applicable.

C is usually something exotic. In order to function as a blanket dismissal, C can’t use the ideas from A or B (because then it falls into the scope of original discussion).  It has to be from a distant discipline or sphere, so far outside the purview of A or B that the core argument gets derailed. 

Here’s an example: “COVID-19 originated in fruit bats” (A) vs. “It was bio-evolved in a Chinese lab” (B). Then (C) pops up: “Actually, statistics have shown social attitudes to pandemics track according to political party affiliation, if you want to talk relevance when it comes to the virus.”  Ironically, C itself is immensely and obviously irrelevant to what’s being talked about.  But unless it is instantly ignored by everyone, it’s work is done.

People who see this move might point out the scope creep.  But by then the thrust of the original discussion has already fractured.  In our example, we’re now talking about at least 3 issues: (1) the bat theory vs the lab theory, (2) the new political party theory, and (3) whether the new political party theory matters or is an irrelevant digression.  Now it’s much easier for the edgelord to divert the argument, self-soothe, and still pose as the edgy freethinker not caught up in the preoccupations of A vs B conformist thinking.  At this point, we’re about three or four rhetorical steps away from looking at a jpg of his cat, Waffles.

In healthy discussions (with psychologically healthy people), this is sometimes called “reframing the issue,” and it’s a perfectly legitimate way of clarifying a subject under consideration—when it focuses on getting at a deeper point significant to A and B.  In the example, this might be something like, “The issue of whether the virus originated in fruit bats or in a lab actually raises the deeper question of whether determining the origin will matter to developing a vaccine.”  Here, the reframe is aiming at a link between both A and B and trying to enhance and clarify the discussion by pointing that link out.  The test is relevance: A and B are both compelling because they are interested in how we know and therefore can control the global outbreak.  But when reframing is done as a way to distract and dismiss by bringing in an extraneous consideration, there are usually disingenuous motives at work.

People who didn’t live through the online evolution of bulletin boards, newsgroups, and discussion forums (all of which disappeared eventually into the reeking maw of social media), might not recognize this tactic as a largely online way of posturing and pseudo-arguing.  Like most rhetorical strategies born in the disinhibited, critical-thinking-starved world of the internet, it’s largely an empty, counterproductive tactic, an emotional time and energy sink best avoided.

Still, during a lockdown, when we’re spending more of our lives online as opposed to in person, pointing these things out might be worthwhile.  They’re no longer the sole province of trolls, basement dwellers, loudmouths, and fakes.  As we move toward the 2021 US Presidential election, social tensions flare, and the virus dances in the streets, stress levels are likely to soar.  And, in cases where public discourse is critical, we might even see close friends and family posing as the edgelord in the room while surreptitiously looking for the exit.


Early rendition of Alfred E. Neuman, 1908.

Today, I wonder whether I should re-think some of my ultra-liberal biases and attendant leftist news consumption.  This is good.  But, man, I’m beat.

The alt-right (and the radical religious right) to me seems like a uniquely American expression of deep stupidity but, of course, I would say that. Look at my demographic: college educated, democrat, fiction writer, from Southern California, who’s been an expat for almost a decade. Of course, I think Trump is the worst thing that could have possibly happened to the world. Of course, I wanted Bernie but voted Hillary. Of course, I want net neutrality. Of course, I support many (but not all) positions taken by the ACLU. Of course, I believe that, in an earlier era, Obama would have been considered a moderate republican. Of course, I have a problem with drones, civilian casualties, the terrific scope creep of the Patriot Act, and the “war on drugs.” Of course, I care about my country.

If I didn’t think the Green Party was run by bumblers, I would probably join. I’m pro-choice, pro-Planned Parenthood, and I support gay marriage. I think many of these things should not even have to be controversial in a liberal democracy. I dream of a day when there will be universal healthcare and free college tuition. I think climate change is one of the most, if not the most, serious issues we face today. But the truth is that most of these biases and beliefs can be (and are) predicted by an algorithm. The even sadder truth is that I only have so much energy I can devote to fact checking and being outraged. This is a problem. Tiredness is a problem.

The problem is not that there is a right answer we have to find. The problem is that uncertainty and complexity are exhausting over time, especially when you’re necessarily engaged in other things. Most Americans are not, actually, stupid. They’re invested in certain areas–mostly job and family–and in most other respects have a general (superficial) understanding of the world, including political issues and identifying yellow journalism, confirmation bias, and what passes for fear mongering click-bait. I have also seen this in European and Asian countries, relative to various cultural differences and levels of education. The USA doesn’t own “stupid.” Every country with a powerful media has a horse as a proconsul somewhere. The difference is that the States likes to put its toga-wearing horses on display, whereas other countries have not. But this is changing.

In an enormous post-industrial society, you will have many levels of political, historical, and economic awareness and many opinions emerging constantly in the news media. You will also have crackpot theories; secessionism; separatism based on race, religion, and / or gender biases; conspiracy paranoia; multi-directional shaming; late night talk show infotainment; social justice fanatics; religious absolutists; new age hucksters; ambulance chasers; a continuous horde of cynics; doom-saying historians looking for their 15 minutes; the resurgence of failed orthodoxies (like Nazism, ethno-nationalism, and whatever Steve Bannon happens to be reading); and the all-encompassing opportunism that feeds off these things. What you won’t have is a simple black-and-white truth. You will have truthiness.

To live in an information society infected with truthiness is extremely taxing. But just as there is no black-and-white truth, there is no easy solution. A friend of mine has suggested “slow news” as opposed to internet news feeds. It seems like there are some merits there. But slow news does not necessarily safeguard against yellow journalism, which has been around since newspapers could fold. In many ways, the 24-hour news cycle and its problematic presence on social media makes it harder for governments and corporations to spin interpretations in their favor. We should be grateful for the ineptitude of Sean Spicer and the alacrity with which he and his boss are covered by the press corps.

I don’t have answers. I don’t think there is a single version of what is true—at least not one that can be had through the media. But I also don’t think the cross-eyed chants of “burn it down” and “fuck your feelings” have done any good. They helped Trump get elected as president, and he has thus far made a mockery of America. The left understandably wants him gone. The GOP wants him to calm down and let them get on with the kleptocracy. His base supporters are currently upset because he bowed 5 inches to receive an award in Saudi. Some of his supporters are no doubt upset that the Reich hasn’t yet emerged in all its glory. I suspect they will still be upset when he gets impeached.

“Nothing is an absolute reality; all is permitted” – Hassan-i Sabbah

classroomLong ago, I was an English teacher at a private high school in central California. It was a good, if demanding, job and unlike many of my colleagues, I seemed to manage occasional moments of non-misery in the workplace. In fact, the two years I spent working there taught me more about human nature than two decades of university teaching, freelance writing, and working abroad ever did.

Without a doubt, teaching over 100 adolescents each semester schooled me not only in how people will behave when going through some of the worst years of their lives but the extent to which many parents are feverishly inadequate when it comes to raising kids. With respect to family, no one wants to admit they have no clue what they’re doing. Everyone must pretend things are running smoothly and they’re in complete control.

I found this pretense interesting, particularly during parent-teacher conferences when ashamed, bewildered parents would whisper, “What do you think I should do?” as if my ability to manage large groups of adolescents somehow qualified me to give them advice. At first, I would explain that my two degrees in English plus minor gifts in speaking in front of people and writing did not mean I had a solution to why Jimmy couldn’t sit still or why Leticia cried through every class and felt compelled to layer everything around her in Purell, or why Leo circulated pictures of his girlfriend’s vagina. Over time, I developed a less draining response: “I do not know.” All Quiet on the Western Front may prepare us to think about the meaning of war, but it will not prepare us for Leo’s girlfriend’s vagina.

I suspected then, as I still do, that confronting such situations is not within the job description of a high school English teacher. But maybe, in the hundreds of outrageous situations in which I found myself in that job, I could have done more. The questions I ask myself now are the questions many parents asked me then: what should I have done? Was there anything to be done at all? There must be an expert somewhere, a veteran administrator or someone with a PhD in education theory, who can speak to this. Maybe a prison psychologist.

I wish I could believe that. In spite of my lingering questions, I think I’ve come to believe the opposite: there actually are no rules—not just for teaching or parenting, but for any area of human experience. A friend once said to me when we were going through our own high school torment: “This is the meaning of life: we all suck and we’re nothing.” I don’t think he fully appreciated how profound that statement was when he said it. 27 years later, I’m still seeing it prove out.

We all suck: no one—and I mean this in the broadest, most inclusive, most general sense—actually knows what they’re doing to the extent that assumptions and judgment calls are unnecessary. Perfect human understanding does not exist and human error is ubiquitous. Even our attempts at artificial intelligence are subject to our limited assumptions about what intelligence actually is (or can be). What can we know beyond a shadow of a doubt? The truth is: nothing, unfortunately.

Surely an engineer will feel confident that, say, as energy is transformed or transferred, an increasing amount of it is wasted. Surely something as dependable and consistent as a physical law (in this case, the Second Law of Thermodynamics) is immutable, absolute, not a matter for interpretation. But even something as ironclad as a law of physics is not without its exceptions. Some things are givens within the parameters of a particular knowledge paradigm, but those givens are always relative to and dependent upon the parameters themselves.

For example, within the agreed-upon bounds of thermodynamic theory, basic laws obtain as a reliable set of rules for the behavior of energy, entropy, and temperature at thermal equilibrium. But we also know that even within that theoretical framework, an empirical finding like the Second Law is subject to exceptions. In 2002, researchers at the Australian National University, in a paper entitled, “Experimental Demonstration of Violations of the Second Law of Thermodynamics for Small Systems and Short Time Scales,” found that “systems can undergo fleeting energy increases that seem to violate the venerable law.” And while this is only one small example, it is by no means isolated or anomalous to the extent that we could dismiss all such exceptions out of hand.

In fact, our entire narrative of scientific progress is predicated on discoveries which alter existing assumptions about how the world works. As Thomas Kuhn observes in The Structure of Scientific Revolutions, “though the world does not change with a change of paradigm, the scientist afterward works in a different world.” The scientific narrative changes. Because it was always a narrative, never an unassailable, omniscient catalogue.

Nothing is beyond interpretation, not even the bedrock assumptions of our most materialistic sciences. Rather, ways of knowing amount to best possible premises always subject to discourse and development over time (to say nothing of the complexities of the information technology we rely on to document and transmit that discourse). We do the best we can. We develop and codify optimal principles for a given field. And then we work with those principles until we encounter a paradigm-disruptive discovery that forces us to revise our theories.

But we’re nothing: Even the most qualified and intellectually responsible claims are part of a conversation (discourse) which is grounded in work that came before and which will be superseded by discoveries and realizations that follow. In many cases, an individual contribution to any field is no greater than a minuscule inch forward with vastly disproportionate implications.

Still, there are careers to develop and Cessnas to purchase and grants to chase and colleagues to slander and books to write and mistresses to support and students to convince. In Polishing the Mirror, the guru Ram Dass—then a social psychology professor named Richard Alpert—describes what he felt was a hollowness at the center of western academia:

In 1961, I was thirty and at the height of my academic career. I had a PhD from Stanford University, and I was a professor of social relations at Harvard. I had arrived at a pinnacle of life as I thought it should be, professionally, socially, and economically. But inside there remained an emptiness—a feeling that, with all I had, something was still missing. Here I was at Harvard, the mecca of the intellect. But when I looked into the eyes of my peers, wondering “Do you know?” I saw in their eyes that what I was looking for was nowhere to be found. In a social or family setting, people looked up to me and hung on my every word because I was a Harvard professor, and they clearly assumed that I knew. But to me, the nature of life remained a mystery.

In Ram Dass’ use of the term, we “do not know” much about the world in any absolute sense. We cannot know because our intellectual tools are as finite as the time we have in which to use them. This is not to argue that we should be content with ignorance. But it is a way to foreground a simple suggestion: speculation is absolutely necessary when it comes to developing knowledge.

Assumptions are necessary. Ultimately, belief is necessary. Kuhn, at least, seems to agree: “Normal science, the activity in which most scientists inevitably spend almost all their time, is predicated on the assumption that the scientific community knows what the world is like.” This seems reasonable not just in science but in any field of human activity.

So what remains to be done if we can never fully know our world and ourselves? Everything! Our inability to attain perfect understanding is no reason to abandon the pursuit of outer and inner knowledge. Rather, it leads us to an acceptance of our limitations as individuals and as a species and, in that acceptance, a very personal yet very real sense of freedom.

Maybe the right answer to those parents who wanted advice should have been: you already know how to raise your kids because what you think is best will be the best you can possibly do. Maybe, as my high school friend seemed to imply back in 1989, we are not static, monolithic, isolate objects. We are no thing.

Instead, we are dynamic, dialectic, fluid collaborations—living syntheses of what was known in the past and what will be discovered in the future. Maybe “discourse” is the most accurate metaphor for human experience. If so, all we can do is try to engage in life’s conversation as robustly as possible. Maybe there are no rules beyond that.

“Baby,” I said, “I’m a genius but nobody knows it but me.” 
― Charles Bukowski, Factotum

Welcome . . .

I write fiction and nonfiction for magazines, work as a freelance writer / editor / journalist, and teach composition and fiction writing.

This blog is mostly dedicated to writing about politics and media, travel essays, creative non-fiction, discussions about books, the MFA experience, publishing, and work I’ve already placed in magazines. But I might write anything.

Sign up for my newsletter.  Also take a look at my Pressfolios pages, where my writing is archived.

Click on the keys to subscribe to my free newsletter.

“To educate is to seek meaning in everything. It is to teach others to seek the meaning of things. This means mixing the dreams of children and young people with the experience of adults and the elderly. This exchange must always take place, or else there can be no humanity because there would be no roots, no history, no promise, no growth, and no prophecy.”

— Pope Francis, 5 June 20

Recent Posts

If I don’t believe what you believe, that’s okay.  I don’t have to.  I will defend your right to hold, express, and live by your own belief system, but you have no right to impose any of it on me.

— Helen Pluckrose

If you enjoy my free content, please consider supporting me on Ko-fi allows me to receive income from fans of my writing.  Anyone who clicks the link can support me with a with a 'coffee' (a small payment that is roughly equal to the price of a coffee).

“Either you repeat the same conventional doctrines everybody is saying, or else you say something true, and it will sound like it’s from Neptune.”

― Noam Chomsky

Twitter Updates

“Perhaps if you know you are insane then you are not insane. Or you are becoming sane, finally. Waking up.”

― Philip K. Dick, The Man in the High Castle


“I write it myself, edit it myself, censor it myself, publish it myself, distribute it myself, and spend time in prison for it myself.”

— Vladimir Bukovsky

“If you’re going to try, go all the way. Otherwise, don’t even start. This could mean losing girlfriends, wives, relatives and maybe even your mind. It could mean not eating for three or four days. It could mean freezing on a park bench. It could mean jail. It could mean derision. It could mean mockery—isolation. Isolation is the gift. All the others are a test of your endurance, of how much you really want to do it. And, you’ll do it, despite rejection and the worst odds. And it will be better than anything else you can imagine. If you’re going to try, go all the way. There is no other feeling like that. You will be alone with the gods, and the nights will flame with fire. You will ride life straight to perfect laughter. It’s the only good fight there is.”

― Charles Bukowski, Factotum