Rhetorical Edgelordism and the Summary Dismissal

[Edgelord:] Even from its earliest uses, the word carries the connotation of eye-rolling skepticism.  The edge in edgelord comes from expressions like cutting edge or the idea of being edgy, applying a sense of boldness or unconventionality to such behavior; the lord half elevates such a person ironically with the rank of a deity or member of British nobility, with echoes of Voldemort, Sauron, and other dark-spirited, villainous characters who hold that title. — “Doing the Work of the Edgelord,” Merriam-Webster.com

Lately, on political news blogs, Twitter, and Facebook, we’ve been seeing a lot of summary dismissals of arguments, particularly those which are racially or pandemically charged.  This might suggest people are more stressed out than ever.  One rarely sees argumentative moves like this when times are calm, even in the divisive cesspools of social media and in the freewheeling comments areas still permitted by news sites. 

Only when people begin to crack under sweeping emotional strain do they start to become rhetorically evasive and nihilistic.  They want to appear as though they’re open to reasoned discussion and debate, but really they want to close down the conversation and talk about their cats.  In a sense, I don’t blame them.  We’re in a very emotionally difficult moment right now.  And no one wants to admit to having an exploding head.  

We might classify this particular evasion as a form of “rhetorical edgelordism”—an attempt to disingenuously self-protect by dismissing an argument while also trying to seem like the smartest, most incisive person in the room. 

If someone says, “It could be A or it could be B,” the edgelord adds, “No, A and B are a false choice because C,” which invalidates them, ostensibly ending the discussion.  Usually the person bringing C is upset with having to choose between A or B and wishes to redefine the choice as (A vs B) vs C—where C is much less controversial, threatening, or applicable.

C is usually something exotic. In order to function as a blanket dismissal, C can’t use the ideas from A or B (because then it falls into the scope of original discussion).  It has to be from a distant discipline or sphere, so far outside the purview of A or B that the core argument gets derailed. 

Here’s an example: “COVID-19 originated in fruit bats” (A) vs. “It was bio-evolved in a Chinese lab” (B). Then (C) pops up: “Actually, statistics have shown social attitudes to pandemics track according to political party affiliation, if you want to talk relevance when it comes to the virus.”  Ironically, C itself is immensely and obviously irrelevant to what’s being talked about.  But unless it is instantly ignored by everyone, it’s work is done.

People who see this move might point out the scope creep.  But by then the thrust of the original discussion has already fractured.  In our example, we’re now talking about at least 3 issues: (1) the bat theory vs the lab theory, (2) the new political party theory, and (3) whether the new political party theory matters or is an irrelevant digression.  Now it’s much easier for the edgelord to divert the argument, self-soothe, and still pose as the edgy freethinker not caught up in the preoccupations of A vs B conformist thinking.  At this point, we’re about three or four rhetorical steps away from looking at a jpg of his cat, Waffles.

In healthy discussions (with psychologically healthy people), this is sometimes called “reframing the issue,” and it’s a perfectly legitimate way of clarifying a subject under consideration—when it focuses on getting at a deeper point significant to A and B.  In the example, this might be something like, “The issue of whether the virus originated in fruit bats or in a lab actually raises the deeper question of whether determining the origin will matter to developing a vaccine.”  Here, the reframe is aiming at a link between both A and B and trying to enhance and clarify the discussion by pointing that link out.  The test is relevance: A and B are both compelling because they are interested in how we know and therefore can control the global outbreak.  But when reframing is done as a way to distract and dismiss by bringing in an extraneous consideration, there are usually disingenuous motives at work.

People who didn’t live through the online evolution of bulletin boards, newsgroups, and discussion forums (all of which disappeared eventually into the reeking maw of social media), might not recognize this tactic as a largely online way of posturing and pseudo-arguing.  Like most rhetorical strategies born in the disinhibited, critical-thinking-starved world of the internet, it’s largely an empty, counterproductive tactic, an emotional time and energy sink best avoided.

Still, during a lockdown, when we’re spending more of our lives online as opposed to in person, pointing these things out might be worthwhile.  They’re no longer the sole province of trolls, basement dwellers, loudmouths, and fakes.  As we move toward the 2021 US Presidential election, social tensions flare, and the virus dances in the streets, stress levels are likely to soar.  And, in cases where public discourse is critical, we might even see close friends and family posing as the edgelord in the room while surreptitiously looking for the exit.

Surpassing Meritocracy: the Artist’s Way

There are many different paths to greatness, not just the ones most commonly identified by conformist culture.  As long as your basic needs are met, where you put your energy—how you pursue excellence—is completely your business.  Realizing this can be difficult and gradual.

It seems true, even if we admit that discourses (value systems) will always compete with each other for dominance.  And one of the most ruthless and rapacious, at least in the West, is that of “meritocracy.”  A meritocracy is inherently based on an assumed set of cultural values.  But you need to realize that you are free to opt out of those assumed values.  What the masses consider to be good doesn’t have to define your life.  

If you don’t accept meritocratic cultural values, merit-based judgments by those who do are irrelevant.  In other words, it is a mistake to impose the rules of a game on someone who refuses to play; though, because discourses will compete with each other, people will usually try to impose their personal values-discourse on you.  Often, they will do so because they’re not aware of alternatives.  They may not even remember the moment they chose to buy in.  And they may not understand that imposing values on someone else is an act of violence.

Remove the question of merit (and its various implications) and the locus of meaning in life shifts (possibly returns) from an external authority to the individual.  One arrives squarely within Viktor Frankl’s “Will to Meaning“—not seeking meaning / value relative to others, but exploring what is already resonant / resident in the self.  “Thy Will be Done” becomes “My Will be Done,” with all the freedoms and responsibilities arising from that shift.

It makes no difference if your private world is idiosyncratic to the point at which it would seem very strange to more common sensibilities.  As long as you’re not behaving like a hypocrite by harming or otherwise curtailing the autonomy of others, your interiority (including the way you choose to perceive the world outside your self) is completely yours.  And it doesn’t seem outrageous to conclude that this is how it should be.  If you don’t own your thoughts, can you ever own anything else?  In fact, it seems that the more you personalize your unique way of seeing and acting in the world, the stronger and more persuasive that uniqueness becomes. 

Because discourse is grounded in conflict and competition, this self-originating, self-describing narrative you are spinning can have a destabilizing effect on others, who may accuse you of being a delusional, a dreamer, someone out of touch with (what the dominant culture considers) reality.  But if it works for you, isn’t it the right thing?  Isn’t that choosing inner freedom instead of pledging fealty to ideas and to a lifestyle that was designed (or emerged) without you particularly in mind?

Walking away from a meritocracy takes a lot of courage and effort.  Because you are a social being, it can involve a certain amount of suffering, alienation, and lonesomeness.  You risk being called a deviant, being labeled as a disaffected undesirable.  Even if you don’t agree with those judgments, they will still hurt.  Hopefully, your growing curiosity about your own sui generis greatness and freedom will mitigate that pain.

You might call this the “inward path,” the “artist’s way,” or “the path beyond the campfire” which leads into dark unmapped places, where all new things wait to be discovered.

Nobody Knows It But Me

classroomLong ago, I was an English teacher at a private high school in central California. It was a good, if demanding, job and unlike many of my colleagues, I seemed to manage occasional moments of non-misery in the workplace. In fact, the two years I spent working there taught me more about human nature than two decades of university teaching, freelance writing, and working abroad ever did.

Without a doubt, teaching over 100 adolescents each semester schooled me not only in how people will behave when going through some of the worst years of their lives but the extent to which many parents are feverishly inadequate when it comes to raising kids. With respect to family, no one wants to admit they have no clue what they’re doing. Everyone must pretend things are running smoothly and they’re in complete control.

I found this pretense interesting, particularly during parent-teacher conferences when ashamed, bewildered parents would whisper, “What do you think I should do?” as if my ability to manage large groups of adolescents somehow qualified me to give them advice. At first, I would explain that my two degrees in English plus minor gifts in speaking in front of people and writing did not mean I had a solution to why Jimmy couldn’t sit still or why Leticia cried through every class and felt compelled to layer everything around her in Purell, or why Leo circulated pictures of his girlfriend’s vagina. Over time, I developed a less draining response: “I do not know.” All Quiet on the Western Front may prepare us to think about the meaning of war, but it will not prepare us for Leo’s girlfriend’s vagina.

I suspected then, as I still do, that confronting such situations is not within the job description of a high school English teacher. But maybe, in the hundreds of outrageous situations in which I found myself in that job, I could have done more. The questions I ask myself now are the questions many parents asked me then: what should I have done? Was there anything to be done at all? There must be an expert somewhere, a veteran administrator or someone with a PhD in education theory, who can speak to this. Maybe a prison psychologist.

I wish I could believe that. In spite of my lingering questions, I think I’ve come to believe the opposite: there actually are no rules—not just for teaching or parenting, but for any area of human experience. A friend once said to me when we were going through our own high school torment: “This is the meaning of life: we all suck and we’re nothing.” I don’t think he fully appreciated how profound that statement was when he said it. 27 years later, I’m still seeing it prove out.

We all suck: no one—and I mean this in the broadest, most inclusive, most general sense—actually knows what they’re doing to the extent that assumptions and judgment calls are unnecessary. Perfect human understanding does not exist and human error is ubiquitous. Even our attempts at artificial intelligence are subject to our limited assumptions about what intelligence actually is (or can be). What can we know beyond a shadow of a doubt? The truth is: nothing, unfortunately.

Surely an engineer will feel confident that, say, as energy is transformed or transferred, an increasing amount of it is wasted. Surely something as dependable and consistent as a physical law (in this case, the Second Law of Thermodynamics) is immutable, absolute, not a matter for interpretation. But even something as ironclad as a law of physics is not without its exceptions. Some things are givens within the parameters of a particular knowledge paradigm, but those givens are always relative to and dependent upon the parameters themselves.

For example, within the agreed-upon bounds of thermodynamic theory, basic laws obtain as a reliable set of rules for the behavior of energy, entropy, and temperature at thermal equilibrium. But we also know that even within that theoretical framework, an empirical finding like the Second Law is subject to exceptions. In 2002, researchers at the Australian National University, in a paper entitled, “Experimental Demonstration of Violations of the Second Law of Thermodynamics for Small Systems and Short Time Scales,” found that “systems can undergo fleeting energy increases that seem to violate the venerable law.” And while this is only one small example, it is by no means isolated or anomalous to the extent that we could dismiss all such exceptions out of hand.

In fact, our entire narrative of scientific progress is predicated on discoveries which alter existing assumptions about how the world works. As Thomas Kuhn observes in The Structure of Scientific Revolutions, “though the world does not change with a change of paradigm, the scientist afterward works in a different world.” The scientific narrative changes. Because it was always a narrative, never an unassailable, omniscient catalogue.

Nothing is beyond interpretation, not even the bedrock assumptions of our most materialistic sciences. Rather, ways of knowing amount to best possible premises always subject to discourse and development over time (to say nothing of the complexities of the information technology we rely on to document and transmit that discourse). We do the best we can. We develop and codify optimal principles for a given field. And then we work with those principles until we encounter a paradigm-disruptive discovery that forces us to revise our theories.

But we’re nothing: Even the most qualified and intellectually responsible claims are part of a conversation (discourse) which is grounded in work that came before and which will be superseded by discoveries and realizations that follow. In many cases, an individual contribution to any field is no greater than a minuscule inch forward with vastly disproportionate implications.

Still, there are careers to develop and Cessnas to purchase and grants to chase and colleagues to slander and books to write and mistresses to support and students to convince. In Polishing the Mirror, the guru Ram Dass—then a social psychology professor named Richard Alpert—describes what he felt was a hollowness at the center of western academia:

In 1961, I was thirty and at the height of my academic career. I had a PhD from Stanford University, and I was a professor of social relations at Harvard. I had arrived at a pinnacle of life as I thought it should be, professionally, socially, and economically. But inside there remained an emptiness—a feeling that, with all I had, something was still missing. Here I was at Harvard, the mecca of the intellect. But when I looked into the eyes of my peers, wondering “Do you know?” I saw in their eyes that what I was looking for was nowhere to be found. In a social or family setting, people looked up to me and hung on my every word because I was a Harvard professor, and they clearly assumed that I knew. But to me, the nature of life remained a mystery.

In Ram Dass’ use of the term, we “do not know” much about the world in any absolute sense. We cannot know because our intellectual tools are as finite as the time we have in which to use them. This is not to argue that we should be content with ignorance. But it is a way to foreground a simple suggestion: speculation is absolutely necessary when it comes to developing knowledge.

Assumptions are necessary. Ultimately, belief is necessary. Kuhn, at least, seems to agree: “Normal science, the activity in which most scientists inevitably spend almost all their time, is predicated on the assumption that the scientific community knows what the world is like.” This seems reasonable not just in science but in any field of human activity.

So what remains to be done if we can never fully know our world and ourselves? Everything! Our inability to attain perfect understanding is no reason to abandon the pursuit of outer and inner knowledge. Rather, it leads us to an acceptance of our limitations as individuals and as a species and, in that acceptance, a very personal yet very real sense of freedom.

Maybe the right answer to those parents who wanted advice should have been: you already know how to raise your kids because what you think is best will be the best you can possibly do. Maybe, as my high school friend seemed to imply back in 1989, we are not static, monolithic, isolate objects. We are no thing.

Instead, we are dynamic, dialectic, fluid collaborations—living syntheses of what was known in the past and what will be discovered in the future. Maybe “discourse” is the most accurate metaphor for human experience. If so, all we can do is try to engage in life’s conversation as robustly as possible. Maybe there are no rules beyond that.

“Baby,” I said, “I’m a genius but nobody knows it but me.” 
― Charles Bukowski, Factotum