Category Archives: conformity

Read my latest at Splice Today . . .

 

 

Read it here: https://www.splicetoday.com/politics-and-media/the-new-york-times-is-rotting-at-the-seams


Writing the Hard Thing

Writers on the pain of hindsight in publishing: 'It's like a bad breakup – you have to move on ...

If I could tell you the number of stories and novels I’ve begun writing and not finished, we’d be here too long.  But “not finished” doesn’t mean “discarded.”  It means what it says. 

The difficulty comes when I’ve convinced myself that I’m one sort of writer (the consistent, cheerfully productive kind) as opposed the other, less glamorous (or, at least, less visible) sort—a slave to the vicissitudes of the moon or some shit, the guy with 25 ongoing projects and an inability to stop working on any of them. 

I know this about myself.  I tell myself that it’s all part of the bigger creative process.  I imagine all these incomplete pieces fermenting, cross-pollinating, mutating.  Nothing lost.  Everything in motion.  And I take refuge in those ideas and metaphors so I can keep working.  Being a writer, I tell myself a story.  But it might be bullshit self-deceit.

The Romantics smoked opium to get closer to the moon and further from the Victorian head trauma of  “productivity.”  And when my genre writer pals do highly Victorian social media posts that go, “Sigh.  Only 10 pages today,” I wonder whether they’re writing from inspiration or simply turning a lathe in some Dickensian word factory.  Productivity equals commercial success, while moonbeams are their own reward.  Still, I have word count envy no matter what I do. 

The problems of productivity and self-deceit are at the center of trying to write the hard thing.  They are the essential obstacles in making the fiction I came here to make instead of clocking in and lathing out a bunch of words to satisfy something or someone else.  I don’t want to produce that which has been assigned to me by industry, necessity, or convention.  I hate obeying.  But am I achieving anything in my disobedience?  For that matter, is achievement even the point?

When yet another publishing industry blog post comes out sounding like the vehement Alec Baldwin scene from Glengarry Glen Ross, I feel repulsed.  I don’t want to spend time creating a fucking audience platform.  Being an artist is not about “closing.”  Just doing the actual writing takes up all my energy.  I don’t want to frame pieces of my fiction as marketable units.  I want to sit in a moonbeam and make something that arises from my own unique imperatives and disposition.  I want the serendipity of inspiration.  I live for it.  And I resist the overtures of commercialism dedicated to consumption and to bullying artists into seeing themselves as part of a service industry.

Unfortunately, I also can’t avoid wanting the world to read my work and maybe give me some money so I can feed and clothe myself.  It’s terrifying sometimes.  Years ago, at an AWP conference, talking with a publisher after I put out Gravity, my first collection of stories, I felt like Nunez in “The Country of the Blind”—faced with the choice of getting what I loved if I voluntarily blinded myself or seeing clearly and climbing out of the hidden valley forever.  In the end, I chose to keep my eyes.

“If you want to get a second book out using the momentum of your first,” he said, “you need to complete the manuscript in less than a year.  More than that and people forget who you are.  You won’t be able to position it.  You’ll be starting over.”  Six years later, my second book was done.  And he was correct: from the marketing, word factory standpoint, I was starting over.  From a creative-process standpoint, those six years were predicated on the six that came before.  I wasn’t starting over.  I was writing something hard that had emerged from my ongoing creative process, something I couldn’t have written in under a year.

Finishing writing in one’s own time instead of in service to the word factory is difficult.  Discovering one’s limitations as an artist and then transcending them is very difficult.  Putting in the years is difficult.  Doing this up to and beyond age 30 is not only difficult but scary.  Nevertheless, all can be accomplished if one is willing to believe in something greater than the word count.  One says, it’s all part of my creative process and tries to calm down.  One decides not to read (or write) certain self-aggrandising Facebook posts.

Of course, there might not be a bigger process.  Maybe there is only Random House, Amazon, AWP conference ugliness, building a platform, positioning and branding, and Best American Monotony.  Maybe.  Maybe we exist in a world full of cynical anti-creative money-making ventures, cautious art, and nothing else.  It’s always possible.  The thought of it sometimes keeps me up at night, especially in those blocked periods of worrying and not writing.

It’s like reading about nuclear war or the earth dying from climate change: you have no agency, no option to mitigate the damage, soulless politicians are making horrible decisions, and there is only one way this can end.  Apocalypse.  Tragedy.  No one at the wheel.  Inhuman corporations controlling everything.  And death, ignominious and unnoticed, unless you get with the program and start churning out formulaic units. 

Capitalism wins.  It usually does.  But if there is a bigger process at work in your struggle to be an artist, it can’t have anything to do with metaphors of productivity on a factory timeline.  That is a reality you must not accept.

How does a writer know what’s real?  Is it moonbeam or production line?  Is it both?  Can it be both?  Andy Warhol, Ernest Hemingway, and David Bowie say yes.  For the rest of us, maybe not.  For every Warhol, Hemingway, and Bowie, there are multitudes who weren’t lucky enough to have their unique artistry coincide with commercial demand. 

Hugh Howey likes to write about Wool the way Elon Musk talks about launching a roadster into space: let me tell you about my unique genius and the origin of my success.  But self-publishing fame and running a car company have one thing in common that never gets discussed: they exist because they are timely.  So it is with any highly lucrative creative effort.  And that intersection has to do with luck.  Meanwhile, someone out there is no doubt making Peking opera, but they are unlikely to be buying villas on the Riviera anytime soon.  Nobody cares.  Their units don’t ship.  And yet they also have the favor of the moon.

Writers are especially predisposed to misunderstand what is real—what is objective versus just a moonbeam.  They spend a lot of time deliberately thinking in metaphors, some more useful than others.  And if they’re not paying attention to their minds, they can mistake such metaphors for objective reality (which, incidentally, has nothing to do with capitalist realism).  Over-absorption in a world of imaginative metaphors can become a source of anxiety when the non-make-believe world reaches out and reminds us that we can’t live totally in our imaginations.  Make your Peking opera, sure, but also accept that the six years you put into it mean nothing in terms of branding and positioning.

A writer will see something and begin to imagine things about it—everyone does this, but writers seem to do it with particular intensity—and before long the writer starts to feel like he or she knows it or, even worse, is it.  Then something from the world of physics and money communicates: no, you are not that.  You can’t imagine yourself to fame and fortune if you’re doing original work.  You might get lucky, yes, and I hope you (I hope I) do.  But commerce and true creativity exist in different spaces.

So I look at my 25 open projects with a bit of trepidation as the days go by.  I’m turning 46 this month.  I’ve published a lot of stories in magazines and two books.  These have been hard things.  Are they enough?  Will they ever be enough?

Don’t worry, I tell myself.  There’s bigger process at work.  There must be.


Maybe being a success-bot isn’t the way after all?


The Introspective Ferret’s Guide to Parties

The quiet introspective ferret feels he has only been to two kinds of parties: those where people assess each other from behind smokescreens of shallow small talk and those where people get as drunk and as high as possible to avoid being aware of such assessment.  Office / department parties tend to be a blend of the two, with clever coworkers staying sober so they can capitalize on the rare opportunity to interrogate / insult the drunkards or make time with someone normally uninterested in them.  This is not misanthropy on the part of our gentle introspective ferret. He has simply learned that he likes individuals way more than groups.

Staying home is nearly always a better choice.  It keeps our ferret from having to dwell on the loathsome behavior that inevitably comes out in people after a few hours of drinking and frustration.  It’s way better not to see it, not to have to recall it, in those the ferret would prefer to otherwise respect.  But if he must attend, our ferret prefers to bring his own non-alcoholic beverages and disappear after about 90 minutes of watching people force smiles and reposition themselves feverishly around a room.  Also, having a palette-cleansing activity lined up, like a movie or some other distracting event, helps an introspective ferret shake off the bad vibes.

No one cares about what a ferret does at a party anyway. No cares that his drink is non-alcoholic.  In fact, they probably don’t even notice.  And no one really cares that he left after 90 minutes, unless they came to the party on a mission with the poor ferret in mind, in which case he should definitely scamper out with a quickness after no more than an hour and preferably by the back exit.

In the following days, the drama and innuendo about what happened between various drunkards at the party will become known.  But our gentle ferret will be an innocent child of the earth, oblivious and free, a wild polecat in the grass amid the butterflies. For he will be able to tell the simple truth: “I’d already left when X-horrible-thing happened between Bleary Mule and Angry Snake.  So I really have no idea.”  And people will turn their boredom and obsessiveness on someone more entertaining—Squawking Rooster, perhaps.


Surpassing Meritocracy: the Artist’s Way

There are many different paths to greatness, not just the ones most commonly identified by conformist culture.  As long as your basic needs are met, where you put your energy—how you pursue excellence—is completely your business.  Realizing this can be difficult and gradual.

It seems true, even if we admit that discourses (value systems) will always compete with each other for dominance.  And one of the most ruthless and rapacious, at least in the West, is that of “meritocracy.”  A meritocracy is inherently based on an assumed set of cultural values.  But you need to realize that you are free to opt out of those assumed values.  What the masses consider to be good doesn’t have to define your life.  

If you don’t accept meritocratic cultural values, merit-based judgments by those who do are irrelevant.  In other words, it is a mistake to impose the rules of a game on someone who refuses to play; though, because discourses will compete with each other, people will usually try to impose their personal values-discourse on you.  Often, they will do so because they’re not aware of alternatives.  They may not even remember the moment they chose to buy in.  And they may not understand that imposing values on someone else is an act of violence.

Remove the question of merit (and its various implications) and the locus of meaning in life shifts (possibly returns) from an external authority to the individual.  One arrives squarely within Viktor Frankl’s “Will to Meaning“—not seeking meaning / value relative to others, but exploring what is already resonant / resident in the self.  “Thy Will be Done” becomes “My Will be Done,” with all the freedoms and responsibilities arising from that shift.

It makes no difference if your private world is idiosyncratic to the point at which it would seem very strange to more common sensibilities.  As long as you’re not behaving like a hypocrite by harming or otherwise curtailing the autonomy of others, your interiority (including the way you choose to perceive the world outside your self) is completely yours.  And it doesn’t seem outrageous to conclude that this is how it should be.  If you don’t own your thoughts, can you ever own anything else?  In fact, it seems that the more you personalize your unique way of seeing and acting in the world, the stronger and more persuasive that uniqueness becomes. 

Because discourse is grounded in conflict and competition, this self-originating, self-describing narrative you are spinning can have a destabilizing effect on others, who may accuse you of being a delusional, a dreamer, someone out of touch with (what the dominant culture considers) reality.  But if it works for you, isn’t it the right thing?  Isn’t that choosing inner freedom instead of pledging fealty to ideas and to a lifestyle that was designed (or emerged) without you particularly in mind?

Walking away from a meritocracy takes a lot of courage and effort.  Because you are a social being, it can involve a certain amount of suffering, alienation, and lonesomeness.  You risk being called a deviant, being labeled as a disaffected undesirable.  Even if you don’t agree with those judgments, they will still hurt.  Hopefully, your growing curiosity about your own sui generis greatness and freedom will mitigate that pain.

You might call this the “inward path,” the “artist’s way,” or “the path beyond the campfire” which leads into dark unmapped places, where all new things wait to be discovered.


Nobody Knows It But Me

classroomLong ago, I was an English teacher at a private high school in central California. It was a good, if demanding, job and unlike many of my colleagues, I seemed to manage occasional moments of non-misery in the workplace. In fact, the two years I spent working there taught me more about human nature than two decades of university teaching, freelance writing, and working abroad ever did.

Without a doubt, teaching over 100 adolescents each semester schooled me not only in how people will behave when going through some of the worst years of their lives but the extent to which many parents are feverishly inadequate when it comes to raising kids. With respect to family, no one wants to admit they have no clue what they’re doing. Everyone must pretend things are running smoothly and they’re in complete control.

I found this pretense interesting, particularly during parent-teacher conferences when ashamed, bewildered parents would whisper, “What do you think I should do?” as if my ability to manage large groups of adolescents somehow qualified me to give them advice. At first, I would explain that my two degrees in English plus minor gifts in speaking in front of people and writing did not mean I had a solution to why Jimmy couldn’t sit still or why Leticia cried through every class and felt compelled to layer everything around her in Purell, or why Leo circulated pictures of his girlfriend’s vagina. Over time, I developed a less draining response: “I do not know.” All Quiet on the Western Front may prepare us to think about the meaning of war, but it will not prepare us for Leo’s girlfriend’s vagina.

I suspected then, as I still do, that confronting such situations is not within the job description of a high school English teacher. But maybe, in the hundreds of outrageous situations in which I found myself in that job, I could have done more. The questions I ask myself now are the questions many parents asked me then: what should I have done? Was there anything to be done at all? There must be an expert somewhere, a veteran administrator or someone with a PhD in education theory, who can speak to this. Maybe a prison psychologist.

I wish I could believe that. In spite of my lingering questions, I think I’ve come to believe the opposite: there actually are no rules—not just for teaching or parenting, but for any area of human experience. A friend once said to me when we were going through our own high school torment: “This is the meaning of life: we all suck and we’re nothing.” I don’t think he fully appreciated how profound that statement was when he said it. 27 years later, I’m still seeing it prove out.

We all suck: no one—and I mean this in the broadest, most inclusive, most general sense—actually knows what they’re doing to the extent that assumptions and judgment calls are unnecessary. Perfect human understanding does not exist and human error is ubiquitous. Even our attempts at artificial intelligence are subject to our limited assumptions about what intelligence actually is (or can be). What can we know beyond a shadow of a doubt? The truth is: nothing, unfortunately.

Surely an engineer will feel confident that, say, as energy is transformed or transferred, an increasing amount of it is wasted. Surely something as dependable and consistent as a physical law (in this case, the Second Law of Thermodynamics) is immutable, absolute, not a matter for interpretation. But even something as ironclad as a law of physics is not without its exceptions. Some things are givens within the parameters of a particular knowledge paradigm, but those givens are always relative to and dependent upon the parameters themselves.

For example, within the agreed-upon bounds of thermodynamic theory, basic laws obtain as a reliable set of rules for the behavior of energy, entropy, and temperature at thermal equilibrium. But we also know that even within that theoretical framework, an empirical finding like the Second Law is subject to exceptions. In 2002, researchers at the Australian National University, in a paper entitled, “Experimental Demonstration of Violations of the Second Law of Thermodynamics for Small Systems and Short Time Scales,” found that “systems can undergo fleeting energy increases that seem to violate the venerable law.” And while this is only one small example, it is by no means isolated or anomalous to the extent that we could dismiss all such exceptions out of hand.

In fact, our entire narrative of scientific progress is predicated on discoveries which alter existing assumptions about how the world works. As Thomas Kuhn observes in The Structure of Scientific Revolutions, “though the world does not change with a change of paradigm, the scientist afterward works in a different world.” The scientific narrative changes. Because it was always a narrative, never an unassailable, omniscient catalogue.

Nothing is beyond interpretation, not even the bedrock assumptions of our most materialistic sciences. Rather, ways of knowing amount to best possible premises always subject to discourse and development over time (to say nothing of the complexities of the information technology we rely on to document and transmit that discourse). We do the best we can. We develop and codify optimal principles for a given field. And then we work with those principles until we encounter a paradigm-disruptive discovery that forces us to revise our theories.

But we’re nothing: Even the most qualified and intellectually responsible claims are part of a conversation (discourse) which is grounded in work that came before and which will be superseded by discoveries and realizations that follow. In many cases, an individual contribution to any field is no greater than a minuscule inch forward with vastly disproportionate implications.

Still, there are careers to develop and Cessnas to purchase and grants to chase and colleagues to slander and books to write and mistresses to support and students to convince. In Polishing the Mirror, the guru Ram Dass—then a social psychology professor named Richard Alpert—describes what he felt was a hollowness at the center of western academia:

In 1961, I was thirty and at the height of my academic career. I had a PhD from Stanford University, and I was a professor of social relations at Harvard. I had arrived at a pinnacle of life as I thought it should be, professionally, socially, and economically. But inside there remained an emptiness—a feeling that, with all I had, something was still missing. Here I was at Harvard, the mecca of the intellect. But when I looked into the eyes of my peers, wondering “Do you know?” I saw in their eyes that what I was looking for was nowhere to be found. In a social or family setting, people looked up to me and hung on my every word because I was a Harvard professor, and they clearly assumed that I knew. But to me, the nature of life remained a mystery.

In Ram Dass’ use of the term, we “do not know” much about the world in any absolute sense. We cannot know because our intellectual tools are as finite as the time we have in which to use them. This is not to argue that we should be content with ignorance. But it is a way to foreground a simple suggestion: speculation is absolutely necessary when it comes to developing knowledge.

Assumptions are necessary. Ultimately, belief is necessary. Kuhn, at least, seems to agree: “Normal science, the activity in which most scientists inevitably spend almost all their time, is predicated on the assumption that the scientific community knows what the world is like.” This seems reasonable not just in science but in any field of human activity.

So what remains to be done if we can never fully know our world and ourselves? Everything! Our inability to attain perfect understanding is no reason to abandon the pursuit of outer and inner knowledge. Rather, it leads us to an acceptance of our limitations as individuals and as a species and, in that acceptance, a very personal yet very real sense of freedom.

Maybe the right answer to those parents who wanted advice should have been: you already know how to raise your kids because what you think is best will be the best you can possibly do. Maybe, as my high school friend seemed to imply back in 1989, we are not static, monolithic, isolate objects. We are no thing.

Instead, we are dynamic, dialectic, fluid collaborations—living syntheses of what was known in the past and what will be discovered in the future. Maybe “discourse” is the most accurate metaphor for human experience. If so, all we can do is try to engage in life’s conversation as robustly as possible. Maybe there are no rules beyond that.

“Baby,” I said, “I’m a genius but nobody knows it but me.” 
― Charles Bukowski, Factotum


Happiness is a Warm Gun

She was my first real girlfriend and she terrified me. More precisely, the possibility of her getting bored with me terrified me. And she was always at great pains to remind me that boredom would have dire consequences. Boredom was the end.

I had no idea who she really was and neither did she; though, I had developed some ideas over the months we’d been dating. I’d projected and imagined. Meanwhile, she thought I was Ross from Friends. She’d say it all the time: “You’re just like Ross. Oh my god.” Sometimes, she’d say, “You’re just like my brother,” but, mostly, it was Ross.

Between her brother and his collection of hot mustards and Ross, there was a very narrow margin for keeping Christina entertained. Still, I tried like a motherfucker. I was 18 years old and fully believed she was The One.

She broke up with me while we were sitting in a theater before a movie. I don’t remember what movie it was, but I do recall that it was a Sunday afternoon and it was extremely hot outside. She told me she needed to feel safe and protected. And she didn’t think I could do that. She needed someone who would fight for her. We stared at the blank screen and listened to the Muzak. Then she said she hoped we could stay friends.

For the next two hours, Christina rested her head on my shoulder while I contemplated the infinite tunnel of grief and abandonment I had just entered. I suspected the real reason she was breaking up with me was that she’d gotten bored. And I hated myself for letting it happen when I’d known ahead of time THAT WAS THE ONE THING THAT COULD NOT BE TOLERATED. 25 years later, I still remember her saying, “You don’t even own a gun.”

To my credit, I did not run to the corner gun shop and buy a nine. I’ve always felt that having a gun around would make it too easy for me to permanently check out. But I thought about what she said for a long time and eventually the reality of who Christina was blossomed in my mind. She was, for all intents and purposes, an idiot. But then, as the wise man once said, stupid is as stupid does. Her point about me not owning a gun would prove to be the set-up for a 25-year-long joke, the punchline of which was delivered only last week . . .

 

My 9x19mm Parabellum Romance

At the end of Games People Play, transactional psychiatrist Eric Berne suggests, not unlike Orwell in 1984, that most people are concerned primarily with reproducing and distracting themselves while they wait for death. However, Berne notes that exceptional individuals can transcend the default monotony of an unexamined life:

THE somber picture presented in Parts I and II of this book, in which human life is mainly a process of filling in time until the arrival of death, or Santa Claus, with very little choice, if any, of what kind of business one is going to transact during the long wait, is a commonplace but not the final answer. For certain fortunate people there is something which transcends all classifications of behavior, and that is awareness; something which rises above the programming of the past, and that is spontaneity; and something that is more rewarding than games, and that is intimacy. But all three of these may be frightening and even perilous to the unprepared. Perhaps they are better off as they are, seeking their solutions in popular techniques of social action, such as “togetherness.” This may mean that there is no hope for the human race, but there is hope for individual members of it. (81)

I’ve entertained this idea for a long time. In fact, it has bothered me enough over the years that I’ve structured a large part of my life trying to be an individual who isn’t caught up in mindless self-distraction and reactivity. But sometimes I feel incredibly afraid of this binary. I’m never sure if I’ve succeeded in avoiding groupthink or if success is even possible.

It’s not Berne’s “fright of the unprepared”; rather, I think I get afraid because I worry that the binary itself is oversimple and therefore misleading and impossible. Can I honestly believe that individuality is directly and cleanly opposed to conformity and that these are the available options in my life? I sometimes wonder whether I’ve fallen prey to false assumptions in this respect. Frankly, the thought leaves me cold.

For example, I think, I’m going to avoid going to that melodramatic tear-jerking romance. How boring. Looks like Titanic in space. Having seen over a hundred similar formula-bound period pieces, I feel on some level that I must be right. But then I have to wonder how many people out there are thinking the exact same thing. How many people, by selecting out of the Mindy Project-Titanic-Atonement-A Walk in the Clouds-Bridget Jones demographic, have opted into the anti-Mindy Project-Titanic-Atonement-A Walk in the Clouds-Bridget Jones demographic? How many of these “non-conformists” are straight men within a certain age group? How many of their life choices could be predicted based on the Facebook groups they follow and don’t follow? Should I then run out to buy a TEC-9 to impress my empty-headed girlfriend? What sort of man would make her feel safe and protected? Who decides for me if I don’t? Who decided for her?

 

Your Brain in its Vat will be Tangy and Scrumptious

There appears to be a marketing angle for everyone as soon as we learn that someone has written a mass-market manual, entitled The Art of Non-Conformity. But isn’t this about more than just marketing and consumerism? Isn’t it really about whether we’re sleepwalking through our lives? How does one lead, in Berne’s terms, an aware, spontaneous, and intimate life? Do any of us know what these things mean such that we could define this sort of life in a sentence? Or is such an existence necessarily so idiosyncratic that it resists generalization?

Orwell wrote that the proles “went to work at twelve, they passed through a brief blossoming period of beauty and sexual desire, they married at twenty, they were middle-aged at thirty, they died, for the most part, at sixty. Heavy physical work, the care of home and children, petty quarrels with neighbors, films, football, beer, and, above all, gambling filled up the horizon of their minds.” That sounds a lot like most people. But it’s certainly not the picture of an aware, spontaneous, and intimate life—is it? How can we tell?

If, like Orwell, we assume that there is a persistent objective universe that could be mediated at all times by the political ideology of a state apparatus (or a Cartesian brain-in-a-vat AI), then we can’t be individuals unless we assure ourselves that we know and can act meaningfully on what is objective. We can’t lead authentic lives unless we have both knowledge and agency grounded in an objectively persistent material world.

Moreover, as philosophically complex as this becomes, I have to wonder whether it is even possible to determine whether an objectively persistent material world even exists—back to Descartes and the problem of the Cogito. In the end, there are no definitive answers and it would be naive to expect them. There is only the eternal interplay between narratives and questions. What do we prefer to believe?

 

My Killing Joke

The punchline came when a mutual friend of ours tagged her on Facebook and the hand of the past reached out to tweak my nose. There she was in the same town, SUV, kids, cop husband. Over the last 25 years, she got exactly what she wanted. I avoided all that, which is exactly what I wanted and I still don’t own a gun. Have either of us changed? Or are we both still playing the hands we’ve been dealt, telling ourselves we’re special and authentic and clever? There’s no way to know. We can believe what we like.

In the end, it seems, the joke is on us. No matter who we decide we are—no matter if I’m actually Ross or Christina’s actually a rocket scientist—there’s no central authority to tell us what’s objectively real and what isn’t. This is not Oceania and there’s no dominant newspeak (yet). Instead, we’re forced to carry the burden of meaning as individuals: “Ultimately, man should not ask what the meaning of his life is, but rather he must recognize that it is he who is asked. In a word, each man is questioned by life; and he can only answer to life by answering for his own life; to life he can only respond by being responsible” (Frankl 131).

The most responsible thing might be to keep spinning narratives and keep asking questions. If nothing else, we can be consistent in that.

 

 

Works Referenced:

Berne, Eric. The Games People Play: the Psychology of Human Relationships. Ballentine, 1996.

Frankl, Viktor. Man’s Search for Meaning. Beacon P, 2006.

Guillebeau, Chris. The Art of Non-Conformity: Set Your Own Rules, Live the Life You Want, and Change the World. TarcherPedigree, 2010.

Hickey, Lance. “The Brain in a Vat Argument.” Internet Encyclopedia of Philosophy, http://www.iep.utm.edu/brainvat/. Accessed 26 September 2016.

Orwell, George. 1984. U of Adelaide, Feb. 2016, https://ebooks.adelaide.edu.au/o/orwell/george/o79n/index.html. Accessed 26 September 2016.

Skirry, Justin. “René Descartes (1596—1650).” Internet Encyclopedia of Philosophy, http://www.iep.utm.edu/descarte/#SH4a. Accessed 26 September 2016.