“What would you say… you do here”
—Bob, Office Space
I would like to begin by presenting, without any additional commentary, three anecdotes that hopefully can serve to illustrate both what I plan to argue against, and why you may detect an undercurrent of frustration in the ensuing 4,000 words.
It has been alternately my pleasure and my misfortune to have taken a few classes on Canadian literature. One of these, memorably entitled “Canadian Literature in Transnational Times,” skewed toward the latter. Among several other works, our professor had selected Wayson Choy’s All That Matters for study. The novel, a quiet investigation of the life of a young Chinese immigrant and his family in the 1930s and 40s, explores issues of race, cultural clashes, the historical injustices enacted upon nonwhite Canadians, and probably much more.1 Imagine my surprise, then, when my professor began one of her lectures by saying “I’d like to explore the possibility of a queer reading,” and proceeded to argue that the book’s protagonist is in fact gay, unbeknownst to his family, his supposed love interest, himself, and to the reader. Her reasoning for this was simple: since homosexuality was so strongly detested in this time period, one would expect only tiny, barely extant clues as to its existence in any one character, or perhaps even none at all. This, she claimed, is exactly what we find in All That Matters. The protagonist never comes out, or expresses any outright desire for another man, but there is one point, when his best friend is about to leave for World War II, when he seems like our protagonist is trying to say something, but can’t, instead trailing off in an ellipsis. This, my professor contended, is perfectly consistent with the character being gay. In her words, “you can fit a lot inside an ellipsis.”
Run Lola Run is a German film from 1998, that tells the story of Lola, who must somehow get 100,000 Deutsche Mark within twenty minutes in order to save her boyfriend. The movie is elliptical: Lola’s attempt to get the money plays out three times, with slightly different results each time, both for her and for the people she meets. As you might expect, this brings up themes of chance vs. fate, the ethics of breaking the law to save someone, etc., and the movie is far from subtle about this. And yet, when my film studies class met to discuss it, we spent about half of our sixty minute running time talking about an element that one of my classmates found very important: a small, out of focus, edge-of-frame sign, visible in frame for about a second, that indicates the entrance to the subway. This was meant to be important, according to my fellow student, because the sign is blue, in contrast to Lola’s red hair, and thus this sign exists as an antithesis to Lola’s character,2 acting as a symbol of escape, even as the camera relegates this symbol to the background to show Lola’s inability to leave her difficult situation.
About midway through a booksmash3 in my YA literature class, one of the competitors recommended Me and Earl and the Dying Girl, a book about a boy who befriends a girl with terminal cancer in the last months of her life, and through the inspirational power of her painful death grows as a person. “But don’t bother with the movie,” she said, “it changes his character so much. He’s a bit of an asshole in the movie.” I silently agreed with the latter point, even if I think that filmmakers can kind of do what they want with an adaptation. The girl in front of me, however, felt the need to raise her hand (odd, considering the speaker was being timed and had made no indication that she would be fielding questions) and then to just start speaking without being called on (even odder, for the same reasons), asking the speaker “but did you notice his autism?” Whether out of extreme politeness or surprise at the incongruity of such a sentence, the woman stopped dead in her time-sensitive tracks, tried to parse the sentence one more time before just muttering “what?”
“He’s on the spectrum. It never comes out and says it or anything, but if you look at how he acts, how he’s kind of closed off and keeps people at arm’s length, and how he really closely analyzes every social situation he’s in,4 it totally makes sense. The movie really comes alive if you think about it like that.”
The duck quack that had been set up as a buzzer helpfully filled in the moment’s silence, before three or four people made a few generally agreeable hmms, grunts, and “that’s interesting”s. This, about a movie that gives no indication whatsoever that the protagonist has any diagnosable psychological abnormality, in which his social machinations are shown to actually be very successful, and through which the protagonist changes to become more emotionally open and less relentlessly self-aware (thus doing who knows what to his supposed condition).
Now then, those of you who excel at reading titles can likely guess both where I stand on this issue and where I think it is most rampant, so I want to say firstly that it would be irresponsible of me to imply that the above represents the only way art is studied in higher education. It does not, and I have been in many classes that have done far better analyses than these presented. Nor is this sort of reasoning exclusive to universities. I’ve seen it in secondary classrooms and heard it on talk radio. So to those of you who felt the inclusion of the above anecdotes was implying that this is the way art is talked about in post-secondary education, I recognize that professors vary. But I do think this method is connected to academia in a special way, a connection that stems from the nature of academia itself and not found with other communities, and that it’s therefore not unreasonable to think of it in general as the academic approach. This approach is easy enough to criticize as pretentious, unrealistic, or self-important, but while these criticisms aren’t necessarily wrong I don’t think they get to the heart of the issue with academia’s current approach, which is that this method, at its core, runs counter to just about everything that makes language itself and the study of it worthwhile.
Them could be described as fighting words, though, so before I get as negative as that thesis statement hinted I might, I want to define exactly what I mean by the academic approach, then back up and consider what the study of language—English, in our case—is actually good for at the best of times, starting with the most practical, and ending with the most important.
Defining the academic approach specifically is difficult not only because I stand against it, but also because its methods are diffuse and usually overloaded with jargon. But here are a few touchstones that can stand as useful red flags in identifying an academic argument. A focus on alternative readings, à la anecdotes 1 and 3. A focus on minute and seemingly insignificant details over and above big-picture clues, à la anecdote 2. Shout-outs to Derrida, Lacan, Foucault, Barthes. The use of artistic examples to argue for empirical realities. Any of the following words: semiotics, discourse, deconstruction, alterity, subversion, binary, polyphonic (in reference to anything but music), intertextual. Alternative constructions of same. Meta-, post-, and -ism being used more than once (combined) per fifteen minute increment. A preference for “rethinking.” Declarative statements that characters stand for or are social movements or abstract concepts. A suspicious lack of first-person pronouns. “Other” as a verb.5 Titles like “Tracing the Absent Referent in Marian Engel’s Bear and J. M. Coetzee’s The Lives of Animals.” It’s tempting to just say you’ll know it when you see it, but that may be seen as a cop-out. Suffice it to say that the academic approach is probably what you think of when you think of Serious Literary Criticism: it focuses on “deciphering” stories, explaining them in (sometimes hilariously) cognitive terms, and generally shies away from their gut-level emotional impact.6
So then, if that is our tool, what problems does it hope to solve? There are many, of course, possibly as many as there are English students. But I’ve broken them down into four categories that hopefully cover a representative sample of what language study can do.
Firstly, a proper understanding of English is crucial for the workforce. Yes, I am well aware of the going assumption that studying English (or anything other than STEM fields, if we’re being honest) leaves students with few to no useful skills, and zilch in terms of helpful formal qualifications, but I’m not talking here about English degrees, or the skills that current English faculties teach students. I’m talking here about the ideals of studying language, be they lofty or mundane—spelling and grammar, an ear for tone, a properly sized vocabulary, the basics of rhetoric, etc. While many of these may seem on first glance to be irrelevant to your job of choice, I invite you to try to get a job with a typo-laden résumé, without any words to describe your performance beyond “good, great, the best,”7 without understanding the subtle differences between nice-but-business-minded comments and actual friendly ones, and no ability to shape your speech to engender an employer’s interest. Beyond that, I doubt that a proper understanding of human communication is irrelevant to any job, once you’ve landed it. Anyone who has worked in an email-heavy environment knows how tricky it can be to properly convey tone with text, as can most people who text/FB/Snap/Telegram/WhatsApp/etc.8 Even if you couldn’t care less how your emails make others feel, I would bet actual money that you’ve received emails or texts that make your eyes narrow, your brows furrow, your head tilt slightly, and your meninges pulse with a dull throb as you try to work out what exactly someone is trying to say to you.9 Put simply, a competent, working understanding of communication is helpful (albeit unexciting) for both individual workers and for the workforce as a whole.
But let’s say that you either already have a decent job, or you think there are more pressing issues facing our economy—oil prices, bureaucratic red tape, increasingly automated workforces, China. You are likely correct, but it’s worth bearing in mind where these concerns of yours may have come from. You may have heard about them on T.V. or online, from a local activist or a prominent politician. But whatever the source, you now care about these issues at least partly as a result of the media you consume. This isn’t automatically nefarious, of course, but this sort of cause-effect chain—person A says something, and now person B feels the same way—doesn’t just happen. There is rhyme and reason to it. The media you consume, be it local news, BuzzFeed, Twitter personalities, talk radio, or whatever your friends post on Facebook, is biased. Certain words are selected over others; certain facts are headlined and others are ignored; certain connections are drawn and others are left unexplored. Yet while this is almost universally understood on a theoretical level, most of us, if left to our own devices, will only remember to consider it when imbibing media from the “opposite” side. Those on the pro-life side will complain that the label of “pro-choice” is an unfair rhetorical flourish, since choice isn’t really the issue at hand, and no one would want to take a stand against choice. Those on the pro-choice side will make exactly the same argument about the label “pro-life.” And neither will stop to consider the ways in which their own side has influenced their opinions, and seeks to influence the opinions of others. This is not a problem of activists; it is a human problem. We don’t consider our own side’s rhetoric because it is uncomfortable to do so. And yet doing so is crucial for any sort of communication across political divides—without it, both sides just cling doggedly to the idea that they are simply stating the facts, while everyone else is trying to twist things into their own ideology. Breaking down these communicative barriers is a matter of tremendous civic importance, and whether this happens in a faux-trendy media studies class or something as supposedly stuffy as a Shakespeare unit,10 this is something that language study is uniquely suited to do.
Economics and civics are boring, though, and a surprising number of people are convinced they can avoid them and thus pay them no mind. What is unavoidable in nigh every ordinary human life, however, is relationship with other people. Whether they’re friends, coworkers, bosses, underlings, family, romantic partners, or someone you see frequently at the grocery store, other people have a tendency to become a part of your life whether you want them to or not. And once you’re in a relationship with a person, you are inevitably faced with the problem of communicating with them. I of course don’t necessarily know you or the hypothetical person with whom you’re having this supposed relationship, so I can’t say with any specificity exactly how you should communicate, whether you should tend toward more or less emotional disclosure, what the optimal decibel level is for an indoor disagreement, etc., but I can say for certain that there will be some ways of communicating that work better than others—whether “better” means that you will understand your interlocutor with more accuracy, that you will be able to make yourself understood with more accuracy, or that in the end both parties end up happier.11 This may sound rather odd coming from me, seeing as I’m neither a Casanova nor a Svengali and so can hardly lord my linguistic skills above you as some ideal to be striven toward, but to the extent that I’ve been able to understand others and make myself understood, this has been an asset. It has saved some relationships, minimized the fallout of others, and big-picture has just made me a lot happier than I might otherwise have been. This sort of study, focussing on non-workplace, interpersonal communication, is not a major part of most of the English classes I’ve seen,12 but I see no reason why it couldn’t be. At any rate, it is a potential benefit of language study, whether or not it’s capitalized on.
But really, none of these really gets at why most English students go into the field, which in my experience is simply that they love stories, enjoy reading them, and count their lives the richer for having done so. Funny enough, this is one area that many people would count as a mark against formal language education, on the grounds that people don’t really need to be taught what they like, nor do they need to be compelled to do it—people will follow their desires naturally, and with gusto. But while I would agree with the general assumption that preferences don’t need to be taught,13 this hardly makes English class irrelevant. Rather, it places guidelines on how an English class can or should be run. Students will always have preferences, yes. They will naturally gravitate toward things they like and away from things they don’t. All else being equal this is fine, and we should encourage kids in the things they love. But I don’t see why this would imply that we shouldn’t push students beyond what they would normally do. Even if our only goal is to increase students’ enjoyment of the books they read, it’s still worthwhile to push them into more and more challenging material, if only because this will make any relatively less challenging material more manageable, and therefore more fun.14 Moreover, there’s a wealth of material often considered challenging—Homer, Shakespeare, Milton—that illuminate other works even if they aren’t exactly “fun” in their own right. Hamlet, as an example, is to my eyes kind of an overlong slog, but knowing the story and the language used in it has allowed me to understand allusions I might have otherwise missed, and to appreciate newer takes on the same story structure (cf. The Lion King). And yet despite my love for English, I wouldn’t have read Homer or Shakespeare if left to my own devices: I don’t like them. This, it seems, is a perfect place for English education (and in fact education in general) to fit in. It needn’t waste students’ time with things they could do just as well on their own, but can instead force them to do things they may not like or intuitively choose in the moment but which will ultimately be indispensable.
Someone could probably come up with other reasons for studying language, but these four seem to me like a pretty good outline, as they cover major areas of all our lives: economic, political, interpersonal, and internal.15 How, then, does the academic approach stack up, when compared against these four specific and practical ideals of language study?
In short, not very well. On the employability front, English academia has all but accepted its lack of applicability on most job applications, and its relative uselessness within the workplace.16 It’s not hard to see why: extricating the postcolonial semiotics underlying modernist discourse might seem fun, but is unlikely to come in handy even in relatively cerebral jobs. As to matters of civics, proponents of the academic method could claim that the discursive assumptions behind our politics in some way harm or taint our democracy, and that thus deciphering the poststructuralist reality of contemporary political speech is crucial to restore good government, but it’s hard to see how this would be as significant as the numerous basic and definitively-there rhetorical tricks that politicians use to make their opponents seem scary, unhinged, or just not ready for political office.17 Interpersonally, the academic approach is at best irrelevant and at worst an abject failure: almost no conversations apply it, and when they do they quickly devolve into an ouroborotic quagmire, testing each sentence against such and such Lacanian critical theory and ultimately just tiring everyone out without communicating any valuable knowledge about either party.18 And as to the hedonic argument, one only has to look at English students before and after their degrees. Students enter with a love for story and its function(s) in our lives, and exit with a sort of resigned apathy to the knowledge that all text is meaningless, story is merely a Eurocentric schema hegemonically imposed by who knows who, and by the way would you like 3,000 words on that by Friday? Five articles cited? MLA? Done. The reality is that for many students (the majority that I’ve talked to), the academic method reduces complex and lovely things with beauty and worth in and of themselves to mere riddles, cognitive exercises with the potential for a certain word count and grade.19
On hearing this, one is rightly tempted to ask why, exactly, this approach continues to stick around despite its apparent uselessness. It’s not impossible that it serves some function other than the four broad ones I outlined above. If that is so, I’m all ears as to what that function is. But I think the answer is more pernicious and pedestrian than that. First and foremost is the fact that academia has as a central part of its machinery the continued publishing of new theses, which theses are judged at least partly by their novelty. The overwhelming volume of already published work means that new theses must reach farther and farther down the interpretive rabbit hole of critical theory, until they fall in completely.20 Second, and quite probably necessary for the first, post-structuralism in general, and the notion of the death of the author specifically, have taken hold so securely that in many English classes they are assumed a priori and never even discussed.21 (For those happy few unaware of the death of the author, the gist of it is that an author’s intention has no bearing on what a text actually communicates, and even if it did we can never know what an author’s intention was anyway, so we shouldn’t bother with it. It’s about as insipid and stuffy and tweed as an ostensibly radical ideology can be.)22 The practical outcome of this is that many classes spend almost no time talking about the intention behind a piece, or trying to coalesce around some fuzzy but still relatively determinate meaning in a work, instead spending hours of intellectual labour on so many alternative readings which, though often only tangentially related to the text in question, are all considered as important as the (sometimes rather obvious) intention of the author.
It remains an open question as to why thinking that is both unhelpful and fraught with problems hasn’t been challenged in some systematic way.23 It could just be that killing the author makes us feel important, as it allows us to twist art into whatever shape pleases us. Or maybe it is simply the outcome of a machinery that needs to create infinitely new theses based on a limited supply of texts. But whatever the reason, I doubt it’s sufficient. The fact is that there are real benefits to the study of language, benefits that are lost in the academic model, and in their place we have, at best, the opening of a dialogue that no one in their right mind could stand to listen to anyway.
Like the majority of students in the majority of English classes, I read less than half of the assigned readings, which is another sort of inbuilt problem with current upper-level education, but one probably best dealt with another time.↩
It’s like competitive book recommendation, and is about as exciting as it sounds.↩
It’s unclear where this girl got the impression that normal teenage standoffishness and insecurity is a sign of being on the autism spectrum. Seeing as I am by her definition autistic as well, it makes sense that I didn’t bother to ask.↩
To all social progressives reading this, I’m truly sorry that the language of your social justice movements overlap as much as they do with such stuffy nonsense as all this. It’s possible that this is merely an unfortunate coincidence.↩
The specifics of how a particular story elicits the emotions it does are generally seen as boring mechanics and kinda below the pay grade.↩
Or, probably worse, with nothing but dross like “exemplary,” “irreproachable,” “outstanding.”↩
Our current onslaught of emojis has to some extent alleviated this—it’s not hard to tell if someone’s joking when ? is at the end of their sentence—but this is only a fix in the sense that it avoids the issue altogether: we still don’t know how to write a particular tone.↩
To those whose souls glaze over when they hear the word “grammar,” know that this right here is the problem grammar attempts to solve. That your persnickety high-school teacher took its recommendations way too far is not grammar’s fault.↩
Julius Caesar, as an example, is about 90% subtle rhetorical battles, punctuated only briefly by violence.↩
E.g. consider the subtly different tone you may take when breaking up with someone and you want to remain actual friends with them vs. with someone else with whom you only want to pretend to remain friends, but would prefer if you didn’t see them any more. Or else consider being on the receiving end of one of these: the distinction matters, and it’s important to be able to both communicate and understand it.↩
At least, not explicitly so. It’s possible that this sort of interpersonal nuance is best picked up indirectly, through study of the language as a whole.↩
Though there’s certainly an argument to be made that some preferences should be.↩
Much the same argument applies to math education as well: sure, many students won’t use logarithms in their daily post-high-school life, but the point is that the peak of our skills should be well beyond what we need to do every day: working at peak capacity is exhausting and uncomfortable.↩
You could add “moral,” as I considered doing, but that opens a whole other argument about the moral responsibilities of art(ists), and that’s not a conversation either of us wants to have right now.↩
This is the running joke amongst both English students and English professors.↩
To which a persistent proponent might reply that we are talking about the same thing. To which I would say fine, but I managed to communicate this in plain English, so I still, in a very real sense, win.↩
Not to mention the fact that ignoring authorial intent (discussed in just a few moments, below) is a disastrously moronic approach in any real life conversation.↩
I am not, by the way, at all opposed to essay assignments. I quite like them, in fact. But they are not automatically beneficial, and are almost guaranteed not to be when the guiding impulse of one’s course (or indeed degree) is that of the nihilistic, “say anything so long as it sounds clever” variety.↩
Then the author gets a Ph.D.↩
Indeed, if they were debated I doubt many classes would stick with them.↩
Not to mention sloppy and over-applied, to boot. Of course we can’t know an author’s intention, but that doesn’t mean we can’t have some idea what it is. And why does the fact that an author’s intention doesn’t dictate 1:1 what his work communicates imply that her intention doesn’t mean anything? And while we’re at it, how exactly does one get from “all text is polyphonic and subject to interpretation” to the absolute supremacy of the reader over and above the author?↩
The closest I’ve seen is a few teachers who simply ignore post-structuralism and focus very specifically on the likely meaning(s) of the work itself, but I’ve not seen much actual criticism of their intellectual competition.↩