If I Disagree with You in a Forest, Is It Still Wrong?


How can a movie review be written in the third person, as if it were an account of facts? If it isn’t subjective, there’s something false about it.

— Roger Ebert, Life Itself

Introduction

I had powered through my blue cheese burger, always being something of a fast eater by virtue of my single-minded focus on the eating process over and above any conversational maxims, particularly when my food is slathered in blue cheese, and was halfway through a side of fries that seemed to be trying to make up in seasoning what it lacked in overall size—a blatantly profit-motivated technique that would have been roundly criticized by all five of us if the joint hadn’t been so trendy (a basically pointless and image obsessed social concession that irked me even as it reassured me that I’d finally reached adulthood). The server had performed the ritual of asking “how is everything?” and, as per usual, my mouth was full, and on this night it seemed of great importance that I let this server know that I was on to her pesky routine, and so, mouth full, I looked her square in the eye, brow furrowed, eyes dead, and slowly nodded until she backed away. Comments were made regarding how that always happens, with the food in the mouth and everything. No comments were made regarding the obvious point that, if a time is selected at random for a question to be asked of a diner, it is quite likely, given that the diner is here to eat, that he or she will have food in her mouth, and so you can’t really blame the server. Instead the conversation had turned to the Oscars, and I didn’t feel I’d be able to turn it away without invoking the Streisand Effect, so I kept mostly silent as my friends mutually agreed that yes, there had been some good movies last year, and yes, there will probably be some good ones this year, too. The conversation was unprofitable, then, shall we say, as it offered little in the way of new information or individual expression, but it was enlivened by a sense of good cheer (at least one other person had gotten the blue cheese burger, and there was chipotle enough to last the night), and the intelligent and gracious conversation of good friends.

Someone, it may well have been me, said the words American Sniper. Cue much discussion of the possibility of a special effects supervisor who had never seen an actual, live baby, or else a supervisor who was in fact himself a baby, followed by discussions of Eastwood’s personal life and historical political leanings which through later research I learned were incorrect, and dramatically so, and five people all trying not to ask what we really thought of the moral implications of warrior glorification, since none of us had an answer. Cue the comment which, unbeknownst to me or anyone else, would lead me to ruin the evening for everyone involved: “Yeah, I haven’t seen it, but my friend did and she said it was the worst movie ever made.”

My considerable lack of social grace, by some Carrollian logic, led me to the hasty conclusion that, since this friend of my friend wasn’t here, I could let loose on her without hurting anyone’s feelings, so I said, with a falling and sarcastic intonation more appropriate in its exaggeration for a sketch comedy show, “Really. The worst movie ever made. In a hundred years of film history, the one that’s just happened just happens to be the most exceptional.”

She startled, hesitated, “Well no, you’re right—”

I prepared the words “I accept your apology.”

“—she actually said it was second worst. Behind Spring Breakers.”

Numerous thoughts flashed through my mind, not the least of which concerned how exactly it counters accusations of historical ignorance to cite something two years old, but I settled on “That seems a little ridiculously extreme.”

“Well maybe, but she’s right, though.”

I put down my french fry.

“Look, I know you and [redacted] like it, but it’s shit.”

I opened my mouth.

“I haven’t seen it, but I don’t have to to know: it’s shit.”

Those in more cinephilic circles should know that this girl is not stupid, nor will she serve as some sort of stand-in for the ignorance of people who, through some egregious sort of cultural cluelessness, have managed to slog through their whole lives without watching the entire filmography of Harmony Korine. Such behaviour is natural and may well be the healthier option. But those thoroughly outside these circles should note that comments like these are surefire ways to making a person’s devotion to watching and understanding film feel wholly devalued.1 Suffice it to say that the preceding comments made me angry, and that I proceeded to launch into an extended defence of Spring Breakers, fielding concerns ranging from perceived sexual perversion on Korine’s part to a lack of humour on the part of the film.2 I am not going to recreate that argument here for two simple reasons. Firstly, the arguments, though potentially valid in their logic, were, on my part, far too spiteful to deserve any repetition. And secondly, it would be self-aggrandizing nonsense completely tangential to the overall point of this essay, which point lies in the main dispute that ended up arising, that of the indisputability of subjective experience. Because of course, a conversation about a movie between someone as insensitive as I am and someone who hasn’t even seen it will never go that far without not really being about the movie any more, instead being about respect and good conduct.3 The central contention of my friend was that, in outlining the ways in which I thought her dislike for Spring Breakers was definitively ill-founded or wrong, I was disrespecting her right to her own opinion. In her words, “Why can’t you just let me feel how I feel? Why do I have to think what you think?” I will not, in the rest of this piece, argue that I was not being disrespectful. I was. What I will argue firstly is that disrespect is not a necessary outcome of disagreements in subjective matters, and secondly that the belief that it does has done some real harm to our appreciation of film.

The Facts

Subjectivity is hard, though, especially for empiricists like us. So I’d like to work up to it slowly, by starting with our cold, dead, uncontroversial friend, the objective truth. What is probably the most important objective truth to keep in mind when encountering this debate is that the definitions of both “objective” and “subjective,” though moderately clear in denotation, are, in practical use, muddled. Ordinary dictionaries will tell you that “objective” is an adjective describing a judgment which is not influenced by feelings or personal opinion, whereas “subjective” describes a judgment which is so influenced. More philosophically minded, pedantic, or ostentatious sources will tell you that an objective truth is one that would continue to be true even if no subject were there to perceive it, while subjective truths rely on an agent perceiving them.4 Neither of these short definitions is very good, since the former implies that “judgment” can somehow be made sans any and all feeling/opinion, which is unlikely to be true, and the latter sucks all the fun out of the tree-in-forest discussion;5 but the real problem with both is that they don’t capture the wholeness of what people use these terms to mean. In practice, objective things can be verified externally through sense perception or reason—something objective can be argued for or against, while something subjective cannot. Or at least, a subjective thing should not be argued, since it represents a person’s opinion, and people, we believe, have a right to their opinions and feelings.6 That people have this right is, funny enough, just the sort of opinion that belief in this right would preclude anyone from challenging, but it is undeniably true that, if someone claims to believe or feel a certain way, there is nothing anyone else can do to validate or disprove that. One’s own experience of a movie, then, is, for that same person, objective in the sense that it does not appear to them as a matter of opinion whether or not they have a particular opinion. For everyone else, however, there is no way to distinguish between a person who feels some emotion and a person who just appears to feel this emotion. This distinction brings to light the first flicker of the difficulties of discerning precisely between subjective and objective phenomena.7 For another example, read a few positive reviews of movies you dislike, or negative reviews of some of your favourites. If you like Inception, you may feel strange when the Observer’s Rex Reed tells you that you shouldn’t, because when you watch it, “you never know who anyone is, what their goals are, who they work for or what they’re doing.” If you didn’t, you might feel similarly about Laremy Legel’s claim that “[Christopher Nolan] tugs at your heartstrings with Marion Cotillard, then he brings you back to a feeling of dire peril with Ellen Page’s architect persona.” Such sloppy use of the second person is commonplace in English, and so neither author should be faulted too much for it, but it can’t be denied that what they’ve written is nonsense: they have no way of knowing what it is you feel when watching Inception. Maybe you recently saw a picture of Marion Cotillard’s face photoshopped onto a cat, and so her scenes are extremely funny to you. Maybe since childhood you’ve had a fixation on Ellen Page that would make Freud blush, and so her scenes were marginally less nail-biting and more… imaginative, shall we say. Quotes like the above, which blatantly use the second person, are but the most obvious example of a widespread problem not at all limited to film criticism, but quite visible in it. Slightly more advanced writers might obfuscate their assumptions about other people’s feelings by substituting “the audience” for “you,” which could improve these writers’ marks in first-year media studies but is really no better. Once this is found out, some will retreat to burying the lead, saying, for example, that “the gradual lowering of the camera and lengthening of lenses makes the room of 12 Angry Men feel progressively more claustrophobic,” simply leaving it up to the reader who, exactly, is feeling this claustrophobia. It is, of course, the viewer, i.e. you.

And yet in spite of the impregnability of other mind-states, it also can’t be denied that there are objective facts about movies. It is objectively true, for instance, that there was a movie called Whiplash released in what we call 2014. And this movie was, objectively, directed by Damien Chazelle. It starred Miles Teller and J. K. Simmons. It was released in a 2.40:1 aspect ratio. It won Academy Awards for editing and sound mixing. All of these facts are objectively knowable.8 And not all the objective facts about movies are as boring as these. Jaws has been blamed for spawning both the modern summer blockbuster and our extreme fear of sharks.9 The extent of Jaws’ responsibility for either of these phenomena can obviously be disputed, but it nevertheless remains clear that the kind of answer we’re looking for is objective: no one cares whether you feel like Jaws impacted shark conservation—it either had an effect or it didn’t. The same goes for Kieślowski’s A Short Film About Killing, which may or may not have impacted Poland’s decision to end the death penalty, but not both. Ditto Fight Club spawning real life fight clubs and Projects Mayhem, and The Jinx getting Robert Durst back in the spotlight.

Though a bit more abstract, it is also objectively true that there are tendencies or norms as to how humans communicate. I speak a dialect of English, as do many others.10 English has, like any language, rules. Some of these rules are, at least on the scale of a human life, fixed and definite, like the distinction between “I am ten years old” and “I was ten years old.” Long term, of course, the conventions surrounding the “be” verbs will change, but for your modern, everyday English speaker, making this distinction between what is currently happening and what has happened in the past is done by using either “am” or “was.” Other rules are not so set in stone. Shifts can be relatively fast, obvious, and political, as in the shift in meaning and eventual reclamation of “gay,”11 or they can have no real intentionality to speak of, as has happened with the emergence of the quotative like,12 or they can emerge slowly and without anyone later knowing they’ve happened, as with the word “awful.”13 This may appear to be inconsequential jibber-jabber that is about as far from Spring Breakers one could get, but the point I’m trying to drive home is that how people communicate in English depends on the rules of communication, and these rules exist outside of any one person’s subjective opinions. Yes, all our words are, in a sense, “just” social convention, but the social convention itself exists outside of any one person’s opinion, and can be debated on something like objective grounds. That is, even though “yes” and “no” only mean what they mean because we think they do, I can’t swap their meanings just by believing this is so. Nor can I reasonably expect to use “gay” to mean happy without connoting homosexuality. Even if I somehow managed to honestly believe it only means “happy,” it objectively doesn’t, in actual, real world use.

And these guiding conventions apply to art, as well, which is how this is all relevant. For a time, a movie being black and white carried little meaning in and of itself, because there was no other mainstream option. Now, however, one can’t shoot sans colour without implying that the choice to do so had some sort of motivation, be it financial, artistic, or otherwise.14 Similarly, there was a time when the majority of a movie’s credits rolled at its beginning, rather than its end. Now this has switched, and having four minutes of credits at the opening of a film gives it a rather different tone than what we’re used to. My contention isn’t that deciding to go with or against these conventions is right or wrong, but rather that, given that a convention does exist, the choice to go with it or not says something. Shooting in black and white with credits at the intro is an intentional choice with meaning that is dictated by objectively real cultural assumptions and understandings, and so it’s not, strictly speaking, only a matter of opinion what this decision would say. There may be more than one possibility, but they are not infinite, and many could be argued, without any appeal to feelings or mere opinion. Nothing I write could possibly be a critique of governmental policies that have yet to exist, nor could it be a message of hate toward the citizens of a country I haven’t heard of. And it is unlikely, though not impossible, that my writing can rightly be interpreted as veiled apologetics in support of the Shinto religion, given that I have very little connection to Shinto. Similarly, when Harmony Korine has a voiceover of a character, Faith, saying of her spring break trip, “we saw some beautiful things here,” while we see the four main characters urinating on the side of a road, or saying “everyone’s so sweet here—so warm and friendly,” while we see a woman clearly yelling “spring break, bitch,” and then a close-up of a woman’s barely-clad butt cheeks being rapidly shaken at us—this absurdly sanitized description being directed at Faith’s grandma, for Pete’s sake—I would argue it’s not simply a matter of opinion what stance Korine is taking here. Or at least, it’s not a matter of opinion in the “everybody gets a say and no one can be wrong” sense: the logic of how we communicate dictates that what we see (rampant hedonism, debauchery, and STI-friendly condom-free sex) influences how we hear the voiceover. Namely, calling four friends urinating by the roadside beautiful is, by current standards, objectively unlikely to be genuine, and this is made all the more certain by another line: “I wanna come back again next year with you.” Again, this is said to this character’s grandmother. Faith would like to take her grandmother on a trip to St. Petersburg, so that this elderly woman can pee on sidewalks, snort cocaine, let out her sexually experimental side, and listen to terrible rap. Without generalizing to a larger claim that Spring Breakers as a whole is a good movie, I would say that the message, at least of this specific sequence, is more than just my opinion, since the alternative propositions—that Korine just happened to juxtapose these images perfectly against the voiceover, by some sort of divine accident; or that he actually intended the sequence to be taken genuinely—are objectively unlikely.15

As far as regards the issue of respect in personal disagreements, this is the extent of my point: given that the tendencies of human communication can be observed and discussed outside of any one person’s opinions, how is it necessarily disrespectful to disagree with someone’s interpretation of a work? I must always keep in mind, of course, that the reality of that other person’s own experience can’t be questioned,16 although even that wouldn’t be disrespectful so much as it would be stupid and pointless. And for simple rhetorical reasons I must make clear that I’m not telling anyone they must agree with me.17 But provided these pitfalls are avoided, I see no reason why it constitutes disrespect for me to disagree with someone, or vice versa. However, this is not, alas, the biggest issue with our current understanding of subjectivity. The primary issue, in my view, is that the doctrine of mere subjectivity— which holds that, in what are seen as matters of taste, relativism should prevail and no one should disagree with, condemn, criticize, or dismiss opinions alternative to their own—has infiltrated all sorts of areas it really shouldn’t have, and that we’ve come out the worse as a result.

The Non-facts, or The Real Argument

It should be made clear that I’m not against tolerance, understanding, living and letting live, letting others float their own boats, and the like. Our broad acceptance of subjectivity and the right to an opinion has much to recommend it. At the very least, this mindset is realistic about how art functions: individually, internally, emotionally, cognitively, personally. Very little about art has any meaning at all outside of some viewer perceiving it, and so it makes sense to give precedence to the personal, emotional reactions that stem from it. Beyond this, it’s altogether possible that how we generally think about artistic statements is tied up with all sorts of ideas regarding individual freedom of belief and conscience. That is, the same worldview that lets us say “I like this movie, and you don’t have to” also allows us to say “I like this religion, and you don’t have to,” or “I like this current government administration, and you don’t have to. Essentially, this respect for, or at least acknowledgment of, other people’s intellectual autonomy, helps us get away from legislating thought patterns, at least explicitly. And for those of us who happen not to write our nation’s laws, a healthy understanding of others’ subjectivity, our own, and the difference between those, just makes us less annoying in general. But like most good things, this doesn’t come without tradeoffs.

One obvious issue with it is that it carries at least the potential of shutting down any productive conversation. On the one hand, if everyone in the conversation plays strictly by the rules of relativism, nothing much can happen beyond person A saying “I have this opinion,” and person B saying “That is fine that you have that opinion. Also, I have this other one,” with both people either repeating their opinions without engaging with each others’, or more likely just ending the conversation. On the other hand, if person B ignores convention and disagrees with person A’s opinion, all conversation about the actual subject of interest and its relative merits and/or demerits immediately ceases, being replaced by a conversation that ends up being about whether or not it is O.K. to disagree with someone on what is, after all, just a matter of opinion. Q.v. this essay for an example of this, and note that in many ways I’m not saying anything important or controversial, except that I think we should allow each other to say important or controversial things.18 Also note the ending of the conversation that sparked this extended analysis, which was that it took probably only around forty-five seconds for both me and my friend to realize that absolutely no good could come of our disagreement, and yet neither of us knew quite how to finish it with any grace, and so it went on for maybe five or ten excruciating minutes until I finally settled on the nuclear option, saying to my friend that I thought what was really disrespectful was her disagreement with my disagreement, which is as much as to saw off the branch I was sitting on just so I could get back to solid ground, albeit painfully: the only way not to continue being disrespectful was to attempt to shift the blame, making someone else appear to be the intolerant one, even though the crux of my argument had been that none of us, least of all her, should be seen as disrespectful in the first place.19

Which ultimate refusal of meaningful, opinionated discussion is a strange outcome for a belief system that claims to greatly respect each person’s opinion. That is to say, mutual smiling ignorance is a pretty low kind of respect, and perhaps our paeans to subjectivity don’t really give it its fair due—instead relegating it to “private life,” making it unsharable (and especially so if people would rather not hear it), which is as much as to say making it unimportant, except when it is being defended against the absolutist barbarian hordes—i.e. maybe all this “let’s all think what we want” talk is really just lip service.

Most damning, though, is how easily relativism can be retooled into a weapon of willful self-isolation, a trump card we can use to summarily ignore any viewpoint we’d just rather not think about. I would posit that we do this a lot. For pot-stirring, SEO, and general levity, I will point to the recent anti-vaccine movement as an example. It may seem to some as if medicine and epidemiology are more or less objective pursuits, with readily quantifiable data which can quickly get to a definite answer as to whether, for example, our current MMR vaccine actually reduces the rates of measles, mumps, and rubella, or whether it shows any evidence of inadvertently causing autism. And yet, despite the fact that the data are readily quantifiable, and have been quantified, despite there being much evidence of efficacy and little evidence of harm, Colin Phipps, a chiropractor in San Fransisco, still describes the debate as “a question of faith. […] Do you have faith in the CDC? The FDA? Big pharma? Science not driven by profit or motive? Do politicians have our best intentions at heart? Or do you have faith that the body is built to deal with pathogens in the universe?”20 Phipps’ point seems to be that, at its core, medical science is really a matter of opinion. Or at least, that there can be no ultimate authority on the subject, and so it is a matter of opinion as to which authority should be believed. Coming from a person who is clearly trying to establish the authority of his own, anti-vaccine stance, this is more than a little disingenuous, especially when one can see clearly how Phipps juxtaposes spooky-sounding pro-vaxx agents (CDC/FDA, Big Pharma, corrupt scientists, politicians) with pleasant, airy, or uplifting anti-vaxx thinking (the body and its strength; the odd, New Age-y “universe”). The worry here is that what may have begun or been intended as a tool of increased understanding ends up in practice as a way of passing judgment on something and then not having to deal with any of the repercussions of this judgment, or even having to explain oneself. Person A can call vaccines dangerous and ineffective, and when Person B—whose child contracted measles from Person A’s unvaccinated child—becomes angry, Person A can immediately cite subjectivity: the thoughtless or hurtful words she made toward Person B’s now-deaf child are unassailable, and any attempt to do so is an assault on precious autonomy.21

This is not to say that anti-vaxxers are alone in this, however. I would argue that most of us have a much less healthy respect for others’ autonomy than we realize. In theory, we may say everyone has a right to his or her own opinions, and live and let live, and all such nice cuddly platitudes. Yet I, who firmly believe that movies do have subjective components, joined in a discussion slamming American Sniper because their baby was too sick to film that day and they had to make do with a doll, this discussion being monstrously one-sided and framed as completely objective fact: that movie sucked, and that’s all there is to it. This all, remember, happening mere moments before the inciting Spring Breakers comments. The only reason I didn’t bother to speak up in American Sniper’s defence is that I didn’t like it either.

The most insidious thing about our tendency to objectivize our own thinking and subjectivize others’ isn’t so much that it’s hideously self-centred—that much we should probably just expect of ourselves at this point—but that it’s often invisible to us, or else we just don’t care to do much about it. Facebook took about fifteen minutes’ worth of heat last year for participating in a study22 in which users’ news feeds were altered to display either fewer negative posts or fewer positive posts. It was then examined whether viewing more positive content, in general, leads to an increase in positive content on the user’s part, and vice versa. Setting aside the considerable number of responses to the criticisms this study faced,23 what mostly got lost in the shuffle was an understanding of the extent to which Facebook already manipulates users’ news feeds. Facebook has always been a constructed reality, in ways far bigger than users only posting their most attractive or fun-looking pictures. If you are like most people, your Facebook friends post more content in a day than you could read through even if you wanted to, and so Facebook has a choice: either let most shared content go unseen with little rhyme or reason, or intentionally pick certain things to show certain users. The latter option likely results in more user engagement, and so they went with that. What you see on Facebook is not a simple list of things your friends have done. It is a selective list of things your friends have shared, which Facebook thinks you will already like. Google does much the same thing, with its search results, personalizing them based on past searches, browsing history, age, sex, racial demographics, location, and, one can only assume, your chakras. Every other search company in the game would likely do the same thing, and at the same scale, if they could. Netflix’s star ratings, which many of my friends have assumed to be an average of user or critic ratings, is actually an algorithmic prediction of how much you will like a given movie, based on previous viewing habits, ratings, general audience reaction, the director and stars, and so on.24 This notion of a unique, curated internet experience for each user that ends up shielding him from anything he may not like has been called a filter bubble, and there has been some debate as to how meaningfully it ends up changing one’s internet experience—some people have found it does, others not—but one has to admit that insofar as it does meaningfully change one’s experience and shield users from contrary perspectives, it is at least important to keep in mind. These sorts of analytics may be right, even unsettlingly so, and it’s not as if anyone can blame companies for trying to just give customers what customers want. I’ve done much the same thing to some of my closest friends from time to time. But I can’t help thinking there is a danger in the underlying implication that none of us have any need for things we don’t already like, and that by extension there’s no such thing as worth that you don’t personally recognize. I’ve seen many a friend and acquaintance laugh upon seeing a movie they hate rated poorly, seeing this as a sort of vindication, quite reasonably not understanding that this “rating” is just Netflix correctly predicting their response.

This sort of relativism brings out the evangelical in me. Mustn’t there be something more to a movie than how I feel about it? Is there never value in watching something I dislike? Is the foundational principle of consumer choice, “It’s my money and I’ll spend it how I like,” really the be all and end all of everything? Even art? Because if so, you could go your whole life without ever expanding one iota, and there’s no sane way to argue for anything else. And the fact is, if I could, there’s a huge part of me that would love nothing more than to like what I like, hate what I hate, and enjoy my stasis until the day I die, yet I’ve seen, even through the limited hindsight I have, just how much growth I’ve needed in the past,25 and, extrapolating from that, it would appear as if I’m going to need even more growth in the future.26 It’s almost enough to make me want a few pigheaded, absolutist assholes whose job it is to argue plainly and intelligently about film, so that I can have a more informed appreciation of it than a few transient hormonal responses can give me. The problem, of course, is that we all hate critics. Critics might tell us our favourite artistic work has problematic implications for race relations, or issues of sexism, or deifies violence in a way it shouldn’t, or just plain isn’t as mind-blowing as we thought, and this doesn’t feel very good at all. Alternatively, critics might recommend something that, on watching, we don’t like, and this can feel like a waste of time or a betrayal of trust. Our movies don’t help us out much in this regard, with, for example, Birdman, Top Five, and Ratatouille all painting critics as amoral mercenaries more concerned with coming up with witty remarks than actually thinking about movies, or else just stuck up snobs who are genetically unable to just sit back and enjoy anything on account of the Douglas Fir they have stuck up their ass.27

Thing is, I can’t do much to rebut this perception, because I hate critics, too. The experience of reading their work almost universally falls into one of two categories. Either the critic tells me something I already know, which is unhelpful, or the critic says something I flat out disagree with as if it’s an obvious fact, which makes me wonder who this uppity prick thinks he is to act like he knows better than I do about movies.28

And besides, as we all know, everyone’s a critic. At this point, individual autonomy, in tandem with the democratization of both publishing power and film itself, has gotten us to a point where everyone’s opinion is important, which is as much as to say no one’s is, which brings us right back to the original problem: why would anyone consider anyone else to be an authority? Why not just watch what you like?

The response, I think, stems from the very small portion of criticism I’ve read or heard that doesn’t fall into the two categories mentioned above. Again, yes, the majority of it appears to me as either facile or just plain wrong, but every now and then I read, say, an impassioned defence of Tom Cruise, and am suddenly able to appreciate his movies without any ironic distance; or else by conversation with whip-smart friends I am suddenly able to understand that the musical tradition of characters breaking out into song needn’t be maligned for not appearing “realistic,” because, far from the songs being a vehicle for the characters to naturalistically express themselves, they are really a way for the filmmakers to express their intention, and so now I can watch musicals without any inner sneering;29 or else it could be something as simple as someone commending Happy-Go-Lucky to me, which, as it turns out, is fantastic, and the start of a longstanding admiration for Mike Leigh. This sort of criticism broadens me, stretches me, makes me uncomfortable, and ultimately improves me, as far as I can tell. And it’s not just positive, either. An extensive and insightful deconstruction of Fight Club can lead to a newfound appreciation of the power of presentation to subvert (or support) a story’s actual intention, and clue me in to the multiplicities of goodness, some subset of which is extant in pretty much anything.30 This is criticism that I would argue is worthwhile in almost an objective sense. It has helped me appreciate more art, thoughtlessly dismiss less of it, and in doing so has helped me become a better person in tangible ways. Directly, the process of learning to give as much thoughtful consideration as is possible to every act of expression one comes across, though obviously not a practice I’ve perfected, has made me a dramatically more empathetic person. Indirectly, a broadened appreciation of art has led me to such things as Short Term 12, The Bridge (2006), Calvary, and The Voices, all of which can be praised as masterpieces or dismissed as overrated trash, but, more importantly, all taught me something real and practical about human relationships.31 And these tangible benefits have genuinely improved me and my life irrespective of whether I “like” the movies or the criticism of them. But this all just serves to bring us right back to the central problem: even if artistic appreciation and critical reflection are objective goods, how could anyone convince anyone that this is the case, given that, as of right now, subjectivity reigns. Why would you, reader, consider anything I’ve written to be more than a written account of what it is I happen to feel, with no bearing on how you should feel? What authority could I possibly have?

Conclusion

There is one obvious solution to all this, which is for me to shut up and enjoy my burger. There is definitely something to be said for refraining from over-intellectualization, especially since such a lengthy, academic consideration of a topic may, in practice, serve only to make the few people who read it less interested in the topic at hand. Unfortunately, this doesn’t fix any of the problems at hand, which I believe and have argued are real and important. Specifically, our over-reliance on subjectivity as an argument-stopper can end up cutting us off from ideas that might improve us, and can be used as an excuse for disrespect, mine included.

I’m not sure what to do about this. I don’t know if the solution rests with filmmakers, with critics, or with audiences. Probably all three. And I don’t know if fixing these problems would just end up causing a whole new set of interpretive issues. It probably would. But I firmly believe that these problems matter, and that we should consider them. I think Spring Breakers is good. It’s not a big deal to me if you think it’s good too, but it matters to me—and you too, though you’re free to disagree—that I be able to talk to you about it.


  1. Those who would see this conflict as a manifestation of latent animosity should know that there is none. Our relationship is friendly, and I like her. About the closest thing to a criticism I could have of it would be to say that we didn’t know each other very well, which admittedly did then strike me as a grave offence for some reason, but I would argue this was caused by the cinephilic concerns, rather than the other way round.

  2. The central and heartbreaking irony of this whole interaction—that criticism of a movie I liked personally offended me, like a sort of insult by proxy, and that yet, at the same time, I refused to acknowledge the proxy insult of badmouthing someone’s good friend in the middle of a dinner—was, mercifully, not touched on.

  3. This is unfortunate, as I’m much better versed in movies.

  4. An e.g. being: a stop sign will continue to reflect light with a wavelength of around 650 nanometres, regardless of whether or not anyone is looking at it. This is objective. The stop sign won’t, however, be red, since redness is a perception, and therefore subjective. I know what you’re thinking, and yes: distinctions like this are the reason so many philosophers have gone stark raving mad.

  5. The philosopher’s answer: “Well it depends if we consider ‘sound’ to be an objective or a subjective phenomenon. If the former, then yes; if the latter, then no.” Which this is right, but wearyingly lacking in mystery.

  6. The relevant Latin phrase here is De gustibus non est disputandum, and so help me if anyone pretends that fact is even remotely important I don’t know what I’ll do.

  7. Further investigation into this sort of distinction, and some borderline relevant musings on the existence of subjective experience, can be found in David Chalmers’ The Conscious Mind. Or else, here’s Daniel Dennett explaining how everything you’ve ever experienced is just an outgrowth of your stupidity and inability to be as objective as he.

  8. Some of you may sense an ulterior motive in my diction, an attempt to make the words “objective” and “subjective” so annoyingly frequent as to make every reader toss them aside. I plead the fifth.

  9. Personally, for the latter, I blame their ability to swim at road speeds, this map, which shows their range to be pretty much anywhere you might want to swim, and those teeth.

  10. I’ve had a hard time pinpointing exactly what this dialect is, since it draws on some conventions from the States (I occasionally say “eighth grader” in lieu of “grade eight-er,” for example, though the later grades are enumerated rather than labelled), others from too much BBC (“literally” being pronounced with three syllables, “innit”), and stuff from small-town BC (various terms for alcohol, upwardly intoned “right” suffixed to a sentence, “carbeque,” etc.). Mostly I’d say it’s Hollywood English, in the sense that it’s the sort of talk you hear from an unimportant character in a Hollywood movie who isn’t meant to have anything distinctive about his speech.

  11. See also “queer,” which is experiencing a similar shift toward positive use.

  12. The quotative like is the use of “like” to refer to someone speaking, as in “I was like, ‘yes please.’” This formal linguistic term is a bit academic and therefore icky, but it’s just abstruse enough to sound clinical, avoiding a tone of criticism toward this use of “like,” which use is more or less like the tide and here to stay, not to mention actually useful.

  13. The word used to mean exactly what it looks like: full of awe, or awe inspiring. This meaning still lives on in sentences like “I made an awful lot of money last year,” but in general the word is now negative.

  14. Cinéastes note: I know this is a very simplistic portrayal of the B&W vs. colour discussion, and that pre-colour movies could have a lot to say with their palette. I only mean that the basic decision of shooting without colour meant little, as there was no other realistic possibility.

  15. Ditto the notion that Korine just happened to cast ex-Disney stars, have them watch My Little Pony, give them props with childish overtones (squirt guns, plush backpacks, popsicles), get them to alternate between singing Britney Spears and talking about wanting to pause life or not grow up. Clear points are being made about either the infantilization of sexuality, or the sexualization of infancy, or both. These points may or may not be conscious on Korine’s part, and it’s fair to say this is all creepy and uncomfortable to watch, but I would still argue that it’s objectively more likely that a point is being made than that Korine is just a skeezy old perv.

  16. But N.B. that this doesn’t mean a person’s opinion of their own experience is necessarily true. It’s only that one can’t objectively disprove it.

  17. Just for future reference, though, I’m skinny, timid, and seriously lacking in either thuggish minions or legislative power, so it’s basically impossible for me to make anyone believe anything, so perhaps this rhetorical move shouldn’t be necessary.

  18. Controversial in the more literal sense of causing disagreement, not the racist joke sense.

  19. Except me, of course, but that was because I was being caustic, not strictly because I was disagreeing.

  20. The relevant quote can be found here, provided you’re able to maintain a bit of distance from the tone of the article, which is pretty accusatory.

  21. The other common defence tactic is free speech, which is possibly even worse, since it refuses to even consider the distinction between a legal code and decent behaviour.

  22. Adam D. I. Kramer, Guillory, J. E., & Hancock, J. T. (2014). Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences, 111(24), 8788-8790. doi:10.1073/pnas.1320040111

  23. Firstly, the results of this study are hardly trivial or obvious. It might seem to be a no-brainer that positive content makes people happy, but it is also intuitive to say that viewing friends’ happiness could make people feel envious and depressed. Secondly, while it may be easy to paint Facebook as malevolent in this case, it’s possible that it merely wanted to understand better how to make its users happy, which isn’t as nefarious as all that. Thirdly, the effects found, though statistically significant, were tiny, the largest being two hundredths of a standard deviation. Fourthly, the study only examined subjects’ subsequent posts on Facebook, doing nothing to verify whether the content of these posts reflected actual moods on the part of the subjects, or whether the change in positivity was merely a linguistic issue. Basically, none of it was as big a deal as many made it out to be.

  24. As Xavier Amatrian, engineering director at Netflix puts it, “most of our algorithms are based on the assumption that similar viewing patterns represent similar user tastes.”

  25. I won’t go into the seedy details, but just so we’re clear: I loved Garden State. I thought it really spoke to me.

  26. Maybe someday I’ll be able to like Stalker.

  27. Yes, Ratatouille and Top Five have a sort of redemption for the critics involved, but this redemption comes through the critics in question changing, to become less prodigiously dickish. Anton Ego, in particular, quips that “In the grand scheme of things, the average piece of junk is probably more meaningful than our criticism designating it so,” going on to make the argument that, while critics are generally human garbage, they do occasionally do something worthwhile, which hardly legitimizes the profession.

    Tangent: The derision we reserve for critics, who supposedly shit on work people have slaved over, is curiously absent in our casual discussion—cf. any current discussion of Keanu Reeves—which is odd given that, once someone’s written 300 words on something, they’ve probably given it more care and attention than most of us.

  28. Usually someone with a film studies degree or two, which tends to mean they do actually know better than I do, which just makes it worse.

  29. Well, some of them.

  30. Plus it helps keep me from making yet another painfully unfunny Fight Club reference.

  31. Working backwards: maybe the scariest thing about a murder is the reality of the killer’s experience; a whole lot about meaningful, unglamorous self-sacrifice; about a hundred more nuances to the suicide discussion than I’ve heard anywhere else; kids, right?