In Defense of Heresy in Criticism

Full English Breakfast

Once a week, Criticwire asks a group of film critics a question and compiles their responses.  This week’s Criticwire Survey seems to have caused a bit of a stir.  Here is the question posed by Matt Singer:

What movie widely regarded as a cinematic masterpiece do you dislike (or maybe even hate)?

This question and its responses were promoted under the incendiary headline: “Overrated Masterpieces.”  Needless to say, this provoked some outrage, both in the comments and across the web.  Only one critic, Glenn Kenny, appears to have left the proceedings unscathed.  The reason for this is that he refused to name a film:

I find this question especially dispiriting, as it’s really just a form of bait, and a cue for individuals to come up with objects to snicker at, feel superior to, and all that. I’m sure many critics will have a blast with it.

Kenny follows this with a passage from Richard Hell’s autobiography where Hell writes of an encounter with Susan Sontag in which she laments the fact that she has opinions because, as Hell puts it, “opinions will solidify into prejudices that substitute for perception.”

On Twitter, New York Times critic A. O. Scott singled out Kenny for praise:

watch @Glenn__Kenny enlist Susan Sontag and Richard Hell to smack down glib link-trolling pseudo-contrarianism

First of all, I would argue that Kenny himself is using this opportunity to “snicker at” and “feel superior to” his fellow critics.  Second, I would argue that the point of this particular survey is to counter popular opinions that may have solidified into prejudices, not the other way around.  Finally, I think that it is Scott who is being “glib” in his dismissal of the exercise as “pseudo-contrarianism.”

Each individual critic (Kenny included) will have points of divergence from the critical community with which he or she belongs.  This is only natural; individuals have individual tastes (e.g., likes and dislikes) based on individual life experiences.  But here is an unsettling fact: many people will accept that certain films are sacred—sometimes irrationally and without having actually seen them—for the single reason that the films have been blessed with critical approval and labeled masterpieces.  The critics who answered the Criticwire Survey are simply challenging this automatic acceptance, some even going so far as to offer rational and articulate defenses of their opinions (the opposite of pseudo-contrarianism, I would say).

Interestingly, James Ramsden, a food blogger at The Guardian, wrote a piece last week called “The Great British fry-up: it’s a national disgrace.”  The article comes with the following blurb:

The full English breakfast is the most overrated of British dishes – even the name is shuddersome. How did we become shackled to this fried fiasco?

Just as with the Criticwire Survey (and perhaps again due to the word “overrated”), Ramsden experienced a lot of backlash.  He felt compelled to write a response (published only a day after the Criticwire Survey): “Which well-loved foods do you hate?”  In this piece, we learn that Ramsden received accusations similar to those received by the film critics.  For example, he, too, was accused of trolling (maybe by the A. O. Scott of the British food blogging world).  However, Ramsden understands where the attacks are coming from:

I understand it because I’ve felt it too. It is perhaps not a rational reaction to a subjective aversion […], but we feel strongly about food and are thus oddly offended by someone vehemently opposing that which we cherish.

Yes, and people apparently feel strongly about film as well and will oppose subjective aversions to well-loved films with equal vehemence and irrationality.  Ramsden, after providing a long list of similar aversions from some notable chefs and food critics, ends his piece by stating:

The common denominator with all of these dislikes is the mutual conviction that the other person is a loon, even a heretic. There are certain aversions – anchovies, haggis, balut, kidneys – that are entirely understandable (you don’t often hear cries of “you don’t like kimchi?!” except perhaps in certain foodish circles), but when it comes to dissing curry, fish and chips, pasta, or indeed a fry-up, it turns out people are, at best, going to think you very odd indeed. Still, can’t blame a man for trying.

Glenn Kenny chose not to name a film on which his opinion differs from that of the masses.  Does that mean he holds no such opinion?  That no such film exists?  Hardly.  As I said, he used this opportunity to elevate himself above his fellow critics under the pretense that criticism has loftier goals than this sort of muckraking.  I think that he just didn’t want to get his hands dirty.  I prefer the “loons” and the “heretics” who are unafraid of their own subjectivity.  On a related note, I believe that Pauline Kael would have loved this week’s Criticwire Survey.  Especially the word “overrated.”

Further reading:

Hume, Kael, and the Role of Subjectivity in Criticism

On Morality in Criticism

Zero Dark Thirty

An interesting question has been making the rounds in certain critical circles since the release of Kathryn Bigelow’s Zero Dark Thirty this past December.  And I’m not talking about the question of whether or not the film endorses torture (it doesn’t).  I’m talking about the broader question that has been phrased this way by Danny Bowes at Movie Mezzanine:

[…] is a critic under any obligation to render a moral judgment on a film?

After pointing out that the debate extends beyond Zero Dark Thirty to films like Django Unchained and Beasts of the Southern Wild, Bowes states:

With each of these films, critics praising the aesthetics of each have been accused of ignoring, rationalizing, or even siding with offensive content therein. In response, critics have been forced into a “no I do not” defensive posture, and a great deal of huffiness about art for art’s sake and the primacy of the work over the given critic’s personal beliefs and austere objectivity and so forth has ensued.

In the past, I would have agreed with the l’art pour l’art critics who claim that they can separate their personal beliefs from their aesthetic evaluations of a given film and adopt an “objective” or an “impersonal” position from which to judge the work in question.  But not anymore.  Indeed, it is my understanding that an aesthetic judgment is inseparable from a moral judgment, and vice versa.  I think that Bowes agrees:

Every act of criticism is a moral judgment, and not in a glib, media-trolling, mid-’60s Jean-Luc Godard way, either. However objective any critic tries to be in evaluating any work, the evaluation is being conducted by a matrix of observation, cognition, and the innately unique assembly of life experience and education that makes up all the things the critic knows and how s/he knows them.

Yes.  Each person who makes an aesthetic judgment on a work of art cannot escape his or her “unique assembly of life experience and education,” and this assembly includes a person’s adopted morality.  Thus, I cannot consciously separate my moral leanings from my critical evaluations of artworks any more than I can separate my aesthetic taste from my moral judgments, no matter how hard I might try to hide the influence of one over the other.  As the character Bill Haydon says in regard to his treason in Tinker Tailor Soldier Spy, “It was an aesthetic choice as much as a moral one.”

Bowes writes at the end of his piece:

The decision a critic makes to approach a movie on its own terms with as much objectivity as s/he can muster is a moral decision. Not everyone succeeds in completely divesting their preexisting baggage.

Not exactly.  I would say that no one succeeds in this and that the morality present in a work of criticism is never a “decision” but inevitable.  In addition, we can never really know the multitude of factors that have brought us to our critical assessments (factors as disparate as temperature, mood, and peer pressure), so how can we choose to ignore some while allowing for others?  We can’t.

In Daybreak, Friedrich Nietzsche writes:

You dislike him and present many grounds for this dislike—but I believe only in your dislike, not in your grounds!  You flatter yourself in your own eyes when you suggest to yourself and to me that what has happened through instinct is the result of a process of reasoning. (D358)

Though criticism remains our best attempt to account for our likes and dislikes, we must recognize the limitations of the undertaking (e.g., the fact that it might just be a post-hoc rationalization of a knee-jerk judgment).  And we must stop pretending that we can consciously control what influences our opinions and what doesn’t, whether it be our moral conditioning, environmental factors, or something else entirely.  The best we can do is be honest regarding the extent of our knowledge in this area.  In most cases it will be minimal.

Further reading:

5 Bizarre Factors That Secretly Influence Your Opinions

Prometheus: “There Is Nothing in the Desert, and No Man Needs Nothing”

Please note that the following post may contain spoilers.

Ridley Scott’s Prometheus is chilling science fiction, a Lovecraftian space odyssey that poses some big questions about the origin of life and its ultimate purpose.  David Denby has called it “a metaphysical ‘Boo!’ movie.”  Andrew O’Hehir compared it to Terrence Malick’s The Tree of Life:

Both are mightily impressive spectacles that will maybe, kinda, blow your mind, en route to a hip-deep swamp of pseudo-Christian religiosity.

I want to counter those claims by demonstrating that, though characters in the film may have faith in something beyond the material world, the film itself (mostly through the android David) depicts a world incompatible with that faith.

The film opens with a humanoid on what is presumably primordial earth.  A spaceship is seen in the distance, apparently abandoning him.  He drinks something from a cup and begins to disintegrate.  His genetic material, we’re led to believe, helped spawn life on earth.  Thus, we’re immediately given the film’s premise: an alien race “engineered” humans through this initial act of terraforming.  This premise, quite naturally, invites skepticism.  Even if an alien race did spark life on earth, there is no way that they could have predicted the paths that this life would take.  There is no way that they would have been able to engineer the many happy accidents that allowed a branch from this seed to evolve into humans.  Later, we will meet a biologist among the crew of the spaceship Prometheus.  He knows how life evolved on earth and voices his skepticism at the idea that we were somehow designed.  How does the script handle this contradiction?  It renders the biologist irrelevant, as nothing more than a cowardly stock character.  But skepticism hardly matters; we have already seen the creation of life on earth, so we must accept this premise, believable or not, as a fact in the world of the film.

This brings us to our protagonist, archaeologist Elizabeth Shaw. She (along with boyfriend Charlie Holloway) is the one who uncovered the cave paintings supporting the theory of extraterrestrial parentage.  The mission of the Prometheus, we learn, is to find our alien ancestors and ask them why they created us.  The assumption, of course, is that there is a meaning to human life, a reason for us being here.  And this meaning, according to Shaw, is out there among the stars for us to discover.  She wears her faith in this idea like a virtue; she also wears a cross.

But Shaw isn’t the only one who has a religious worldview at stake.  Even Peter Weyland (the sinister corporate interest who is funding the mission) expresses faith in metaphysical gobbledygook when he says that David, his android creation, differs from humans in that he does not possess a “soul.”

In a character analysis at the blog Virtual Borderland, the author writes:

We are told that David is different from humans because he has no soul — but is the trick really that David knows humans don’t either? Where humans pretend that they are different, that we have creators with answers to our questions, gods who will elevate us above the rest of the universe, David accepts the empty desert and the trick is simply: not minding that it hurts.

I agree with this analysis, and I think it is a key to understanding David’s function in the film and his obsession with Lawrence of Arabia.  His fondness for the David Lean film is particularly fascinating.   He even attempts to mimic Peter O’Toole through his appearance and mannerisms.  In this ability to learn through experience and observation and to mimic the behavior of model figures, David is perhaps more human than the other characters can comfortably realize, despite his lack of a “soul.”  As the author of the character analysis suggests, maybe David differs most from humans in that  he can accept the meaninglessness of existence.  For example, David knows all too well why he was created:

DAVID:  Why do you think your people made me?

HOLLOWAY:  We made you because we could.

DAVID:  Can you imagine how disappointing it would be for you to hear the same thing from your creator?

In exchanges such as this, David perfectly undermines the metaphysical delusions of his companions.

So what of Shaw’s faith?  What does it mean in this context?  As I already discussed, we are shown the creation of life right at the start, so we at least know that Shaw’s theory of extraterrestrial parentage is correct (absurd as it is).  We then see Shaw and Holloway uncover physical evidence to support their claim (cave paintings around the world that depict giant figures pointing to a specific star system).  People are reasonably skeptical, but rather than argue with the strength of their evidence, Shaw relies on a typical religious defense: “It’s what I choose to believe.”  She clearly possesses a metaphysical bent; she demands a meaning for her life outside of her own making, and as I said earlier, she wears her faith in this objective value like a virtue.  But the manner in which life was created, designed, or engineered is depicted as a material process–not a spiritual one.

Thus, Shaw can accept her theory of extraterrestrial parentage without the need of a metaphysical foundation for this belief.  She has data that supports it (including strong DNA evidence), even if it goes against the established body of scientific data.  So her conviction and her cross are peculiar affects, much like Captain Janek’s Christmas tree (a cultural symbol that survives through habit and custom).  What’s even more interesting is that Shaw does not discard her faith at the film’s end, even after she exclaims quite exuberantly: “We were so wrong.”  She requests her cross back from David, who had removed it earlier.  He asks: “Even after all this, you still believe, don’t you?”  It’s a valid point.  How can we take Shaw seriously as a scientist if she is so willing to turn a blind eye to all that she has just witnessed?  We are left silently snickering at this all-too-human foible, just as David mocks it in his own special way.

So Prometheus does not support a metaphysical outlook, even if its characters adopt one.  As Jim Emerson points out:  “Not unlike Star Trek V: The Final Frontier, Prometheus uses god as a MacGuffin.”  Furthermore, David the android serves as the perfect foil to the humans and their odd beliefs.  Toward the end of the film, on the brink of death, Weyland declares: “There is nothing.”  “I know,” David responds with appropriate coldness.  “Have a pleasant journey, Mr. Weyland.”

Further reading:

Art and Criticism (Again)

When I started this blog last year, I had a more esoteric view of art than I do now.  Also, if one thing should be clear from my most recent posts, I no longer think that the usual definitions of art (e.g., Joyce’s) are sufficient to cover the full spectrum of human aesthetic experience.  Indeed, I already amended Joyce’s definition (via institutional theories of art) to suit my purposes:

Art is the human disposition of sensible or intelligible matter for an aesthetic end, whereby the aesthetic end is determined by context, tradition (i.e., established evaluative criteria), and audience (i.e., critical appraisal)–not by the artist.

This is adequate, but it still sounds unnecessarily academic.  Are there better definitions out there?

In a recent essay about an appearance of the Blue Man Group on The Celebrity Apprentice, Penn Jillette offered his partner Teller’s definition of art: “Whatever we do after the chores are done.”  I kind of like that.  I’m also fond of Marshall McLuhan’s dictum: “Art is whatever you can get away with.”  These definitions, though unsatisfying in any metaphysical sense, have the benefit of being more in line with how humans in practice actually create and interact with art.

To be clear, this approach (which some will deride as “anything goes”) does not make criticism irrelevant.  I have written extensively about this already, most recently in “Hume, Kael, and the Role of Subjectivity in Criticism.”  However, I would like to point you to a recent video conversation between A. O. Scott and David Carr concerning the purpose of criticism.  (I also recommend Jim Emerson’s sharp analysis of this conversation.)  In particular, I want to highlight the following exchange:

CARR:  But there is no objective excellence, no objective truth. There is only your subjective version of it.

SCOTT:  Do you really think that there’s no common project of deciding what’s beautiful and what’s good and what’s true?

Like Carr, I accept that there are no objective values.  However, like Scott, I believe in the “common project” of criticism: a community of people coming together to decide “what’s beautiful and what’s good and what’s true.”

Scott continues:

I don’t think it’s ever arrived at for all time, but I don’t think that you or anyone else actually believes that we just carry around our own little private, you know, canons of taste that we just sort of protect. Otherwise we’d never talk about any of this stuff. Otherwise, why would we have an arts section in the newspaper? Why would we talk about movies with our friends? Why would we have book clubs?

Well, I think that we do carry around and protect our own “canons of taste.”  However, the point that Scott is making is that taste is malleable (another idea that I have stressed on this blog).  Taste can be transformed through reading and participating in criticism.  Thus, what you think is beautiful, good, and true today will not be beautiful, good, and true “for all time.”

In sum, the role of criticism is not to dictate taste; however, we should remember that it plays an important role (intended or not) in establishing it.

Further reading:

The Meaning of Lists

In an interview with Spiegel in 2009, Umberto Eco discusses an exhibition he curated at the Louvre and its accompanying text, which he edited, called The Infinity of Lists.  Of lists, he says:

The list is the origin of culture. It’s part of the history of art and literature. What does culture want? To make infinity comprehensible. It also wants to create order–not always, but often. And how, as a human being, does one face infinity? How does one attempt to grasp the incomprehensible? Through lists, through catalogs, through collections in museums and through encyclopedias and dictionaries. There is an allure to enumerating how many women Don Giovanni slept with: It was 2,063, at least according to Mozart’s librettist, Lorenzo da Ponte. We also have completely practical lists–the shopping list, the will, the menu–that are also cultural achievements in their own right.

Yes.  And another cultural achievement is the top ten list, dreaded as it is by Roger Ebert.  Film critic Andrew O’Hehir, in introducing his top ten list for 2011, writes:

Crafting an annual top-10 list is no doubt a ludicrous exercise, and I’m not promising I’d have given you the same answers a month ago, or will give you the same ones a month from now. But over the years I’ve grown to appreciate the fact that it forces critics to stop hiding behind relativistic weasel words and high-flown rhetoric, and forces me to defend the murky and individual question of taste. The fact that I–ever so slightly–prefer “Coriolanus” to “Drive,” and “Mysteries of Lisbon” to “Uncle Boonmee Who Can Recall His Past Lives,” definitely tells you something about me as a person and a movie critic.

Indeed, as I argue in a previous post, we cannot help but respond to films subjectively.  Thus, on its own, any list of one’s preferred art objects is going to tell you more about the person making the list than about any universal criteria by which the art objects should be judged.  It can be valued for that reason alone.  Even if accompanied by strong rational arguments in support of the list’s ranking, this says nothing of its objective worth.  Perhaps this is Ebert’s gripe, as he claims (correctly) that lists “have next to nothing to do with the quality of movies.”

That being said, I believe these types of lists are valuable for another reason. Taken together, from a wide variety of critics, they can help us reach a usable standard by which to judge artworks.  The BFI Sight & Sound list of the top ten films of all time (the one list that even Ebert appreciates) is a good example of how this can be done effectively.

Every ten years, the BFI surveys a large number of film critics from all over the world and asks them for their individual top ten lists.  From these individual lists, the BFI compiles the definitive list of the top ten films of all time.  The ten films on this list become exemplars of the standard of taste exhibited by this group of critics.  I doubt that any two individual lists are identical (and you can view all of the individual lists if you do not believe me), but Citizen Kane is continually deemed the standard of excellence in cinematic art.  Kristin Thompson recently wrote a compelling argument against this fact, but even she concedes that Kane’s status appears to be cemented for the time being.  That’s not to say that tastes won’t change; they will, just as they did back in 1962 when Kane dethroned Vittorio De Sica’s Bicycle Thieves for the number one spot.

The BFI list and others like it are especially valuable for those who do not usually swim the murky waters of film criticism or know which critics might offer the most trustworthy opinions. For them, an aggregated “best of” list is an easy way to discover films that might be worth watching.  The two biggest review aggregators that will offer such lists are Rotten Tomatoes and Metacritic; in addition, Movie City News always creates a beautiful year-end chart from aggregated top ten lists. What the review aggregators do is take a film’s multiple reviews and give the film a score based on the critical consensus.  Thus, individual tastes can be merged into a single standard of taste, just as with the BFI lists.  People who want to get in on the conversation surrounding film would do well to follow these aggregators, check out their lists of the best reviewed films, watch the films, and then read the reviews.  In this way, one will learn the current standards of taste among critics, learn the arguments in favor of these standards, discover the critics with whom one’s own standard of taste might align (or misalign), and finally, if one wishes, take an active part in the community of criticism.  That is when the true fun begins, for with each new critic (if his or her opinions are heard and deemed valid by the rest of the community) comes a new chance to push the standard of taste in new and exciting directions.

Roger Ebert claims that “all lists are meaningless.”  Clearly, this statement is untrue.  Lists can be full of meaning, as long as there are people who read them and utilize them.  In the least, they are expressions of the tastes of those who have made them, whether an individual or a critical community.  Thus, they can tell us something about a person in the former case, and they can tell us something about critical standards in the latter.  In either case, the word “meaningless” does not apply.

Further reading:

The Turin Horse

Please note that the following post may contain spoilers.

Béla Tarr’s The Turin Horse is a bleak and beautiful film, one that portrays quite masterfully the frailty of human endeavor, of human civilization.  It does this through breathtaking black and white cinematography captured in long, thoughtful takes.  There is little dialogue, and the music (when present) simply nudges the films along, like the eponymous horse, with its melancholic, plodding rhythm: a funeral dirge for humanity.

The opening narration recounts the following fable:

In Turin on January 3, 1889, Friedrich Nietzsche steps out of the door of number six Via Carlo Alberto, perhaps to take a stroll, perhaps to go by the post office to collect his mail. Not far from him, or indeed very removed from him, a cabman is having trouble with his stubborn horse. Despite all his urging, the horse refuses to move, whereupon the cabman…Giuseppe? Carlo? Ettore?…loses his patience and takes his whip to it. Nietzsche comes up to the throng and puts an end to the brutal scene of the cabman, who by his time is foaming with rage. The solidly built and full-mustached Nietzsche suddenly jumps up to the cab and throws his arms around the horse’s neck sobbing. His neighbor takes him home, where he lies still and silent for two days on a divan until he mutters the obligatory last words: “Mutter, ich bin dumm,” and lives for another ten years, gentle and demented, in the care of his mother and sisters. Of the horse, we know nothing.

This has little bearing on what follows, unless you are familiar with the philosophy of Nietzsche; in which case, the film will unfold as both a confirmation of Nietzsche’s anti-metaphysical view of the world as well as a fascinating refutation of his optimism in regard to our relationship to it.

We begin, appropriately enough, on the horse, a pathetically tired animal, as it carts an old man through a wind-swept wasteland.  After this long take, in which the camera follows the weary journey with an upward gaze, the man and his horse arrive at their humble home.  The man’s daughter rushes out to assist her father in stabling the horse and cart.  Meanwhile, the wind, loud and unceasing, continues pillaging the already gloomy landscape.

We stay with the man and his daughter for six days.  We follow them through their daily routine and observe this routine degrade more and more each day until the characters no longer seem to get any pleasure or meaning from it.  The main obstacle is the unexplainable and incessant wind storm, which the man and his daughter simply gaze upon through a window.  The other problem is the horse, which will no longer obey commands or even eat.  It is as if it has simply resigned from life.

The daily routine of the characters consists of waking, dressing, fetching water from the well, cleaning the horse’s stall, boiling potatoes, eating the potatoes, washing the dishes, and sleeping.  I imagine the man would ride the horse into town, but that part of the routine, of course, is disrupted, as others soon will be.

Two notable disruptions arrive in the form of visitors.  The first is a man seeking pálinka (Hungarian brandy).  He cannot find any in town because of the wind storm, our hint that civilization itself is collapsing from the relentless onslaught of nature. The man recounts a fable (a philosophy?) about why the world is the way it is: man’s judgment of himself, god’s hand in all that is terrible, the debasement of the world through touch and acquisition; it is an indictment of sorts.  The character reminds me of the jovial squire from Bergman’s The Seventh Seal.  He seems to take what pleasure he can from existence without guilt and despite horrid circumstances; he is in on the joke that nothing matters.  And if what he says is true, the storm is the means by which the world, though indifferent, will reclaim itself from those who would debase it.  “Come off it,” the old man responds.  “That’s rubbish.”

The second visitation is an unwelcome band of gypsies.  They attempt to take water from the well.  The old man sends his daughter out to disperse them, but he eventually comes out to aid her with an axe.  The gypsies disband, laughing merrily, and they taunt the man and his daughter: “You are weak.  Drop dead.”

The gypsies are a fitting counterpoint to our sad protagonists.  They have healthy horses of which they are in command, they vocally claim the land and the water as their own, and they are not suffering.  Indeed, they appear to be striving–living rather than dying.

The Turin Horse, unlike last year’s visually rich auteurist statement, Terrence Malick’s The Tree of Life, does not depict the world or humanity with any imagined telos (an end goal or purpose).  There is only a tumultuous sense of becoming–chaotic and without reason.  The forms which we inflict upon the formlessness are what give our lives pleasure and meaning.  And those forms (including our routines), as Tarr shows us, are weak, flawed, and ultimately inadequate.  This is where the film refutes Nietzsche’s optimism.  If, as Nietzsche believes, “we possess art lest we perish of the truth,” what happens when art (our form-giving capability) fails us?  What happens when the ugly truth (the valueless nature of existence) is all that remains?  This is why, perhaps, the film opens with the tale of Nietzsche’s resignation to insanity.  Even for him, the tale suggests, the blackness of life became too much to bear.

Consider the plight of our characters:

First, their horse, their taming of wild nature, no longer responds to their bidding.  Then, their well, their taming of the earth, dries up.  Then, their lamps, their taming of the darkness, do not light.  At this point, we become conscious that even cinema–the film we are watching, which has indeed been beautifying the ugliness of existence for us–even that ultimately fails.  As the light goes out, we, along with the characters, are consumed by blackness.  As the characters resign to nothingness, so too must we. “Tomorrow we’ll try again,” the man says.

The film’s narrator finishes the tale that the camera can no longer tell.  The man and his daughter go to sleep, and the storm, comically enough, subsides.  We see the man and his daughter one final time: eating potatoes, joylessly, as they do.  This time, though, a heavy darkness weighs down on them from above like the pulsating black space of a Mark Rothko painting.  Is this ending hopeful?  Maybe–the storm has ended, and our protagonists can now get back to their daily routines.  But is there a difference between simply sustaining life and actually living it?  The gypsies seem to think so.  But not all of us are as capable of adapting to nature’s frightful whims.  We prefer that nature adapt to us, be tamed by our art, and abide by our laws and routines.  When nature refuses? That is the despairing tale of The Turin Horse.

Further reading:

Doomsday Cinema, Part 2: The Turin Horse

Hume, Kael, and the Role of Subjectivity in Criticism

In a previous post, I discuss why I prefer the word “impersonal” to the word “objective” in questions of aesthetic judgment.  I state: “ […] we can make aesthetic judgments independent of personal taste, based solely on our knowledge, experience, and critical understanding of the art in question.  Rather than taking art personally, we can take it impersonally.”

Simply put, I no longer believe this.  I no longer think that an “impersonal” approach to art is possible.  My reason is that I no longer understand “taste” as something separate from “knowledge, experience, and critical understanding.”  Instead, I understand taste as that which encompasses all of those elements (as well as others).  For example, a person’s adopted evaluative criteria will become a part of that person’s taste, along with his or her experience, learning, and values.  For truly, these all play a part in a person’s subjective appraisal of a work.  No matter how much we may want to experience something objectively, impersonally, or purely rationally, we remain stubbornly tied to our individual tastes.

As a case in point, I want to examine the notorious film critic Pauline Kael.  Last year saw the release of both a biography of Kael and a collection of her work.  This prompted many active critics and journalists to write their own appraisals of Kael.  Roger Ebert had this to say:

Pauline had no theory, no rules, no guidelines, no objective standards. You couldn’t apply her “approach” to a film. With her it was all personal. Faithful readers will know I am tiresome in how often I quote Robert Warshow, who in his book The Immediate Experience wrote: “A man goes to the movies. The critic must be honest enough to admit he is that man.” Pauline Kael was that honest. She wrote about her immediate experience, about what she felt.

She’s accused of being inconsistent and contradicting herself. Directors would fall in and out of favor. With her there was no possibility of inconsistency, because she always wrote about what she felt right now. What was the purpose tilting that emotion to reflect something she wrote earlier? I sat next to her once in a New York screening room. She responded audibly. “Oh, oh, oh!” she’d say, in praise or disapproval. Talking like that would get her in trouble in Chicago. Pauline had–or took–license. You sensed something physical was happening as she watched.

Of his own criticism, Ebert concedes: “In my reviews and those of a great many others you are going to find, for better or worse, my feelings. I feel a responsibility to provide some notion of what you’re getting yourself in for, but after that it’s all subjective.”

Manohla Dargis, in a discussion regarding the merits of Kael in The New York Times, comes to a similar conclusion:

As critics, all we have are our beliefs, ideals, prejudices, blind spots, our reservoirs of historical and personal knowledge, and the strength of our arguments. There are empirical truths that we can say about a movie: it was shot in black and white or color, on film or digital, in widescreen or not, directed by this or that filmmaker. But beyond these absolutes there is only our thinking, opinions, ideologies, methodological approaches and moments in time. That isn’t to say that criticism is a postmodern anything goes; it is to admit that critics are historical actors and that our relationships with movies, as with everything in life, are contingent on those moments.

What Ebert and Dargis seem to be saying, what I have already claimed, and what the example of Kael proves is that there are indeed individual subjective elements that come into play in a critical judgment.

To see how this works, I think that we can apply (interestingly enough) Jean Anthelme Brillat-Savarin’s model of tasting from The Physiology of Taste, in which there are three stages.  However, I think we can simplify it to two concurrent stages.  When appraising an object, we first sense it; as our brain registers the sensation, we immediately start “considering” it (not necessarily consciously or rationally, although that can indeed occur and provide the illusion that we’re operating independently of our body’s conditioning).  What happens when we consider an object?  Our past experiences, our memories, our feelings, our learning, our adopted criteria, and (most importantly) our values all come together (or work against one another) to pass judgment.  Reason might help us sort some of this into a clear, articulate response, but such conscious rationalization is usually unnecessary and will probably only occur, anyway, after a judgment has already been made.  That being said, these rationalizations serve a different purpose–they are what constitute criticism.

Of course, this idea of “no theory, no rules, no guidelines, no objective standards” teeters on the brink of nihilism.  If, ultimately, we each experience an artwork subjectively, what is the point in debating the merit of one opinion over another? How is criticism not simply “postmodern anything goes”?

Fortunately, David Hume addresses this very issue in “Of the Standard of Taste.”  Carolyn Korsmeyer, in her analysis of that work (“Hume and the Foundations of Taste”), expresses the problem in this manner:

If beauty is identified with a particular kind of pleasure, if aesthetic and artistic value is measured by the feelings of the individual perceiver, then one would expect that there would be no grounds for asserting that one aesthetic judgment or expression of pleasure is preferable to any other. People differ, and so do their tastes. However, it becomes clear when reading Hume’s writings on criticism, that tastes, on his account, are not so subjective that no standards can be discerned. In fact, it is quite evident that Hume considered some artistic and literary tastes preferable by far to others.

To be sure, Hume states: “It is natural for us to seek a Standard of Taste; a rule by which the various sentiments of men may be reconciled; at least a decision afforded confirming one sentiment, and condemning another.”

Like Dargis, Hume does not believe that criticism is “postmodern anything goes,” even as he allows for the subjectivity inherent in the wide variety of individual tastes.  No–as he points out, the “joint verdict” of the best critics (consisting of “similarities of sentiment”–the common, shared elements of their opinions–and not necessarily the individual subjective elements) becomes “the true standard of taste and beauty.” This standard of taste, then, if adopted, becomes the very context in which criticism (and, thus, art) becomes communicable and meaningful (i.e., not nihilistic).

Korsmeyer offers the following as an example of how a standard of taste can develop out of individual subjective tastes:

Time is a reliable filter for passing fads and poor judgments, and the verdict of history cancels out individual foibles and produces a universally valid consensus concerning great art. Therefore, according to Hume, although rules of art cannot be codified, standards of taste do emerge as one takes a long look at human society and history and sees how that art which is best suited to please the human frame attains an unquestioned superiority over other, ephemeral creations.

Despite the apparent universal applicability of such standards of taste, however, the individual subjective elements remain the lifeblood of criticism; ultimately, that is why criticism remains an imperfect, mutable process. That is also why it remains fun, engaging, stimulating, and relevant, as the example of Pauline Kael clearly demonstrates.  To be sure, Keith Phillips says of Kael: “Even when she’s wrong, she’s worth reading. I can’t think of any higher praise for a film critic.”

Kael resisted being standardized.  Not many, for example, will share her distaste for Stanley Kubrick.  But she still championed films such as Bonnie and Clyde and directors such as Godard whose qualities have indeed informed the standard of taste adopted by today’s film critics.  So admitting the subjective nature of criticism does no harm to the practice.  In spite of that fact, a standard of taste still develops, and it is that standard which both shapes and challenges our own critical judgments, and vice versa.

So why practice criticism?  Why read it?  Art critic Jonathan Jones offers the following summation:

No [critical] judgment is final. No critic is right, necessarily. It’s just that criticism offers a more honest and realistic understanding of the deep strangeness of our encounters with these mysterious human creations called works of art.

Yes–and in the spirit of subjectivity, that answer is certainly good enough for me.

Further reading: