In Defense of Heresy in Criticism

Full English Breakfast

Once a week, Criticwire asks a group of film critics a question and compiles their responses.  This week’s Criticwire Survey seems to have caused a bit of a stir.  Here is the question posed by Matt Singer:

What movie widely regarded as a cinematic masterpiece do you dislike (or maybe even hate)?

This question and its responses were promoted under the incendiary headline: “Overrated Masterpieces.”  Needless to say, this provoked some outrage, both in the comments and across the web.  Only one critic, Glenn Kenny, appears to have left the proceedings unscathed.  The reason for this is that he refused to name a film:

I find this question especially dispiriting, as it’s really just a form of bait, and a cue for individuals to come up with objects to snicker at, feel superior to, and all that. I’m sure many critics will have a blast with it.

Kenny follows this with a passage from Richard Hell’s autobiography where Hell writes of an encounter with Susan Sontag in which she laments the fact that she has opinions because, as Hell puts it, “opinions will solidify into prejudices that substitute for perception.”

On Twitter, New York Times critic A. O. Scott singled out Kenny for praise:

watch @Glenn__Kenny enlist Susan Sontag and Richard Hell to smack down glib link-trolling pseudo-contrarianism

First of all, I would argue that Kenny himself is using this opportunity to “snicker at” and “feel superior to” his fellow critics.  Second, I would argue that the point of this particular survey is to counter popular opinions that may have solidified into prejudices, not the other way around.  Finally, I think that it is Scott who is being “glib” in his dismissal of the exercise as “pseudo-contrarianism.”

Each individual critic (Kenny included) will have points of divergence from the critical community with which he or she belongs.  This is only natural; individuals have individual tastes (e.g., likes and dislikes) based on individual life experiences.  But here is an unsettling fact: many people will accept that certain films are sacred—sometimes irrationally and without having actually seen them—for the single reason that the films have been blessed with critical approval and labeled masterpieces.  The critics who answered the Criticwire Survey are simply challenging this automatic acceptance, some even going so far as to offer rational and articulate defenses of their opinions (the opposite of pseudo-contrarianism, I would say).

Interestingly, James Ramsden, a food blogger at The Guardian, wrote a piece last week called “The Great British fry-up: it’s a national disgrace.”  The article comes with the following blurb:

The full English breakfast is the most overrated of British dishes – even the name is shuddersome. How did we become shackled to this fried fiasco?

Just as with the Criticwire Survey (and perhaps again due to the word “overrated”), Ramsden experienced a lot of backlash.  He felt compelled to write a response (published only a day after the Criticwire Survey): “Which well-loved foods do you hate?”  In this piece, we learn that Ramsden received accusations similar to those received by the film critics.  For example, he, too, was accused of trolling (maybe by the A. O. Scott of the British food blogging world).  However, Ramsden understands where the attacks are coming from:

I understand it because I’ve felt it too. It is perhaps not a rational reaction to a subjective aversion […], but we feel strongly about food and are thus oddly offended by someone vehemently opposing that which we cherish.

Yes, and people apparently feel strongly about film as well and will oppose subjective aversions to well-loved films with equal vehemence and irrationality.  Ramsden, after providing a long list of similar aversions from some notable chefs and food critics, ends his piece by stating:

The common denominator with all of these dislikes is the mutual conviction that the other person is a loon, even a heretic. There are certain aversions – anchovies, haggis, balut, kidneys – that are entirely understandable (you don’t often hear cries of “you don’t like kimchi?!” except perhaps in certain foodish circles), but when it comes to dissing curry, fish and chips, pasta, or indeed a fry-up, it turns out people are, at best, going to think you very odd indeed. Still, can’t blame a man for trying.

Glenn Kenny chose not to name a film on which his opinion differs from that of the masses.  Does that mean he holds no such opinion?  That no such film exists?  Hardly.  As I said, he used this opportunity to elevate himself above his fellow critics under the pretense that criticism has loftier goals than this sort of muckraking.  I think that he just didn’t want to get his hands dirty.  I prefer the “loons” and the “heretics” who are unafraid of their own subjectivity.  On a related note, I believe that Pauline Kael would have loved this week’s Criticwire Survey.  Especially the word “overrated.”

Further reading:

Hume, Kael, and the Role of Subjectivity in Criticism

A Defense of Banksy

Dancers on a Plane by Jasper Johns

Once again, I feel compelled to address some claims made by the art critic Jonathan Jones at The Guardian.  This time, Jones has written a piece attacking Banksy.  This in itself is not the problem.  The problem is that the attack makes very little sense under close examination.

Here is the crux of Jones’s argument:

Some art can exist just as well in silence and obscurity as on the pages of newspapers. The Mona Lisa is always being talked about, but even if no one ever again concocted a headline about this roughly 510-year-old painting it would still be as great. The same is true of real modern art. A Jasper Johns painting of a network of diagonal marks surrounded by cutlery stuck to the frame, called Dancers On a Plane – currently in an exhibition at the Barbican – was just as real, vital and profound when it was hidden away in the Tate stores as it is under the gallery lights. Johns does not need fame to be an artist; he does not even need an audience. He just is an artist, and would be if no one knew about him. Banksy is not an artist in that authentic way.

I strongly disagree that art can exist in a vacuum; I think it needs an audience to be art.  Thus, I cannot fathom the absurdity in the statement that Jasper Johns “does not even need an audience” to be an artist.  How does that work exactly?  It doesn’t.  Jones is simply presupposing a metaphysical reality in which art possesses inherent value independent of humans.  This presupposition, being fictional, remains unsupported.  How can a work remain profound if no one is around to bestow the value of profundity upon it?  And does it not take a human mind to transform Jasper Johns’s “network of diagonal marks surrounded by cutlery stuck to the frame” into a cohesive whole?  Truly, then, one cannot dismiss Banksy on the grounds that his work demands an audience.  All art does.

Another problem that I have with Jones’s argument is that he takes the properties that make Banksy aesthetically interesting to most people and transforms them into Banksy’s aesthetic shortcomings:

Banksy, as an artist, stops existing when there is no news about him. Right now he is a story once again, because a “mural” by him (street art and graffiti no longer suffice to describe his pricey works) has been removed from a wall and put up for auction. Next week the story will be forgotten, and so will Banksy – until the next time he becomes a headline.

Part of Banksy’s “art” is in the impermanence of his pieces and in the confrontational nature of his “murals” that are designed to disrupt people from their daily routines to make them stop and notice something, to see things differently.  Perhaps comparisons to static pieces like the Mona Lisa are not the best means to understand performance-based work of this nature (though I admit that because the art market has laid claim to Banksy, such comparisons are not necessarily off base, either).

But “street art” is hardly the first recognized art form to be temporary and confrontational in the manner adopted by Banksy. And why does Jones consider fame and branding as faults or weaknesses of the artist?  These attributes were obviously as essential in solidifying the legacies of the artists whom Jones admires as they were in elevating Banksy above his peers.

Jones claims that he wants “art that is physically and intellectually and emotionally real.”  Unfortunately for him, as his blog on Banksy makes clear, he seems to have no idea what that even means.

Further reading:

Banksy goes AWOL

On Morality in Criticism

Zero Dark Thirty

An interesting question has been making the rounds in certain critical circles since the release of Kathryn Bigelow’s Zero Dark Thirty this past December.  And I’m not talking about the question of whether or not the film endorses torture (it doesn’t).  I’m talking about the broader question that has been phrased this way by Danny Bowes at Movie Mezzanine:

[…] is a critic under any obligation to render a moral judgment on a film?

After pointing out that the debate extends beyond Zero Dark Thirty to films like Django Unchained and Beasts of the Southern Wild, Bowes states:

With each of these films, critics praising the aesthetics of each have been accused of ignoring, rationalizing, or even siding with offensive content therein. In response, critics have been forced into a “no I do not” defensive posture, and a great deal of huffiness about art for art’s sake and the primacy of the work over the given critic’s personal beliefs and austere objectivity and so forth has ensued.

In the past, I would have agreed with the l’art pour l’art critics who claim that they can separate their personal beliefs from their aesthetic evaluations of a given film and adopt an “objective” or an “impersonal” position from which to judge the work in question.  But not anymore.  Indeed, it is my understanding that an aesthetic judgment is inseparable from a moral judgment, and vice versa.  I think that Bowes agrees:

Every act of criticism is a moral judgment, and not in a glib, media-trolling, mid-’60s Jean-Luc Godard way, either. However objective any critic tries to be in evaluating any work, the evaluation is being conducted by a matrix of observation, cognition, and the innately unique assembly of life experience and education that makes up all the things the critic knows and how s/he knows them.

Yes.  Each person who makes an aesthetic judgment on a work of art cannot escape his or her “unique assembly of life experience and education,” and this assembly includes a person’s adopted morality.  Thus, I cannot consciously separate my moral leanings from my critical evaluations of artworks any more than I can separate my aesthetic taste from my moral judgments, no matter how hard I might try to hide the influence of one over the other.  As the character Bill Haydon says in regard to his treason in Tinker Tailor Soldier Spy, “It was an aesthetic choice as much as a moral one.”

Bowes writes at the end of his piece:

The decision a critic makes to approach a movie on its own terms with as much objectivity as s/he can muster is a moral decision. Not everyone succeeds in completely divesting their preexisting baggage.

Not exactly.  I would say that no one succeeds in this and that the morality present in a work of criticism is never a “decision” but inevitable.  In addition, we can never really know the multitude of factors that have brought us to our critical assessments (factors as disparate as temperature, mood, and peer pressure), so how can we choose to ignore some while allowing for others?  We can’t.

In Daybreak, Friedrich Nietzsche writes:

You dislike him and present many grounds for this dislike—but I believe only in your dislike, not in your grounds!  You flatter yourself in your own eyes when you suggest to yourself and to me that what has happened through instinct is the result of a process of reasoning. (D358)

Though criticism remains our best attempt to account for our likes and dislikes, we must recognize the limitations of the undertaking (e.g., the fact that it might just be a post-hoc rationalization of a knee-jerk judgment).  And we must stop pretending that we can consciously control what influences our opinions and what doesn’t, whether it be our moral conditioning, environmental factors, or something else entirely.  The best we can do is be honest regarding the extent of our knowledge in this area.  In most cases it will be minimal.

Further reading:

5 Bizarre Factors That Secretly Influence Your Opinions

Video Games Are Art

Smithsonian American Art Museum

I had wanted to write about video games as art for some time now, but I was worried that the question was no longer relevant–that most people (including me) had finally accepted the fact that video games can be art.  This past November, Disney released Wreck-It Ralph, a film which brings to life video game characters and worlds in the manner of Pixar’s Toy Story.  In his review of the film in The New York Times, A. O. Scott writes:

The secret to its success is a genuine enthusiasm for the creative potential of games, a willingness to take them seriously without descending into nerdy pomposity.

Clearly, I thought, this means that we’ve reached a turning point–that critics like A. O. Scott are now on board and willing to accept the aesthetic potential of games.

But I was wrong.  On November 30, Jonathan Jones, the art critic at The Guardian, published a blog entitled “Sorry MoMA, video games are not art.”  His blog is a response to the fact that the Museum of Modern Art in New York plans to curate a selection of video games as part of its Architecture and Design collection.  Despite the fact that this is not the first time that an art museum will be playing host to video games (the Smithsonian American Art Museum held such an exhibit earlier this year), Jones has decided to put his foot down and play the predictable role of arbiter of what is and isn’t art (the role once famously played by Roger Ebert in this particular debate).  He writes:

Walk around the Museum of Modern Art, look at those masterpieces it holds by Picasso and Jackson Pollock, and what you are seeing is a series of personal visions. A work of art is one person’s reaction to life. Any definition of art that robs it of this inner response by a human creator is a worthless definition. Art may be made with a paintbrush or selected as a ready-made, but it has to be an act of personal imagination.

Whether through ignorance or idiocy, Jones has made an argument that is simply not applicable to video games.  If he were to watch the great documentary from this year on the subject of independent game design, Indie Game: The Movie, he would realize that he has no right to claim that video games are not the work of personal imaginations.  In that film, we see just how personal games can be to their creators.  We watch Phil Fish, for example, as he obsesses endlessly over every detail of his game FEZ, postponing its scheduled release for years and revealing how much of himself is in the game–how it has become his identity.  We also watch Edmund McMillen and Tommy Refenes as they complete Super Meat Boy, an ode to their childhood video gaming experiences. From the Wikipedia synopsis of the film:

McMillen talks about his lifelong goal of communicating to others through his work.  He goes on to talk about his 2008 game Aether that chronicles his childhood feelings of loneliness, nervousness, and fear of abandonment.

Surely this suggests the extent to which games can be the works of personal imagination.  Another film playing the festival circuit this past year, From Nothing, Something, a documentary about the creative process, also features a video game designer among its artist subjects: Jason Rohrer, who “programs, designs, and scores” his games “entirely by himself.”  It does not get more personal than that.

And this is not even limited to independent game design (a field which Jones might not even know exists).  Surely the games of Nintendo’s Shigeru Miyamoto are recognizable as products of that creator’s personal vision.  Through pioneering works such as Donkey Kong, Super Mario Bros., and The Legend of Zelda, Miyamoto became one of the first auteurs of game design.

Regardless, Jones ends his argument against video games as art by making a point about chess:

Chess is a great game, but even the finest chess player in the world isn’t an artist. She is a chess player. Artistry may have gone into the design of the chess pieces. But the game of chess itself is not art nor does it generate art — it is just a game.

Jones’s use of chess to illustrate his case against the aesthetic value of games is interesting because he writes about the game in a previous blog titled “Checkmates: how artists fell in love with chess.”  In this piece, he doesn’t necessarily call chess art (he seems content to assign it the role of muse), but he comes awfully close:

It is a game that creates an imaginative world, with powerful “characters”: this must be why artists were inspired to create designer chess sets long before modern times.

On top of this, Jones seems willing to concede the fact that chess pieces can be art.  Would he also concede the fact that pixelated characters, orchestral scores, and other “pieces” of a video game can be art?  (To be sure, there are clearly “traditional” artists who work on individual aspects of games: graphic designers, writers, and musicians.)  My question would then become:  Why cannot the many artistic pieces cohere into a single work of art that also happens to be a game?  Architects create buildings that serve as works of art as well as living spaces.  Imagine an art critic who would perhaps recognize the artistry in a stained glass window yet say condescendingly of the cathedral in which it is found: “It’s just a building.”  The idea is absurd.

I am all in favor of meaningful distinctions between objects.  We can have art and games as separate categories.  But we must acknowledge that there can indeed be overlap.  I already demonstrated on this blog how food can serve both instrumental and aesthetic ends.  The same is true for games.

In his classic essay “The Artworld,” Arthur Danto writes:

To see something as art requires something the eye cannot descry — an atmosphere of artistic theory, a knowledge of the history of art: an artworld.

The fact of the matter is that video games have now been allowed into two respected art museums (the Smithsonian American Art Museum and the Museum of Modern Art), the National Endowment for the Arts has started to allow funding for game designers, and the conversation about the artistic merits of games is alive and well–within the general populace, yes, but also within the hallowed halls of academia.  This is enough, in my opinion, to qualify video games as art.  Clearly, in practice, that is simply what they are.  Psychologically, people are experiencing them in the same way that they experience objects more commonly classified as art (e.g., novels and movies).  The fact that critics such as Jonathan Jones and Roger Ebert will not allow for the status of art to be extended to games–and that they would rely on smug and silly arguments to prove their points–says more about them than it does about the reality of the situation.  They are great critics, but here, where perhaps they feel their grasp loosening around that which they believed themselves to be experts, they are simply wrong.  We see some metaphysical justifications for their beliefs, but primarily we see the constricting influence of habit and conditioning–their inability to see other than what they have been trained (or educated) to see.  But no matter.  Others seem to have a much easier time seeing the artistic potential of games.

In an interview with USA Today about composing the theme song for the game Call of Duty: Black Ops II, Trent Reznor says:

I’ve watched with a kind of wary eye how gaming has progressed. I was there at the beginning with Pong in the arcade, and a lot of my great childhood memories were around a Tempest machine. I really looked at gaming as a real art form that is able to take a machine and turn it into something that is a challenging, human interaction puzzle game strategy.

And according to Penn Jillette (from the November 18 episode of his Penn’s Sunday School podcast):

Video games are culture; they are a new way of doing art.  You know, I fought against them at first.  I used to say that, you know, being able to make up a story as you went, I fought against that.  I did a couple of whole speeches about how you want the plot in Shakespeare.  But I’ve now understood.

And so have I.  The more interesting questions, moving forward, are: “By what criteria are people recognizing games as art?  By what standards of taste are these games being critiqued?”  As Luke Cuddy puts it in his review of the book The Art of Video Games in The Journal of Aesthetics and Art Criticism:

We must remember to compare the good to the bad, the same way we compare Foucault’s Pendulum (Umberto Eco, 1988) to The Da Vinci Code (Dan Brown, 2003).

So what are the best games?  What are the worst?  What distinguishes them from each other?  I will leave those questions to the more experienced gamers and critics.

Further reading:

Prometheus: “There Is Nothing in the Desert, and No Man Needs Nothing”

Please note that the following post may contain spoilers.

Ridley Scott’s Prometheus is chilling science fiction, a Lovecraftian space odyssey that poses some big questions about the origin of life and its ultimate purpose.  David Denby has called it “a metaphysical ‘Boo!’ movie.”  Andrew O’Hehir compared it to Terrence Malick’s The Tree of Life:

Both are mightily impressive spectacles that will maybe, kinda, blow your mind, en route to a hip-deep swamp of pseudo-Christian religiosity.

I want to counter those claims by demonstrating that, though characters in the film may have faith in something beyond the material world, the film itself (mostly through the android David) depicts a world incompatible with that faith.

The film opens with a humanoid on what is presumably primordial earth.  A spaceship is seen in the distance, apparently abandoning him.  He drinks something from a cup and begins to disintegrate.  His genetic material, we’re led to believe, helped spawn life on earth.  Thus, we’re immediately given the film’s premise: an alien race “engineered” humans through this initial act of terraforming.  This premise, quite naturally, invites skepticism.  Even if an alien race did spark life on earth, there is no way that they could have predicted the paths that this life would take.  There is no way that they would have been able to engineer the many happy accidents that allowed a branch from this seed to evolve into humans.  Later, we will meet a biologist among the crew of the spaceship Prometheus.  He knows how life evolved on earth and voices his skepticism at the idea that we were somehow designed.  How does the script handle this contradiction?  It renders the biologist irrelevant, as nothing more than a cowardly stock character.  But skepticism hardly matters; we have already seen the creation of life on earth, so we must accept this premise, believable or not, as a fact in the world of the film.

This brings us to our protagonist, archaeologist Elizabeth Shaw. She (along with boyfriend Charlie Holloway) is the one who uncovered the cave paintings supporting the theory of extraterrestrial parentage.  The mission of the Prometheus, we learn, is to find our alien ancestors and ask them why they created us.  The assumption, of course, is that there is a meaning to human life, a reason for us being here.  And this meaning, according to Shaw, is out there among the stars for us to discover.  She wears her faith in this idea like a virtue; she also wears a cross.

But Shaw isn’t the only one who has a religious worldview at stake.  Even Peter Weyland (the sinister corporate interest who is funding the mission) expresses faith in metaphysical gobbledygook when he says that David, his android creation, differs from humans in that he does not possess a “soul.”

In a character analysis at the blog Virtual Borderland, the author writes:

We are told that David is different from humans because he has no soul — but is the trick really that David knows humans don’t either? Where humans pretend that they are different, that we have creators with answers to our questions, gods who will elevate us above the rest of the universe, David accepts the empty desert and the trick is simply: not minding that it hurts.

I agree with this analysis, and I think it is a key to understanding David’s function in the film and his obsession with Lawrence of Arabia.  His fondness for the David Lean film is particularly fascinating.   He even attempts to mimic Peter O’Toole through his appearance and mannerisms.  In this ability to learn through experience and observation and to mimic the behavior of model figures, David is perhaps more human than the other characters can comfortably realize, despite his lack of a “soul.”  As the author of the character analysis suggests, maybe David differs most from humans in that  he can accept the meaninglessness of existence.  For example, David knows all too well why he was created:

DAVID:  Why do you think your people made me?

HOLLOWAY:  We made you because we could.

DAVID:  Can you imagine how disappointing it would be for you to hear the same thing from your creator?

In exchanges such as this, David perfectly undermines the metaphysical delusions of his companions.

So what of Shaw’s faith?  What does it mean in this context?  As I already discussed, we are shown the creation of life right at the start, so we at least know that Shaw’s theory of extraterrestrial parentage is correct (absurd as it is).  We then see Shaw and Holloway uncover physical evidence to support their claim (cave paintings around the world that depict giant figures pointing to a specific star system).  People are reasonably skeptical, but rather than argue with the strength of their evidence, Shaw relies on a typical religious defense: “It’s what I choose to believe.”  She clearly possesses a metaphysical bent; she demands a meaning for her life outside of her own making, and as I said earlier, she wears her faith in this objective value like a virtue.  But the manner in which life was created, designed, or engineered is depicted as a material process–not a spiritual one.

Thus, Shaw can accept her theory of extraterrestrial parentage without the need of a metaphysical foundation for this belief.  She has data that supports it (including strong DNA evidence), even if it goes against the established body of scientific data.  So her conviction and her cross are peculiar affects, much like Captain Janek’s Christmas tree (a cultural symbol that survives through habit and custom).  What’s even more interesting is that Shaw does not discard her faith at the film’s end, even after she exclaims quite exuberantly: “We were so wrong.”  She requests her cross back from David, who had removed it earlier.  He asks: “Even after all this, you still believe, don’t you?”  It’s a valid point.  How can we take Shaw seriously as a scientist if she is so willing to turn a blind eye to all that she has just witnessed?  We are left silently snickering at this all-too-human foible, just as David mocks it in his own special way.

So Prometheus does not support a metaphysical outlook, even if its characters adopt one.  As Jim Emerson points out:  “Not unlike Star Trek V: The Final Frontier, Prometheus uses god as a MacGuffin.”  Furthermore, David the android serves as the perfect foil to the humans and their odd beliefs.  Toward the end of the film, on the brink of death, Weyland declares: “There is nothing.”  “I know,” David responds with appropriate coldness.  “Have a pleasant journey, Mr. Weyland.”

Further reading:

Neuroaesthetics

Philosophy is in a strange place right now.  It struggles for relevance while the empirical sciences continue to master every area over which it once held sway.  In an interview with The Atlantic, physicist Lawrence Krauss puts it this way:

There are areas of philosophy that are important, but I think of them as being subsumed by other fields. In the case of descriptive philosophy you have literature or logic, which in my view is really mathematics. Formal logic is mathematics, and there are philosophers like Wittgenstein that are very mathematical, but what they’re really doing is mathematics—it’s not talking about things that have affected computer science, it’s mathematical logic. And again, I think of the interesting work in philosophy as being subsumed by other disciplines like history, literature, and to some extent political science insofar as ethics can be said to fall under that heading. To me what philosophy does best is reflect on knowledge that’s generated in other areas.

Alas, it seems that even aesthetics must now be subsumed by the sciences.  This is not a bad thing.  Why rely on metaphysical conjecture when physical data exists?  Indeed, the most interesting work being done in aesthetics right now relies on behavioral, psychological, and neurological data.  This interdisciplinary approach to art, I learned, is called “neuroaesthetics.”

In a blog post on Psychology Today, Dr. William Hirstein makes a strong case for this exciting new field “in which researchers attempt to understand how the brain responds to art.”   He asks:

What happens in the brain when people listen to their favorite piece of music or appreciate a great painting? Why do all human societies create and value art? How did a creature subject to the evolutionary process evolve the need for art? Does producing art have some sort of survival value for us, or is it merely associated with some more pragmatic trait that does?

Hirstein makes it clear that not all are happy with the idea of philosophy moving in this direction.  I am sure that neuroaesthetics strikes many as cold and detached.  For them, the mysteries of art remain forever outside the province of science.  But I think that such an approach is just what is needed.  In fact, I see no other alternative.

According to Hirstein:

If we refuse to look inside the skull, the tremendous variety of artworks can start to make the process of understanding what they have in common look hopeless. According to a view called “particularism” each artwork must be understood on its own merits, which may have nothing in common with any other artwork. But then how can we ever meaningfully speak and think about artists and art in general? Neuroaesthetics promises to break this deadlock by finding that the vast variety of artworks do have something in common: the response they provoke in our brains.

When I was studying the similarities between art and food, this type of neurological data was most helpful.  I suspect, too, that further studies in neuroaesthetics will demonstrate that the brain responds to other types of phenomena in a way eerily similar to that in which it responds to art. This will prove especially enlightening, and I look forward to the many new discoveries that work in this field is bound to yield.

Further reading:

Art and Criticism (Again)

When I started this blog last year, I had a more esoteric view of art than I do now.  Also, if one thing should be clear from my most recent posts, I no longer think that the usual definitions of art (e.g., Joyce’s) are sufficient to cover the full spectrum of human aesthetic experience.  Indeed, I already amended Joyce’s definition (via institutional theories of art) to suit my purposes:

Art is the human disposition of sensible or intelligible matter for an aesthetic end, whereby the aesthetic end is determined by context, tradition (i.e., established evaluative criteria), and audience (i.e., critical appraisal)–not by the artist.

This is adequate, but it still sounds unnecessarily academic.  Are there better definitions out there?

In a recent essay about an appearance of the Blue Man Group on The Celebrity Apprentice, Penn Jillette offered his partner Teller’s definition of art: “Whatever we do after the chores are done.”  I kind of like that.  I’m also fond of Marshall McLuhan’s dictum: “Art is whatever you can get away with.”  These definitions, though unsatisfying in any metaphysical sense, have the benefit of being more in line with how humans in practice actually create and interact with art.

To be clear, this approach (which some will deride as “anything goes”) does not make criticism irrelevant.  I have written extensively about this already, most recently in “Hume, Kael, and the Role of Subjectivity in Criticism.”  However, I would like to point you to a recent video conversation between A. O. Scott and David Carr concerning the purpose of criticism.  (I also recommend Jim Emerson’s sharp analysis of this conversation.)  In particular, I want to highlight the following exchange:

CARR:  But there is no objective excellence, no objective truth. There is only your subjective version of it.

SCOTT:  Do you really think that there’s no common project of deciding what’s beautiful and what’s good and what’s true?

Like Carr, I accept that there are no objective values.  However, like Scott, I believe in the “common project” of criticism: a community of people coming together to decide “what’s beautiful and what’s good and what’s true.”

Scott continues:

I don’t think it’s ever arrived at for all time, but I don’t think that you or anyone else actually believes that we just carry around our own little private, you know, canons of taste that we just sort of protect. Otherwise we’d never talk about any of this stuff. Otherwise, why would we have an arts section in the newspaper? Why would we talk about movies with our friends? Why would we have book clubs?

Well, I think that we do carry around and protect our own “canons of taste.”  However, the point that Scott is making is that taste is malleable (another idea that I have stressed on this blog).  Taste can be transformed through reading and participating in criticism.  Thus, what you think is beautiful, good, and true today will not be beautiful, good, and true “for all time.”

In sum, the role of criticism is not to dictate taste; however, we should remember that it plays an important role (intended or not) in establishing it.

Further reading:

The Meaning of Lists

In an interview with Spiegel in 2009, Umberto Eco discusses an exhibition he curated at the Louvre and its accompanying text, which he edited, called The Infinity of Lists.  Of lists, he says:

The list is the origin of culture. It’s part of the history of art and literature. What does culture want? To make infinity comprehensible. It also wants to create order–not always, but often. And how, as a human being, does one face infinity? How does one attempt to grasp the incomprehensible? Through lists, through catalogs, through collections in museums and through encyclopedias and dictionaries. There is an allure to enumerating how many women Don Giovanni slept with: It was 2,063, at least according to Mozart’s librettist, Lorenzo da Ponte. We also have completely practical lists–the shopping list, the will, the menu–that are also cultural achievements in their own right.

Yes.  And another cultural achievement is the top ten list, dreaded as it is by Roger Ebert.  Film critic Andrew O’Hehir, in introducing his top ten list for 2011, writes:

Crafting an annual top-10 list is no doubt a ludicrous exercise, and I’m not promising I’d have given you the same answers a month ago, or will give you the same ones a month from now. But over the years I’ve grown to appreciate the fact that it forces critics to stop hiding behind relativistic weasel words and high-flown rhetoric, and forces me to defend the murky and individual question of taste. The fact that I–ever so slightly–prefer “Coriolanus” to “Drive,” and “Mysteries of Lisbon” to “Uncle Boonmee Who Can Recall His Past Lives,” definitely tells you something about me as a person and a movie critic.

Indeed, as I argue in a previous post, we cannot help but respond to films subjectively.  Thus, on its own, any list of one’s preferred art objects is going to tell you more about the person making the list than about any universal criteria by which the art objects should be judged.  It can be valued for that reason alone.  Even if accompanied by strong rational arguments in support of the list’s ranking, this says nothing of its objective worth.  Perhaps this is Ebert’s gripe, as he claims (correctly) that lists “have next to nothing to do with the quality of movies.”

That being said, I believe these types of lists are valuable for another reason. Taken together, from a wide variety of critics, they can help us reach a usable standard by which to judge artworks.  The BFI Sight & Sound list of the top ten films of all time (the one list that even Ebert appreciates) is a good example of how this can be done effectively.

Every ten years, the BFI surveys a large number of film critics from all over the world and asks them for their individual top ten lists.  From these individual lists, the BFI compiles the definitive list of the top ten films of all time.  The ten films on this list become exemplars of the standard of taste exhibited by this group of critics.  I doubt that any two individual lists are identical (and you can view all of the individual lists if you do not believe me), but Citizen Kane is continually deemed the standard of excellence in cinematic art.  Kristin Thompson recently wrote a compelling argument against this fact, but even she concedes that Kane’s status appears to be cemented for the time being.  That’s not to say that tastes won’t change; they will, just as they did back in 1962 when Kane dethroned Vittorio De Sica’s Bicycle Thieves for the number one spot.

The BFI list and others like it are especially valuable for those who do not usually swim the murky waters of film criticism or know which critics might offer the most trustworthy opinions. For them, an aggregated “best of” list is an easy way to discover films that might be worth watching.  The two biggest review aggregators that will offer such lists are Rotten Tomatoes and Metacritic; in addition, Movie City News always creates a beautiful year-end chart from aggregated top ten lists. What the review aggregators do is take a film’s multiple reviews and give the film a score based on the critical consensus.  Thus, individual tastes can be merged into a single standard of taste, just as with the BFI lists.  People who want to get in on the conversation surrounding film would do well to follow these aggregators, check out their lists of the best reviewed films, watch the films, and then read the reviews.  In this way, one will learn the current standards of taste among critics, learn the arguments in favor of these standards, discover the critics with whom one’s own standard of taste might align (or misalign), and finally, if one wishes, take an active part in the community of criticism.  That is when the true fun begins, for with each new critic (if his or her opinions are heard and deemed valid by the rest of the community) comes a new chance to push the standard of taste in new and exciting directions.

Roger Ebert claims that “all lists are meaningless.”  Clearly, this statement is untrue.  Lists can be full of meaning, as long as there are people who read them and utilize them.  In the least, they are expressions of the tastes of those who have made them, whether an individual or a critical community.  Thus, they can tell us something about a person in the former case, and they can tell us something about critical standards in the latter.  In either case, the word “meaningless” does not apply.

Further reading:

The Turin Horse

Please note that the following post may contain spoilers.

Béla Tarr’s The Turin Horse is a bleak and beautiful film, one that portrays quite masterfully the frailty of human endeavor, of human civilization.  It does this through breathtaking black and white cinematography captured in long, thoughtful takes.  There is little dialogue, and the music (when present) simply nudges the films along, like the eponymous horse, with its melancholic, plodding rhythm: a funeral dirge for humanity.

The opening narration recounts the following fable:

In Turin on January 3, 1889, Friedrich Nietzsche steps out of the door of number six Via Carlo Alberto, perhaps to take a stroll, perhaps to go by the post office to collect his mail. Not far from him, or indeed very removed from him, a cabman is having trouble with his stubborn horse. Despite all his urging, the horse refuses to move, whereupon the cabman…Giuseppe? Carlo? Ettore?…loses his patience and takes his whip to it. Nietzsche comes up to the throng and puts an end to the brutal scene of the cabman, who by his time is foaming with rage. The solidly built and full-mustached Nietzsche suddenly jumps up to the cab and throws his arms around the horse’s neck sobbing. His neighbor takes him home, where he lies still and silent for two days on a divan until he mutters the obligatory last words: “Mutter, ich bin dumm,” and lives for another ten years, gentle and demented, in the care of his mother and sisters. Of the horse, we know nothing.

This has little bearing on what follows, unless you are familiar with the philosophy of Nietzsche; in which case, the film will unfold as both a confirmation of Nietzsche’s anti-metaphysical view of the world as well as a fascinating refutation of his optimism in regard to our relationship to it.

We begin, appropriately enough, on the horse, a pathetically tired animal, as it carts an old man through a wind-swept wasteland.  After this long take, in which the camera follows the weary journey with an upward gaze, the man and his horse arrive at their humble home.  The man’s daughter rushes out to assist her father in stabling the horse and cart.  Meanwhile, the wind, loud and unceasing, continues pillaging the already gloomy landscape.

We stay with the man and his daughter for six days.  We follow them through their daily routine and observe this routine degrade more and more each day until the characters no longer seem to get any pleasure or meaning from it.  The main obstacle is the unexplainable and incessant wind storm, which the man and his daughter simply gaze upon through a window.  The other problem is the horse, which will no longer obey commands or even eat.  It is as if it has simply resigned from life.

The daily routine of the characters consists of waking, dressing, fetching water from the well, cleaning the horse’s stall, boiling potatoes, eating the potatoes, washing the dishes, and sleeping.  I imagine the man would ride the horse into town, but that part of the routine, of course, is disrupted, as others soon will be.

Two notable disruptions arrive in the form of visitors.  The first is a man seeking pálinka (Hungarian brandy).  He cannot find any in town because of the wind storm, our hint that civilization itself is collapsing from the relentless onslaught of nature. The man recounts a fable (a philosophy?) about why the world is the way it is: man’s judgment of himself, god’s hand in all that is terrible, the debasement of the world through touch and acquisition; it is an indictment of sorts.  The character reminds me of the jovial squire from Bergman’s The Seventh Seal.  He seems to take what pleasure he can from existence without guilt and despite horrid circumstances; he is in on the joke that nothing matters.  And if what he says is true, the storm is the means by which the world, though indifferent, will reclaim itself from those who would debase it.  “Come off it,” the old man responds.  “That’s rubbish.”

The second visitation is an unwelcome band of gypsies.  They attempt to take water from the well.  The old man sends his daughter out to disperse them, but he eventually comes out to aid her with an axe.  The gypsies disband, laughing merrily, and they taunt the man and his daughter: “You are weak.  Drop dead.”

The gypsies are a fitting counterpoint to our sad protagonists.  They have healthy horses of which they are in command, they vocally claim the land and the water as their own, and they are not suffering.  Indeed, they appear to be striving–living rather than dying.

The Turin Horse, unlike last year’s visually rich auteurist statement, Terrence Malick’s The Tree of Life, does not depict the world or humanity with any imagined telos (an end goal or purpose).  There is only a tumultuous sense of becoming–chaotic and without reason.  The forms which we inflict upon the formlessness are what give our lives pleasure and meaning.  And those forms (including our routines), as Tarr shows us, are weak, flawed, and ultimately inadequate.  This is where the film refutes Nietzsche’s optimism.  If, as Nietzsche believes, “we possess art lest we perish of the truth,” what happens when art (our form-giving capability) fails us?  What happens when the ugly truth (the valueless nature of existence) is all that remains?  This is why, perhaps, the film opens with the tale of Nietzsche’s resignation to insanity.  Even for him, the tale suggests, the blackness of life became too much to bear.

Consider the plight of our characters:

First, their horse, their taming of wild nature, no longer responds to their bidding.  Then, their well, their taming of the earth, dries up.  Then, their lamps, their taming of the darkness, do not light.  At this point, we become conscious that even cinema–the film we are watching, which has indeed been beautifying the ugliness of existence for us–even that ultimately fails.  As the light goes out, we, along with the characters, are consumed by blackness.  As the characters resign to nothingness, so too must we. “Tomorrow we’ll try again,” the man says.

The film’s narrator finishes the tale that the camera can no longer tell.  The man and his daughter go to sleep, and the storm, comically enough, subsides.  We see the man and his daughter one final time: eating potatoes, joylessly, as they do.  This time, though, a heavy darkness weighs down on them from above like the pulsating black space of a Mark Rothko painting.  Is this ending hopeful?  Maybe–the storm has ended, and our protagonists can now get back to their daily routines.  But is there a difference between simply sustaining life and actually living it?  The gypsies seem to think so.  But not all of us are as capable of adapting to nature’s frightful whims.  We prefer that nature adapt to us, be tamed by our art, and abide by our laws and routines.  When nature refuses? That is the despairing tale of The Turin Horse.

Further reading:

Doomsday Cinema, Part 2: The Turin Horse

Hume, Kael, and the Role of Subjectivity in Criticism

In a previous post, I discuss why I prefer the word “impersonal” to the word “objective” in questions of aesthetic judgment.  I state: “ […] we can make aesthetic judgments independent of personal taste, based solely on our knowledge, experience, and critical understanding of the art in question.  Rather than taking art personally, we can take it impersonally.”

Simply put, I no longer believe this.  I no longer think that an “impersonal” approach to art is possible.  My reason is that I no longer understand “taste” as something separate from “knowledge, experience, and critical understanding.”  Instead, I understand taste as that which encompasses all of those elements (as well as others).  For example, a person’s adopted evaluative criteria will become a part of that person’s taste, along with his or her experience, learning, and values.  For truly, these all play a part in a person’s subjective appraisal of a work.  No matter how much we may want to experience something objectively, impersonally, or purely rationally, we remain stubbornly tied to our individual tastes.

As a case in point, I want to examine the notorious film critic Pauline Kael.  Last year saw the release of both a biography of Kael and a collection of her work.  This prompted many active critics and journalists to write their own appraisals of Kael.  Roger Ebert had this to say:

Pauline had no theory, no rules, no guidelines, no objective standards. You couldn’t apply her “approach” to a film. With her it was all personal. Faithful readers will know I am tiresome in how often I quote Robert Warshow, who in his book The Immediate Experience wrote: “A man goes to the movies. The critic must be honest enough to admit he is that man.” Pauline Kael was that honest. She wrote about her immediate experience, about what she felt.

She’s accused of being inconsistent and contradicting herself. Directors would fall in and out of favor. With her there was no possibility of inconsistency, because she always wrote about what she felt right now. What was the purpose tilting that emotion to reflect something she wrote earlier? I sat next to her once in a New York screening room. She responded audibly. “Oh, oh, oh!” she’d say, in praise or disapproval. Talking like that would get her in trouble in Chicago. Pauline had–or took–license. You sensed something physical was happening as she watched.

Of his own criticism, Ebert concedes: “In my reviews and those of a great many others you are going to find, for better or worse, my feelings. I feel a responsibility to provide some notion of what you’re getting yourself in for, but after that it’s all subjective.”

Manohla Dargis, in a discussion regarding the merits of Kael in The New York Times, comes to a similar conclusion:

As critics, all we have are our beliefs, ideals, prejudices, blind spots, our reservoirs of historical and personal knowledge, and the strength of our arguments. There are empirical truths that we can say about a movie: it was shot in black and white or color, on film or digital, in widescreen or not, directed by this or that filmmaker. But beyond these absolutes there is only our thinking, opinions, ideologies, methodological approaches and moments in time. That isn’t to say that criticism is a postmodern anything goes; it is to admit that critics are historical actors and that our relationships with movies, as with everything in life, are contingent on those moments.

What Ebert and Dargis seem to be saying, what I have already claimed, and what the example of Kael proves is that there are indeed individual subjective elements that come into play in a critical judgment.

To see how this works, I think that we can apply (interestingly enough) Jean Anthelme Brillat-Savarin’s model of tasting from The Physiology of Taste, in which there are three stages.  However, I think we can simplify it to two concurrent stages.  When appraising an object, we first sense it; as our brain registers the sensation, we immediately start “considering” it (not necessarily consciously or rationally, although that can indeed occur and provide the illusion that we’re operating independently of our body’s conditioning).  What happens when we consider an object?  Our past experiences, our memories, our feelings, our learning, our adopted criteria, and (most importantly) our values all come together (or work against one another) to pass judgment.  Reason might help us sort some of this into a clear, articulate response, but such conscious rationalization is usually unnecessary and will probably only occur, anyway, after a judgment has already been made.  That being said, these rationalizations serve a different purpose–they are what constitute criticism.

Of course, this idea of “no theory, no rules, no guidelines, no objective standards” teeters on the brink of nihilism.  If, ultimately, we each experience an artwork subjectively, what is the point in debating the merit of one opinion over another? How is criticism not simply “postmodern anything goes”?

Fortunately, David Hume addresses this very issue in “Of the Standard of Taste.”  Carolyn Korsmeyer, in her analysis of that work (“Hume and the Foundations of Taste”), expresses the problem in this manner:

If beauty is identified with a particular kind of pleasure, if aesthetic and artistic value is measured by the feelings of the individual perceiver, then one would expect that there would be no grounds for asserting that one aesthetic judgment or expression of pleasure is preferable to any other. People differ, and so do their tastes. However, it becomes clear when reading Hume’s writings on criticism, that tastes, on his account, are not so subjective that no standards can be discerned. In fact, it is quite evident that Hume considered some artistic and literary tastes preferable by far to others.

To be sure, Hume states: “It is natural for us to seek a Standard of Taste; a rule by which the various sentiments of men may be reconciled; at least a decision afforded confirming one sentiment, and condemning another.”

Like Dargis, Hume does not believe that criticism is “postmodern anything goes,” even as he allows for the subjectivity inherent in the wide variety of individual tastes.  No–as he points out, the “joint verdict” of the best critics (consisting of “similarities of sentiment”–the common, shared elements of their opinions–and not necessarily the individual subjective elements) becomes “the true standard of taste and beauty.” This standard of taste, then, if adopted, becomes the very context in which criticism (and, thus, art) becomes communicable and meaningful (i.e., not nihilistic).

Korsmeyer offers the following as an example of how a standard of taste can develop out of individual subjective tastes:

Time is a reliable filter for passing fads and poor judgments, and the verdict of history cancels out individual foibles and produces a universally valid consensus concerning great art. Therefore, according to Hume, although rules of art cannot be codified, standards of taste do emerge as one takes a long look at human society and history and sees how that art which is best suited to please the human frame attains an unquestioned superiority over other, ephemeral creations.

Despite the apparent universal applicability of such standards of taste, however, the individual subjective elements remain the lifeblood of criticism; ultimately, that is why criticism remains an imperfect, mutable process. That is also why it remains fun, engaging, stimulating, and relevant, as the example of Pauline Kael clearly demonstrates.  To be sure, Keith Phillips says of Kael: “Even when she’s wrong, she’s worth reading. I can’t think of any higher praise for a film critic.”

Kael resisted being standardized.  Not many, for example, will share her distaste for Stanley Kubrick.  But she still championed films such as Bonnie and Clyde and directors such as Godard whose qualities have indeed informed the standard of taste adopted by today’s film critics.  So admitting the subjective nature of criticism does no harm to the practice.  In spite of that fact, a standard of taste still develops, and it is that standard which both shapes and challenges our own critical judgments, and vice versa.

So why practice criticism?  Why read it?  Art critic Jonathan Jones offers the following summation:

No [critical] judgment is final. No critic is right, necessarily. It’s just that criticism offers a more honest and realistic understanding of the deep strangeness of our encounters with these mysterious human creations called works of art.

Yes–and in the spirit of subjectivity, that answer is certainly good enough for me.

Further reading: