Atonement

A few brief thoughts on this film, which I saw last night. First of all, it is just as exquisitely made as Joe Wright’s last film, 2005’s Pride and Prejudice. Like that film (also starring Keira Knightly and based on a beloved book), Atonement is chalk full of sumptuous costumes, sets, and luxuriant camera movement. The film is as stylish and artistically superior as anything you’ll see this year.

There is one incredibly beautiful single shot sequence in particular—at the bombed-out beach city of Dunkirk. It’s at least five minutes long and the roving camera effortlessly captures an eye-popping array of fluid sights and sounds. It’s one of those sequences that only the cinematic artform can capture, and at times like these the film offers something Ian McEwan’s novel cannot: a jaw-dropping experience of sight, sound, space and time.

But more than the striking artistry of this film (and its great performances), I was most affected by the final ten minutes in which-without spoiling it—the entire story is thrown into doubt. Or, I should say, the film re-defines itself as more than just an epic love story, but as a meditation on the nature of art and storytelling.

What are the personal or psychological motivations to tell stories or make art? Is it for the benefit of others? Or is it to make amends with ourselves and atone for our former sins? And in telling stories, is fidelity to “how it was” really as important as making the story as whole and fulfilling as possible?

In addition to raising all of these questions, Atonement seems to be emphasizing the formalist split between story and plot (fabula and syuzhet, to use Russian formalist terminology). That is, the “reality” of what the author intends to express (the “plot”) and the seemingly arbitrary interpretation/construction of the reader or viewer (the “story”). With a film like Atonement, there are at least three realities going on at any given moment: the reality of what actually happened (presumably in the life of the author), the reality of the author’s portrayal of it (which in this case is admittedly skewed and subjective), and the reality of the audience (which is always different, viewer to viewer). Atonement weaves a gorgeous, epic and tragic tale, but it intentionally undermines itself by questioning the truthfulness of perspective, memory, and reconstructed reality.

Oddly, after I left the theater my thoughts went to Harry Potter. Specifically I was thinking about J.K. Rowling’s recent announcement that the headmaster of Hogwarts, Albus Dumbledore, is and always has been a homosexual. When she made the stunning announcement I immediately thought, “hey, what right does she have to make Dumbledore gay?” But then I thought that just because she says he is doesn’t mean he is in my construction of that world. After all, the “reality” of Potter world in Rowling’s mind is not necessarily ever the same as mine, or any other reader. Art—even if it is objectified (and mass-reproduced)—is always subjectively experienced and interpreted. Thus, its “reality” is never as concrete as we think (or hope) it is. Rowling can say Dumbledore is gay all she wants—and even write him that way if she wishes… but the fact is, he is only an idea on pages. He is whatever the reader makes him to be.

The word “atonement” means “at one” and is defined by Merriam-Webster as “the reconciliation of God and humankind through the sacrificial death of Jesus Christ.” In the novel and film, the character of Briony (who is ultimately revealed to be the author of the novel, Atonement) is trying to achieve a peace and at-oneness by “playing God” and reconciling herself to the worlds she has shattered. But the tragedy of it all is that she is only “God” within the pages of her fictional accounts and reparative revisionism. Fiction can help heal, but it can never alter history. Or maybe it can—depending on how you define “history.”

Let the Lists Begin!

As you know, I loooove lists... so if you thought I was going to limit myself to one "best of 2007" post on Dec. 31 and that's it, you'd be mistaken! "Best of the Year" frenzy begins today on The Search, and will go through the end of the month (culminating with my top ten movies of 2007 on New Year's Eve!)

So, to kick things, off, and because this category has no risk of missing any latecomer additions, today I'm listing my picks for the best television shows of 2007:

10) American Idol (Fox): What can I say? As trifling as it is, this show is the most compelling television for four months out of the year... I'm addicted.

9) The Hills (MTV): Am I joking? Sort of… But anyone who has seen this show must admit it has a definite “can’t turn away” quality. Plus, from a theoretical, “what is real?” point of view, the show is fascinating.

8) Project Runway (Bravo): Continues to be the most interesting, consistently quality reality series on television.

7) The Daily Show / Colbert Report (Comedy Central): Yes, these are two different shows, but the spirit is the same in both. It's the Comedy Central "newsblock," and it's ridiculously fun to watch.

6) Rome (HBO): HBO's Caesar series only had two seasons, but its great cast (a who's who of British thespians) and classy period melodrama made for some really good, highbrow TV.

5) The Office (NBC): Gets better and better every season… the cast has nailed down the nuances and hilarious quirks of their characters, and the writing is consistently dead-on.

4) Lost (ABC): This show redeemed itself near the end of its third season, reminding us all why we got so addicted in the first place. Can’t wait for its January return!

3) 30 Rock (NBC): Miles above the majority of comedy on TV in terms of sharp, culturally astute humor. Tiny Fey, Alec Baldwin and Tracy Morgan lead the funniest ensemble cast since Arrested Development.

2) Mad Men (AMC): Who knew AMC was in the business of making amazing one-hour dramas? This 60s period piece (about Madison Avenue ad men, their whiskey and their women) was the best new series on TV this fall—with great acting, glossy eye candy, and sharp social commentary.

1) Friday Night Lights (NBC): I suppose it’s getting repetitive by now, but this show really is the best thing on TV. The acting, writing, production, and general “breath of fresh air” spirit of the whole thing is really unrivaled among network shows. Here’s hoping it’ll survive for a third season!

Upside Cinema

“Holiday Season” in Hollywood often means two things: Awards movies and feel-good, family fare. And the two usually do not overlap. Sadly. However, this season a lot of the happiest, most joyful films are also some of the best, most critically acclaimed and Oscar buzzworthy. Could it be that Hollywood is finally waking up to the fact that movies can be real, gritty, AND positive? That “true to life” does not always need to leave the audience feeling morose and down on humanity?

Take the film Juno, for example. Here’s a film that straddles the line between fluffy teen comedy and heavy relationship drama with utmost ease. It’s a beautiful, maturely told story that cuts no emotional corners and doesn’t force-feed anything (whether it be laughs or tears). It deals with serious human turmoil (teen pregnancy, abortion, divorce) with a rare and elegant nuance that manages to make it funny without making light of it. When 16-year-old Juno (the unforgettable Ellen Page) tells her parents that she’s been “dealing with things way beyond my maturity level,” she might as well be speaking for the film as a whole. Juno could so easily have slipped into cheap teen sex comedy, Alexander Payne-esque satire or some heavy-handed family melodrama (Life as a House comes to mind), but it refuses to be pigeonholed in any one of these genres. Instead, Juno focuses on its characters and life in general. The result is a film that feels frayed and weary but ultimately hopeful—and above all, real.

Some other films that have come out recently have also taken the high road and made the messiness of life seem somehow lovely and affirming. Bella is an obvious example—a film that hinges on a serious matter (unwanted pregnancy) but is otherwise concerned with everyday joys like good food, family, and dancing. Dan in Real Life also has this feeling of joyful revelry and earnestness (even if it sometimes feels a tad forced). One of my favorite films of the year, Lars and the Real Girl, also features sadness and strife within an overwhelmingly warm, life-is-good worldview. I would even put The Savages into this group. It’s a seemingly glum film about aging and death, but the way that it refuses to let its characters fall victim to cynicism and self-destruction makes it ultimately just as hopeful and loving as Enchanted (itself a prime example of this “feel-good cinema” trend).

These types of films come as a breath of fresh air to the typical “prestige/arthouse” fare that wallows in human depravity and despair. I’m thinking specifically of two films that have recently released (to critical acclaim) that I think are more despairing and nihilistic than they need be: Sidney Lumet’s Before the Devil Knows You’re Dead and Noah Baumbach’s Margot at the Wedding. Both of the these films are high-quality and superbly acted (especially Margot… Nicole Kidman is remarkable), but both left me feeling like I needed to take a shower to wash away the despair and macabre family dysfunction. Both films focus on family, and in each case there is little to no love to go around. Devil is about a pair of brothers (Philip Seymour Hoffman and Ethan Hawke) who rob their own parents and descend into a spiral of sex, drugs, and cover-up. Margot is more subtle and passive-aggressive, but perhaps even more insidious. Even Jack Black (at his manic slacker best) cannot lift the film out of its pervasive misanthropy and existential confusion. Even Ingmar Bergman—in all of his nihilistic eloquence—never plumbed the hateful depths that are mined in Margot.

I’m not saying such films are not useful or worth seeing—they are, in small doses, just like Virginia Woolf or James Joyce. But as “true” as they may seem (and as admittedly resonant as the acting is), I’d much rather see a million Juno replicas. These feel-good art films are just as real and skillful and uncompromising as the “downer” best picture nominees—the difference is that the “reality” they choose to portray is the upside of life, not the underbelly. And I daresay our world needs a lot more “upside” stories right now.

I'm Not There

I just saw I’m Not There, the new ultra-artsy “biopic” about Bob Dylan. Let me just say: it’s an amazing film. Whether or not it’s an accessible film is a different story (it isn’t), but as far as a film that is stunning to watch—from first to last frame—I’m Not There is one of the best of the year.

By now you all know the gimmick: six actors of various ages, races, and genders each playing some iteration of “Dylan.” But the thing is, it isn’t a gimmick at all; in fact, it fits so perfectly with what this film is about… I can’t imagine it being right in any other way.

On one level this is a film about Bob Dylan—the complicated artist who had many “phases,” career turns, personalities, and iconic moments. Indeed, the film is as much about Dylan the man as it is about Dylan the decade: the aura and zeitgeist of the “sixties” which he so embodied.

And yet on another level the film is about identity in general. “Dylan”—or the lack of any one identifiable Dylan—is just an easy case study in what we might call the larger “identity crisis” in postmodern Western culture.

Prior to the Industrial Revolution, our sense of “self” or identity was more or less static—our social roles defined, our frames localized. But as the world pushed toward modernity things became much more fractured, specialized, and our social roles diversified. Soon the “self” became “selves” that corresponded to the different hats we wore and locales we existed in—the home, the workplace, the playground, etc… Modernity eroded our certainty about pretty much everything, including the notion that “who we are” is something we can control or understand… if it exists at all. People like Freud, Jung, and Lacan emphasized the massively complicated and elusive nature of personality, and later postmodernists like Foucault (who positioned “self” as a discursive construct rather than a real entity) and Baudrillard (who called identity “the label of existence”) further deconstructed any notion that there is a Self above and beyond our “selves.”

Dylan was living at a time when all of this was very much in the air—with modernity wreaking havoc (Vietnam, Cold War, etc) and the postwar “technocracy” breeding cookie-cutter specialists and subsequently a counterculture that defined itself in terms of the multiplicity of things it wasn’t. In some ways Dylan is the pop cultural embodiment of all this socio-cultural confusion. As society was struggling to define itself amid a world spinning in so many directions, so too was Bob Dylan.

But beyond the historical context of Dylan and the 60s, this film struck me as something I could relate to on a much more personal level. I’m not sure I subscribe (at least cognitively) to the postmodern theorists and rhetorics of the indefinable self, but watching this film I couldn’t help but recognize myself in the whirlwind of existential fluidity being displayed on screen.

There is a real sadness in the film’s desperate search for the self. The questions are never asked explicitly—and indeed, fractured identity is never problematized but rather organically assumed—but nonetheless, beneath the cool exteriors of each version of Dylan lies a spiritual angst and unsettledness. It is a spirit of confusion and fragmentation that I think we all—in this hyperlinked, frenetic world—can relate to. Who am I really? Am I the guy on TV? On stage? The character written about in the press? Or more germane to the non-celebrities among us today: Am I my Facebook profile? My blog persona? Or is that all a “separate” self from who I am with my closest friends and loved ones? Are all these individuated selves just some version of the same thing? And if so, does what I think I am really matter when everyone I ever meet sees me through different eyes?

The feeling that you get watching I’m Not There is the same hollow, perplexing feeling that the title implies: that in everything I’m seeing, experiencing, saying, performing, there is one thing that is conspicuously absent: myself. It is the feeling of being removed from yourself and simply observing from afar, akin to that dream experience of observing yourself as a character is some narrative that you are simultaneously experiencing first-person. It’s a feeling that reminds me of video games and avatars—playing “myself” in both a first and third person sense.

This is all very confusing and perhaps counterproductive, but there is an undeniable exhilaration to it as well. Being able to step back and analyze the breathtakingly complex nature of humanity—indeed, even within the one human being that is yourself—provides a fittingly diverse array of emotions. If that’s something you are not afraid to experience, go see I’m Not There.

Top Twenty Defining Films of the 00s

Because I LOVED this post about the top ten films of the millennium (thus far), and because I love lists, and probably because I’m sometimes a copycat, I decided to compile a list of the twenty most defining films released since 2000.

The key word here is defining. My list isn’t so much about the BEST as in raw, objective quality as it is about how well these films capture or embody the moment of the 00s. Just as films like The Graduate, Easy Rider, and Medium Cool defined the zeitgeist of the Sixties, what are the films we will look back upon as the best and most defining films of the first decade of the 21st century? Here is my list:

Inland Empire (2007): Though David Lynch’s Mulholland Drive (2001) is probably the better film, Inland Empire is certainly more “of the moment.” The all-digital, hallucinatory epic (it looks like a home video from hell) is a three-hour montage of nightmarish postmodern images and rabbit trails—an assemblage of 21st century anxiety and scatterbrained vignettes of the most mind-bending sort.

The New World (2006): Terrence Malick’s film, though set in the earliest days of the American republic, has a lot to say about how we view the world now. The film is an elegiac tone poem for a paradise lost— ecologically, spiritually, and culturally. Its fluid images and hushed voiceover fragments create one long, cathartic purge for our collective, world-weary soul.

A.I. (2001): This film ushered in the 21st century with a particularly 21st century gimmick: the mashup. The Spielberg/Kubrick film is also thoroughly modern in its dystopic imagery and technophobic preoccupations: the all-too-immediate question of what happens when our technology becomes more real to us than our fellow humans.

Lost in Translation (2003): Brilliant in the way that it embodies globalization and its discontents in the twenty-first century, Sofia Coppola’s graceful, nuanced film captures both the joys and existential angst of a glossy, post-industrial, spiritually-wayfaring society.

Me and You and Everyone We Know (2005): This quirky little film from artist Miranda July is all about the odd mutations of human communication and connection in a digital age. What happens when our computer-mediated relationships turn out to be less than appealing in the real world?

Nine Lives (2005): Nine fragments of nine individual lives, told in segments over a series of nine long shots: all of them women, all unresolved glimpses into tangled lives with branching trajectories. It may sound convoluted, but this film evokes so much truth in its snapshot structure. It’s akin to the soundbite news stories or googled tidbits that populate our everyday windows into other peoples’ worlds… only better.

Children of Men (2006): Like A.I., this futuristic sci-fi epic provides an exceptionally dour vision of the not-too-distant future. The film deals not so much with technology, however, as with the consequences of politics, war, terrorism, immigration, and environmental disintegration. Like much of the doomsday rhetoric in society today, however, there are some glimpses of hope within the chaos.

Before Sunset (2004): This film, which is essentially an extended conversation between Ethan Hawke and Julie Delpy, is perhaps the most elegantly urgent film of our increasingly anxious historical moment. It’s about not letting things slip away in a world where second chances—where nothing, really—is guaranteed.

George Washington (2000): David Gordon Green’s debut film is anachronistic (there are no computers or cell phones to be seen), but while it may not feel completely comfortable in the 21st century, neither does it feel at home in the 20th. The film is in some ways a lament for the “olden days” of green grass, safe streets, American dreams—but it is also looking away from all that—towards a new future that leaves behind the racial, relational, and economic strife of bygone days.

Flags of Our Fathers / Letters From Iwo Jima (2006): Clint Eastwood’s WWII double-whammy is both a classic war epic and a totally postmodern recasting of our collective national memory of “The War.” By showing the Battle of Iwo Jima from both the American (Flags) and Japanese (Letters) perspectives, Eastwood provides a decidedly 21st century narrative that is fueled by our general malaise (post Iraq) about America’s foreign policy.

United 93 (2006): 9/11 is the defining event of this decade (thus far), and United 93 is the best filmic representation of it. The documentary-style drama brings us viscerally back to the terror of that day, offering a disturbing glimpse inside the hijacked flight 93 as well as a resonant look at the unfolding chaos on the ground.

25th Hour (2002): Shot in the shadows of the blue-light specters of the World Trade Center, Spike Lee’s film captures the complicated post-9/11 mood of America. Ostensibly about one man’s (Edward Norton) last night before heading off to prison, 25th Hour is really a letter to NYC and America—full of all the rage, love, sadness, and hope that Lee so keenly conjures up in his films.

The Royal Tenenbaums (2001): Wes Anderson’s most complete, satisfying cinematic entrée, Tenenbaums is a gloriously somber iteration of the sort of hip-cultural/youthful nostalgia that has defined the 00s. Anderson’s hyper-stylized, immaculately arranged art direction and mise-en-scene also seem to have started numerous trends in both film and television.

Being John Malkovich (2000): You could argue that the most pressing question of the digital age is that of identity. Being John Malkovich takes this question and runs with it—to very trippy results. Spike Jonze’s film feels like a video game in an online puppet fetish community; it is wildly postmodern and nonsensical.

Southland Tales (2007): This soon-to-be cult classic from Richard Kelly (Donnie Darko) is a frenetic, insane, balls-out overture to all that is off-kilter in 21st century, post 9/11 America. Like the decade it satirizes, the film is a mashup of politics, religion, entertainment and technology—a warped but beautiful vision of a surrealist Lynchian apocalypse.

The Departed (2006): Last year’s best picture Oscar winner is a thoroughly contemporary film in both obvious ways (the importance of cell phones for the script) and on more subtle levels (the transnational migration of the film from the Hong Kong original—Infernal Affairs—is a decidedly recent phenomenon in cinema). Furthermore, the film’s urban, unrepentant nihilism feels quite authentic in the context of our current cultural quagmire.

Waking Life (2001): One it tempted to write this animated (via rotoscope) film off as an exercise in high style (which it is), but Richard Linklater’s 2001 film is also stunning in its seamless and gloriously garrulous vignettes full of new millennium philosophizing.

Kill Bill (2003-04): This two-volume epic from Quentin Tarantino throws together a zillion pop culture artifacts to form a surprisingly effective, coherent narrative that fits nicely into the post-9/11 revenge-film trend. The genre, musical, stylistic, and thematic hybridity demonstrated in the film(s) may not be solely Tarantino’s domain any longer, but he still does it the best. Garden State (2004): Some called it The Graduate for the Net-generation; others called it commodified cool. Whatever you call it, this film struck an unmistakable chord with many, many young people—disillusioned, medicated, and unsure of physical place and “home” in an increasingly de-physicalized world.

Tarnation (2004): Using a mass of accumulated home video from throughout his life, director Jonathan Caouette takes us on a harrowing journey through his troubled childhood, messed-up family, and drug-addled existence. It’s a truly tragic and very personal film, and perhaps the first masterpiece of the “home movie / iMovie” genre.

Derelict Chic

Los Angeles is a place where anyone can be a celebrity—and I mean anyone. It’s also a city that boasts one of the largest homeless populations in the world (50,000 and rising). It was only a matter of time, then, that a homeless person became a celebrity.

Meet John Wesley Jermyn (aka “The Crazy Robertson”)—a streetperson who has lived on Robertson Blvd in L.A. for twenty some years. Like many vagrants in the City of Angels, Jermyn comes from a successful background (he was a star baseball player in high school and college, and was drafted by the Kansas City Royals in 1969). Unlike most vagrants, however, Jermyn has a clothing line named after him.

Kitson, a trendy boutique in the uber-popular Robertson shopping corridor, has recently launched "The Crazy Robertson" brand of T-shirts and sweatshirts. The line includes a $98 hoodie with Jermyn’s likeness on the back and the words “No Money, No Problems.” The twenty-something trio that launched the label made a deal that offers 5% of the line’s net profits to Jermyn, though so far he has refused to accept much cash, preferring to be paid in food, liquor and paper for his art projects.

In the meantime, Beverly Hills hipsters are snatching up the Crazy Robertson shirts—the latest fad in the increasingly odd and self-conscious “gauche/trash” trend in L.A. fashion.

Nary an indie-rock concert today that does not have dozens of rich kids dressed in Olsen twin derelict, “ashcan” homeless style. A few weeks ago I was at a Joanna Newsom concert (a freakfolk harpist/singer-songwriter) and there was loads of this boho, straggly-haired unkemptness. I even saw one guy with a stick hoisted over his shoulder with a cloth sack hanging off the end of it, railroad bum style. I felt like I was in a Jack Kerouac novel.

Homelessness is probably not trendy or cool if you are a homeless person, but it is increasingly chic for many wealthy and hip folks in Los Angeles. Look no further than L.A.’s infamous “Skid Row.” At 50 square blocks, this bastion of third-world poverty is the largest encampment of homelessness in the nation. But it is also—increasingly—the hottest site of high-end real estate development in downtown Los Angeles. Literally across the street from the homeless tent camps are newly renovated loft spaces that sell for $1000-2000/month. In efforts to (perhaps) get in touch with their unpretentious earthiness, many yuppies are moving into the gentrified shantytown. Oscar-nominated “it” actor Ryan Gosling lives in a loft on Skid Row. “You can't filter yourself from reality there,” Gosling remarked in a Guardian interview.

As bizarre as this all is, it does make some sense. People long to be “homeless-friendly”—especially rich, socially conscious, guilty white folks. And since riding public transportation, working at a soup kitchen or volunteering at a city mission is out of the question for much of the leisure class, moving in next door is the next best option! Spending hundreds of dollars on designer homeless clothes sends a message of solidarity, right?

Well, maybe, but solidarity does nothing to alleviate real world problems. The gentrification of Skid Row may “clean up” downtown L.A., but where will all the homeless people go? I wonder if Ryan Gosling realizes that the “reality” he is paying top dollar to live within will be directly impacted by his being there? Do the patrons of Kitson realize that the $98 they spend on a “Crazy Robertson” sweatshirt could buy ten sweatshirts for people on the streets?

Probably not, but that’s because “derelict chic” is a trend. And trends have little concern for consequences.

The Case for Criticism

This Thursday is Thanksgiving—a day when we should wake up to the overwhelming goodness and bounty in our lives. It is a wonderful and much-needed occasion to reflect on what is good in the world: family, home, health, happiness, etc.

On this day we are often reminded that being “critical” or “cynical” does no one any good. As a critic by trade, I sometimes feel a little guilty around Thanksgiving. Life’s too short to go around criticizing things, right? Shouldn’t we be thankful for what is good in life rather than complain or criticize what is wrong? The world needs more positive thinking, after all.

But criticism gets a bad wrap when it is associated with cynicism or some other negative word. Sure, there are types of criticisms that are rooted in unhelpful, aggressively deconstructive attitudes. But other criticisms come from a constructive spirit of enrichment and appreciation. Some criticism, in other words, is meant to make the world better. I might even suggest that all criticism—when it is serious, well-informed, and nuanced—benefits humanity.

The world is a complex, overwhelming place. There is a whole lot of good and a whole lot of bad—an absolute glut that gets bigger by the day. Without questions and criticism, it would be unmanageable—or if not unmanageable, then at least unlivable.

Criticism is all about understanding, theorizing, making simple what is complex… It is plucking out and propelling the best of what is otherwise obscure. The purpose of criticism is to champion goodness, truth, and beauty and to criticize that which is bad, dishonest, or ugly (things that are, arguably, increasingly subjective).

I recently wrote an article on criticism for Relevant magazine in which I came to the defense of the much-maligned discipline/vocation. Here’s a brief excerpt:

[The job of a critic] should NOT be to try to keep people away from bad movies. Instead, we should try to keep people from missing the great movies. Sure, I enjoy writing scathing reviews of atrocious films (who doesn’t?), but I’d much rather write about a film that I love. It’ s fun to make a dent in the undeserving monstrosities, but it’s way more fulfilling to give some momentum to the deserving-yet-unknown little guy.

I think the food critic character in Ratatouille (a movie I highly recommend) puts it nicely in his final speech of the movie. He speaks for all critics, I think, when he says:

“In many ways, the work of a critic is easy. We risk very little yet enjoy a position over those who offer up their work and their selves to our judgment. We thrive on negative criticism, which is fun to write and to read. But the bitter truth we critics must face is that, in the grand scheme of things, the average piece of junk is more meaningful than our criticism designating it so. But there are times when a critic truly risks something, and that is in the discovery and defense of the new.”

Indeed, the discovery and defense of something new—something true, beautiful, significant, progressive—is when the critic feels most validated. It’s when anyone feels most validated. Yes, it’s true that the cultural significance of “the critic” is waning. In our bottom-up, user-driven society, top-down suggestion is way less persuasive than peer-to-peer recommendations. We see through marketing and are suspicious of opinions—unless they are from people we trust. And yet to be someone that people trust (more than just your best friends and family) takes some devotion to the critical discipline. (read the rest of the article here...)

I’m not saying that all critics are valid or helpful, just that the idea of criticism is not something we should fear or view as inherently negative.

What we have to be thankful for is often brought to our mind, elucidated, and eloquently celebrated by the critical act. Far from something that is contrary to a positive, enriched outlook on life, criticism is an essential and invaluable affirmation of who we are as rationally-endowed, finely-tuned beings.

Southland Tales

Southland Tales is a gloriously incoherent, colossally ambitious, creatively explosive masterpiece of American cinema. Such breathless critical hyperbole is fitting for such a breathlessly hyperbolized film. Everything about Richard Kelly’s second film (following cult hit Donnie Darko) is over the top, super-sized, and shamelessly indulgent. It’s a million little pieces of postmodern ramblings, pop-culture pastiche, and political farce… but it all adds up to an experience that’ll bowl you over in its ballsy cinematic exuberance.

Tales is the type of film critics love to write about. But it’s also the type of film you can’t really write about. There’d never be enough words to capture anywhere near what one could say. Thus, because Tales eschews conventional structure, narrative, and a general concern for rhetorical efficacy, so too will my “review.” The film is a fluid, freestyle hemorrhage of scatterbrained thoughts, arguments, and observations. It’s a film that glories in the art of irreverent, probably pretentious mimesis; and so go I down the same meandering path.

  • First scene of the film: home video, Texas, July 4, 2005. Neighborhood kids gathered in general frivolity for child’s birthday party (ala South American alien scene in Signs). Suddenly: boom! Mushroom cloud! Terrified screams! WWIII, terrorism, CBS’s Jericho (a name that comes up throughout the film), Pat Frank’s Alas Babylon!
  • I do NOT trust the Cannes Film Festival. Southland Tales almost didn’t get distribution because those snobby cinematic dilettantes booed it to all hell. But they also loathed Marie Antoinette and The Brown Bunny. Case closed.
  • Tales is wicked political farce: a strikingly insidious steam valve for the bubbling rage, seething malaise, and widespread 21st century disgust with American democracy. It’s Fox News Channel, Enron, neo-Cons, hippie liberals, neo-Marxist performance artists, racist cops, gerrymandering (via severed thumbs), militant electioneering, Fallujah war stories, Big Brother surveillance, Patriot Act-sanctioned federalized Internet, frathouse beer bongs, American flags, Hustler billboards on tanks, wildfires, riots, earthquakes, and Armageddon all rolled into one.
  • Video screens and interfaces define the look of this film. “HD Live”—bringing you all the best of Iraq War footage and “Girls Gone Wild” Springbreak debauchery… sponsored by Hustler, Panasonic, and Bud Light. Miranda Richardson (playing the conniving, backroom mastermind wife of a presidential candidate) sits in a control room with wall-to-wall screens pumping real time war updates, LAX bathroom stall surveillance, celebrity news, scrolling updates, and teaser tidbits (“How safe is your car? Find out which cars are the favorites of the terrorist car bomb underground… after the break!”). The revolution WILL be televised!
  • Sarah Michelle Gellar plays Krysta Now, a porn star turned multi-media talkshow host with a cable TV show called “Now”—a “topical discussion chat reality show” in which a panel of porn stars debate a slate of important issues like war, crime, “social reform,” abortion, and teen horniness. The show equates “important national issues” with X-rated banter about sexual positions. It’s as captivating and ridiculous as any Ann Coulter diatribe on a cable news talkshow.
  • In addition to Buffy, the film’s cast is loaded with a who’s-who of pop culture icons, B celebrities and other such unconscionable casting choices (“Booger” from Revenge of the Nerds is in it!). The Rock (or, as he likes to be called now, Dwayne something) is the mysterious locus of the film—a celebrity wrestler turned-politician/screenwriter who’s married to Mandy Moore (playing a daughter of a senator) and writing a script that becomes Southland Tales somewhere around the one hour mark. Seann William Scott plays dual roles as an L.A. cop carousing with neo-Marxist revolutionaries (played by SNL stars Cheri Oteri and Amy Poehler) and his identical brother/clone who spends a good portion of the film in a dumpster. Other stars of the film include Bai Ling (as the heavily eye-shadowed “Serpentine”), Kevin Smith (as a philosophizing mastermind of the 4th dimension…similar to “The Architect” of The Matrix), and Jon Lovitz (as a corrupt and moribund L.A. cop).
  • Justin Timberlake—God bless him—steals the show. He’s a celeb-turned Iraq War vet named Private Pilot Abilene. He also narrates the film, quoting T.S. Eliot (“this is the way the world ends, not with a whimper but with a bang”) and half of the book of Revelation. He’s a God-fearing prophet (perhaps one of the “two witnesses of Jerusalem”?) and a beer-guzzling druggie (addicted to “Fluid Karma,” which is a revolutionary substance somehow key to the oil-strapped world’s energy crisis).
  • Best scene of the movie: Timberlake (in all his SexyBack/NSYNC glory) breaking into a dance sequence/lipsync to “All These Things That I Have Done” by The Killers. Backed by a chorus line of blonde-wigged dancers, Timberlake mouths the words to the song (“I’ve got soul but I’m not a soldier”) while walking toward and glaring at the camera, music-video style (i.e. uber-serious), all the while pouring Bud Light beer onto his head.
  • Tales is a cinematic homage to L.A. (via films like Short Cuts, Magnolia, The Big Lebowski) but more than anything it’s a love letter to David Lynch (himself an auteur of L.A. in all its incomprehensible Southland mayhem). Moby’s thick, synth-happy score is an MTV-age riff on Angelo Badalamenti’s ethereal sonic layering that defines the Lynch brand (Blue Velvet, Twin Peaks, etc). There’s a lurid, primal elegance to the artifice of Tales that both stems from and expands on Lynch’s feverish cinematic delirium. The climax of the Lynch love comes in the film’s final act—with an unforgettable performance of “The Star Spangled Banner” sung partly in Spanish by Rebekah “Llorando” Del Rio (the crystal-voiced mystery singer from Mulholland Drive’s stunning “Club Silencio” sequence). Simply astounding.
  • Krysta Now is the prescient voice of cultural narrative in the film. She comments on the past (“All the pilgrims did was ruin the Indian orgy of freedom”) the future (“Scientists are saying the future is going to be far more futuristic than originally predicted”), and even the apocalyptic now: “The New York Times said God is Dead!” Buffy Michelle Gellar exudes a sharp comic timing that gives the film some of its funniest moments (and there are a lot of them).
  • The music in the film—anchored by Moby’s luxuriant score—kills. Great songs are featured from the Pixies, Black Rebel Motorcycle Club, Jane’s Addiction, Waylon Jennings, Big Head Todd and the Monsters, and a slew of Britpop bands (Radiohead, Blur, Muse, Elbow)… even a well-placed Beethoven’s 9th.

So much more to be said, but I’ll wrap it up with an incoherent string of images, phrases, and adjectives that Southland Tales evokes: American dysfunctional “party on” ethos, decontextualized soundbite windows media, reality as pop art vaudeville, entertainment/political/technocratic superstructure, Huxleyan dystopia, SUVs humping, The Hindenburg in flames, Jesus Christ stigmata tattoos on The Rock, spacetime continuum rifts, Kubrick, Robert Frost, Tarantino, off-road rollerblades, Venice Beach freaks/anarchists, Santa Monica pier snipers, Skid Row version of Children of Men firefight, Cheetos-eating stalkers, pop pastiche trifle with Herculean vigor and convenient election-year timing, epoch-defining, Gen X orgy of cultish Trekkie fanboyisms, desacralized Neo savior complex, O.J. Simpson/early 90s L.A., Greta Van Susternist exploitation, Lacan’s mirror stage, the fractured postmodern self, and pimps who are never suicidal.

The Writers' Strike: Doomsday for TV?

For those outside of Hollywood and NYC, the Writers Guild Strike probably seems distant, irrelevant, and maybe a bit superfluous. But soon enough everyone will feel its effects—in the short term (lots of re-runs, sports, and reality TV this winter) and also in long term, systemic shifts in the broadcast media landscape.

To quickly summarize what the strike is all about: in a word, Internet. Last time the WGA went on strike in 1988 it was over home video residuals (i.e. how much per video sold or rented does the writer get?). The debate today is in part over DVD residuals (because writers now get only 8 cents per DVD sold), but in most opinions, the days of DVDs are numbered. Thus, the real focus of the debate between writers and studios is compensation for Internet content. For every streamed or downloaded show on a network website, writers get nothing. This is a problem for them, but the networks refuse to budge.

In one of my classes last week, Greg Daniels (creator/show-runner for The Office) spoke to us about the strike. Earlier in the day he had been on the picket lines with other Office staff, which you can see in this video (he’s the guy with glasses). Daniels told us that the strike was all about show content on the Internet, which networks maintain is solely promotional/marketing in purpose, even though—according to Daniels—the ads on the network websites are twice as valuable per 1000 views as anything on TV. But are the writers seeing any of this money? Not a dime.

For obvious (albeit risky) reasons, the networks and their studios are not conceding or negotiating anything. They recognize that the immense money to be made online is the future, and thus they’re taking a hard-line proprietary stance. If the belligerent posturing continues, the strike could last at least as long as the ’88 strike (5 months) or maybe even longer. All your favorite shows will be relegated to reruns, reality shows will enjoy a reluctant renaissance, and American Idol’s ratings will go higher into the stratosphere than ever before.

In the meantime, the writing talent in Hollywood will be jobless… In theory. But the longer the strike goes on, the more I think the good writers will go elsewhere with their material. Everyone is pretty much in agreement about the fact that T.V. is inevitably going to move online. So why should writers wait for the networks? In the all-access, narrowcast, niche Internet, who needs broadcast networks? Writers may as well circumvent the networks entirely: acquire private financing from a third party, produce the shows independently, market them virally, and exhibit them online.

Lest you think made-for-the-Internet shows are still a long way off, think again. Marshall Herskovitz and Edward Zwick (My So Called Life, thirtysomething) have a new show called quarterlife that premiered on MySpace on Sunday and will be shown in 36 webisode installments on www.quarterlife.com. The twentysomething ensemble drama is a fictionalized serial that supplements a larger social-networking site for aspiring artists and creative people in their twenties. Sound like a brave new world? That’s because it is.

Television as we used to know it—a place where shows appeared on certain days and times that we had to tune in to, tape, or miss—is disappearing before our eyes. With Tivo, iTunes, webisodes, and online streaming, we are no longer tied down to a day, time or medium through which we consume media. We determine how we consume an episode of a show. It’s a completely me-centric media experience.

I’m convinced that we are just a few years out from a massive change in our very definition of television.

Soon we will buy most of our TV shows like we do a magazine—either by subscribing for a year or picking it up ala carte. For $20 or $30 bucks we will be able to buy a season of our favorite shows and have access to download or view them exclusively online. And this money would go directly to the people making the show—with no network or distribution middlemen. Thus, if J.J. Abrams announced a new, spinoff Lost series to be shown online to subscribers only, he could feasibly finance it completely himself. It would require Abrams to convince loyal Lost viewers (about 15 million in the U.S. alone) to shell out $20 for a “season pass” to view or download a 20-episode season. This would equal $300 million income for Abrams—more than enough to cover the show’s 3-4M/episode budget. And this is without any mention of advertiser revenue, which in the old model of T.V. was the one and only income source.

Essentially I’m suggesting a new model of entertainment-delivery that is funded solely through mini-contributions from millions of viewers. But of course, this is not a new model at all! It’s called the movies! T.V. and cinema have been converging for decades now in style. Now they are taking that last step of convergence in business: on-demand, web-based, ala-carte everything.

Call me crazy, but this is the future. The Writers' Strike is just hurrying it all along.

No Country For Old Men

No Country For Old Men

Joel and Ethan Coen’s new film, No Country For Old Men, is not an easy film to watch. It is desperately nihilistic and almost apocalyptic, in the way that Cormac McCarthy is so apt at capturing. It’s an anachronistic Texas western in look and mood—with great action scenes, shootouts, and dead desert imagery. But it is a world-weary, existential western as well: somewhere between Unforgiven and 3:10 to Yuma.

Christianity 101: Exclusivity

I have had several conversations and encounters in recent months that have made me worried about the extent to which the world—including Christians—does not understand what Christianity really means. In June I attended a panel discussion on the film A Mighty Heart, which featured representatives from Christian, Jewish, and Muslim backgrounds. The major theme throughout the discussion was the increasingly popular sentiment of collective goodwill/hope: that all major religions—regardless of who is being worshipped—are chiefly about love and peace. We must stop viewing each other as different or wrong... just diverse paths to a similar end.

More recently (this weekend), I attended a screening of a new documentary produced by Morgan Spurlock (Super Size Me). The film, entitled What Would Jesus Buy?, uses the forms and traditions of Christianity to mount an argument against out-of-control consumerism, though it never really offers Christianity or Christ as an alternative or solution. The film (which I will write about in more depth soon) follows “Reverend Billy and the Church of Stop Shopping”—a performance art/activist group that looks like a gospel choir but makes no claims of believing in the gospel. Following the screening of the film, I interviewed Spurlock and asked him about how Christianity fits into the message of the film. He said that the film's theme reflects the true meaning of Christmas—the arrival of a man who would revolutionize the world and shake things up through his radical message of peace, love, and equality.

But Christians, as I pointed out to Spurlock, would argue that Christmas represents more than peace and goodwill and love. It represents the Answer to our dissatisfaction in the arrival of a person who becomes a savior. True satisfaction, the Christian argues, comes not simply from the message of Jesus Christ (which if it is only peace/love/equality is not unique to him), but through his person. The sacrificial death and resurrection of Jesus—and through that alone—provides our redemption and ultimate happiness. Spurlock (who was incredibly nice and easy to talk to) responded by saying that yes, happiness can be found in Jesus Christ, but also in Allah or Buddha or whoever it might be. All of us are essentially about the same business: which is to try to make a change in the world.

It seems that the Christianity being invoked in What Would Jesus Buy?—and which is cooperating ecumenically for social justice and political causes (a good thing)—is increasingly being stripped of its claims of exclusivity. It is pretty clear in the scriptures that Jesus Christ was not of the mind that his way was just “one of many.” Rather, he said “I am the way, and the truth, and the life. No one comes to the Father except through me” (John 14:6). C.S. Lewis articulates the vital importance of Christ’s claims of exclusivity also in his famous “Lord, liar, or lunatic” reasoning in Mere Christianity:

A man who was merely a man and said the sort of things Jesus said would not be a great moral teacher. He would either be a lunatic – on the level with a man who says he is a poached egg – or he would be the devil of hell. You must take your choice. Either this was, and is, the Son of God, or else a madman or something worse. You can shut Him up for a fool or you can fall at His feet and call Him Lord and God. But let us not come with any patronizing nonsense about His being a great human teacher. He has not left that open to us.

In other words, Jesus Christ cannot merely be a teacher, or prophet, or rhetorical genius (all of which he is). His message of love/peace/equality is great, yes, but part of his message is also that “my way is the only way.” Thus, to accept him as a peace advocate or political revolutionary but reject his claims of divinity is to undermine his whole legacy and legitimacy.

Christians today are struggling with the exclusive nature of our faith. It’s the hardest thing for people to get past, for sure. We don’t want to come across as condemnatory of every other religion. We hate having to tell others that our faith necessarily excludes other faiths as valid alternatives. We want to work together with Jews, Muslims, Hindus, etc without judgment or tension. And we can.

It is possible to live and work amongst other faiths, because we do have some common ground and shared concerns for peace and justice and a better world. But ultimately we cannot equate ourselves, because the final solution, in Christianity’s view, is none other than Jesus Christ himself. Not just the general, social reform causes he championed, but Jesus Christ the man: God incarnate. He offers himself to all—no matter where you were born or what you have done—and in that way he is the most inclusive.

Mii, Myself, and My Online Identity

Recently I’ve been fascinated with the notion of the avatar—whether our Facebook picture or our IM Buddy icon or our actual videogame avatars. I’ve been playing on the Nintendo Wii and having way too much fun creating Miis… little cartoonish avatars that I can make from scratch and then play in games. But it’s a pretty interesting thing to consider on a deeper level—the attraction and increased ubiquity of avatars in a digital age.

In his essay, “Hyperidentities: Postmodern Identity Patterns in Massively Multiplayer Online Role-Playing Games,” Miroslaw Filiciak argues that “on the Internet … we have full control over our own image—other people see us in the way we want to be seen.”

My question is this: To what extent are these avatars or online identities really “identities,” insofar as we recognize them as being in some way “us”? Do we see them as extensions of ourselves, or substitutes, or “one of many” variant, circumstantial identities? Do we empathize with our avatar as a function of being its creator and controller? Or as a result of its being our digital likeness and online persona?

“Identity” as an idea is complicated enough, but “postmodern identity” is another ball game entirely. Filiciak attempts to grasp the postmodern identity in his essay, citing people like Jean Baudrillard (identity is the “label of existence”), Michel Foucault (“self” is only a temporary construct), and Zygmunt Bauman, “the leading sociologist of postmodernism,” who argues that the postmodern identity “is not quite definite, its final form is never reached, and it can be manipulated.” This latter notion seems to be the crux of the matter—the idea that identity in this networked world is not fixed but fluid, ever and often malleable in our multitudinous postmodern existence.

Filiciak cites social psychologist Kenneth Gergen, who writes about how we exist “in the state of continuous construction and deconstruction.” While this is not a new idea (psychologist Erving Goffman argued, in his 1959 classic, Presentation of Self in Everyday Life, that the presentation of self is a daily ongoing process of negotiation and information management, with the individual constantly trying to “perform” the image of themselves that they want others to see), it is nonetheless an idea which does seem ever more appropriate in this DIY, user-generated, “massively multiplayer” society.

The type of “self” we construct and deconstruct in everyday life, however, seems to me to be a subtly different thing than what we can and often do in videogame avatar creation. A primary attraction of avatar creation, I think, is that it allows us to create “selves” that are both our creation and our plaything, something that can be as near or far from us as we want. We can and often do construct “identities” that are far from who we are or would ever want to be in the “real” world. Why do we do this? Because we can. Where else can I create a detailed character—complete with eyes, nose, hair, lips, eyebrows, all proportioned to my curious heart’s content—who I not only have authored but can now control and “act as” in a simulated, interactive space?

I find it interesting that when I began to create my first Mii, my initial instinct was not to carefully craft a Mii in my image (I did do this later on, and found it rather boring), but rather to play around with the tools and manipulations at my disposal and create the weirdest looking, side-ponytail-wearing freak I could come up with. Given the opportunity to create any type of Mii, I had no inclination—and I never have, really—to create an avatar that is remotely like who I am (or who I think I am). Thus it strikes me as questionable whether avatars are primarily something that we are to empathize with, at least in the visual sense.

In a sense, my attraction to an avatar is not so much the ability to portray and empathize with a digital alternate to my self, as it is an empathy or affinity towards the ability to create and control this being. To create the avatar is—to me—the most enjoyable part of having one. Of all the things I’ve played on the Wii (sports, Mario Paper), Mii creating was definitely my favorite part. There is something very attractive to the idea of formulating a person from scratch—assembling features in bizarre and unnatural ways with no penalty for cruelty or ugliness. As Filiciak writes of the avatar creation of MMORPGs:

There is no need for strict diets, exhausting exercise programs, or cosmetic surgeries—a dozen or so mouse clicks is enough to adapt one’s ‘self’ to expectations. Thus, we have an opportunity to painlessly manipulate our identity, to create situations that we could never experience in the real world because of social, sex-, or race-related restrictions.

Indeed, if we view avatars as a sort of extension of our identity, then here is one case in which we truly can be anything we want to be.

We can also do anything we want to do, or at least things that are taboo or unthinkable in our real lives (play Grand Theft Auto for a good example of this). Here again we see that our empathy with the avatar occurs not just in what the avatar is, but perhaps more in what the avatar does, or is able to do at our command. Filiciak believes the freedom we have with the avatar “minimizes the control that social institutions wield over human beings,” and results not in chaos but liberation: “avatars are not an escape from our ‘self,’ they are, rather a longed-for chance of expressing ourselves beyond physical limitations … a postmodern dream being materialized.”

It’s an interesting notion, to be sure: the vaguely Freudian idea that who we really are (our true identity) can be realized only when the many limitations of everyday life are removed (as in a game). Gonzalo Frasco, in his essay “Simulation versus Narrative,” makes a similar point about how videogames allow for a place where “change is possible”—a form of entertainment providing “a subversive way of contesting the inalterability of our lives.”

I think that the ability to transgress the limitations and inalterability of our real lives is an especially important attraction of the avatar. But within this ability of the avatar (to be and do things that are beyond the scope of our real lives), I think, lies the very limitations of our identification with it. It seems that what draws us to the avatar is the very thing which ultimately alienates us from it. If true empathy is possible with the user and his avatar, he must first get past the fact that this digital incarnation of “self” can do (and is really meant to be) substantively different than we are—unbound by the many limitations (physical, emotional, cultural, etc) which mark our existence.

The pleasure we derive from our relation to an avatar, then, seems to be less about empathy or identification than creative control and interactivity. With my Mii creations, for example, my enjoyment came from the ability to create in any way I wanted—to play God in some small way. There was little in the Miis that I could relate to my own identity; little I could really empathize with. But I still enjoyed creating, changing, and controlling them. This reflects a tension that is, in my mind, central to the videogame experience. It is the tension between the “anything is possible” freedom of virtual worlds and the user’s desire for empathy. The former may produce the higher levels of fun and gameplay, but the latter is a fundamental human longing. And I believe the two are negatively correlated: as “anything is possible” increases, the opportunity for empathy decreases, simply because limitation—as opposed to unbounded freedom—is what we know. It’s our human frame of reference.

Halloween Special: The Ten Creepiest Films

In honor of Halloween, everyone seems to make a list like this. As someone who never passes up a chance to compile a top ten list, of course I had to join in the fun! But rather than a top ten horror film list, I thought I’d broaden it to include films of other genres. Thus, my collection is of the creepiest or most insidiously disturbing films: less about blood and slashers than heart-pounding shock and awe.

10) The Last Wave (1977): The horror of this film comes not from blood or violence or other conventional thrills. Rather, Peter Weir’s feverish aboriginal nightmare of an Australian apocalypse provides psychotropic ambience of the most unsettling kind.

9) Freaks (1932): The creepiest thing about this film is its exploitative use of real dwarfs, midgets, Siamese twins, and other circus freaks… And when the maligned freaks get angry and rebel against the “normals,” watch out…

8) Three Women (1977): Though it’s not a typical horror film (some might even call it a comedy), Robert Altman’s serenely pagan study of identity is remarkably ballsy and deeply disturbing. Sissy Spacek is even creepier here than she is in Carrie.

7) Lost Highway (1997): David Lynch makes sick movies. He’s a twisted, tortured soul. Lost Highway is particularly creepy, however, mainly because of the image of Robert Blake’s ghost-white, alarmingly devilish face.

6) The Hitcher (1986): I haven’t seen the recent remake, but the 1986 original is a heartpounding thrill a minute. C. Thomas Howell plays a teen stalked by a madmen on the highways of the American West… a conventional setup that devolves into uncharted territories of nihilistic despair.

5) The Wicker Man (1973): This British film (not to be confused with the horrible Nicolas Cage remake) about a secluded island in Scotland populated by happy-go-lucky occultists (who like to sing cheerful songs while sacrificing goats) inspired parodies like Hot Fuzz, but it remains one of the most disturbing, shocking films of the 1970s. The last five minutes are unimaginably f*#$%d up.

4) Night of the Living Dead (1968): George Romero’s groundbreaking horror classic terrified audiences back in the already-tense late 60s, and it terrifies even today. The low-budget, black-and-white, “band of survivors in a farmhouse” setup turns into an unrelenting Cold War zombie hysteria by the end.

3) The Exorcist (1973): Demon movies are inherently disturbing, and this one takes the cake. The unrepentantly earnest realism of the film is its most frightening quality, as are the flash-frame images of indescribably scary demon faces (I dare you to pause it on those images!). The subliminal spiritual warfare of this film is intensely terrifying.

2) The Silence of the Lambs (1991): In addition to the indelible horror that is Hannibal Lecter, this film features the most thrilling climax of any film I can think of. The “blackout” moment near the end is quiet nearly unbearable to watch.

1) The Shining (1980): When I first saw this film for the first time, it was quite simply one of the most scarring moments of my young life. But the months of nightmares are worth it in retrospect, as Stanley Kubrick’s film (from a book by Stephen King) remains one of the most compelling, complex, skin-tingling thrillers of all time.

Next Ten: Pyscho, Carrie, Halloween, The Sixth Sense, Rosemary’s Baby, Zodiac, Scream, The Others, The Birds, Alien.

Best “Christian” Albums of all Time

Yes, it is ridiculous that there is such a thing as “Christian music.” I am totally of the mind that the contemporary Christian music industry is something that never should have existed, and that most of its output has, in fact, been utterly forgettable. That said, however, I must admit that not ALL of so-called “Christian” music (and in my definition, it’s basically any music made with Christian spirituality in mind or in heart) is horrific bilge. Some of it is good, and some even great. I suppose that in any largely-crappy genre of anything, there are some standouts. In this case, I think that the following ten albums more than hold their own in the company of any other “best-of” list, secular or otherwise. So, without further ado, here’s my list of the best “Christian” albums of all time (and when I say “all time,” I mean anything after 1990… which is when I started buying albums):

U2, The Joshua Tree (1987): It might seem cheap and superficially obligatory to include this album on a list like this (b/c U2 has never and will never call themselves a “Christian” band), but there’s no denying: this album is the one of the most glisteningly spiritual creations in pop music history.

Sufjan Stevens, Seven Swans (2004): Again, not a traditionally CCM artist, but Sufjan Stevens can’t be left off of this list. I’m convinced that history will look back on Sufjan as a turning point in the musical trajectory of “spiritual” music. Perhaps now Christians who are into good music won’t feel ashamed if they care more about being true and artistic rather than obvious and didactic.

Jars of Clay, Much Afraid (1997): Some might claim that Jars of Clay’s debut album (with that happily earthy feel) is their finest work. However, I’ve always contended that Much Afraid is their masterpiece. Subtle, subdued, and sonically rich (with gorgeously lingering songs like “Frail”), this sophomore album from a seminal CCM band is truly worthy of accolades.

Pedro the Lion, It’s Hard to Find a Friend (1998): When David Bazan (aka Pedro the Lion) emerged from the Seattle indie/emo scene in the late 90s, he was like the Christian version of Kurt Cobain (tortured, passionate, dark) with the mellow style of Eddie Vedder. His first full-length album remains his best, with quietly tragic (and catchy) tunes like “Big Trucks” and “When They Really Get to Know You They Will Run.”

Over the Rhine, Ohio (2003): This could be my favorite album of all time. Certainly it’s the best album ever to come from blatantly Christian artists. The folky double-disc masterpiece from Cincinnati’s best kept secret is nothing short of magnificent, with its backwoods mystery and latter days prophetic gravitas (“Changes Come”). There are about six songs from this album that should be sung in churches every Sunday.

Sixpence None the Richer, Sixpence None the Richer (1998): Though the uber-catchy “Kiss Me” got all the press, the rest of this album is equally marvelous. Leigh Nash—the queen of CCM’s “indie” sound—gave beautiful form to Matt Slocum’s well-crafted classics on this album, which remains a rainyday staple and a major step into mainstream success for CCM.

Caedmon’s Call, Caedmon’s Call (1997): This is an album of the “college folk” movement in the late 90s in which “earthy” bands with world music leanings became “alternatives” for the over-18 set. Caedmon’s Call filled the Christian niche nicely with this album, which—among other things—launched the solo career of Derek Webb, who would later become the Martin Luther of CCM.

Waterdeep, Everyone’s Beautiful (1999): Even more grassroots and folky than their contemporaries Caedmon’s Call, the Kansas City-based Waterdeep became something of a legend among Christian hipsters for a few years in the late 90s/early 00s. Everyone’s Beautiful is their most diverse, satisfying album, though their live shows are still this band’s strongest suit.

DC Talk, Jesus Freak (1995): Though it can’t be denied that this album is a two-year delayed derivative of the grunge craze, it also can’t be denied that Jesus Freak is a super catchy, well-crafted effort from CCM’s favorite boy band. Give the trio credit: they went from rap outfit to rock band in seamless fashion, reinventing the Christian music industry (and giving it license to rock!) along the way.

Switchfoot, New Way to be Human (1999): Though this San Diego surfer band has since fallen victim to “crossover” MTV irrelevance, their older stuff is actually quite good. I especially like this album for its beautiful ballads (“Sooner or Later,” “Let That Be Enough,” and “Only Hope”) which appeared all over teen media (Dawson’s Creek, Party of Five, A Walk to Remember) in the late 90s.

Honorable mention: Burlap to Cashmere, Anybody Out There? (1998), The Innocence Mission, Christ is My Hope (2000), Eisley, Room Noises (2005), Danielson, Ships (2006), Half-handed Cloud, Halos and Lassoes (2006), Rich Mullins, Songs (1996), Vigilantes of Love, Audible Sigh (1999), Damien Jurado, Rehearsals for Departure (1999), Relient K, The Anatomy of Tongue in Cheek (2001), Audio Adrenaline, Bloom (1996).

The Commodification of Experience

In Wes Anderson’s new film, The Darjeeling Limited, three brothers from an aristocratic family meet in India to go on a “spiritual journey.” Loaded down with designer luggage, laminated trip itineraries, and a hired staffer (“Brendan”) with an albino disease, the dysfunctional trio embarks on a train ride through the richly spiritual terrain of India.

It is clear from the outset that the brothers—or at least Francis (Owen Wilson)—are here to experience something: something deep, profound, and hopefully life changing. And they are oh-so methodical about maximizing the “spirituality” of it all. Francis stuffs every spare moment of their schedule with a temple visit or some sort of feather prayer ritual. It might be odd and a little offensive that these three rich white guys—decked out in fitted flannel suits by Marc Jacobs—are prancing around such squalor, making light (by juxtaposition) of the decidedly exotic culture that surrounds them… But this is what makes the film funny. It’s a comedy.

But it also rings very true. These guys are swimming in things (designer sunglasses, clothes, trinkets, keychains, etc), but what they really want is to feel. And because acquiring commodities is in their DNA, they assume that these types of immaterial experiences can be collected too. Thus, their exotic pilgrimage to India.

The film made me think a lot about my own life, and how I increasingly feel drawn to experiences rather than things. It’s all about seeking those magic moments—whether on a vacation abroad or on a sunset walk on the beach—when we feel something more. And of course, it helps to have an appropriate song pumping through your iPod to fit whatever mood or genre of life you are living at that moment. In Darjeeling, the “iPod as soundtrack to a nicely enacted existential episode” is given new meaning.

In his book The Age of Access, Jeremy Rifkin applies this all very neatly to economic theory, pointing out that our post-industrial society is moving away from the physical production of material goods to the harnessing of lived experience as a primary economic value. For Rifkin, the challenge facing capitalism is that there is nothing left to buy, so consumers are “casting about for new lived experiences, just as their bourgeois parents and grandparents were continually in search of making new acquisitions.” Rifkin believes that the “new self” is less concerned with having “good character” or “personality” than in being a creative performer whose personal life is an unfolding drama built around accumulated episodes and experiences that fit into a larger narrative. Rifkin keenly articulates how this user orientation toward theatricalized existence creates a new economic frontier:

There are millions of personal dramas that need to be scripted and acted out. Each represents a lifelong market with vast commercial potential… For the thespian men and women of the new era, purchasing continuous access to the scripts, stages, other actors, and audiences provided by the commercial sphere will be critical to the nourishing of their multiple personas.

And so as we (the spoiled, affluent westerners among us, at least) become more and more dissatisfied with all the physical goods we’ve amassed, and begin to seek lived experiences and dramatic interaction as a new life pursuit, we must not delude ourselves that this is some higher goal, untainted by commercialism.

On the contrary, the economy is shifting to be ready for the “new selves” of this ever more de-physicalized era. The question is: are we prepared to allow our experiences to become commodities? Are we okay with the fact that our “to-buy” wishlists are now being replaced by “to do” lists, of equal or greater value to the marketplace? What happens when every moment of our lives becomes just another commodity—something we collect and amass to fill the showcase mantles of our memories?

Christian (Fill in the Blank)

Two summers ago, I heard Rick Warren speak at a conference. Pastor Warren (God bless him) uttered a line in his speech that gave me particular pause: “There is no such thing as Christian music, only Christian lyrics.” It’s a significant line in his theology, and it also appears throughout the Purpose-Driven book empire.

It’s a line that goes to the heart of the crisis in Christian identity.

Essentially, Warren is suggesting that something is made Christian when it is clearly labeled as such. Song lyrics (words) are easy to recognize as Christian: do they contain the words God, Jesus, praise? If so, wham! They’re Christian! Instrumental music cannot be “Christian,” in Warren’s view, because how could we ever tell what it is about? If the song itself doesn’t proclaim itself verbally as such, it is not Christian (even if its composer is Christian).

This way of thinking turns the essence of Christianity into a cheap adjective. Slap it onto anything, and voila! You have redeemed the regular and made it holy! But wait—isn’t Christianity more complicated than that?

Christians are way too slaphappy with the name “Christian.” We cavalierly attach it to the most trivial of things. Let’s consider just some of the “Christian” things that populate our culture: Christian bookstores, Christian music, movies, videogames, radio, magazines, publishing houses, Christian Youtube (“Godtube”), Christian MySpace (“MyPraize”), Christian clothes, shoes, socks, paintings, mousepads, cooking utensils, crockpots, you name it….

But what makes any of this “Christian”? What makes one crockpot more suitable for Christians than another? Do we really need “Christian” alternatives in cutlery?

Long ago, Christians decided that rather than trying to influence mass culture from within, they’d take the more passive route and define themselves as a “subculture.” One more subculture among many. There are many reasons why they did this: 1) it’s easier, 2) niche markets make more money faster, and 3) modernity gave rise to the combative, defensive posture of “us vs. them”—an attitude that has defined pop-Christianity ever since.

As a result, “Christian” seemed to become a word best defined by what it wasn’t (i.e. liberal, gay, postmodern, pro-choice, etc…). Somewhere in there we lost our sense of history and tradition and identity—we lost our idea of what “Christian” really means. And if we don’t know what it means, how will anyone else?

The problem is that our society has convinced us that “Christian” is merely an adjective—a descriptive word that usually connotes a conservative, prudish, bigoted fundamentalist diametrically opposed to everything fun under the sun.

But the truth is that “Christian” is much better fit as a noun, or even better—a verb. To be a Christian it to live in pursuit of Christ—to not be satisfied with who you are, but to strive for who you might be. It’s an action-oriented life; it’s a process.

We need to stop demeaning Christianity by treating it like a just another attribute. “Christian” is not like “red” or “tall.” It’s not just a word to describe. It’s a living, breathing way of being.

Why You Should Watch Commercials

Last week in my Network T.V. Management class at UCLA, our guest speaker was a high level executive at ABC Primetime. He spoke to us about the business side of broadcast television--how the audience of any given show is basically "sold" to the advertisers who then invest in a show for its guaranteed spectatorship. If a show is getting good ratings in the 18-49 demographic, for example, the network will then be able to charge more for the increasingly sought-after commercial ad space. As we all know (or should know), advertising via audience “labor” is the bread and butter of T.V. financing.

A massive spoiler appeared on the horizon a few years ago, however, and its name is DVR. Tivo and friends have altered the industry’s economic landscape in striking ways, and T.V. executives are scrambling to figure out what to do about it. The problem is that with DVR technology, people are able to fast-forward through commercials. And they do. I do. Advertisers notice this and are increasingly demanding that the networks do something about it. Consequently, ABC Primetime has taken the revolutionary step this fall season of being the first network to sell ad space based solely on commercial ratings.

In a nutshell, this striking shift means that ABC (and perhaps the other networks soon) will measure a show’s economic feasibility based only on who is watching the commercials—not the show itself. What does this mean for you? It means that if you use DVR to fast-forward through the commercials of your favorite show, you might as well not be watching (at least in the eyes of the networks, who are always looking for excuses to dump underperforming shows). This may be a bitter pill to swallow, but I'm afraid it is true: your favorite television shows are in danger if you do not watch their commercials.

More generally, however, this shift represents the frantic defensive maneuvers being undertaken by beleaguered media industries in the face of technology and changing audience patterns. Hollywood is trying to adapt its old framework to withstand the erosion that things like DVR, on-demand, video iPod and other technologies are causing. Their worst fear is to become the lame-duck recording industry, which is all but dead now because of its blatant refusal to work with and through new technologies.

It remains to be seen whether or not the ad-based network T.V. model will survive the digital age, and maybe it shouldn’t. Maybe we should be even more purposeful about fast-forwarding through commercials on our Tivos. Maybe we should send the message that the days of “commercial breaks” are over—that we will no longer tolerate being passive ratings demographics or dollar-sign statistics in the ugly ratings wars. Of course, we’d have to concede a trade-off in some way—most likely the acceptance of brand-integration and product placement within our favorite shows. After all, these shows need to be financed somehow.

All I know is that the future of television is completely up in the air (as are the futures of most other media industries), and we the audience will have an ever larger role to play. I have much more to say about it all, so stay tuned…