When someone talks about videogame-to-film adaptations they typically
start by lamenting the fact that the films are utterly terrible, and that the
ones that aren’t terrible are only good in relation to the ones all the way
down at the bottom of the bucket. To me this is the equivalent to winning a
world’s sexiest crack whore competition, or only having half of your house burn
down in an electrical fire. There’s a  very important difference between winning and cutting one’s losses. 

     The other day I was watching Silent Hill, which is arguably
one of the more competent adaptations in that it almost passes for a real film.
 It has a plot. The characters have
personalities that are at least slightly more interesting than a cotton swab museum.
Visually, Silent Hill is always passable, and sometimes even borderline inspired.
It also climaxes with a frumpy religious zealot getting what amounts to a
gynecological examination from a barbed wire wielding demon-child before being exploded
across an altar.  So overall—not bad.

     And yet, with the possible exception of Silent Hill, most of
these adaptations are rewriting the rules for badness. Even by the basic law of
averages, there should be at least one “excellent” or “very good” videogame adaptation by now.

    But no.

     Some are so bad they
seem bad on purpose, others are so bad that anti-intellectuals call them genius. When Uwe Boll released Alone in the Dark a few years ago, I
remember a lot of people giving it a pass because they simply weren’t sure what
to do with it. The film had somehow transcended the language of film criticism.
 It seemed pointless to even use words
like cinematography and dialogue.  Maybe Boll was like Picasso. Maybe he’d
destroyed the body to capture its essence, disoriented it to reorient it. Yeah,
that’s what he did!

     Uwe Boll aside, the natural inclination seems to be to blame
the filmmaker for a given adaptation’s failure. Maybe he (only men direct these
things) just wasn’t capable enough. Maybe
he didn’t get it. Maybe he tried too hard to please everybody. Obviously the
property itself isn’t flawed. I mean, the game was awesome. . .right?

     I would argue that this line of thinking is mostly wrong,
that the word “awesome” has two different definitions depending on which genre
is being discussed and that the rules for each are so different as to make them
incompatible with one another. Compound this with the fact that videogame
industry still has one of the most arcane systems for discussing the quality of
the work it produces.  I’ve yet to hear
anyone question the meaning of the term “graphics,” which to me is such a
broad, general way to talk about the visual quality of something that it’s
basically rendered meaningless. And can someone tell me what “sound” is?
Really. How can you even begin to translate something into another medium if
you don’t have a specific way to talk about what it’s doing in the first place?  Final
Fantasy VII
has a great story? Ok. Why? Was it really great? Or were you just pleasantly
engaged? You spent nine hours training a chocobo? Well, what does that mean? How is that a story? Should we make three movies?

    Ultimately, I find it hard to blame film, as a genre, or
filmmakers, even terrible ones, for the failure of these adaptations.  The real problem is that videogames aren’t yet
of a quality to make them filmable.  And
I don’t mean that videogames aren’t of any quality, just that it’s a singular quality
that has everything to do with being a videogame and nothing to do with being
in any way filmic. Until more people figure that out, we’re going to have a lot
more blood on the altar.