When I started writing about storytelling in videogames almost a month ago, I’d intended to turn it into a series, with more arguments and maybe even some examples more concrete than “videogame stories should be good.” To keep things going, I’ll take some of the comments I’ve read about the topic online (in blogs, articles, and on message boards) in the past few weeks, and offer up a rebuttal to each.
Previously on Spectre Collie…
To recap: writing and storytelling in videogames has traditionally been weak at best. It’s common knowledge — whether it’s accurate or not — that videogame stories suck, and even the best don’t measure up to the level of the worst movies and books. And objection to cut-scenes and lengthy non-interactive segments has evolved into a whole school of thought saying that story has no place in videogames. According to this, games are defined by interactivity and their game mechanics, and that’s all that’s important. Trying to apply aspects of other media into videogames has not only failed in the past, but it’s always doomed to fail.
I say that not only can you tell a good story in a game, but that it’s important to games. In fact, it’s the only way that videogames are going to realize their true potential. Now, this requires a looser definition of “story” to make sense. It’s not just the narrative, or the premise, but everything that’s not purely the game mechanic: setting, characters, dialogue, narrative, and theme.
Myth 1: Videogames are Young
So the first myth about storytelling in videogames always comes in response to the whole “are videogames art?” debate, and you’ll see it repeated in this article from Wired. It usually goes something like this:
Somebody, like say Roger Ebert, asks why, if videogames are capable of art, hasn’t there been the great masterpiece worthy of comparison to the greatest works of film and literature? In other words, why is there no Citizen Kane of videogames?
Inevitably followed by the reply: Videogames and interactive entertainment are still a new medium, and developers are still figuring out how to use it. It took the movie industry decades to produce its definitive classics.
Which seems to me a pretty weak argument. A big deal was made on the internet about the recent 40-year anniversary of videogames, and this article by Kyle Orland in Joystiq compares other media at their 40-year mark (using a somewhat arbitrary start date for each, which I won’t argue with here). By the standard presented in that article, it would seem that we’re still in pretty good shape, and we’re due for our greatest achievement in just a few years now.
But there are problems with that. For starters, the development of a medium of art or entertainment doesn’t place in a vacuum. Looking back at the Joystiq article, compare the state of film after 40 years versus that of TV, and it’s clear that TV advanced a lot more quickly. They list “The Flintstones” as one of the most popular shows at the cut-off point, while movies had just released integrated soundtracks and introduced the Academy Awards.
“The Flintstones” as Postmodernist Masterpiece
While it may be tough to recognize today, “The Flintstones” was pretty experimental: an animated series airing in prime time, that was itself a parody of an earlier series. Depending on how much credit you want to give Hanna Barbera, it was either a postmodernist reference back to “The Honeymooners;” or the character types created in “The Honeymooners” were so established at that point, that they were default for a family comedy. Either way, it took a lot of evolution (no pun intended) and maturity before you could have something like “The Flintstones” even air, much less be one of the top series.
By that standard, games should have been maturing twice as fast as television did. And at least monetarily, that’s the case: the industry is making mad money, and game budgets are already rivaling those of movies. Production values are plenty high, too — there are plenty of scenes in Gears of War and Half-Life 2 that were more convincing to me than the effects in Starship Troopers and the recent War of the Worlds. The videogame business clearly isn’t pacing itself by the same schedule as movies & TV.
My biggest objection to the “games are still new” defense, though, is that artistic media are improved not just by time, but by milestones. You can’t just say that in x number of years, you’re due for your Wizard of Oz or Casablanca or Citizen Kane. The medium doesn’t really grow by evolution, but by intelligent design — you’ve got to have somebody who recognizes the potential of the medium, and then makes something that exploits it, showing the next generation what’s possible.
So just saying “give it time” doesn’t really cut it. The industry has got to put up or shut up. A better rebuttal to the question, “What is the videogame equivalent of Citizen Kane?” is to ask, “What’s so great about Citizen Kane?” It’s universally considered a classic, one of the greatest achievements in film. So what did it accomplish for movies as a form of art?
Orson Welles May Be a Hero To Most…
It’d be easiest for all of us if I could just say, “It’s the story. The end.” But it’s clearly not. “A reporter looks back on the life of an ambitious and powerful man to discover what was his greatest desire” is definitely a solid premise, but on its own, isn’t enough to warrant universal praise.
It’s not even the way the narrative is structured (part of the storytelling as Hanford Lemoore described it in an earlier comment, and thanks to him for bringing up the distinction). Setting up the central mystery of “What is ‘Rosebud?'” was a brilliant way to drive the story, and it’s one of the best-known conventions in the history of movies. But it’s also one that could’ve happened in any other narrative medium; it’s not so novel that you couldn’t do the same thing in, well, a novel.
And that is why people are still pointing to Citizen Kane as one of the definitive medium-defining movies: it takes a good story, and then tells it in a way that only a movie can. There’s the composition of shots that clearly and instantly establish characters and the relationships between them (the iconic image of Kane in front of his campaign poster, and the careful placement of characters in the foreground or background to show Kane growing distant from the people close to him). There’s the breakfast table montage that shows Kane’s marriage deteriorating over time via the placement of the actors and the editing of the scenes. And there are all the match-on-action shots that give the movie the sense of jumping around in time. (And more than that, just served as Orson Welles’ showing off.)
You can use a lot of the same gimmicks in other media — in comic books, The Watchmen gets a lot of use out of symbols and icons to show character, and images that carry through from one scene to the next. But a comic adaptation of novelization of Citizen Kane would fail if it just attempted a direct recreation, just as the cinematic adaptation of The Watchmen will fail if it just tries to film the comic book. Unless you exploit the medium to its fullest, doing the things that only that medium can do, you’re going to fall short of the medium’s potential.
“Games are not movies!”
Obviously, what videogames add is interactivity. And that’s the source of the whole debate: games just aren’t yet exploiting that interactivity as well as they could. Because they have all the same storytelling elements as movies (or television, in the case of episodic games such as Telltale’s hilarious Sam and Max series available for the low low price of $34.95 for the entire first season), the tendency has been to make shambling Frankenstein’s Monster creations stitching together cinematic sequences and interactive sequences that never quite meld. You either get games that periodically stop being interactive to make you watch a movie, or interesting story sequences that are held together by a predictable and uninspired game. And sometimes the most fun, perfectly-designed, pure games-for-their-own-sake games feel obligated to throw in some token effort at story, putting in an opening cutscene explaining that you’re playing Breakout to rescue a space princess from some evil galactic mega-corporation.
That results in the “Games are not movies! Down with story!” backlash. “Stop with all the pretentious ‘are videogames art?’ talk and just get back to asking ‘are videogames fun?'” Which is pretty unambitious. We already know videogames have to the potential to be fun; the industry wouldn’t be making billions and billions of dollars and taking up hours and hours of our time if they weren’t.
But they’re capable of more than that. So why not try to achieve more than that? We can just keep on Unreal Tournament until we come up with flashier versions of a game that’s undeniably fun but ultimately without purpose. Or, we could try to make interactivity meaningful. I don’t like the Grand Theft Auto series, for example, but I can’t deny that it was hugely significant in showing what you can do with a truly interactive environment. Now, what if you had a GTA with something of more substance than shooting hookers?
The various aspects of Citizen Kane — montages, staging, different types of editing — aren’t interesting on their own. They only stand out because they serve the story and its characters. Plenty of lesser movies use the same techniques, and they remain lesser movies. The cinematic elements alone don’t make it a great movie, just as a great game mechanic by itself doesn’t guarantee a great game. Even if you say that Kane is a case of form over function, that it’s only regarded a classic because of the way it mastered the cinematic elements in service of a fairly simple story: what’s wrong with that? Why not apply that to games? Having that kind of filter during game development would be a great improvement to what we have now: imagine how many hours of frustration we could’ve avoided if developers had simply asked themselves, “Does this jumping puzzle actually serve any purpose in the overall story?”
It All Has Purpose
That doesn’t mean that every game has to mean something, any more than the existence of “important” movies means we can no longer have movies like Big Trouble in Little China. And I don’t want to live in a world that doesn’t have Guitar Hero.
It just means that we start rewarding the developers who try to move things forward. It’s not just going to happen naturally over time. You’ve got to have people who are willing to step up and experiment with how interactivity and narrative feed off each other, instead of being mutually exclusive. And they’ve got to be able to do it without hearing the old story about how it won’t sell as well as Quake or Madden. Half-Life 2 is experimenting with things from the linear, cinematic perspective, and it seems to be selling all right. And The Sims went at it from the pure game-mechanic/sandbox angle, but it still managed to make a subtle commentary on consumerism and the nature of storytelling, and I believe it made a few dollars for Electronic Arts. I wouldn’t call either of those the Citizen Kane of videogames, but they’re on the right track.
At present, videogames aren’t like the fresh-faced young high-school graduate finding out how to make his way in the world, just years away from his first greatest achievement. They’re like the 40-year-old stoner who maybe will get around to accomplishing something, eventually. But for now it’s just easier and more cost-effective to just sit on the couch and watch shit blow up.
(While I was doing Google searches, I found this article from CBS News from last year, about the “Citizen Kane of videogames.” It gets perspective from a few people and then comes to many of the same conclusions. None of this is particularly new ground.)
Is there anybody going to listen to my story?