Who’s in control here?

Alexander_Haig.jpgThat’s Alexander Haig. Look him up.

Also: This post has spoilers for BioShock and Grand Theft Auto IV, in case you’re paranoid about that kind of thing.

Previously on Spectre Collie, I made the claim that videogame developers can learn more from “non-interactive” media than just how to make more cinematic cut-scenes and more literary dialogue. If interactivity is the key aspect of videogame storytelling, then how come everything we borrow from traditional media is non-interactive? Why not look for the ways in which movies, books, TV, and comics interact with the audience, and then try to build on that?

The example I used last time was the “don’t go into that room” scene in horror & suspense movies. Those scenes build tension not by showing the audience what happens next, but by asking the audience what they think is going to happen next. In effect, they’re turning the storytelling duties over to the audience.

This only works because there are always at least two versions of the narrative being told simultaneously: the filmmaker’s version, and the audience’s version. It’s as true for movies, books, and TV as it is for storytelling games. In games, obviously, you put more emphasis on the player’s narrative. Which leads to the assumption:

Myth 6: Player narrative is always more important than developer narrative.

On the one side, you’ve got the arrogant, control-freak game designer, forcing his lame story onto players who don’t want to hear it. One of the designers at Telltale, Heather Logas, described this phenomenon better than I’ve heard anywhere else: “A lot of game designers act like they don’t want players coming in and messing up their story.” So we’ve developed all kinds of ways to ensure our stories don’t get messed up: cut-scenes; choke points; and linear sections that trick the player into believing he has control, when in reality he’s only allowed to do the one thing we want him to do.

On the other side, you’ve got the players, a bunch of whiny malcontents with an inflated sense of entitlement. They insist that their $50-$60 has bought a team of professionals who should dance at their command. The interactivity of a game is supposed to let the player tell his own story. That’s the only story that players care about. Besides, everybody knows that games will never have storytelling and writing that’s as good as movies or even television. If a game developer just wants to tell a story, he should get out of games and just make movies. So players have developed all kinds of ways to ensure their stories don’t get messed up: basically, insisting repeatedly on blogs and message boards that developer’s stories be kept quiet and unobtrusive, and that cut-scenes should be kept skippable if not cut altogether.

So which narrative is the more important one? If the real potential of interactive storytelling is giving the audience the freedom to tell whatever story they want, then the answer’s obvious: the player’s narrative is everything.

But the real potential of interactive storytelling isn’t giving the audience the freedom to tell whatever story they want. That’s the real potential of the pencil. And if you give someone a pencil and a blank sheet of paper, or a blank page in Microsoft Word, or a blank workspace in Flash, you don’t automatically end up with great storytelling. If you end up with anything at all, more often than not it’s insipid, derivative, filled with cliches. That’s as true of the best screenwriters alive as it is of the guy who writes “FIRST!” on blog comments. Great stories are rare, because great stories are hard. So the player’s narrative isn’t the most important.

But the developer’s narrative isn’t the most important, either. After all, if a game developer just wants to tell a story, he should get out of games and just make movies.

Tear down this wall!

The real potential of interactive storytelling is delivering a story that’s a collaboration between the storyteller and the audience. It’s not the player’s narrative, and it’s not the developer’s narrative; it’s this third thing that’s better than either. As you play the game, the pieces of the story start to come together, and you feel not like you’ve played a part in someone else’s story, but you helped write the story.

So how does a game design make that happen?

Currently, the Holy Grail of Modern Videogame Design is the “sandbox” game. It’s something that’s apparently so wonderful that it’s attained a mirage-like quality; people even see sandbox games where they don’t really exist. You’ll frequently hear developers and reviewers talk as if sandbox games aren’t a design decision, but are the One True Way. They’ll sheepishly acknowledge when their own games fall short. Of course we’d drop the player into an open, non-linear world in which he can do whatever he wants, but we just don’t have the time/budget/staff/engineers/talent.

But as much as games try to abstract the real world, it’s important to remember what happens to sandboxes in the real world. When they first jump in, kids go mad with freedom and come up with all kinds of crazy stories. But they quickly get bored and move onto something more structured, leaving the sandbox for stray cats to come shit in.

I’ve probably spent more hours playing sandbox games like SimCity and The Sims than I have all other games combined. And yes, stories do “emerge” from those experiences. But even though I “own” them, they’re never as compelling or interesting to me as the stories in even the most rote Japanese RPG. As tired as “His village was destroyed so he got a crystal to restore the life energy of the planet” may be, it’s still more interesting than “She was trying to become a journalist but lost her job because she wet herself in front of the boss.”

And they end up being ultimately shallow and solitary experiences. There’s no feeling that I’ve really accomplished something permanent, and no sense that I’ve shared something with another person. The storytelling moments that do come through in those games are most often pre-generated sequences: a person catches on fire, and Death shows up at the front door. Because those are the moments you feel like you’ve connected with the game developers, and realized, “I see what you did there.”

So it’s the game designer’s job to add in some direction, to establish some potentially interesting characters, and set up some fodder for drama. It’s not a game unless there are rules, and it’s not a story unless there’s a premise. But at best, the end result of that is more like improv theater, but with the roles of the audience and the performers reversed. The developer shouts out some ideas: “You’re a Croatian immigrant!” “In New York City as if it were populated by people with a fourth-grade sense of humor!” “And you gotta kill a dude!” And then the player runs off to do his thing, until he gets the next batch of instructions.

Even when that’s done well, and it almost never is, you still end up with a great divide between the player narrative and the developer narrative. I’ll tell my story, now you tell yours. Except yours doesn’t really matter that much, because I know what scene is going to play next.

The thing to remember is that the developer will always know what scene is going to play next. Because whether it’s making a cutscene or plotting drama curves and AI personality matrices into a HyperTime Story Generating Unit, he made the scene. So you can trick the player into thinking he has control of the situation before snatching it away from him at the next choke point. Or you can try to fake “infinite probability space” by generating tons of assets that’ll play in response to the things you think most players are likely to do.

Or, you can take the simplest approach, and just try to make sure that the player’s story is the same as your own.

Just Say No, Already. What are you waiting for?

From GamesRadar.com, here’s a list of Will Wright’s top 8 games as presented as part of an art exhibit. In the explanations for his choice, you see the mantra repeated several times: “Don’t invalidate the player’s narrative.”

But “don’t invalidate the player’s narrative” doesn’t automatically imply giving the player total control over the game and responsibility for the game’s story. It just means that when the developer does take over storytelling, it shouldn’t go against what the player’s trying to do.

Take three examples of “fake interactivity” from recent games. In each case, the game acts as if it’s turning control back over to the player. He can run around, jump up and down, and possibly shoot things. But in reality, there’s really only one meaningful thing the player can do. It’s interactivity just for the sake of interactivity; the player doesn’t really accomplish anything that couldn’t have been just as well in a cut-scene.

1. Near the end of Half-Life 2:Episode 2, the player’s asked to press a button that will launch a missile; the event that the entire episode has been building up to. This is a little cheesy, but it fits in with the philosophy of the Half-Life games: they’re not really about choice, but about immersion and agency. Throughout the games, you’re only doing one thing, but what’s important is that you are the one who’s doing it. So if a button needs pressing, of course Gordon Freeman is the one to do it.

2. Near the middle of BioShock, the player’s told to press a button that will blow up the city. You’ve just watched a climactic scene that’s all about being a slave to orders, or being a man and choosing to make your own decisions. The veil has been lifted! And then immediately afterwards, you’re made a slave to orders. It invalidates the previous cut-scene, and it undermines the entire premise of the game. If it were intended as meta-commentary, it failed miserably, so I’m more inclined to believe they just wanted to split up the pacing of such a long non-interactive segment.

3. Near the beginning of Grand Theft Auto IV, the player’s character decides to kill a guy who’s been having sex with his cousin’s girlfriend. Here, you’re ostensibly given more freedom than either of the other two examples — the game drops you back into this huge city, and you can keep jacking cars, going on dates, assaulting passerby, watching TV, or just driving around. But from the beginning long cut-scene over the opening credits, the game has kept reminding you that this story is important. And at this point, there is only one thing you can do to advance the story. This one scene invalidates the entire game, because it ruins the only thing that made the game worthwhile.

So how can I possibly say that when a game about free will rips away your choice, it’s bad but forgivable; but when a game about killing guys orders you to kill a guy, it ruins the entire game? It’s all about the player’s narrative at that moment.

In the Half-Life 2 example, it’s a no-brainer: everything in the story up to that point has been about launching a missile. You can’t not want to press that button.

In BioShock, it’s pretty much the exact same scene. Everything you’ve done so far has been to get to this point. You still want to push that button, because you’ve spent the last 10 or so hours trying to get to the button. It’s just that the scene has been put into the worst possible point of the game. It’s a case of plot fighting with story: the plot is overwhelmingly pushing you towards doing this one thing, but the story has forced you to ask, “how come I have to do it?”

But in GTA IV, you’ve spent the last 4 or 5 hours (the game takes forever to get moving) being taught one thing: you have control. You can do anything you want to. There are these other characters you can interact with, and there’s some mystery to your character that’s going to get revealed over time, but how and when you do it is completely up to you. To reinforce that idea, they’ve given you one mission where you’re explicitly given the choice to kill a guy or let him go. But now, because the designers have decided they want this dramatic moment to happen, they rip that control away from you.

And they rip it away completely: it’s not even someone telling your character to do something. Your character decides to do it, in a cut-scene. To me, it was as jarring as the scene in Adaptation when Susan Orlean says, “We have to kill him!” except here, it wasn’t supposed to be funny or ironic. I actually ended up reverting to a save game in GTA IV because I was positive there’d been some kind of mistake — there had to be some alternate story branch I’d missed, because surely the game wouldn’t spring this on me out of nowhere. I’d been introduced to two loathsome characters, and instead of letting me choose which of the bastards I had allegiance to, it swooped in and made the decision for me.

Again, it’s not the kill/save “choice” that’s important to me. You don’t start playing a GTA game to explore the complexities of human morality; you play to drive around and kill people. By putting the game into the drive, I’ve already agreed to that premise. The problem is assuming that I want to kill this guy just to advance the designer’s story, when the designer hasn’t earned it yet. He hasn’t established character enough to explain why I would automatically decide to do this, instead of just telling my cousin to go screw himself.

And like BioShock, GTA is telling me I have choice and then not following through with it. Except in GTA, it’s overwhelming. The more store-fronts you show me as I’m driving around, the more jarring it is when I can’t go into those stores. The more people you show walking the streets, the more jarring it is when they all say the same thing. And the more you tell me that I can go anywhere I want and do anything I want, the more clear it becomes that I really can’t. I’ve got control for all the boring stuff like driving and shooting and hitting pedestrians, and the designer still has total control over everything meaningful. And if that’s the case, I might as well just be watching TV. (Which the game lets you do, and which gets consistent praise in reviews. 10/10!)

Trickle-Down Storytelling

I don’t think there’s anything wrong with a game developer imposing a story on the player. That’s pretty much his job. And you can assume that since the player has bought your game about amnesiac space marines on a haunted Martian base with zombies, for whatever reason, he’s on board with the premise. The problem comes when you take an antagonistic view of the relationship between developer and player, and you treat the game as two sides fighting for control over a story.

Too much developer control, and you end up with what you see in most games: I’ll take care of all the interesting stuff in the story, and every once in a while I’ll throw you a bone by making your joystick work again. Let me know when you’re done killing guys, so I can get back to my story.

Too much player control, and you end up with shallow, solitary experiences: Hey cool, I just ran my car into a telephone pole and it exploded, taking out the cop that was chasing me and the guy I was trying to kill! Did anybody else see that? And err… now everything’s back to normal. Uh, what am I supposed to do now?

Again, the key is to think of storytelling games as a collaboration between the developer and the player. Give the player control every time it matters, and only when it matters. Don’t save the good stuff for cut-scenes and choke points.

When the player does have a choice, try to make his choices meaningful and interesting. I think “Do I use a shotgun or a pistol” is less interesting than “Do I take out the medic or the heavy?” And “Kill this guy who lives in this building and use these weapons” is less interesting than “Find this guy and kill him.”

Don’t wrest control of the story and try to tell everything in big chunks — not because the player doesn’t care about your stupid story, but because the player will care if he feels like he’s uncovering a story and not just being told a story. The audio logs in System Shock 2 and BioShock are a good step towards doing this; if they were more active, it’d be even better.

Don’t make decisions for the player unless you’re convinced that you’ve earned it. Make sure that you’ve genuinely established character and motivation enough that most players want to get to the next plot point as much as you want them to get there. A cut-scene where the main character says, “I want to blow up the enemy base!” is bad. What’s better is a sequence of scenes where the main character is repeatedly told how bad the enemies are, and discovers a bomb, and discovers a map to the enemy base, and did we mention how bad the enemies are?

It’s a lot more difficult than it sounds from just vague “we should do this more” ramblings on a blog. But I think it’s more a subtle shift in philosophy than a to-do list. We want to believe that games are entirely about enabling the player, because we’ve all played games and felt that rush that comes from knowing you just did something clever and/or awesome and/or obscenely violent. But storytelling is basically communication, not control, and it implies that both channels of communication, between storyteller and audience and between audience and storyteller, stay open all the time.