Merry Christmas

Unless you’re a raging insomniac like I am, by the time you read this I’ll be on my way back east for Christmas with the family. Not having a week of build-up and shopping and a big Christmas tree this year has thrown everything out of whack somehow. It’s hard to believe that it’s only two days away, and I can already tell that it’ll seem like it’s over too soon.

I know I’m looking forward to seeing the family again — it already feels like it’s been much longer than a month since Thanksgiving — and to having a week off with no obligations. It’d be better if I could get off the damn internet and get to packing and such, but I’m getting around to that. (See “raging insomniac,” above.)

I hope everybody has a happy holiday and sees exactly as much of their family as they want to see this year.

400 Gigabytes of Terror

Man, I’m pissed. I bought a new extra-huge hard drive for my Mac mini because the internal one sat me down and had a long heartfelt talk with me about how it needed more space. I’m not sure what exactly was taking up so much room; it sure wasn’t my “novel.” I guess all those lengthy fan letters to the G4 TV show hosts and hostesses started to add up. Or else it was all the scratch audio for my “Pieces of Chuck” podcast.

Because I’ve been working with or around computers for the past 16 years of my life, I’m an expert at them and knew exactly the first thing to do when I installed my new hard drive: copy every single document, music file, photo, video, and game file to the new drive and delete them from the old one. And because I’m extra double-plus smart, I made sure to include all my work files in that.

When I woke up this morning afternoon today, I saw that I’d shut down my Mac but left the hard drive running. I turned it off to go take a shower, and that was the last time it was heard from ever again.

I’m pretty sure that not much is lost permanently. It looks like it’s just the enclosure’s power supply that’s bad and the drive is still most likely in good shape. And I backed up just about everything to DVD (except my iTunes library) when I got the new drive; worst case, I’d lose some minor changes to the work stuff and have to restore all my music off the iPod. It was annoying enough a couple weeks ago, when I copied my music over to the new drive without doing it exactly like iTunes wanted me to, so I lost all my ratings, playlists, and my play history.

If nothing else, it’ll encourage me to do more frequent backups in the future. And I have a healthier respect for all ths Web 2.0 business, since I notice my website and Gmail accounts are still golden. Now I just have to mail back the drive (at my expense), wait for them to fix it and mail it back to me, and pray really really hard that they pay attention to my instructions and don’t format the hard drive.

Also: another SFist post about Apple.

No, actually, I don’t yahoo.

I put up another post on SFist about Yahoo’s buying the del.icio.us site. It’s a little bit on the ironic side, since I was just thinking how much better it would be if I just had some kind of URL-sharing thing so I didn’ t feel the need to put up a new blog post every time I just wanted to post a link. It was my intention to make the “Links” section of this site exactly that, a total rip-off of the del.icio.us idea, but it’s another one of my neglected projects and probably won’t see any attention anytime soon.

That SFist article was very briefly one of the top stories on Google news again, which is still just weird. I was looking on there for links to other stuff, and there was a blog post I’d written staring back at me. There’s still something unsettling about that; it makes it seem like I’m trying to pass myself off as a “journalist” or something. Journalists actually get interviews and do research and shit; I just get off on pontificating and trying to be funny in equal doses.

Wikipedia and Intelligent Design

My SFist article this week is brought to you by the letter I, for Insomnia.

The reason I went on so long is because I’ve been reading about Wikipedia ever since I saw that libel story. And the more I read, the more I had the feeling that there was just something troubling about the whole concept. It wasn’t until I read the article from the Encyclopedia Britannica guy calling it a “faith-based encyclopedia,” and the one from the co-founder talking about “anti-elitism” that I figured out what it was.

The core attitude behind Wikipedia is the same one behind the Intelligent Design “movement.”

Every time you read about Wikipedia, people talk about it in Darwinian terms. The articles get better through natural selection, they say, and only the strongest articles will survive. It’s as close to being a pure democracy as possible, is the claim, and because everyone has equal say, it’ll eventually reach some kind of objective truth — errors are weeded out, as are highly opinionated pieces, and they maintain the rallying cry of “neutral point of view.”

Which is bunk. Robert McHenry used the quote from a Wikipedia article:

“Arguably, he set the path for American economic and military greatness, though the benefits might be argued.”

as a demonstration of just plain poor writing and the lack of editorial oversight. Sure, it reads like a C-average high school student’s history report, but there’s a deeper problem there than just lazy writing. It’s lazy thinking.

This is how they deal with “controversy.” Any crackpot with an internet connection and an opinion has an equal crack at the encyclopedia, which means that even the most innocuous articles — from Mother Theresa to The Andy Griffith Show — can result in a debate. And contributors will invariably begin shouting “NPOV!” and editing articles to acknowledge every inane point of view, watering them down to the point of being meaningless.

And whenever I hear “lazy thinking,” I immediately think of Intelligent Design. Not that the people behind that movement are lazy; on the contrary, they’re insidious and dangerous. But the way they work is by taking advantage of lazy thinking on the part of average people. It’s ingeniously disguised as a populist movement (even though, of course, it’s anything but). It takes advantage of the little sound bites and high-level overviews of fundamental concepts, then twists them in order to discredit them.

The ID crowd takes advantage of the fact that a lot of people hear “man didn’t descend from monkeys!” or “evolution is a theory!” or “there are scientists who don’t believe in human evolution!” and just stop there with their thinking. Even though those three things are true, they don’t do anything to discredit evolution and are in fact an important part of the scientific process.

The ID crowd also takes advantage of the anti-elitist, anti-intellectual attitude — the same attitude that made people think GW Bush would’ve been better suited for the presidency than Al Gore, because the former would be “more fun to have a beer with” — to try and discredit human evolutionary theory. The scare talk is: They want to keep religion out of your children’s schools, but they refuse to have their own beliefs questioned! They’re forcing your kids to blindly accept a controversial theory without listening to everyone’s opinions!

Everyone with any sense should be wary of the ID movement, but it puts liberal Christians (which I consider myself to be) in a particularly tough spot. Complain about Intelligent Design, and you’re labeled an anti-religious secular humanist cultural elitist. Acknowledge that you do believe in an intelligent Creator of the universe, and you’re still lumped in with the ID crowd and labeled a fundamentalist.

But more offensive to me than some religious debate is the idea that dumber is better. That there’s some inherent value in not being an expert or a professional. That just having a different opinion, even if you can’t back it up, is enough to constitute a “controversy.” Just because billions of people, including myself, believe in a higher power doesn’t mean that that belief has any place in a science class. And just because you believe that you are a special snowflake (Jessica’s expression) with strong opinions doesn’t mean that those opinions have any place in an encyclopedia. Get a personal blog, where you can pontificate all you want — just don’t piss on a public resource and then try to claim that it’s the truth.

Katamari Dumb-assy

Another SFist post from me is up, where I go off on a tangent about Roger Ebert’s claim that videogames are inherently inferior to real art like literature and film.

It surprised me that his comments bugged me as much as they did, considering that I don’t technically work in games anymore. And I’m as dismissive of videogames as anyone else. But I’ve always thought that I’m dismissive of them partly because of all the wasted potential. It’s not just the usual complaint that 90% of any medium is crap, although that’s definitely the case with games (probably more like 98%).

It’s that even people who would normally be the strongest advocates of the potential of games — the fans and game developers themselves — are giving up on that potential. People defend games because they’re either defending their hobby or defending their profession, but nobody can seem to agree what a game is supposed to be, exactly. Other than profitable.

People just seem to have this implicit understanding that although Michael Bay and Jerry Bruckheimer movies get the biggest audiences, movies are capable of more than just explosions and car chases; they’ve finally been accepted by most as legitimate art. Even television and comic books, which have an even higher crap-to-quality ratio than movies, get the acknowledgement that they put out something great every once in a while. But more and more, people are saying that either games are nothing more than escapist entertainment, and that that’s all they should try to be. I’m fine with escapist entertainment; I don’t think you necessarily have to have meaning to have merit. And I think that good solid game design is an accomplishment in and of itself. So what’s the problem?

The problem is that I’ve played enough games to see what can happen when you get just the right combination of game and narrative, or as Ebert’s complaint put it: player choice and authorial control. It’s the point when you realize, “ah, I use the barrels to float the ramp up into position” in Half-Life 2, or “ah, I have to put a bucket of mud over the door” in Monkey Island 2. Day of the Tentacle was full of them. They’re points where you are actually working with the authors to finish telling the story. The realization hits you like a ton of bricks; it’s “ah, Rosebud represents Kane’s loss of innocence” and “Moby Dick is Ahab’s battle with mortality and fear of the unknown,” times 100. That’s the tool that gives videogames an artistic potential that nothing else has.

And the problem with that, is that very few games are actually using that tool to make something of real resonance beyond “just” entertainment. I’ve said that Half-Life 2 is the best videogame ever made, and I still think so. But I think it works on the same level as Aliens — easily one of my top 10 favorite movies, but not exactly a profound statement on the human condition. There shouldn’t be any question that it’s art. If Fantasia qualifies as art, then so does Rez. And if The City of Lost Children is art, so is Grim Fandango.

The real question is whether games will be allowed to take that extra step to make something profound. I honestly think The Sims takes a step in that direction — it’s not just a dollhouse or even a social simulator, but it has something to say as a parody of consumerism and an abstraction of social behavior and mundane life. And somebody on a message board hit on what I couldn’t figure out about Shadows of the Colossus that made it noteworthy — it’s not just the act of solving the boss fights that’s cool, but the sense of moral ambiguity throughout the game. You have to go through all the tasks you’re given just to complete the game, but just through the atmosphere of the game and the simple set-up, you spend the entire time wondering whether you’re doing the right thing.

People keep insisting that games are still in their infancy and that’s why there hasn’t been a real stand-out that’s universally acknowledged as a masterpiece instead of just “good for a game.” Technical improvements in rendering and AI will keep coming, and they’ll go a long way towards making games better, but what really needs to happen is for more developers to realize their potential as capital-A Art, and make something that’s not just a fun diversion but actually has some relevance.

Dire

Things are pretty dismal in the world of kludgey, predictable, cliched literature. I’m still stuck just under 10,000 words and have been stalled for about a week now. I can confirm that the key to the whole NaNoWriMo thing is momentum, since I haven’t been all that compelled to go back to the thing and pick up the slack. After more than a couple days of inactivity, the philosophy of “this isn’t great or even all that good, but at least I’m getting results,” turns to “if it’s turning out this boring and predictable, why even bother?” Apparently I was not born with ink in my veins — it was most likely Coke, or maybe gravy — and I lack the desire, no, need to create that fills the hearts of true artists such as Danielle Steele and that guy whose name I forget who writes all the mystery novels around horse racing.

I’m genuinely glad to see my writing buddies doing better than I am, though. Assuming that they’re not, well, lying, and that they haven’t just copied-and-pasted “banana” over and over again for tens of thousands of times. (Which now that I think about it, would probably be a better artistic achievement, in the James Joyce-ian sense, than what I’ve got so far). It’s nice to see real evidence that the whole contest works: after a month of concerted effort, you get to check something off your life’s list of things to do.

If it sounds like I’ve given up, I haven’t. I’m not going to admit defeat until midnight on November 30th. And 40,000 words in 15 days amounts to 2,667 words a day, which isn’t completely out of the realm of possibility.

Won’t someone think of the children?!?

There’s another post up at SFist, which I mention only because that’s the only way they show up in the sidebar down below to your right.

Speaking of belated responses to basically inconsequential news: A couple of weeks ago there was a big stink all over the videogame section of the internets about this “lawyer” named Jack Thompson and his run-in with the guys from the webcomic “Penny Arcade.” In brief: he wrote something claiming that he’d donate $10,000 to charity if any videogame company would make a game based on his premise, which was a ridiculous story about a father whose child was killed as a result of game-inspired violence and went on a killing spree murdering game developers, publishers, and retailers. The Penny Arcade guys, to their credit, handled it reasonably well: they pointed out to the guy that they ran a charity which raises money and supplies games for sick kids, and they made a $10,000 donation to that charity in Thompson’s name. He responded with legal threats and various letters to the FBI, several webcomics and hundreds of blog articles resulted. (And when somebody did actually make the game, he responded by saying that his claim had all been “satire,” and then with a couple more threats of legal action.)

In short, everybody got what they wanted. The sleazy ambulance-chasing lawyer got the attention he wanted and kept his name in the press. The Penny Arcade guys drew more attention to their charity, which could be seen as self-serving, but was basically a potent way of getting their message across, that most of the people who play videogames are not hyper-violent, semi-autistic selfish children.

I don’t even like mentioning Thompson, because it just adds one more internet reference to him, however insignificant, to make it seem like the guy’s having more impact than he really is. He’s laughably incompetent, and his agenda is completely transparent, even if you’re not aware (as I wasn’t) of his history of grandstanding and dementia. One of the Penny Arcade guys had an unexpectedly mature take on it: he said that they were aware they should just ignore the guy instead of giving him more attention, but that it was essentially a good thing he was at the forefront of the debate. Because if they ever had anyone competent taking all the credit as leader of the anti-videogame crusade, game fans and companies would be screwed.

(Senator Joe Lieberman and SF Assemblyman Leland Yee also make occasional headlines in videogame censorship news, but usually only when it’s around election time. And when they do, it becomes apparent they have no real expertise in the issue other than knowing enough to mention Grand Theft Auto and Postal).

The problem is that there’s nobody particularly competent on the pro-videogame side of the issue, either. All we’ve got is the insistence that there’s no evidence linking game-playing to violent behavior, and the First Amendment. Which means that as soon as someone releases a study showing that there is a correlation between GTA and Columbine, then all you’ve got left is the ACLU and “I know my rights” and an argument that has parents responding, “Well yeah, but…”
Continue reading “Won’t someone think of the children?!?”

Procrastination Nation

Man, I thought I’d learned everything about procrastination from college and then working at EA, but that was strictly amateur class. Now that I’m working from home, I’ve gone professional in my time-wasting.

Before, it was The Sims 2, where I’d do stupid stuff like see an interesting house in the city and then go into the game to try and build that and then put a family in it and then try to get them hooked up with one of my other Sims or have a baby and then before I knew it three or four hours had passed.

That was nothing compared to Civilization IV. That game is pure evil. I always made fun of the people who claimed they had an “internet addiction” or were addicted to games like Everquest and World of Warcraft, and I still do, because they deserve it. The idea of being addicted to a videogame is absurd. But this game is just weird. When I picked it up on Thursday, I was resigned to waste a whole day on it, and that’s exactly what happened. I got it home around 4pm, and the next thing I knew it was 2am and dark outside and I just felt gross. Really stupid, but I saw it coming so whatever.

But it’s worse than that. Yesterday I was reading a review of the game that mentioned this opening sequence (narrated by Leonard Nimoy) that I didn’t remember seeing. So I started up a game just to check that out. And the next thing I knew, it was 4 hours later. Not even my usual “I know I shouldn’t be doing this now, but I’ll make up for it later” thinking; I genuinely didn’t realize that much time had passed. So, I’ve decided to put that game aside until after I finished my work. Seeing as how I’m not a damn twelve-year-old.

So that’s left all the other stuff to creep in and take over my attention. Like how I became convinced that I wanted to add my AudioScrobbler recently-played tracks to my website like all the cool kids do. Even though I don’t listen to iTunes all that much, and nobody who reads this thing is all that interested in my music — that’s not the point. The point is that it could all track this data that nobody’s interested in, automatically. That’s Web 2.0! The future of the internets! And what’s more, I got it working perfectly, writing the code to get the data and parse it out and put it in a nice little list on the sidebar, rationalizing that I was learning about web programming as I went. But for whatever reason, it doesn’t work from my webserver, and AudioScrobbler’s service is only up intermittently. So scratch that.

But hey, check this out! Some guy made a bunch of Flickr Toys to make calendars, mosaics, Magic cards, magazine covers, and such from your Flickr photos. And what impresses me is that Flickr has complete documentation for their API, so you can write your own toys and galleries and stuff using your photo collections. That means I’ve got to write a new gallery for all my travel photos, right?

Maybe later. After I get past the outline stage and writer’s block I’ve been having with work.

For now, there’s another SFist column up. I’m only supposed to do one a week, but again: it lets me get distracted from what I’m supposed to be doing.

And finally: Happy birthday, Mac! Welcome to your 30s. It’s not as horrifying as I make it out to be.

At Long Last Zombies

Another SFist post is up, which mentions zombies in passing.

That’s because today is a special day: at last, my little obsession over the past few months is over, and I’m caught up with “Alias.” TNT finally ran the zombie episode. I’d been expecting a whole zombie storyline, but they didn’t show up until the season finale. And they weren’t really zombies. But still, it was pretty damn impressive as a TV show season finale. On par with the best season, season 2. I don’t know if it’s just a coincidence, but what they both have in common is Lena Olin as Sydney’s mom. Kinda sucks when you make a show with one great, stand-out character that your staff really knows how to write for and makes for the best storylines, and you can only have her make guest appearances.

I do think it’s kind of funny that throughout the entire series so far, the only times they’ve showed Jack Bristow kissing a woman, it was with someone he was angry at or repulsed by. C’mon, dude — you’re an actor! And it’s Lena Olin and Isabella Rosselini for gosh sakes! Can’t you just take one on the chin for ABC, and put some passion in it?

So all that’s left is the two missing episodes from the beginning of season 4, but I already know what happens in those from flashbacks and such. Then I have to pick a new hobby. I do have these “Lost” episodes on DVD sitting around…

If You Go Out In The Woods Today

I wrote earlier about the teddy bear who’d collapsed on his way towards my apartment. Unfortunately not all the toys in San Francisco are so lucky. In rougher neighborhoods, like Haight-Ashbury, you come across scenes like this one. Mac and I had a Stand By Me moment a while back when we came across this bear’s body lying on the road in front of a car. It looked innocent enough until we came closer to the scene and realized that his head had been crushed!

No witnesses saw the incident, but investigators are saying it’s a case of post-picnic violence. The victim was most likely gaily dancing about while his assailant was watching him, catching him unawares. At six o’clock, his mommy and daddy arrived on the scene to take him home to bed, but instead found this.

Also, another SFist post is up.