Would That It Were So Simple

A cursory and non-reductive look at Hail, Caesar! by Joel and Ethan Coen

hailcaesarclooney
I’ve been promising for a while that I’m going to be less reductive about movies. So even though I saw Hail, Caesar! last night and have been thinking about it ever since, I won’t try to come up with some belabored explanation of what I think it all means.

After all, trying to assign a “meaning” to everything is insufferably pretentious. It’s the kind of thing that self-important writers do, sitting together flinging high-minded concepts at each other without being able to put them into practice or make them relevant. It’s not actual communication, but just a pretense of superiority to cover up a deep-seated bitterness over the fact that writing is not universally accepted to be the most important thing there is.

It’s all part of the value judgment we tend to put on the labels of “art” vs “entertainment.” As if entertainment is inherently ephemeral, vapid, and valueless. The only way to elevate it to the higher and worthwhile level of “Art” is to put it in service of some important and meaningful statement. By, for instance, inserting ideas into unrelated stories for audiences to decode and pick apart like puzzle boxes, with the goal of “changing minds.” Or having a movie end with a forceful, dramatic monologue that explains to the audience exactly what the whole thing is about.

If it were just self-important ramblings from someone who’d taken too many cinema studies courses, it’d be bad enough. But the problem is that it’s reductive. It’s dismissive of all the wonderful things that movies can do, and it undermines the efforts of the hundreds of people who work to bring those wonderful experiences to audiences.

After all, there’s a ton of artistry that goes into a movie that’s more than just the self-congratulation of screenwriters and attention-grabbing performances by movie stars. There are cinematographers, editors, composers, choreographers, set designers, artistic directors, dancers, and stunt people. And that’s not counting all the people whose work doesn’t make it directly onto the screen, but who are essential to making the entire thing possible.

It’s an attitude that assumes that the movie industry’s days of spectacle and pageantry and displays of raw talent are gone and no longer missed, because the medium “matured.” It assumes an evolution from cheap tricks and stunts into more refined and intelligent stories of beautiful and sophisticated people delivering clever dialogue. It claims that there’s no true artistry in a well-executed farce, or a perfectly choreographed musical number. It ignores the delight of an audience enjoying something together in favor of being able to say I get it.

So I’m going to resist my natural tendency to talk about the Coen brothers as populists. Or to mention their disdain for pretense and self-importance. Or to put anything in the context of their recurring theme of spirituality vs. religion, and the futility of being so focused on the meaning and the answer that we lose sight of the wonder of life and the beauty of experience itself.

Instead, I’ll just talk about what I liked most in Hail, Caesar!, the Coens’ absurd, spectacular farce about the “golden age” of Hollywood.

I loved that they went all-in on reproducing genre after genre of classic movie, not just as casual reference but full-blown production. The result is the best living cinematographer and film composer working with scores of the most talented people in the industry making pitch-perfect recreations of old musical westerns, Gene Kelly-style musicals with elaborate dance numbers, Esther Williams-style “aquamusicals,” pompous sword-and-sandal epics, and high-society melodramas. And then made the references pitch-perfect as well, to Cold War spy dramas, a film noir car chase, and of course, the Singin’ in the Rain-style movie-within-a-movie.

I loved that the Coens returned to their Blood Simple-era mastery of timing, setting up the slow burn “Would that it were so simple” gag that’s shown in the trailer, waiting for a few more scenes before setting up the punchline, then stretching out the delivery of the punchline even more by inserting one of the other funniest scenes in the movie, with Frances McDormand’s film editor.

Every single performance in the movie was perfect, which is especially remarkable considering that they weren’t all in the same genre of movie. Being the straight man in a screwball comedy is a thankless job, but Josh Brolin keeps it grounded. But Alden Ehrenreich is kind of amazing, not just for completely getting the wild changes in tone from scene to scene, but also being able to do an affected accent trying to affect a different accent. And this is the first movie where I’ve liked Channing Tatum.

I loved the recurring gag about not “depicting the godhead” (check the disclaimers in the end credits) while making a movie billed “A Tale of The Christ,” and then not only did they never show the actor playing Jesus, they had an assistant director asking him whether he was principal or supporting cast.

I loved that the high-minded, presumably religious epic On Wings As Eagles was built up with such import and significance and then marvelously deflated. I loved the “No Dames!” musical number that started with a subtle nod to the Gene Kelly performances that seemed just-barely-shy of homoerotic, and then got more and more blatant as the routine got more and more sophisticated.

And I loved that Eddie Mannix’s guilt and loyalty and devotion were depicted as a crisis of faith, and contrasted against the religious leaders who didn’t have the answers; the warmongers who dismissed movies as ephemeral, valueless entertainment; and the biblical epics that depicted sudden, awe-inspiring epiphanies that turned out to be ultimately empty. Mannix didn’t really have any sudden moment of clarity; he just went on helping people who needed help and made movies that reached millions of people and brought joy to their lives.

So no real message, just an astoundingly talented bunch of filmmakers making a silly, funny farce about the “magic” of movies. But maybe I missed the point entirely.

Why I’m Excited About Firewatch

The future of independent game development is now! Or at least two years ago!

Firewatch comes out for PCs and PS4 next week, and you can already preorder it on Steam. There are a couple of obvious reasons I’m happy to see it:

  1. My friends made it.
  2. I mean, just look at it.

But I also can’t help but see it as a victory for independent game development. Not to mention evidence that occasionally, we do actually live in a meritocracy, and you can actually be rewarded for being good at what you do.

It’s likely because I’ve spent the last year being disillusioned by the state of professional game development, but I keep thinking about how much the environment has changed since I started working in games [REDACTED] years ago.

Lately, I’ve been focused on all the ways it’s changed for the worse, because that was all I had to work with. I’d forgotten how much it’s changed for the better. I used to imagine a far-off future where game development finally had more in common with film production than toy manufacturing, and it’s happened without my noticing.

I used to take it as a given that making games meant getting a full-time job at a studio. It was such a huge investment that it was all but impossible to try and make anything of consequence otherwise. The popular game engines like Unity and… well, okay, just Unity weren’t available, so in most cases, making a game meant making an engine. Even if you had the chops to write your own game engine, just getting a license for Visual Studio was a not-insignificant investment, something that’s easy to forget now that free compilers and IDEs are ubiquitous.

While I was bitching about Steam as an unnecessary hurdle to jump through so I could play Half-Life 2, I didn’t realize what a gigantic shift it would bring. Once manufacturing and distribution became more accessible, it all but wiped out the necessity of having a publisher. Of course, it didn’t wipe out the need for a marketing budget, not to mention funding for game production itself (especially since art tools are still a huge investment), but it did finally turn the big publishers from gatekeepers into business partners.

This is all pretty obvious stuff, and most of us would just take it as a given that democratizing game development is a good thing. But the implications are a lot more significant than I’d appreciated: making game development more accessible hasn’t just made it possible for more people to make games. It’s allowed for the existence of better games. When a game isn’t having to depend on risk-averse investors or clueless marketing departments in order to exist, then smarter and riskier games get made.

I used to despair at the proliferation of space marines and dwarves and stripper-killing-simulators as a sign that maybe games were as infantile as the “grown-ups” suggested. The people making games just didn’t have any original ideas. While that was no doubt true in a lot of cases, it’s more likely that the original ideas were there, but were getting ignored by publishers and marketing teams still pandering to an outdated and very narrow audience that didn’t even exist until they created them.

I first got interested in games because the Monkey Islands and Full Throttle demonstrated that they were capable of an entirely new kind of storytelling, and Sam and Max Hit the Road demonstrated that there was room for weird stuff that seemed to exist only for its own sake, because it was something that the people making it wanted to see. Unfortunately, that was around the same time that the “gamer” stereotype was becoming fully entrenched. Everyone who had any power in the industry seemed to be devoting all their attention to pandering to that very narrow and specific demographic of asshole.

But there was still this idea that they could be more than that. That it’d be possible to have games that made good use of dialogue. Dialogue that wasn’t just treated as an afterthought, but actually given as much care and attention as screenwriting.

One day, you’d see as much variety in video game art direction as you did in animation. You didn’t always have to strive for photo-realism or try desperately to recreate Aliens or Blade Runner; you could make stylized art that was stylized for a reason instead of attempting to cover up a limitation in rendering fidelity.

Games could be quiet. They could have you as deeply invested in relationships as you are in checkpoints or puzzles. They could be more personal stories than epic adventures. They could be mature.

Firewatch isn’t the first indie game, obviously. But so many indie projects (with rare exceptions) can come across — no doubt unfairly, but still — as ego-driven and self-important, or crass and commercial. This is one that I can say for sure is driven entirely by people who are completely passionate (occasionally, infuriatingly passionate) about making games as good as they possibly can be.

I’ve spent the last several months thinking about how the slot machine mentality has so thoroughly taken over game development that they no longer even pretend to separate creative from marketing; “monetization” is now considered a part of “game design.” Or how game studios treat increasingly specialized employees as interchangeable resources instead of as talent, and how some studios subtly “gaslight” their employees into thinking they have no option but to keep working for them. How the unreasonable and unsustainable hours are now so firmly entrenched that they’re treated as expected instead of exceptional. How the atmosphere has gotten so hostile and opportunistic that people can be as duplicitous as the worst stereotype of the Hollywood movie industry.

And then, I see that a bunch of my friends made a 1980s period piece about a divorcé who spends most of his time walking through absolutely stunning environments that look like fully-animated versions of National Park Services posters, using a walkie-talkie to have conversations that build a relationship. And that makes me really, really happy.

Exit, surrounded by Muppets

Tremendous respect for David Bowie, and a theory on what seems truly alien

As somebody who really only knew David Bowie’s work “indirectly,” it’s bizarre how profoundly firmament-shaking it is to realize he’s no longer a constant, always creating new things that no one else would be able to do.

I really, really liked Labyrinth a lot (I think I was unfortunately at just the wrong age to keep from loving it), and I totally, genuinely loved his duet with Bing Crosby on “Little Drummer Boy.” But apart from that, I always thought of his music as something that cooler people than me listened to. It honestly wasn’t until The Life Aquatic with Steve Zissou that I was aware of individual songs (apart from “Space Oddity” and “China Girl”), and the versions that come to my head are still the acoustic ones in Portuguese.

But amidst all the people talking about how preternaturally talented and legendary he was, and what a huge influence he was on “weird” kids (whether LGBT or otherwise), the thing that I keep seeing over and over is how kind and funny he was in person.

It’s weird because I would’ve expected the accounts of him being like an alien or an impossibly famous artist. But I was surprised by that unassuming response to a fan letter in Letters of Note, or the account of him making fun of his own eye color, or just multiple stories of him putting people in awe of him at ease.

Even a still from Labyrinth with him smiling surrounded by goblins is a reminder that the alien, intimidating, super-powerful menace was the character he was playing. The man himself was just an impossibly famous rock star who’d decided to have fun and share his talent by being in a movie with muppets.

When you’re that cool and that universally loved, it seems like you don’t HAVE to be kind and funny, so it must’ve just come naturally.

It makes me wonder if what we perceive as alien and unapproachable is actually just being able to appreciate exactly how weird and silly we all are. And then, instead of being embarrassed by it or trying to disguise it, just completely celebrating it.

It’s a Shame About Rey

Another note about The Force Awakens, merchandising, and spoiler culture. Contains huge spoilers, obviously.

This post is partly about what an impressive job Disney and Lucasfilm did of keeping details about The Force Awakens secret until its release. So please: if you haven’t yet seen the movie, don’t read it.

swtfa_finnlightsaber

While I’m thinking about Star Wars, nothing but Star Wars (if they could bar wars, why don’t they?): one of the things the internet’s decided they just will not stand for is the way that the film is being marketed.

And by “the internet,” I of course mean maybe a couple hundred people who care enough to write blog posts and start a #WheresRey hashtag. Don’t mistake this post for a “I saw this thing on Twitter and am therefore now responding to the cultural zeitgeist,” but instead a comment on an a couple aspects of this movie and its promotional campaign that I thought were interesting.

The outrage I’m talking about [note: not actually outrage] is best summed up in this post by Mike Adamick (which I saw via my friend Chris Hockabout on Facebook): “Rey is not a role model for little girls.”

Continue reading “It’s a Shame About Rey”

It’s True. All of It.

Why I unabashedly love Star Wars: The Force Awakens, and how much I’ve missed being able to unabashedly love a movie

swtfa_luggabeast
It’s called a “Luggabeast”

A cool thing I discovered after seeing The Force Awakens a second time is that I don’t really care about anybody else’s opinion of The Force Awakens.

Really, though, you don’t care about my opinion of it, either. If you haven’t seen it yet, then you need to stop reading this right now. I’m still somewhat amazed by how well Disney & Lucasfilm have managed to keep the movie in everybody’s consciousness for months but still keep so much of it a surprise.

If you have seen it and have some criticisms you feel need to be addressed: eh, can’t help you there. When people talk about Star Wars being a “religion” to Nerds of a Certain Age, it’s intended to be derogatory of course, but there’s some truth to it. It’s more than a series of movies and their associated merchandise; it’s a phenomenon. In my case, it literally transformed my life. So when an experience so thoroughly triggers that feeling of unbridled delight that I haven’t felt in decades, I’m going to be a little dogmatic. Either you love it as much as I do, or you’re mistaken.

But if you did love it and just want to read another fan gushing about it, you’ve come to the right place. Watching it filled me with the kind of naked, uncynical, bean-to-bar exhilaration I haven’t gotten from a movie since seeing Big Trouble in Little China or Ghostbusters for the first time in 1986. For a few hours last Friday, I was transported back to Phipps Plaza in 1980 watching the premiere of The Empire Strikes Back and tearing up at the sheer wonder of it all.

A Long Time Ago

Of course, the down side to being picked up and transported back in time to being a wide-eyed nine-year-old in 1980 is having to get dumped back into the body of a 44-year-old in 2015. It’s alarming how dyspeptic and self-important we’ve all become.

It’s not just the wet blankets. We’ve always had those. For fun, read “The Empire Strikes Out” by David Gerrold from Starlog in 1980 and marvel at how much of it has survived and spread today. Neil deGrasse Tyson is on Twitter pushing buttons as unconvincingly as any of the people operating electronics in Star Wars. (To be fair, Gerrold’s question of how would a giant worm living in an uninhabitable asteroid be able to find food is actually kind of interesting on an academic level. Unlike the crusty old tired complaints about sound in space).

And Gerrold’s whole preamble should sound hauntingly familiar to anyone who’s on the internet in 2015. It’s the words of the martyr who knows what he’s saying won’t be popular among the “fanatics,” but he’s just got to share his complaints about the movie.

I’m not sure that I’ll ever understand the mentality of the “apathetic pan,” the need to inform as many people as possible that you don’t like something that’s popular. Or that it was good but not great. Or that it was fun but you have complaints that you’ll present as a list now. I’m not sure how to react to that, either, other than with a shrug and an Ayn Rand-ian “Oh well, sucks to be you.”

The whole phenomenon of “spoiler free reviews” really made it clear how far we’ve gotten away from engaging and analyzing arts and entertainment, and now just broadcast opinions as widely as possible. Last Wednesday, some kind of review embargo lifted, which meant every site on the internet was scrambling to be the first to post their spoiler-free review of The Force Awakens. It was a torrent of reviews that no one would read for a movie that everyone would see.

And I do mean every site. I’m not sure exactly why I’d want to know what the writers of a technology site or a video game blog thought about the new Star Wars movie, but I could absolutely find out, in written, video, podcast, and roundtable discussion format.

But who was the audience for these things? If anyone was on the fence about seeing the movie, they wouldn’t benefit from a positive review because tickets had sold out weeks earlier. Which also meant that people who’d already bought tickets wanted to know as little about the movie as possible. Which means that the critics couldn’t address anything of real substance about the movie for fear of “spoilers.” It’s talking in vague abstracts about a piece of art for which I have no context. That’s more search-engine optimization than film criticism.

I’m not cynical enough to think that it’s all SEO. A lot of it is genuine enthusiasm, the same reason I’m writing this. But that almost makes it more tragic: the idea of getting excited to see a movie and then rushing home to list all the problems you had with it.

Maybe it’s a side effect of being told for years that the stuff we loved was infantilizing and shallow? So we have to somehow prove that we’re able to appreciate The Muppets on a much deeper level. It’s not enough just to enjoy something; we have to be able to deconstruct it. If we’re not being analytical enough, it shows we lack discernment.

Part of it might be a by-product of Star Wars itself. One of the side effects of Star Wars’s unprecedented popularity was a fascination with how the movies were made. We all learned about blue screens and miniatures and matte paintings, and I doubt I was the only kid who went out and banged a wrench against a telephone pole support cable in an attempt to recreate the blaster sound effects like I saw in the “making of.” But instead of inspiring us all to become movie makers, it seems to have encouraged us all to think like movie reviewers.

Whatever the reason, it’s meant that even people who loved the movie need to qualify it somehow. “It’s not perfect,” or “it’s not as good as the first two,” or “it’s fine for what it is.” Which is kind of a drag, because I wish people could just lose their minds over it like I did.

I’ve already resolved to be less reductive about movies (and other art), trying to identify and compartmentalize the one thing that the entire work means. But it goes deeper than that. I’m realizing that I go into everything like an analyst instead of an audience. I’m devoting around 75% of my brain to the experience, and the other 25% trying to think of interesting things to say about the experience later. It’s like viewing every big event through a smartphone screen instead of being in the moment.

(The rest of this is spoiler-heavy. Please don’t read it if you haven’t yet seen the movie).

Continue reading “It’s True. All of It.”

Ease On Down

The internet is pretty dumb, you guys.

gleescreengrab
So here’s a dumb thing that happened.

A few weeks ago, I made a corny and obvious dad joke on Twitter, because that’s 99.9% of what I do on Twitter:


At the time, that piece of lovingly crafted artisanal comedy got exactly the amount of attention it deserved, which was almost none. I’ve been making dad jokes since before I turned gray, so I’m intimately familiar with having friends and loved ones roll their eyes and go on about their business.

Last night, surprisingly, it got retweeted by somebody asking “are you kidding me right now?” I responded that yeah I was kidding, I’d thought it was obvious; I got a polite “sorry” that was appreciated but completely unnecessary; that was the end of that.

But then I saw another retweet, and then another, and another, all of them saying some variant of “is this guy for real smh” or “lmao” some such. No big deal. Then I noticed that one of them exclaimed “FOUND IT,” which seemed weird, like it was part of an ongoing conversation. What was going on?

As it turns out, this was going on: a piece of investigative journalism on Buzzfeed mocking all the clueless racist people on Twitter and Facebook complaining about The Wiz. (Which, in a completely fortuitous coincidence for Buzzfeed’s site traffic, is airing live on NBC tonight).

As I said on Twitter, I’m not sure what aspect of that post bugs me more: the attempt to stir up a controversy, or the assumption that a gay man in his 40s would have never heard of The Wizard of Oz, or that one who grew up in the 70s had never heard of The Wiz.

If I’m honest, though, what hurts the most is the last bit, where he says my dumb joke is one that was already done on Glee. Manufacturing outrage for page views is one thing, but using screengrabs of Kristin Chenoweth in order to call me derivative is just cruel.

I won’t be even more derivative by going into a long explanation of why I think the outrage-as-engagement content mills are awful, because it’s already been covered elsewhere. There’s an entire site brilliantly parodying it (Wow). And a writer named Parker Molloy wrote “5 Things the Media Does to Manufacture Outrage”, which explains it as clearly as anything I’ve ever seen. (And is written in Buzzfeed format, which I’m assuming was a clever stylistic choice even if it wasn’t).

Here’s the thing, though: the “writer” of that Buzzfeed post blurred out photos and names — which I guess is at least something positive, or there’d be even more nonsense coming my way — but it’s still super-easy to find stuff just by searching for the body of the message. Computers, and all that. So I searched for the other ones. I only found two others, but it was immediately obvious that they were both corny jokes, too. It’s not even painstaking research, either. 90% of the time you can tell who’s a troll, who’s a batshit extremist, and who’s just making goofy jokes within 15 seconds of reading a twitter feed. It took Buzzfeed longer to blur the profile photos than it would’ve taken to do a quick scan for context.

But nobody bothers to scan for context, so it’s been a day of getting notifications of people calling me an asshole or an idiot. To be clear: it’s just been a couple dozen retweets and one cartoonishly overwrought jerk trying to pick a fight, which barely even registers on the scale of internet harassment. I am white and male after all, so I didn’t have anybody calling me fat or mocking my religion or ethnicity. Still, for somebody who gets about four or five notifications a day, it’s been a drag.

And it’s the laziness of the whole thing that gets me. Sure, making a corny and obvious joke is lazy, but short, ephemeral bits of nonsense are what Twitter’s for. It’s the laziness of serving up “content” that’s just a twitter search for a bunch of seemingly inflammatory tweets interspersed with TV show GIFs and sarcastic comments. The laziness of acting as if that’s really getting a handle on the cultural zeitgeist and making some kind of statement about social justice. The laziness of retweeting something without even taking a few seconds to look for context. The laziness of immediately assuming that people are impossibly crass, selfish, and stupid, so of course you’re going to give them a 140-characters-or-less piece of your mind. Just the willful incuriousness of not wanting to find out more about the thing you’re angrily responding to.

Of course, I do it too. I’ve been trying to do a better job of vetting stuff before I send it along, instead of just sharing and retweeting everything I see that pisses me off. But I’m still frequently happy to dive headfirst into the gears of the outrage machine, shaking my tiny fist at whatever the Blogging Illuminati have decided is to be the Controversy of the Week.

And why not? Companies have spent millions and millions of dollars to make it so easy. Reading stuff is a chore, but it just takes a fraction of a second to hit the RT or Share button. These days, they even serve up a convenient menu of stuff to be angry about. “That mostly forgotten actor from the 80s said what?! This aggression will not stand!” Engagement. Content. People may just be saying vapid, mean-spirited, insight-free nonsense, but that’s okay as long as they’re saying something.

I’m starting to think I had a better handle on it years ago, before I “learned” why Internet Activism is Important. I used to think it was futile to pull out the pitchforks and torches every time someone said something inflammatory on the internet. But over time, I was reassured that it was bringing about real social change.

Instead, though, it’s just created an environment where people treat the most vapid statements as if they were profound declarations. I’m taking a stand against hate! While I’m sure that’s a blow to the pro-hate lobby, it’s not actually doing anything.

And you don’t have to be a statistician to understand that a bunch of randos spewing shit on Twitter isn’t a representative sample of anything. Even if that Buzzfeed “story” weren’t weighted with corny liberals making clumsy attempts at satire, and it were in fact a bunch of clueless racists spewing toxic nonsense on social media… really, so what? Do you really want to amplify that crap, to act like it’s something that intelligent people should waste their time responding to?

It’s not a case of ignoring something pernicious, just hoping it’ll go away. And it’s absolutely not a case of ignoring harassment and pretending it doesn’t exist. It is recognizing the difference between meaningful engagement, and just looking for something to get pissed off about. Which is worse than a waste of time, because it gives a voice to ideas and opinions that don’t deserve it. Keep doing it long enough, and you create an environment where even the most basic human respect — like, say, not harassing women or people of a different religion — gets treated as if it were a controversial topic on which reasonable adults can disagree.

It used to be that I’d see people committing themselves to “think positive” or “promote good” or “be the change you want to see in the world” and think that they were being impossibly naive and sheltered. You can’t just ignore injustice! You’ve got to root it out, and fight it! But that just puts you in the mindset of always looking for a fight. And I mean that literally — I confess I’ve absolutely posted stuff on Facebook with the express intent of looking for someone to disagree with me, so I could feel like I’d accomplished something with my righteous conviction. And I’ve spent a depressing amount of time in my life ranting about Mike Huckabee, somebody who has no chance of ever being President or in any kind of influential position, and who just says things to get attention and piss off people like me.

So gradually, I’ve been starting to see the appeal of this whole “positivity” business. If you really want to see an end to bigotry, misogyny, and general awfulness, you could yell at awful people until they stop being awful. Or you could pledge to be as un-awful as possible, and spread that around instead of the nastiness. The latter seems a lot more fun, and a lot less error-prone. Learn to recognize who’s actually influential, and who’s just trying to manipulate people into thinking they’re more influential than they really are.

Having been both the yeller and the yelled-at in Twitter flare-ups, I can tell you that it’s completely unproductive both ways. At the risk of tanking the entire social media economy, I think it makes a lot more sense to just disengage from the outrage machine and spend more time celebrating people doing great things and ignoring the assholes until we starve them of oxygen.

Except for that pharmaceutical CEO guy who jacked up the price of that AIDS drug. He is just the worst.

iPad Cons

Reluctantly coming to the conclusion that the computer I’ve always wanted isn’t the computer I’ve always wanted

TETSUMIN 28
It’s a reliable source of tragicomedy to see people working themselves into an indignant rage over gadget reviews. When I was looking for reviews of the iPad Pro this Wednesday (to do my due diligence), Google had helpfully highlighted some fun guy on Twitter calling the tech journalists’ coverage of the device “shameful.” The reviews themselves had hundreds of comments from people outraged that even the notion of a larger, more expensive iPad was an assault to everything we hold dear as Americans.

The complaints about the rampant “Apple bias” are especially ludicrous in regards to the iPad Pro, since the consensus has been overwhelmingly cautious: out of all the reviews I read, there’s only one that could be considered an unqualified recommendation. Even John Gruber wasn’t interested in getting one. (But he did still believe that it’s the dawning of a new age in personal computing; it’s still Daring Fireball, after all). Every single one of the others offered some variation on “It’s nice, I don’t have any interest in it, but I’m sure it’s perfect for some people.”

Yes, I thought, I am exactly those some people.

Designed By Apple In Cupertino Specifically For Me

I’ve spent the better part of this year trying to justify getting a smaller and lighter laptop computer. I’ve spent the better part of the last decade wanting a good tablet computer for drawing. And I’ve tried — and been happy with — most of the variations on tablets and laptops that Apple’s been cranking out since the PowerBook G4. (One thing people don’t mention when they complain about how expensive Apple products are is that they also retain their resale value exceptionally well. I’ve managed to find a buyer for every Apple computer or tablet I’ve wanted to sell).

I’ve tried just about every stylus I could find for the iPad. I tried a Galaxy Note. I tried a Microsoft Surface. I got dangerously excited about that Microsoft Courier prototype video. Years ago, I tried a huge tablet PC from HP. None of them have been right, for one reason or another.

But when they announced the iPad Pro this Fall, it sounded like Apple had finally made exactly what I wanted: a thin and relatively light iPad with a high-resolution display, better support for keyboards, faster processor, and a pressure-sensitive stylus designed specifically for the device. Essentially, a “retina” MacBook Air with a removable screen that could turn into a drawing tablet. The only way it could be more exactly what I want would be if it came with a lifetime supply of Coke.

Still, I decided to show some constraint and caution for once, which meant having the calm and patience to get one a few hours into opening day instead of ordering one online the night before.

I read all the reviews, watched all the videos, paid closest attention to what artists were saying about using it. The artists at Pixar who tried it seemed to be super-happy with it. All the reviews were positive about the weight and the display and the sound and the keyboards.

I went to the Apple Store and tried one out, on its own and with the Logitech keyboard case. It makes a hell of a first impression. The screen is fantastic. The sound is surprisingly good. It is huge, but it doesn’t feel heavy or all that unwieldy when compared to the other iPads; it’s more like the difference between carrying around a clipboard vs carrying a notepad. (And it doesn’t have the problem I had with the Surface, where its aspect ratio made using it as a tablet felt awkward).

And inside the case, it gets a real, full-size keyboard that feels to me just like a MacBook Air’s. It really does do everything shown in the demo videos. I imagined it becoming the perfectly versatile personal computer: laptop for writing, sketchpad for drawing, huge display for reading comics or websites, watching video, or playing games. (I’m not going to lie: the thought of playing touchscreen XCOM on a screen this big is what finally sold me).

But Not For Me

But I don’t plan to keep it.

It’s not a case of bait-and-switch, or anything: it’s exactly what it advertises, which is a big-ass iPad. The question is whether you really need a big-ass iPad.

The iPad Pro isn’t a “hybrid” computer, and Apple’s made sure to market it as 100% an iPad first. But it’s obvious that they’re responding to the prevalence of hybrids in Windows and Android, even if not to the Surface and Galaxy Note specifically. And I think Apple’s approach is the right one: differentiating it as a tablet with optional (but strongly encouraged) accessories that add laptop-like functionality, instead of as some kind of all-in-one device that can seamlessly function as both.

But a few days of using the iPad Pro has convinced me that the hybrid approach isn’t the obviously perfect solution that common sense would tell you it is. It’s not really the best of both words, but the worst of each:

  • Big keyboards: The Apple-designed keyboard is almost as bad for typing as the new MacBook’s is, which is almost as bad as typing on a Timex Sinclair. Maybe some people are fine with it, and to be fair, even the on-screen keyboard on the iPad Pro is huge and full-featured and easy to use. But for me, the Logitech keyboard case is the only option. And it’s pretty great (I’m using it to type this, as a cruel final gesture before I return it) but it turns the iPad Pro from being surprisingly light and thin into something that’s almost as big and almost as heavy as a MacBook Air.
  • Big-ass tablet: Removed from the case, the iPad Pro quickly becomes just a more unwieldy iPad. The “surprisingly” part of “surprisingly light and thin” means that it’s genuinely remarkable considering its processor speed and its fantastic screen, but it still feels clumsy to do all the stuff that felt natural on the regular iPad. It really wants to be set down on a table or desktop.
  • It’s not cheap: I wouldn’t even consider it overpriced, considering how well it’s made and how much technology went into it. But it does cost about as much as a MacBook Air. That implies that it’s a laptop replacement, instead of the “supplemental computer” role of other iPads.
  • Touching laptop computer screens is weird: Nobody’s yet perfected the UI that seamlessly combines keyboards and touch input. Even just scrolling through an article makes me wish I had a laptop with a touchpad, where it’s so much more convenient. When it feels like the touchpad is conspicuously absent while you’re using a device that’s essentially a gigantic touchpad, that means that something has broken down in the user experience.
  • Aggressive Auto-correct: Because iOS was designed for touch input on much smaller screens, it was designed for clumsy typing with fat fingers. Which means it aggressively autocorrects. Which means I’ve had to re-enter every single HTML tag in this post. And it still refuses to let me type “big-ass” on the first try.
  • It’s missing much of OS X’s gesture support: Despite all the clever subtle and not-so-subtle things they’ve done to make iOS seamless, it’s still got all the rough edges as a result of never being designed for a screen this large. In fact, having your hands anchored to a keyboard goes directly against the “philosophy” of iOS, which was designed to have an unobtrusive UI that gets out of the way while you directly interact with your content. Ironically, it’s all the gesture recognition and full-screen stuff that made its way from iOS to OS X that I find mysef missing the most — I wish I could just quickly swipe between full-screen apps, or get an instant overview of everything I have open.
  • No file system: This has been a long-running complaint about iOS, but I’ve frankly never had much problem with it. But now that the iPad is being positioned as a product that will help you do bigger and more sophisticated projects, it becomes more of a problem. I just have a hard time visualizing a project without being able to see the files.
  • The old “walled-garden” complaints: Apple’s restrictions aren’t nearly as draconian as they’re often made out to be, but they still exist. Occasionally I need to look at a site that still insists on using Flash. And the bigger screen size and keyboard support of the iPad Pro suggest that programming would be a lot of fun on this device, but Apple’s restrictions on distributing executable code make the idea of an IDE completely impractical.
  • Third-party support: App developers and web developers haven’t fully embraced variable-sized screens on iOS yet. (As an iOS programmer, I can definitely understand why that is, and I sympathize). So apps don’t resize themselves appropriately, or don’t support split screen. Some apps (like Instagram, for instance) still don’t have iPad versions at all. Some web sites insist I use the “mobile” version of the site, even though I’m reading it on a screen that’s as large as my laptop’s.

If You Don’t See a Stylus, They Blew It

For me, the ultimate deciding factor is simply that the Apple “Pencil” isn’t available at launch. They’re currently back-ordered for at least four weeks, and that’s past the company’s 14-day return window. Maybe they really have been convinced that the stylus is a niche product, and they weren’t able to meet the demand. Whatever the case, it seems impossible for me to really get a feel for how valuable this device is with such a significant piece missing.

The one unanimous conclusion — from both artists and laypeople — is that the Pencil is excellent. And I don’t doubt it at all. Part of what gets the tech-blog-commenters so angrily flummoxed about “Apple bias” is that Apple tends to get the details right. Their stuff just feels better, even if it’s difficult or impossible to describe exactly how or why, and even if it’s the kind of detail that doesn’t make for practical, non-“magical” marketing or points on a spec sheet.

Even though I haven’t been able to use it, I have been impressed with how Apple’s pitched the stylus. They emphasize both creativity and precision. There’s something aspirational about that: you can use this device to create great things. Microsoft has probably done more over the years to popularize “pen computing” than any company other than Wacom, but they’ve always emphasized the practical: showing it being used to write notes or sign documents. It’s as if they still need to convince people that it’s okay for “normal” people to want a stylus.

Part of the reason I like Apple’s marketing of the Pencil is that it reminds me of the good old days before the iPhone. Back when Apple was pitching computers to a niche market of “creative types.” It was all spreadsheets vs. painting and music programs, as clearly differentiated as the rich jocks vs the sloppy underdogs in an 80s movie.

I only saw a brief snippet of Microsoft’s presentation about the Surface and Surface Book. In it, the Microsoft rep was talking about the Surface’s pen as if he’d discovered the market-differentiating mic-drop finishing-move against Apple’s failed effort: unlike “the other guys,” Microsoft’s pen has an eraser. I’ve been using a Wacom stylus with an eraser for some time, and its always too big and clumsy to be useful, and it always ends up with me using the wrong end for a few minutes and wondering why it’s not drawing anything.

Meanwhile, Apple’s ads talk about how they’ve painstakingly redesigned the iPad screen to have per-pixel accuracy with double the sampling rate and no lag, combining their gift for plausible-sounding techno-marketing jargon with GIFs that show the pen drawing precise lines on an infinite grid. That difference seems symbolic of something, although I’m not exactly sure what.

The Impersonal Computer

I’ve been pretty critical of Microsoft in a post that’s ostensibly about how I don’t like an Apple product. To be fair, the Surface Book looks good enough to be the best option for a laptop/tablet hybrid, and it’s clear some ingenious work went into the design of it — in particular, putting the “guts” of the machine into the keyoard.

I’m just convinced now that a laptop/tablet hybrid isn’t actually what I want. And I think the reason I keep going back to marketing and symbolism and presentation and the “good old days” of Apple is that computers have developed to the point where the best computer experience has very little to do with what’s practical.

I get an emotional attachment to computers, in the same way that Arnie Cunningham loved Christine. There have been several that I liked using, but a few that I’ve straight-up loved. My first Mac was a Mac Plus that had no hard drive and was constantly having to swap floppy disks and had screen burn-in from being used as a display model and would frequently shut down in the middle of doing something important. But it had HyperCard and Dark Castle and MacPaint and the floppy drive made it look like it was perpetually smirking and it as an extravagant graduation gift from my parents, so I loved it. I liked the design of OS X and the PowerBook so much that I even enjoyed using the Finder. I tried setting up my Mac mini as a home theater PC mostly as an attempt to save money on cable, but really I just enjoyed seeing it there under the TV. Even a year into using my first MacBook Air, I’d frequently clean it, ostensibly to maintain its resale value but really because I just liked to marvel at how thin and well-designed it was.

I used to think that was pretty common (albeit to healthier and less obsessive degres). But I get the impression that most people see computers, even underneath all their stickers and cases to “personalize” them, as ultimately utilitarian. A while ago I had a coworker ask why I bring my laptop to work every day when the company provided me with an identical-if-not-better one. The question seemed absolutely alien to me: that laptop is for work; this laptop has all my stuff.

Another friend occasionally chastises me for parading my conspicuous consumption all over the internet. I can see his point, especially since the Apple logo has gone from a symbol of “I am a creative free-thinker” to “I have enough money to buy expensive things, as I will now demonstrate in this coffee shop.” But I’ve really never understood the idea of Apple as status symbol; I’ve never thought of it as “look at this fancy thing I bought!” but “look at this amazing thing people designed!”

The iPad was the perfect manifestation of that, and the iPad mini was even more. Like a lot of people, I just got one mainly out of devotion to a brand: “If Apple made it, it’s probably pretty good.” I had no idea what I’d use it for, but I was confident enough that a use would present itself.

What’s interesting is that a use did present itself. I don’t think it’s hyperbolic to say that it created an entirely new category of device, because it became something I never would’ve predicted before I used it. And it’s not a matter of technology: what’s remarkable about it isn’t that it was a portable touch screen, since I’ve known I wanted one of those ever since I first went to Epcot Center. I think what’s ultimately so remarkable about the iPad is that it was completely and unapologetically as supplemental computer.

Since its release, people (including me) have been eager to justify the iPad by showing how productive it could be. Releasing a version called the “Pro” would seem like the ultimate manifestation of that. But I’m only now realizing that what appealed to me most about the iPad had nothing to do with productivity. I don’t need it to replace my laptop, since I’m fortunate enough to be able to have a laptop. And the iPhone has wedged itself so firmly into the culture that it’s become all but essential; at this point it just feels too useful to be a “personal” device. (Plus Apple’s business model depends on replacing it every couple of years, so it’s difficult to get too attached to one).

Apple’s been pitching the watch as “their most personal device ever,” but I wouldn’t be devastated if I somehow lost or broke the watch. My iPad mini, on the other hand, is the thing that has all my stuff. Not even the “important” stuff, which is scattered around and backed up in various places. The frivolous, inconsequential stuff that makes it as personal as a well-worn notebook.

Once I had the iPad Pro set up with all my stuff, I was demoing it to a few people who wanted to see it. And obviously with coworkers but even, surprisingly, when showing it to my boyfriend, there was a brief moment of hesitation where I wondered if I was showing something too personal. I don’t mind anybody using my laptop or desktop, or sharing my phone with someoen who needs it, but I’ve got a weird, very personal attachment to the iPad. (And not just because I treat my Tumblr app like the forbidden room in a gothic novel which no one must ever enter).

It’s entirely possible that I’m in the minority, and whatever attachment most people have to “their stuff” is to the stuff itself in some nebulous cloud, and not the device that’s currently showing it to them. It’s even more likely that there’s simply no money to be made in selling people devices that they become so attached to that they never want to give them up. It may be that Convergence is The Future of Personal Computing, and one day we’ll all have the one device that does everything.

After using the iPad Pro, I’m no longer convinced that a big iPad that also functions as a laptop is what I want. I really want a “normal”-sized iPad that’s just really good at being an iPad. Which means adding support for the Apple Pencil to the iPad Air.

So I’m back to hoping Apple’s already got one of those in the pipeline, and waiting until it’s announced at some point next year, and then ordering one the second they’re available and then trying to justify it as a rational and well-considered purchase. Next time for sure it’s going to be exactly the computer I want.

Daredevil: Gold

Marvel’s Daredevil series turns out to be even better than I first thought.

DaredevilHappyEnding
A while ago I wrote about my first impressions of the at-the-time-new Daredevil series on Netflix. I thought it was brilliant, but still only managed to get halfway through the season before having to set it aside.

The “problem” was that it was too good at the mood it was trying to establish. The tension of the series relies on the feeling of a city that’s irreparably broken, where the corruption goes so deep that it taints even the people trying to fight against it. It remains a solid series throughout, but it’s not a carefree, fun romp.

Now, I’ve finally finished watching the first season, and my opinion of it’s changed. Before, I thought it was really good. Now, I think it’s kind of a master work. If it just existed in a vacuum as a one-hour drama/action television series, it’d be really well-done if not groundbreaking; the hyperbole comes in when you consider it as an adaptation. Not just of a long-running series, but of a franchise and a format.

Really, what’s most amazing to me is that it exists at all, when you consider all the different ways it could’ve gone wrong. It could’ve collapsed under the weight of its own cliches, being unabashedly an adaptation of a comic book. It could’ve been pulled apart in any number of directions — too enamored of its fight scenes to allow for long stretches with nothing but dialogue, or too enamored of its “important” dialogue to realize how much storytelling it can accomplish with choreographed fight scenes. It could’ve quickly revealed itself as too derivative, or tried to crib too much from the Christopher Nolan version of Batman, considering that it’s based on a character that was already derivative. It could’ve suffocated from having its head too far up its own ass, being based on what’s maybe the most self-consciously “adult” of mainstream comics characters, and gone the route of “grim and gritty” comics’ facile understanding of what’s “mature.” It could’ve had performances that were too Law & Order for the comic-book stuff to read, or too comic-book for the dramatic stuff. The character of Foggy could’ve been so self-aware as to be insufferable, or the character of Karen could’ve been nothing more than a damsel in distress or a dead weight. It could’ve all been completely torn apart once they let Vincent D’Onofrio loose.

But it all works. (Almost). It’s a self-contained arc and a hero’s journey story and a tragedy and a character study and a crime drama and a martial arts series and a morality play and a franchise builder. It’s never so high-minded that it forgets to be entertaining, but it does insist that entertainment doesn’t have to be stupid. Yes, it is going to show you Daredevil fighting a ninja, but you’re also going to watch a scene that’s entirely in Mandarin, so don’t complain about having to turn the subtitles on.

If, like me, you were unfamiliar with the character other than at the most basic level — blind lawyer with super-senses who fights criminals with a cane that turns into nunchucks — then take a second to read an overview of the character’s history. And be impressed not only at how much they managed to retain, but how many horrible pitfalls they avoided.

My least favorite episode of the season — by far, since it’s really the only sour note in the entire thing that I can think of — is titled “Stick.” I had never heard of the character, but of course it’s from the comics. And of course it’s from Frank Miller, because it’s just an eyepatch and laser gun short of being the culmination of everything a testosterone-addled 12-year-old in the 80s would think is “rad.” As someone who was a testosterone-addled 12-year-old in the 80s, I can acknowledge this was a part of my past, but it’s not anything to be cherished, celebrated, or re-imagined. (Everybody was obsessed with ninjas back then. This was a time when Marvel thought they needed to make their immortal Canadian anti-hero with a metal-laced skeleton and claws that come out of his hands “more interesting” by having him go to Japan).

So the character of Stick is straight-up bullshit. It’s a perfect Alien 3-style example of not being able to handle what you’re given and instead, tearing down everything that came before in order to write about something else. Except even worse, because it tears everything down to replace it with something that is itself derivative: a sensei with a mysterious past in the form of a wise, blind martial arts master. (Except it’s the 80s, so he’s “flawed.” Which means he’s even more rad). It undermines the main character of the story by saying, “Here’s a guy who can do everything your hero can, even better than your hero can, and without the benefit of super powers.”

The makers of the series did the best they could. First, they cast Scott Glenn to come in and Scott Glenn it up. Then, they spun it the best they could, figuring out how to take the elements of the story that would fit into their own story arc: the idea that loyalty and connection to other people is a weakness, and the idea that it’s the choices Matt Murdock makes that define him as a hero, and not his super powers. (And then towards the end of the series, they have Foggy make a reference to how cliched and dumb the whole notion of a blind sensei is, so all is forgiven).

Throughout, there’s a respect for the source material that’s more skill than reverence. They understand not only how to take elements from the original and fit them into the story they’re trying to tell, but how and why they worked in the original. A lot of adaptations, especially comic book adaptations that try to move the story into “the real world,” are so obsessed with the first part that they lose sight of the second. I’m realizing now that that’s a big part of why Christopher Nolan’s Batman movies don’t work for me: they treat the characters and their origin stories as these disconnected bits of mythology floating around in the ether, without much consideration for how they originally worked and why they became so iconic. Especially with the last movie, it seemed to be more about mashing up familiar references instead of meaning. (Take that to its extreme, and you get a version of The Joker who has a panel from an iconic comic book about The Joker tattooed on his own chest).

But the Daredevil series takes stuff that was used as fairly empty symbolism in the comics — a vigilante in a Devil suit standing on top of a building overlooking a church — and pumps enough depth into it to make it meaningful again.

There’ve been so many “adult” interpretations of Batman that the whole notion of a vigilante hero has pretty much lost any tension or dramatic weight. Daredevil makes it interesting again. Even though it’s an unapologetically bleak setup, there’s still never a question that Daredevil is eventually going to win the fight. The question is what he’s going to lose in the process.

That in itself isn’t uncharted story, and the series doesn’t attempt to explore the material by going all-in on realism. Instead, it takes all the familiar elements and symbols and fits them into a structure where they all support each other and build off of each other. We see every single character faced with temptation, and we see how each character responds to it. None of the stories are self-contained origin stories presented for their own sake; they all reflect on that idea of holding on to your soul despite any corrupting influences. Foggy isn’t just the comic relief character; he’s the constant reminder of the ideals they’re supposed to be fighting for. Karen isn’t a story of an innocent saved by a hero; she has actual agency, and she’s an example of how corruption can gradually and subtly chip away at the soul of a good person.

The villains are straight out of the Stock Gritty Urban Bad Guy warehouse, but as with the best comic book stories, they all reflect on some aspect of the hero and illustrate why the hero’s the star of the story. Some of the corrupt cops show what results when people try to appoint themselves as above the law. One of the cops’ stories shows how he succumbed to corruption out of a desire to keep his loved ones safe. The Russian mobsters are depicted as people who did whatever they had to in order to overcome a horrible upbringing. The character of Madame Gao seems to be about moral relativism, a rejection of the idea that there are good people who do bad things. The Chinese drug-smuggling ring is a rejection of the idea that corruption is passive; it seems to insist that people aren’t forced to do bad things but choose to, an idea that’s reinforced by Karen’s story. And the Yakuza aren’t used much for other than a bit of exotic intrigue and a ninja fight, but there’s still some sense of how a devotion to honor above all else is itself a kind of corruption.

Of course, the first season is as much Kingpin’s origin story as Daredevil’s, so his is the most interesting. And again, it takes what could be the often simplistic moralizing of “comic book stories” and pumps depth back into it. There’s a scene in which he’s dramatically reciting the story of The Good Samaritan that keeps threatening to go over the edge into self-important super-villain monologuing scene, where the writer is a little too eager to make sure you get the point of what he’s been trying to say. But when taken as the culmination of his story, it’s the climactic moment that marks his story as a tragedy. It’s fairly typical for writers and actors to say that the most interesting villains are the ones who see themselves as the heroes, so it’s fascinating to see this series try to take that a step further. They’ve spent the entire season letting us into Fisk’s head, building up empathy if not sympathy, showing us how he became what he is. Then they say, “Wouldn’t it be even more interesting to show him accepting and embracing the fact that he’s the villain?” And it is, because it suggests that his story is just getting started.

Even more interesting to me, in a 2015 adaptation of a comic book that originated in 1964, is how it shows Kingpin as a male character created and defined by women. (Maybe not that surprising, considering that the source material is as well known for its relatively short-lived bad-ass female ninja character as it is for its hero). Every defining moment of his character — from his childhood to the climax of his story — is in reaction to something done by a man, but driven by the decision of a woman. His mother covers for him and protects him. Madame Gao intimidates him and backs him into a corner, effectively forcing him to abandon his pretense of fighting for good. And Gao insisted that Vanessa was a distraction for him, when in fact she was helping define him: all of the aspects of his character that he was trying to keep hidden and keep her shielded from, were the very aspects of his character that most attracted her.

In fact, all of the female characters in Daredevil are defined by their agency, while almost all of the male characters (except Matt and possibly Foggy) are shown either as passive products of their environment or as character simply living out their true nature. Ben Urich’s wife encourages Urich to stay true to his ideals, while acknowledging that being a reporter is simply in his nature, and there’s little he can do about it. Wilson Fisk tries to put a positive spin on his motivations, but both Vanessa and Gao encourage him to acknowledge that he’s doing it for power, not for good. Clare chooses to help Matt Murdock, and it’s ultimately her who chooses how to define their relationship. There’s even an element of it with Foggy and Marci — he’s incorruptible by nature, while she has to actively choose to do the right thing.

When you step back and look at it as part of the overall Marvel franchise, it makes it seem even more that the freak-out over Black Widow was missing the point. The internet would have you believe that the issue comes down to the ratio of how many men she defeats vs how many times we’re shown her ass. The bigger issue (and I’m definitely not the first person to point it out!) is that the movies are so dominated by male characters that she has to represent All Women. And even in a comic book story, “strong female characters” aren’t about super powers or who’d win in a fight.

And still, the thing that impressed me the most in the first couple of episodes stayed true throughout: Daredevil is fantastic at maintaining its tone. Sure, dialogue-heavy scenes peacefully coexist with fight scenes, but it goes even deeper than that. Some of the dialogue-heavy scenes are entirely plot driven, while a fight scene is all about establishing character. Some of the scenes are about dramatic monologuing, while others are about more subtle implications and things left unsaid. There are several moments I would’ve expected to be spun out into multi-episode arcs, but are instead left lingering in the background: for instance, a particularly well-acted moment when Foggy realizes that Karen isn’t attracted to him in the same way she is to Matt. It’s fairly subtle and heartbreaking, and to the best of my memory, no character ever utters the despicable phrase “friend zone.”

Everybody knows Vincent D’Onofrio is great at playing a psychopath, but what I didn’t appreciate is that he’s so good at maintaining it. I would’ve thought that by spending so many episodes building up anticipation for his appearance, when he first explodes and kills a guy, they’d have used up all the value of that for the rest of the season. But he keeps it going for episode after episode, filled with rage and menace and perpetually just on the verge of boiling over. And Ayelet Zurer perfectly underplays Vanessa — never trying to compete with Fisk in bombastic scene-stealing but always conveying a sense of power and control. Once she starts making her motivations perfectly clear, it’s every bit as chilling as any of Fisk’s outbursts.

And there’s a scene where Foggy and Matt are fighting because of course there is; any story about a super-hero with a secret identity demands it. I was never particularly invested in their relationship, or unsure of how it would play out, so I thought the entire thing would be a rote case of doing what it needed to for the season arc and then moving on. But it’s so well-acted (and under-written) that it actually got to me. Matt sobs in the middle of a line, and it really feels like the entire weight of the season up to that point just came crashing down on top of him.

As always, it’s another case of understanding exactly how and why a scene works, instead of simply including it because it’s supposed to be there. I’m tempted to say this should be the template for every live-action adaptation of a comic book, but I honestly don’t know how much of it is reproducible. I am excited to see how it plays out in the second season and all the spin-off series. At this point, I’d even watch a show about Cable.

Mmm, yes

On the 90s and the Internet and how Kate Bush is amazing.

For a couple of months in 1990, I was completely obsessed with The Sensual World by Kate Bush. It didn’t last for too long before I moved on to be obsessed with The Pogues and the Pixies, and I’d completely forgotten about it until just recently. A few nights ago, YouTube recommended I re-watch Noel Fielding’s brilliant parody of Wuthering Heights, and I had a vague memory that oh yeah, I used to be kind of infatuated with her.

I was trying to remember my favorite song of hers based on a few half-remembered details — what’s that one Kate Bush video where she’s against a black background and flinging gold sparkles everywhere? — and a memory of the chorus but not the actual title Love and Anger. That meant stumbling around all her videos on YouTube trying to find the right one, and coming to a series of conclusions, in roughly this order:

  1. Watching these now is like suddenly remembering vivid details from a dream I had 20 years ago.
  2. Holy crap, Kate Bush is brilliant.
  3. Even if I’d tried, I don’t think I could’ve fully appreciated all this stuff in 1990.
  4. I’d forgotten how different the music industry was back before Napster and the ubiquitous internet.
  5. I never thought much about how important context is to appreciating a work of art.
  6. No really, she’s just the best.

Until I went to college in a city that prides itself on its music, I only listened to whatever was popular at the time. So when MTV started playing that video for Love and Anger and commenting on what a big deal Kate Bush was and how significant it was to be getting a new album, it was all lost on me. To me, she was just “that woman who sang on that Peter Gabriel song.” I had a vague memory of Running Up That Hill, but had just filed it away in the same folder as Bonnie Tyler and Total Eclipse of the Heart: a synthesizer-heavy pop song by someone who was apparently a lot more popular in the UK than in the US. I can’t remember if I was even aware of Wuthering Heights at the time; if so, I almost certainly dismissed it as someone screeching over overly-precious lyrics. Knowing myself at the time, I probably picked up The Sensual World mostly for the cover, thinking that she looked like Jane Wiedlin and sounded like Cyndi Lauper and was probably worth a listen.

Deeper Understanding

The album is a lot more interesting and varied than the pop record I’d been expecting. Rocket’s Tail in particular is fascinating; I don’t believe I’ve heard it in 25 years, but it all came back suddenly as if it’s been looping constantly in the recesses of my brain. I also suddenly remembered why I didn’t become an obsessive fan back then, and it’s two of the most 1990s reasons imaginable.

One is just raw early 90s proto-hipsterism. I thought the song Deeper Understanding‘s story about a man who retreats from human contact into his computer was facile and paranoid, the same way there a lot of stories around the same time talked about virtual reality and rogue AIs but didn’t seem to understand how computers actually worked, and instead panicked about a super-advanced, cold cyber-world that looked like Second Life. I dismissed it as out-of-touch and irrelevant. (And of course, I’m saying that as someone who now wakes up every morning and immediately grabs his cell phone to check on Twitter and Facebook).

The other reason is that doing a “deep dive” on anyone’s work was, at the time, an investment. In 2015, I started digging around YouTube, Wikipedia, and Apple Music, and within a couple of hours had seen and heard 90% of Bush’s artistic output since 1978. It’s hard for me now to imagine a time without YouTube, much less a time before the web and even USENET, even though I was a computer-fixated nerd back when a 300 bps Vicmodem was a novelty. But essentially, in a time without hypertext, I didn’t have much chance to appreciate what I was listening to.

There’s a 2014 documentary from the BBC called The Kate Bush Story that’s a lot better than it would seem on the surface. It’s the typical VH1 format, where a bunch of celebrities gush about Bush’s work intercut with clips from her videos. It seems about as vapid as a Behind the Music or I Love the 80s show, right down to the sour note of ending with Steve Coogan making a pun about “bush.” And they have interviews with the usual suspects, where Elton John and Tori Amos and Neil Gaiman say that they’re huge fans of Kate Bush.

But of course they are, right? I’m not being entirely dismissive; I was an enormous fan of Amos and Gaiman (I wrote Gaiman a fan letter on the GEnie network! And he sent a personal response with a great story about seeing The Pogues in concert!). But even my vague awareness of Kate Bush as “British and feminine and lots of pianos and literary references and scarves and dancing” fits solidly and predictably into the same category as The Sandman and Little Earthquakes.

And speaking of USENET, I was aware even back during those days that there was a pretty substantial fandom around Bush’s music. But even there, it was named after a song that was relatively obscure in the US. So I thought of it in kind of the same way as Doctor Who pre-Russell Davies: super-popular in Britain and good for them! but completely inaccessible to me.

So to watch that documentary and see St. Vincent and Johnny Rotten and Big Boi and Tricky pop up to say how much her work influenced them, it didn’t just grab my attention, but put everything into a context I hadn’t considered before. And made observations that I wouldn’t have come up with on my own, but seem kind of obvious once they’ve been pointed out to me.

Ooh, it gets dark!

For instance, that Noel Fielding parody of Wuthering Heights. I’d seen it years ago, back when my obsession of the moment was The Mighty Boosh. At the time, I hadn’t appreciated that it was a parody of two versions of the video: the iconic one of her dancing in a field wearing a red dress, but also the studio version with its cartwheels and 70s video-trail effects.

I also hadn’t appreciated that it’s not a mocking parody but a reverential one. The joke isn’t about how weird or fey Bush’s performance is in that video, but that she’s the only person who could pull it off without looking silly. And for that matter, what a touchstone the performance was for her fans. (Proof that it’s not mocking but instead a love letter from a fan is that Bush included Fielding in her remake of the Deeper Understanding video that year).

It’s also about how iconic the imagery is, and how indelibly it’s associated with that song. In several of her early interviews, Bush says (paraphrased) that she studied dance, mime, stage production, and eventually, filmmaking, in order to make visual extensions of her songs. To someone raised on MTV and cynicism, that could sound pretentious or disingenuous — videos are promotional material used to sell music, and only inadvertently become artistic works. But then you remember that Bush was doing this before videos were a thing. And you remember how strong the imagery is: I’m about as close to the polar opposite of “waif-like” as a person can get, but I still find it difficult to keep from making the same gesture when I hear the lyric “let me into your window.”

And — even more embarrassingly for me — I hadn’t put any thought into what the song was about. As someone with the perpetual mindset of a teenage boy rolling his eyes at “girls’ stuff” like gothic romances, I hadn’t considered that it was the voice of a dead woman appearing at her lover’s window in the night, pleading to be let inside. So what I’d dismissed as just weird screeching was, of course, completely intentional. For a female songwriter and singer in 1978, The Man with the Child in his Eyes would’ve been a much more accessible debut song. And it would’ve been successful; it’s a beautiful and memorable song that, in my opinion at least, evokes Karen Carpenter’s considerable talent and holds its own. But she deliberately chose her first appearance to be literary and otherworldly.

This Woman’s Work

And she’s done that throughout everything that I’ve seen and heard. Her stuff is clearly influenced by whatever else is going on in music at the time, but there’s a sense that she won’t bother doing anything unless it’s something she finds interesting and unique.

When I first saw the video to Eat the Music from 1993, I thought I’d figured it out: ah, here’s where she went through her World Music phase just like Peter Gabriel and the Talking Heads and pretty much everyone else in the mid 80s through early 90s. But it’s gloriously sinister right from the start, with the lyric “Split me open with devotion, put your hand in and rip my heart out.” As the video goes on, it gets even weirder and more sinister, as the spinning becomes unstoppable, and the other dancer’s eyes roll into the back of his head, and it becomes so frenzied that everyone collapses. What I’d mistaken as a novelty song or a one-off becomes (obviously, in retrospect) a crucial part of a concept album about obsession and loss.

Rubberband Girl from the same album sounds a little like an early 90s Eurythmics song, and the warehouse in which its video was filmed is the same one that supplied the backing bands and ceiling fans for countless other 90s videos. But then there’s that choreography, which suggests that her seemingly effortless grace is actually the result of her being pulled, exhausted and against her will. And then it, too, descends into a kind of frenzy that belies the “bend without breaking” sentiment of the lyrics. She’s bound into a straightjacket and is compelled to wave her arms around, all filmed with the harsh light of a Twin Peaks murder scene.

Apparently, all the videos from The Red Shoes are from a long-form video that featured Miranda Richardson (she has a lot of videos that feature British comedic actors) and one of her early mentors, Lindsay Kemp, which explains the non-sequitur beginnings and endings. Hilariously, in 2005 interview she describes it “a load of bollocks,” while I’m here 10 years later trying to make sense of its bizarre transitions. Removed from that context, Moments of Pleasure is even more fascinating — starting with a whispered soliloquy and then showing nothing but her spinning and tumbling over a series of backdrops. It’s at least as beautiful and powerful a song as This Woman’s Work, but what’s most remarkable to me is how conversational, almost extemporaneous, the lyrics are. It seems like the natural impulse for a song about death and loss would be to make the lyrics flowery and poetic, but having something so prosaic against such a moving orchestration just makes it all the more real.

And speaking of This Woman’s Work, I suspect that the real reason I stopped listening to The Sensual World was that it was too exhausting. Even without the video, it’s hard to hear that song without feeling emotionally drained by the end. Even while cynical early 90s me dismissed as “maudlin” to disguise the fact it never fails to get a sob out of me.

Same with Love and Anger, which I still love but had thought was nothing more than a product of its time with a simple “we’re all in this together!” message. Paying even a little bit of attention to the lyrics shows it to be more sophisticated than that: I think it’s about passion and empathy, expressing even what we think of as negative emotions instead of being repressed and “waiting for a moment that will never happen.”

In an interview around the release of The Sensual World, she said that it was her first album that was written from a feminine perspective, since up until then, all her musical and artistic influences had been men. Which, I think, is selling herself short, since so much of her entire body of work is uniquely feminine. In that BBC documentary, Neil Gaiman calls out the maternal aspects of the songs Breathing and Army Dreamers. The song that Americans around my age were likely most familiar with — Running Up That Hill — is a call for empathy disguised as synth-heavy 80s pop with some terrific choreography. (With the fascinating, slightly sinister twist of making it sound selfish with “let me steal this moment from you now.”)

“It’s in the Trees! It’s Coming!”

And then there’s Hounds of Love, which is so good that it kind of makes me angry that attitudes like the one 1990s me had kept it from taking off in the US and so I didn’t get to see and hear it until 2015.

The story I keep reading is that Bush was savvy enough to build on her early success from her first two records, to the point that she was able to free herself from the record label and do everything on her own terms. By the time of Hounds of Love, she was not only writing, singing, and producing her own music, but had built her own studio and conceived of and directed the video to the title track. It’s driving and cinematic and enigmatic, and it’s fantastic in the way that I usually think of Terry Gilliam’s movies as being. (And apparently, she collaborated with Gilliam on the video for Cloudbusting on the same album). In yet another interview, she casually mentions drawing storyboards for the video as if it were no big deal.

One of the reasons I admire St Vincent so much is that she’s able to go all-in on the conceptual art side of her work, and then in “real life” is as personable and down-to-earth as it gets. (Unlike, say, Bjork, who’s brilliant but whom I’d never, ever want to meet in person).

Kate Bush comes across the same way, as a person who pours all her imagination and idiosyncrasies into her work. This results in fantastic things like Sat in Your Lap from The Dreaming, which seems to me as early 80s prog rock as early 80s prog rock gets. And then this wonderful appearance on a British children’s show, where she says she’s lucky because she got to wear roller skates in her video, and she lets a little girl in the audience wear one of the minotaur masks.

It probably goes without saying that Kate Bush is objectively, almost impossibly, beautiful. But even that aspect seems to be something she always treated as incidental — great insofar as it helps the music, but never something that should take away from the music. Experiment IV, for instance, is a sci-fi horror story where she lets a bunch of comedic actors (and her then-partner) take the focus while she takes a bit part as a harpy and a horrible monster.

Most amazing to me is Babooshka from 1980. As with Wuthering Heights, she treats dance as a crucial part of telling the story of the song. She appears both as the scorned wife and as a wild-eyed Valkyrie. The first thing that amazes me about this video is imagining the concept stage: when coming up with ideas of how this alter-ego character would look, evidently Bush saw this piece of art by Chris Achilleos and thought, “Hmm, I bet I could probably pull that off.” The second amazing thing is that she totally does pull it off. And there’s absolutely no hint of pandering and zero sign of the Male Gaze. It has the vibe of an artist completely in control of her work, her appearance, and her sexuality.

Also remarkable to a viewer first seeing it in 2015 is Bush’s interpretation of the song at the time. I doubt it was ever intended to be a “deep” song, but I would’ve taken it as an indictment of the husband for discarding his wife once she was no longer young and beautiful. Bush’s take on it was entirely from the woman’s perspective, though; the husband was mostly incidental but sympathetic. Bush describes the song as being about the wife’s self-doubt and paranoia bringing about her own downfall.

At the risk of reading too much into it, I think that’s a perfect metaphor for Bush’s career. There’s a recurring theme of empathy and love and human interaction throughout her work, but never a sense that she’s defined by anyone else. The songs are inescapably hers, and even when she’s playing a character, it’s a character that she created.

And the final thing I find fascinating about Babooshka is that it sounds so much like an ABBA song. On every album of hers that I’ve heard, the sound is all over the place, reminding me at times of ABBA, the Carpenters, The Rocky Horror Picture Show, Peter Gabriel, Kirsty MacColl, the Eurythmics, Pink Floyd, Queen, and probably dozens more that I’d recognize if I had more expansive taste in music. (Not to mention artists like St Vincent and Tori Amos, who’ve declared outright that Bush was an influence on their own work). But even when you can place it in a specific time period, it never sounds derivative or pandering.

If she were pandering, there’d be no explanation for Delius, which is beautiful and memorable and undeniably, unabashedly weird. Or for that matter, the concept album second half of Hounds of Love, which has tracks that are just as melodic as anything from her singles, not to mention an Irish reel that would’ve made me a lifelong fan if I’d only heard it when I was in the middle of my obsession with the Pogues. But it’s not at all concerned with being commercial, and only exists as a purely personal expression.

Even when that expression isn’t high-minded or cerebral, and just putting on costumes and goofing off with a bunch of friends and collaborators.

Stepping Out of the Page

So in other words: yes I said yes, I finally get it now. And what’s more, I wouldn’t have been able to get it in 1990. If for no other reason than I didn’t have Neil Gaiman to explain to me that the title track of The Sensual World was inspired by and referenced Molly Bloom’s soliloquy at the end of Ulysses, and I didn’t have easy access to Ulysses to get the significance of that.

And even if I had, I would’ve thought that the significance of that soliloquy is just a woman’s anachronistically frank and vulgar discussion of her own sexuality. I would have — and did — come to the vapid, simple-minded conclusion that it’s just about being “sex-positive.” But the whole significance is much more than that; it’s how the stream of consciousness is an unpunctuated torrent of the entirety of her experience: vaginas and religion and landladies and chocolates and paintings and gossip and cigarettes and cleaning semen out of sheets and castles and geraniums and breasts and flowers of the mountain. And how she says yes to all of it.

And I wouldn’t have had instant access to decades of a body of work, and all the articles and documentaries and interviews that interpret it and put it in context. So I couldn’t have fully appreciated how that soliloquy would be significant to someone who’d spent years pouring all of her work and energy into sharing her experience without much thought over whether it was commercial or even accessible but just that it was genuine and uniquely hers.

For almost every one of Kate Bush’s videos, I can instantly tell roughly when it was made, whether she was responding to the “look” of the decade or whether she was helping define it. This is mid-to-late 70s, that’s clearly mid-80s, that’s absolutely a product of the early 90s. The exception is The Sensual World, which is timeless. It could’ve been made last year, or it could’ve been dropped to Earth as a response to the Voyager disc.

I’d said that seeing it again recently was like vividly remembering images from a dream, and that’s still the case. But now that I’ve caught up with the people who’ve been lifelong fans of Kate Bush, the images are even more powerful. In that documentary, St Vincent describes Hounds of Love, Stefon-like, as “that thing where it burns like wildfire and then comes alive,” and Viv Albertine describes it like repressed sexuality, as if “the whole song’s on a leash, but you know it’s gonna escape and burst and run free.”

For me, it’s that tremendous moment of release in The Sensual World where she removes her headdress and is dancing barefoot in front of a field of flames. And seeing her confidently and effortlessly dance backwards down a moonlit path in a velvet dress is the most beautiful thing.

To Apple, Love Tailored Experiences

The Apple TV sure seemed like a good idea… at first!

Universal Apps from Apple TV announcement

On the surface (sorry), it seemed like Apple had made all the right decisions with its new product announcements yesterday. [For future anthropologists: new Apple Watches, a bigger iPad with a stylus, and Apple TV with an app store, and iPhones with better cameras and pressure-sensitive input. Also, the title of this blog post is a reference to something that happened a few months ago that nobody cares about now. — Ed.]

I’ve wanted an iPad with a stylus since before the iPad was even announced, so long ago that my image links don’t even work anymore! And I’ve been wanting a lighter laptop to use as purely a “personal computer” in the strictest sense — email, social media, writing, whatever stuff I need to get done on the web — and keep finding myself thinking “something like a MacBook Air that doubles as a drawing tablet would be perfect!” In fact, the iPad Pro is pretty close to what I’d described years ago as my dream machine but cheaper than what I’d estimated it to cost.

There’s been a lot of grousing online about how Apple’s acting like it invented all of this stuff, when other companies have had it for years. On the topic of pen computing, though, I can unequivocally say no they haven’t. Because over the years, I’ve tried all of them, from Tablet PCs to the Galaxy Note to the Microsoft Surface to the various Bluetooth-enabled styluses for iOS. (I’ve never been able to rationalize spending the money for a Cintiq, because I’m just not that great an artist). I haven’t tried the iPad Pro — and I’ll be particularly interested in reading Ray Frenden’s review of it — but I know it’s got to be at least worth investigation, because Apple simply wouldn’t release it if it weren’t.

Even if you roll your eyes at the videos with Ive talking about Apple’s commitment to design, and even if you like talking about Kool-Aid and cults whenever the topic of Apple comes up, the fact is that Apple’s not playing catch-up to anyone right now. They’ve got no incentive to release something that they don’t believe is exceptional; there’d be no profit in it. The company innovates when it needs to, but (and I’m not the first to say it): they don’t have to be the first to do something; they just have to be the first to do it right. And they’ve done exactly that, over and over again. The only reason I may break precedent and actually wait a while to get a new Apple device is because I’m not convinced I need a tablet that big — it’d be interesting to see if they’ll release a pen-compatible “regular-sized” iPad.

And if I’ve been wanting a pen-compatible iPad for almost a decade, I’ve been wanting a “real” Apple-driven TV set-top box for even longer. The first time I tried to ditch satellite and cable in favor of TV over internet, I used a bizarre combination of the first Intel Mac mini with Bootcamp to run Windows Media Center, a Microsoft IR remote adapter, a third party OTA adapter, and various third party drivers for remotes and such, all held together with palm fronds and snot. I’ve also tried two versions of the “hobby” Apple TV, relics of a time when Apple was known for glossy overlays, Cover Flow, and an irrational fear of physical buttons. Basically, any update would’ve been welcome.

But the announcement yesterday was a big deal, obviously, because they announced an App Store and an SDK. Which turned it from “just a set-top box” into a platform. That’s as big a deal for customers as it is for developers, since it means you don’t have to wait for Apple to make a new software release to get new stuff, content providers can make their own apps instead of having to secure some byzantine backroom deal with Apple to become a content channel, and some developers will come up with ways to innovate with the device. (Look to Loren Brichter’s first Twitter client as a great example of UI innovation that became standard. Or for that matter, Cover Flow).

And for games: I don’t think it’s an exaggeration to say that the iOS App Store has done more to democratize game development than anything, including Steam as a distribution platform and Unity as a development tool. Whether it was by design or a lucky accident, all the pieces of device, software, market, and audience came together: it was feasible to have casual games ideally played in short bursts, that could be made by small teams or solo developers, and have them reach so many millions of people at once that it was practical and (theoretically) sustainable.

I hope nobody expects that the Apple TV will become anywhere near as ubiquitous as the iPhone (or even the iPad, for that matter), but still: opening up development creates the potential for independents to finally have an audience in the console game space. It’d be like the Xbox Live Indie Games and XNA, if all the games weren’t relegated to a difficult-to-find ghetto separate from the “real” games. Or like the Ouya, if they’d made a device that anyone actually wanted to buy.

Game developers love saying that Apple doesn’t care about games and doesn’t get how games work — as if they’d just inadvertently stumbled into making a handheld gaming device that was more popular than Nintendo’s and Sony’s. You could look at the new Apple TV the same way, and guess that while trying to secure deals with big content providers and compete with Amazon or “Smart” TV manufacturers, they’d accidentally made a Wii without even trying.

There’ve been enough game-focused developments in the SDK, and the company marketing as a whole, that suggest Apple really does get it. (Aside from calling Disney Infinity “my favorite new Star Wars game”). But there’s a couple of troubling things about the setup, that suggest they expect everything on the TV to play out exactly the same way that it has on smartphones and tablets.

First is that the Apple TV has a heavy reliance on cloud storage and streaming of data, with a pretty severe limitation on the maximum size of your executable. They’ve demoed smart phone games on stage (Infinity Blade) that were 1 GB downloads, so it’s not inspiring to see a much smaller limit on downloadable size for games that are intended to run on home theater-sized screens. Maybe it’s actually not that big a problem; only developers who’ve made complete games for the Apple TV would be able to say for sure. But for now, it seems to suggest either very casual games, or else forcing players to sit through very long loading times. The latter’s been enough of a factor to kill some games and give a bad reputation to entire platforms.

Second is the emphasis on universal apps. They mentioned it at the event and just kind of moved on. I didn’t really think much of it until I saw this from Neven Mrgan:


You could take the most mercenary possible interpretation of that, which is what people always do once the economics of software development comes up: “Big deal! Having one app is what’s best for consumers! What’s best for consumers always wins, and it’s the developers’ responsibility to adjust their business model to enable that!” Also “Information wants to be Free!!!”

Except what’s best for consumers is that the people making great stuff can stay in business to keep making great stuff. And we’ve already seen on iOS exactly what happens when developers “adjust their business models” to account for a market that balks at paying anything more than 99 cents for months to years of development. Some big publishers (and a few savvy independents, like Nimblebit) came in and made everything free-to-play with in-app purchases. Maybe there is a way to make a free-to-play game that doesn’t suck (and again, Nimblebit’s are some of the least egregious). But I can’t see anybody making a believable case that the glut of opportunistic games hasn’t been a blight on the industry. I was out of work for a long time at the beginning of this year, and it was overwhelmingly depressing to see so many formerly creative jobs in game development in the Bay Area that now put “monetization” in the job title.

Believe me, I’d love it if one of these publishers went all-in on the Apple TV, and then lost everything because they didn’t take into account they were pandering to a different audience. But that’s not what would happen, of course. What would happen is that a couple of the big names would see that they can’t just fart out a “plays on your TV screen!!!” version of the same casual game and still make a fortune off of it, so they’d declare the entire platform as being not worth the effort. And then smaller studios who are trying to make stuff that takes specific advantage of the Apple TV “space” will be out of luck, because there are no big publisher-style marketing blitzes driving people to the platform. You need a combination of big names and smaller voices for a platform to work: again, see XBLIG.

It just seems as if there’s no recognition of the fact that there’s a lot more differentiating a game you play on your phone and one you play on your television than just the screen size. It seems especially tone-deaf coming from a company like Apple, who’s made a fortune out of understanding how hardware and software work together and what makes the experience unique. (Part of the reason that iOS has had so much success is that they didn’t try to cram the same operating system into a laptop and a smartphone).

At least the games on display showed evidence that they “get it.” The game demoed by Harmonix took advantage of the stuff that was unique to the Apple TV — a motion-sensitive controller and (presumably) a home theater-quality audio system. And even Crossy Road, which would seem like the worst possible example of shoveling a quick-casual game onto a TV screen and expecting the same level of success, showed some awareness of what makes the TV unique: someone sitting next to you playing the game, or at least having other people in the room all able to see something goofy happening on your screen.

I haven’t seen enough about tvOS to know if Universal apps are actually a requirement, or just a marketing bullet point and a “strong recommendation” from Apple. (Frankly, since I’m trying to make an iPad-only game, I’m ignorant of the existing requirements for iOS, and whether they restrict developers from releasing separate iPad-only or iPhone-only versions of the same software). So maybe there’ll be a market for separate versions? And somehow, magically, a developer will be able to release a longer, more complex game suitable for a home entertainment system, and he won’t be downvoted into oblivion for being “greedy” by asking more than ten bucks for the effort.

And there’s been some differentiation on the iPad, too. Playing XCOM on the iPad, for example, is glorious. That’s not a “casual” game — I’ve had sessions that lasted longer than my patience for most recent Xbox games — but is still better on the iPad because you can reach in and interact with the game directly. I could see something like that working — I’d pay for a game with lower visual fidelity than I’d get on Xbox/PS4/PC, if it had the added advantage that I could take it with me and play on a touchscreen.

So I could just be reactionary or overly pessimistic. But it’s enough to take what first seemed like a slam-dunk on Apple’s part, and turn it into an Ill Portent for The Future Viability Of Independent Game Development. As somebody who’s seen how difficult it was to even make a game in The Before Times, much less sell one, the democratization of game development over the past ten years has been phenomenal. And as somebody who’s finally realized how much some game studios like to exploit their employees, it’s incredible to be in an environment where you can be free of that, and still be able to realize your passion for making games.

The reason I first wanted to learn programming was being at a friend’s house, watching them type something into their VIC-20, and seeing it show up on screen. It was like a little spark that set me down a path for the next 40 years: “Wait, you mean I can make the stuff that shows up there, instead of just sitting back and watching it?” It’d be heartbreaking to see all the potential we’re enjoying right now get undermined and undone by a series of business decisions that make it impractical to keep making things.

Worst case, it’ll be another box that lets me watch Hulu. I was down to only eight.