Where Did We Go Wrong?

Ongoing experiments in social media

When I first heard about Twitter, I — along with a lot of other luddites — thought it sounded like the stupidest thing imaginable. But I decided to give it a shot, follow a bunch of people from different areas of interest, and started to get a real sense of community. Within a few months, I’d gotten to depend on it, not just as a social outlet and as my primary source of news, from global to hyper-local.

And then, after using it for a while longer, I realized that I’d been right the first time. There’s something rotten about Twitter, and it was there long before Musk bought it, and long before the “alt-right” discovered it as the perfect harassment platform.

When the tech media started discovering Mastodon, I heard a lot of people say that there’s no way it would become a Twitter replacement. I think it was meant to be a slam, but I’d say now that I agree, and that it’s a good thing.

Continue reading “Where Did We Go Wrong?”

You Can Quote Me On This

Mastodon, Quote-Tweets, and why some people (like me) care a lot about how they’re used

Thanks to my friends having the forethought to set up a friendly Mastodon server for their podcast community, I’ve gone all-in on the platform1Well, and this blog, too, obviously. But for idle thoughts too mundane even for this blog to tolerate.. I really like it so far; it’s got almost everything I wanted from Twitter — “almost” because the chances of getting into a conversation with a non-tech celebrity I’m a fan of are about zero2Whereas on Twitter, I got responses from Neko Case twice! — and eliminates most of the things I hated about Twitter, aspects that were present long before some asshole bought the platform and made it impossible to keep using it in good conscience.

So I’ve gotten invested in Mastodon and how it functions. Instead of just re-inventing Twitter, this seems like the chance to take the lessons we’ve learned from other social media, and do it right this time. But it’s not just a question of technology, or even ownership, but of social engineering: being more mindful of how we use social media and what we’re choosing to put out there.

In particular, there’s the issue of quote-posts or quote-toots, or QTs so I don’t keep having to say “quote-toot.” They were frequently used on Twitter, but were deliberately not implemented in Mastodon, because of the potential to be used for harassment.

It’s a frequent topic of conversation on Mastodon, from people insisting that it should obviously be implemented, and people are going to do it anyway, so what’s the problem?

And there’s a definite undercurrent of arrogance that suggests the higher-profile proponents are obviously thinking of it as a publishing platform more than any sense of community. Objections to the feature are just dismissed as overblown or unimportant. There’s an automatic assumption that those of us don’t want QTs on Mastodon have to come up with a satisfying justification for why the feature shouldn’t exist, but there’s no sense that proponents are obligated to justify why the feature is necessary.

Personally, I’m not even completely opposed to QTs. If implemented correctly and used responsibly, they could be fine. My annoyance comes from people not taking the time to stop and think about these things and their implications, or how they fit in with the “core values” of the platform and what other users are trying to achieve. Anybody voicing an opinion on this one way or the other needs to at least demonstrate that they’ve put some thought into what QTs actually are, what they’re doing in a social setting, and how they will subtly or not-so-subtly affect how people interact with each other.

Continue reading “You Can Quote Me On This”
  • 1
    Well, and this blog, too, obviously. But for idle thoughts too mundane even for this blog to tolerate.
  • 2
    Whereas on Twitter, I got responses from Neko Case twice!

Long Live the Smarm

The Queen is dead. Now is the time of warring edgelords.

Update 09/14/22: I’d hope it wouldn’t need to be said, but “showing grace after someone’s death” doesn’t include “screwing over people needing essential services in a display of extended performative wealth-hoarding.” This post is seeming increasingly tone-deaf the more I hear about how England is handling the mourning rituals, but I was exclusively talking about people debasing themselves on Twitter either for yuks or narcissistic self-righteous indignation. Also: nothing in this post applies to the recent death of Ken Starr. Make fun of him all you want, because that guy was a really irredeemable bag of shit.

When I wrote about Michael Schur’s book How to Be Perfect, I mentioned how I was disappointed that he’d chosen to praise two of my least favorite essays ever published on the internet. One of those was John Scalzi’s probably-well-intentioned but disingenuously tone deaf analogy for the concept of “privilege” as playing a video game on the easiest difficulty. The other was an absolute piece of garbage from Gawker1Redundant? titled “On Smarm.”

I’m not a good enough writer to describe the visceral reaction I had to reading that essay; it was as if the concentrated nugget of evil from Time Bandits had been converted to HTML and was actually being praised online by seemingly dozens of people who should’ve known better. If I remember correctly, my eyes widened and I impotently screamed and pointed at the obscenity, like Carrie White’s mom, then ran away and took a shower, knowing that I’d never be able to wash myself clean of the stain of it.

Ostensibly, the essay was about the tendency of politicians, pundits, corporate media, and “authority figures” in general to stifle criticism or opposition via insincere, overwrought tone-policing. We saw a perfect example of this recently, when Beto O’Rourke called out some trash in the audience who was laughing after O’Rourke was talking about the Uvalde children who’d been murdered in their school. Plenty of people — including NPR, in their desperate attempt to both-sides everything — instantly began deflecting from the epidemic of gun violence in the US, instead running to their fainting couches and worrying whether it were appropriate for a prominent gubernatorial candidate to be using the f-bomb2My self-censoring might seem like a hypocritical example of smarm, but the fact is simply that I promised my mother I’d stop using the word in public.

On the surface, that sounds fine, even if it’s too shallow to qualify as significant social commentary. That kind of smarminess is abundantly obvious, and it doesn’t actually fool anyone who isn’t already eager to be fooled. Call out that nonsense if you want, but it’s not a genuine threat because nobody of substance is actually buying it.

The problem is that that trivially-true observation was used as the vehicle for defending the awful mission statement of Gawker, the candy coating wrapped around the poison pill that had passed undigested through Nick Denton’s intestinal tract. The morally bankrupt notion of shittiness as a public service. The disingenuous idea that being gossipy, crass, petty, bitter, hypocritical, and narcissistic is okay as long as you can make the case that you’re “punching up.”

If you had the misfortune of being on Twitter last Friday and the following weekend, seeing the reaction to the death of Queen Elizabeth II, you’d quickly see that that although original Gawker is dead3Apparently a version of the former site has been brought back to life by people almost as adept at being self-righteously shitty and awful? I haven’t read it, and won’t., that mindset is still alive and in full force.

In both varieties, too. There was the predictable wave of smarmy and insincere In Memoriams, from politicians eager to distract attention away from whatever they’re doing wrong at the moment, and from corporations too eager to show how reverential they are. Even here in the US, they were excessive, so I can only imagine what a barrage of insincerity it was for people in the UK and other places that still have the Queen on their money. And in addition to being predictable, they were pretty transparent.

What stood out a lot more to me, though, were the people practically stumbling over themselves to be shitty about the death of a much-beloved woman. A lot of it was dumb and obvious but probably harmless; I’ve fallen hard off the wagon and have almost returned to my 2009-era levels of posting nonsense on Twitter, but even I’m knew enough not to make hacky jokes about chess, bees, or Freddie Mercury. But I was more struck by how many people were doing the virtual equivalent of dancing on her grave.

And then, the part that reminded me of Gawker and “On Smarm”: throwing a days-long shit fit when they got called out on it. They will not be tone-policed by royalists! “Don’t speak ill of the dead” is not just wrong, it is anti-journalism! Repeated comments on how this was a joyous occasion for Irish or Scottish people. And of course, all the variations on how people of European descent have no right to be telling “folks”4My new least-favorite thing on the internet is preachy, self-righteous types using “folk” instead of “people” as some bizarre signifier of community or identity, ignoring that the word “folks” has decades of connotations that are even more othering, making it sound like you’re talking about people from Appalachia or The Shire. from places that had been colonized by Europeans how they should be reacting to the death of their oppressor.

Now, I’m an extremely white guy from the United States, but I’m pretty confident in saying that one thing that unites us across cultures and nations is that talking trash about someone who just died is petty and shitty. Different people have their own ideas of when it’s justified, but that doesn’t make it any less petty or shitty.

I can think of at least 5 Americans alive today for whom news of their death will fill me with glee, because death is the only way they’ll ever face any consequences for all the terrible stuff they’ve done. I fully admit that I did feel satisfaction and vindication hearing of Rush Limbaugh’s death, for instance, and also Ronald Reagan’s, and being reminded that there were probably, somehow, people who loved them in life and were sad at their passing didn’t affect that feeling of petty satisfaction at all. That doesn’t make it any less petty, though; it’s just a level of personal shittiness on my part that I’m willing to accept and won’t try to defend by claiming it’s justified or at all righteous.

As adults, we can acknowledge that two things can be true at the same time: 1) The Queen served as a kindly, grandmotherly face on centuries of atrocities done in the name of the British Empire; and 2) That kindly, familiar face of stability was hugely important to millions of people. Of course it’s true that the image of a nice old woman who loved her dogs and had a pretty good sense of humor, is inseparable from the history of stolen wealth, colonialism, and scandals, both decades old and recent. Both as a figurehead, and as someone who was complicit to one degree or another.5I am one of those people who believes that how complicit you are in wrong-doing makes a huge difference. Are you actively making people’s lives worse, are you knowingly benefitting from it and refusing to make reparations, or are you just a representation of it? But if they’re inseparable, that means that you have to accept both.

There was a video going around on Friday, in which a member of the royal guard was telling a pretty charming story about the Queen’s sense of humor: An American tourist encountered them in passing, but didn’t recognize the Queen. When the guard said that he’d met the Queen before, the tourist got excited and asked Elizabeth to take a picture of the two of them together. It’s cute, but it’s also an example of how even a story intended to humanize her is entirely based on her being the Queen of England. The role was universally recognized even if the person wasn’t, and the vast majority of people in the world will never know the difference, or even if a tangible difference exists. (How much of a unique person is left when you’re born into a role and spend your entire life publicly serving it? Do I need to watch The Crown to know the answers?)

Anyway, for anyone trying to turn this into a teachable moment about the history of colonialism, imperialism, stolen crown jewels, or any of the other evils from a century’s worth of world history: your meme of the Queen meeting Margaret Thatcher in Hell ain’t it. Neither is your video of Irish dancing in front of Buckingham Palace. But then, they’re not truly intended to be teachable moments; they’re narcissistic displays that people try to dress up as being more righteous when they get called out for being vulgar or lacking grace.

That’s the part that reminded me so vividly of “On Smarm” and Gawker in general: the lengths to which people will defend their right to be shitty and awful. Mocking the death of an elderly, much-beloved woman is not just my right, but my duty! The thing I found most repulsive about the whole mentality of “On Smarm” was that it was so deeply cynical to the point of nihilism; it didn’t just reject insincere displays of false compassion or sympathy, it refused to even entertain the idea that public compassion or sympathy — or just plain good taste and grace — could ever be genuine. All of Gawker Media was rooted in the assertion that every one of you is as awful as we are, you’re just not brave enough to admit it! It’s an ethos that somehow manages to be more repulsive than Randian Objectivism, because it so frequently sucks in people whose opinions I actually care about.

And that’s not even getting to the rancid, rainbow-colored oil slick of hypocrisy floating on the top of it: it’s its own kind of smarminess, evident in the sheer outrage at being tone-policed by white Europeans who can’t understand the history of oppression that’s embedded in shitty, opportunistic mockery of somebody who just died. It’s still disingenuous self-righteousness, but at least the people who are publicly performing their Reverence For Her Majesty as a distraction are aware at some level that they’re being disingenuous.

Personally, I’m anti-imperialist (both in British and American flavors), and I think the monarchy should be abolished. Those aren’t in any way bold or controversial claims; I think they’re just table stakes for being a decent person in the 21st century. Which is what the whole question of “grace” ultimately comes down to: being a decent person. You don’t have to respect the United Kingdom, or the monarchy, to still be able to respect the people who are affected by it and who lived their entire lives surrounded by it. A lot of people, including myself, could be better educated about the details of history of imperialism and colonialism, not from the point of view of the colonists, but of the people affected.6In college I took a course in African History, because it was a topic I knew almost nothing about. It was essentially a course about Europeans, with almost nothing about the cultures apart from how they were affected — or outright devastated — by colonialism. But there’s a time for that, and it isn’t when someone is really sad because it feels like their grandma just died. Even if they’re sad because they’re focusing on the positive aspects of a public persona, and choosing not to focus on the bad while they’re in mourning. If you’re the type of person to stand outside of a funeral and shout “Your grandma is in Hell because of the British Raj!” there’s a word for you, and it’s not “activist” or “educator.”

  • 1
    Redundant?
  • 2
    My self-censoring might seem like a hypocritical example of smarm, but the fact is simply that I promised my mother I’d stop using the word in public
  • 3
    Apparently a version of the former site has been brought back to life by people almost as adept at being self-righteously shitty and awful? I haven’t read it, and won’t.
  • 4
    My new least-favorite thing on the internet is preachy, self-righteous types using “folk” instead of “people” as some bizarre signifier of community or identity, ignoring that the word “folks” has decades of connotations that are even more othering, making it sound like you’re talking about people from Appalachia or The Shire.
  • 5
    I am one of those people who believes that how complicit you are in wrong-doing makes a huge difference. Are you actively making people’s lives worse, are you knowingly benefitting from it and refusing to make reparations, or are you just a representation of it?
  • 6
    In college I took a course in African History, because it was a topic I knew almost nothing about. It was essentially a course about Europeans, with almost nothing about the cultures apart from how they were affected — or outright devastated — by colonialism.

The Ineffable Subtleties of “Ow! My Balls”

I get annoyed with a vlogbrother and defend a movie I thought was just okay

Well, I’ve already broken my pledge several times over: not only did I start a new Twitter account, but I’ve gotten to reading it habitually and even actually writing replies to strangers1But deleting them quickly afterwards. Maybe there’s still hope?.

What set me off today was this tweet from Hank Green:

The movie “Idiocracy” is, at minimum, implicitly pro-eugenics.

And I mean, come on, man. It’s tough because I usually like (and occasionally really like) Hank and John Green; and I think they’re generally a force for good on the internet, both for helping make complex topics accessible, and for encouraging kindness, charity, and perpetual learning.

But that’s such a shallow and disappointing take that it seems like it was carefully formulated to irritate me as much as possible. It’s not even that I’m a particularly big fan of Idiocracy — I thought it was fine but not particularly deep or memorable past its core premise. Which, it pains me to have to explain, was satire. It’s as much “pro-eugenics” as A Modest Proposal is “pro-infanticide” and “pro-cannibalism.” And it’s not even that subtle about it.

We shouldn’t have to be explaining satire to grown-ups. And of course, I realize that “No but you see it’s actually satire!” has become the go-to defense whenever anyone says or makes something that makes them look like an asshole. But just because it’s been mis-used so often is no reason to throw out the concept altogether.

Maybe what’s needed is the YouTube IDIOCRACY EXPLAINED! approach, complete with an attention-grabbing thumbnail with big red circles and yellow arrows2I tried my best, but couldn’t figure out how to make the arrows with the latest version of Photoshop before I lost interest in the gag. I guess I shouldn’t have gotten my graphic design degree from Costco.. How about we start with the opening, which sets the tone and makes one thing clear almost immediately: The movie is making fun of everyone.

The “High IQ” couple isn’t being put forward as a role model. They’re self-centered and petty. As the woman explicitly says that they don’t want to have children with “the market” the way it is, they’re shown against a background of increasingly fancier and more expensive homes. (While the children of the “Low IQ” couple lives in chaos and disarray). To spell it out: it’s a criticism of socioeconomics, not genetics. One couple is too focused on accumulating wealth for themselves to be willing to devote any of that wealth to children.3On IMDb, at least, they’re credited as “Yuppie Wife” and “Yuppie Husband,” and if you believe that Mike Judge was pro-Yuppie and was advocating having more of them in society, then I don’t know what to tell you apart from “watch literally anything else that Mike Judge has made.”

And even if you can’t let go of the over-literal extremely-online mindset, and are still convinced that Mike Judge and Etan Cohen were sneaking in a sincere pro-eugenics manifesto and disguising it as a silly comedy, then you could consider the entire rest of the movie. The whole story is about a thoroughly average person who’s forced to make an effort for the first time in his life, because he’s held up as superior to everyone else by a completely arbitrary metric. The movie makes fun of the whole concept of intelligence and wealth as signifiers of actual aptitude. It’s chastising early 2000s society for racing to the bottom, settling for the least amount of effort, and appealing to the lowest common denominator.4And yes, we are all aware that we saw exactly that play out in the late 2010s, everybody can stop saying “it was a documentary!” now.

I hate it when people act like there’s one correct interpretation of any piece of art, but I mean, again: this movie is not that subtle. Which is why it’s so frustratingly ironic to see this movie in particular hit with such a shallow and dismissive analysis, since it’s so stridently criticizing us all for settling for less. It shows what happen if we keep lazily declining to engage with anything of depth, until we’re all buried under trash.

There are a couple of reasons this set me off. First is that I spend too much time online. I’ve seen too many examples of people gradually (and eagerly) descending into idiocracy, since so much of online media favors immediate engagement over thoughtful consideration. Blog posts like this one are an anachronism, and I feel very silly as I’m writing it, because it’s just not cool in 2022 to be devoting so much time to anything so inconsequential.

Instead, they’ve been replaced by explainers: web articles or video essays that aim to take everything from topics in social or natural sciences to the current most-SEO-friendly movie release, pick all of the meat off of them, and encapsulate them into an easily-digestible conclusion. The Green brothers in particular were among the first to popularize the short-and-accessible explainer format, and in a lot of cases, I think they’re great. I appreciate it when someone can take a complex topic and present it so that understanding the basics is easily accessible without scolding me for not already understanding the basics and still acknowledging that there’s much more complexity than can be easily explained.

But while it’s great for sciences and history, it’s just deadly for art and entertainment. The art itself is the explainer.

Which leads to another thing that set me off: I’m wondering how much I’m culpable in all this, since I tend to be such a proponent of accessible media. (By which I mean accessible to interpretation. I’m also a strong believer in accessibility for people with disabilities, but I’m not as vocal a proponent of it as I probably should be). I love writing about the MCU and Star Wars — and invite anyone who claims it’s shallow or juvenile to piss right off — because it’s fun and easy. They’re designed to be widely accessible but still have just enough depth that they don’t end up feeling like empty calories.

So I’m all over it when someone wants to point out easter eggs or bits of lore that I’m not enough of a True Superfan to have recognized, but I can feel the soul seeping out of my body when that turns into “explaining” the show or the movie itself. Especially when it just restates the most obvious interpretation of a work. Usually, this stuff isn’t all that ambiguous, so all you’re doing is restating the obvious in a much less elegant way.5One of the things I like about Nope is that it throws out a bunch of ideas and fits them altogether, leaving the overall theme just ambiguous enough to allow for multiple interpretations. I saw somebody had made an explainer video for the shoe in the Gordy’s Home scenes, which just restated the most obvious things then insisted that everybody else was wrong and that this was the “real” meaning of the scene. Don’t be like that guy.

I guess like everyone else who’s ever entered middle age and seen the culture being increasingly driven by younger people, I can’t escape the anxiety that they’re doing it all wrong and ruining everything. I’m generally for the resurgence in earnestness and rejection of unnecessary irony, but not if it’s at the expense of having everything dumbed down and over-simplified.

I get that there’s a lot more noise than there ever has been, and it’s increasingly hard to have patience for people who won’t just say what they mean. There’s a preponderance of people out there actively lying, obfuscating, and disingenuously arguing about things for malicious intent.6I really wish people would stop trying to engage with anyone complaining about women or marginalized people in media. Whether you’re trying to make a point or just dunk on them, you’re not accomplishing anything because they’re always being made in bad faith. All you’re doing by engaging is helping them make basic kindness and common sense seem like something still subject to differing opinions and debate. In fact, I spent some time wondering if Hank Green were pulling some kind of prank with his tweet, but a) that doesn’t seem like his style, and 2) it doesn’t really do anything with the idea, because there’s no twist apart from restating the satirical premise of the movie and calling it a “hot take.” (If that were indeed the “joke” then… okay I guess?)

But if it means that there’s no obligation to analyze a creative work at any level apart from what it says on the surface, and that there’s no obligation to consider whether your first interpretation might not be the one correct interpretation, then we’re heading towards shallower and shallower art. It starts with people believing that the “Twin Pines Mall” becoming the “Lone Pine Mall” in Back to the Future is some delightfully obscure easter egg that only a select few had picked up on. Continue for a few hundred years, and you get “Ow! My Balls!”7But on the brighter side: fewer thinkpieces and blog posts like this one!

  • 1
    But deleting them quickly afterwards. Maybe there’s still hope?
  • 2
    I tried my best, but couldn’t figure out how to make the arrows with the latest version of Photoshop before I lost interest in the gag. I guess I shouldn’t have gotten my graphic design degree from Costco.
  • 3
    On IMDb, at least, they’re credited as “Yuppie Wife” and “Yuppie Husband,” and if you believe that Mike Judge was pro-Yuppie and was advocating having more of them in society, then I don’t know what to tell you apart from “watch literally anything else that Mike Judge has made.”
  • 4
    And yes, we are all aware that we saw exactly that play out in the late 2010s, everybody can stop saying “it was a documentary!” now.
  • 5
    One of the things I like about Nope is that it throws out a bunch of ideas and fits them altogether, leaving the overall theme just ambiguous enough to allow for multiple interpretations. I saw somebody had made an explainer video for the shoe in the Gordy’s Home scenes, which just restated the most obvious things then insisted that everybody else was wrong and that this was the “real” meaning of the scene. Don’t be like that guy.
  • 6
    I really wish people would stop trying to engage with anyone complaining about women or marginalized people in media. Whether you’re trying to make a point or just dunk on them, you’re not accomplishing anything because they’re always being made in bad faith. All you’re doing by engaging is helping them make basic kindness and common sense seem like something still subject to differing opinions and debate.
  • 7
    But on the brighter side: fewer thinkpieces and blog posts like this one!

Bargaining

Notes from a post-Twitter lifestyle (again)

I deleted my second Twitter account over a week ago, in response to the news that the Twitter board had agreed to sell the company to Elon Musk. Which makes it the second time an obnoxious Trump supporter drove me off the platform.

Actually, the buy-out was the kick in the pants I needed to leave, but it was getting increasingly clear how much I dislike Twitter long before the news. I had started to realize that I was checking it unnecessarily — just to see what was “news” — and worse, that I was finding myself having vehemently strong opinions about stuff that just doesn’t matter. And being cranky and irritable to people for no reason. The Twitter algorithm seems designed to keep me upset and on edge.

It’s kind of a drag, because I was looking forward to having a read-only account so I could check in on responses to Sasquatchers when it comes out on the Playdate next month. I have to admit it was a lot of fun to see reactions to the Playdate during its launch week, after following the work the Panic team has mostly-secretly been doing on the device and its development environment for years. I liked the idea of Twitter not as a social platform, but as a crass promotional platform.

Which honestly is just another excuse. There are plenty of independent developers who are plenty successful without having social media accounts. The “I need this account for work” idea is pervasive, but it’s not actually true for most people who don’t work directly in social media or PR.

And I saw so many of those types of excuses in my timeline that it made me kind of sad. It reminded me too much of all the times I’ve quit smoking, and my brain starts coming up with tortured justifications why it wouldn’t be that bad if I just bought a pack and had only one. On Twitter, I kept seeing these variants:

  • I need this account for my career: I definitely understand how this seems true, but I’m increasingly skeptical that it’s actually the case. I’m in no position to judge, because I’ve most often worked on projects that have other people dedicated to promoting them (and sometimes promoting me as part of it). But if it is true, then it seems like it should be the perfect spark to try and build a community that isn’t so dependent on a company you have absolutely no control over.
  • Wait-and-see: “I’ll wait to see if the deal goes through.” Or “I’ll wait to see if Musk institutes any changes.” Or “If he allows Trump back onto the platform, then I’m gone,” etc.
  • Much ado about nothing: “It’s not actually going to change anything.” I saw a ton of these, and I couldn’t tell if they were supposed to be reassuring, or scolding people for making a big deal? In any case, if one of the crappiest billionaires alive takes over a communications platform to take it private, and you can’t tell the difference, then maybe that’s a sign it’s already a terrible place to be?
  • You’re no better than the rest of us: “None of you threatening to quit because of Musk will actually leave.” “You’re going to be back here within a month,” etc. These were the most pitiful, because they sound the most like dependence. After all, even cynical, performatively self-aware dependence (“This place is garbage, but it’s my garbage!”) is still dependence.

Last time, I tried both Microblog and Mastodon to “ween” myself off of Twitter. Microblog isn’t for me, and I’m skeptical that Mastodon is, either. (Although I do have a Mastodon account for anyone interested). I kind of hate to say it, but I think Mastodon really is Twitter without “the algorithm,” which makes it just as pointless as I first thought Twitter was back in 2007.

For now at least, Instagram remains my deeply problematic centralized social media platform of choice. It’s astounding just how much they’ve abandoned the pretense of providing a service to users of the platform, but still, it’s nice to have the outlet. Until that becomes intolerable, I’ll keep cranking away on this blog, hoping that RSS feed readers and Web 2.0 come back in a big way.

Edited for hypocrisy: I got tired of having to run the gauntlet of sign-up requests every time I followed a twitter link. My re-activated account is read-only for real this time, so if you see me commenting or doing memes or getting in arguments and such, feel free to mock me ruthlessly.

The Sasquatchers

My favorite team of paranormal adventurers

I’ve mentioned before that I’m doing a game for the Panic Playdate — coming soon! — but I don’t think I’ve ever mentioned the inspiration for it.

A few years ago, I fell down a rabbit hole watching YouTube videos, and I landed on the most fascinating channel. It seemed to be a couple of guys (and an at-the-time unknown photographer) wandering through the woods at night, trying to get photos of a Sasquatch.

And I mean, that’s not all that weird on its own. Where it got weird is that I actually saw a Sasquatch, in the background of their video! At first I figured it must be one of those elaborate prank videos, or some kind of demo reel for a CGI compositing house or something. But to be honest, it didn’t look good enough to be either one of those. The way it looked uncannily real and not-real — plus the fact that they were so nonchalant about it — convinced me it could only be the real thing!

Anyway, the team is called The Sasquatchers. Their channel seems to have disappeared, and the website was down forever until they got some kind of legal issues squared away, but it’s back up as of the time I’m writing this. They’ve been doing this kind of work for years, but never got the recognition I think they deserve. It’s a shame that the only photo of theirs that still exists online is the one I put at the top of this post, which they said was a rare double-sighting of the Willow Creek Wailer.

Their videos are (or were, anyway) full of never-before-seen animal sightings, but the guys are completely nonchalant about it. They’re all about media impressions, and getting them in selfies and such. But they’ve had some funding issues on top of (and because of) the legal stuff, so they’re eager to get a little bit more exposure so they can get out and start spotting more dangerous and more obscure cryptids.

I had just left Telltale and had some free time, so I decided I had to meet the guys. I was able to talk with them for a little bit when they were in San Francisco researching some kind of video project1I could never figure out whether they were saying that the Zodiac Killer was a Sasquatch himself, or just that he used Sasquatches to move around silently and commit murders, and they seemed stoked to have a video game made about their adventures! It’s a simplified and highly-abstracted version of the real thing, of course, but I’m hoping that if people enjoy the game, they’ll be interested in checking out the team’s real work.

Oh yeah one thing: I don’t know how it happened, but somehow they got the impression that I’m a famous game designer at an AAA studio and had a team of dozens of people working on the game. So everybody just be cool and don’t tell them, okay?

  • 1
    I could never figure out whether they were saying that the Zodiac Killer was a Sasquatch himself, or just that he used Sasquatches to move around silently and commit murders

Boba Fett and the Road Less Traveled

Reconsidering both The Book of Boba Fett and how “sophisticated” Star Wars needs to be

It’s only been a month since the finale of The Book of Boba Fett, which would be too early to go back and give it a second look, except Ben Chinapen made a pretty good video essay about the series, presenting Boba Fett’s character arc mostly independent of everything else in the show.

The video does exactly what it sets out to do: recap Boba Fett’s story in chronological order, to call out how the series managed to take what was essentially a dozen or so lines of dialogue and a cool suit, and turn it into an actual character with real motivations and such. There aren’t any shockingly surprising new takes in the video, but that isn’t a knock on the video at all. It’s just an acknowledgement that the series wasn’t really about ambiguity or layers. All of its meaning was floating there on the surface, keeping all the action scenes from being purely empty calories.

It did make me realize, though, that the series did have a little more thematic resonance than I originally gave it credit for. My main complaint about The Book of Boba Fett stands, and it’s the most obvious one: the series just suddenly loses interest in its main character and goes back to making The Mandalorian. I was willing to give the fifth episode (“Oops, All Mandalorians”) the benefit of the doubt, since it didn’t just continue Din Djarin’s story, but established it as a parallel for Boba Fett’s. But I thought the sixth episode (“How Grogu Got His Groove Back”) was a complete non-sequitur.

It seemed like the series hadn’t just lost interest in Boba Fett’s story, but stopped it completely to show us some fan-favorite characters doing predictable stuff that could’ve happened off-screen. Meanwhile, the Mandalorian chose a new spaceship completely inappropriate for bounty hunting, as if the filmmakers knew the scene they wanted to see at the end (and the toys they wanted to sell) and worked backwards from that, instead of giving it any genuine motivation. Worst of all, the ultimatum Luke Skywalker presented at the end seemed hypocritical and completely out of character; he’d seen more than anyone else how the old Jedi rule of “no attachments” always ended in tragedy, so why was he making Baby Yoda choose one or the other?

But if you reconsider that episode as an intentional part of The Book of Boba Fett instead of a clumsily-shoehorned interlude, it makes more sense. It’s yet another story of a character who has a path clearly laid out for him, but he chooses to define his own path and his own clan. Grogu didn’t even have a name until midway through the second season of The Mandalorian; until then, he was “Baby Yoda.” So of course he was going to end up following the same path as Yoda, training to be a powerful Jedi. (How that would fit in with the timeline of The Last Jedi was going to be an interesting exercise for the writers). I felt like the series was showing me stuff I already knew was going to happen, because it hadn’t even occurred to me that it could play out a different way.

In that context, the end of that episode feels less like an ultimatum, and more like Luke offering the freedom of choice. And the character appearances are meaningful, instead of just being cameos for the fans: Ahsoka chose to leave the Jedi and make her own way, while Luke speaks more like he’s doing what he thinks he’s supposed to do instead of “trusting his instincts.” Even Din Djarin’s new spaceship feels less forced; he wasn’t choosing a ship for being a bounty hunter, because he was redefining himself as something else. He didn’t need room for bounties, but for his new family.

To be clear: I still don’t think it all works. I think the series would’ve been a lot stronger if they’d spent that time developing the characters and plot threads they left hanging, like the Rancor, and Jennifer Beals’s character, and the Hutts, and the other crime lords, and Fett’s history with Cad Bane and other bounty hunters. But at least I can understand why they thought the two episodes of The Mandalorian fit into The Book of Boba Fett without being completely arbitrary.

It seems like I spend a lot of time insisting that Star Wars works best when it doesn’t try for nuance or layers or ambiguity, and just sticks to Good Guys vs Bad Guys with spaceships and lasers. The reason the stories resonate isn’t because they’re complex or open to multiple interpretations, but because they take straightforward ideas about morality and free will, and present them in interesting ways. It’s best kept in the realm of parable, which is why it feels facile to look for too much in the way of philosophy or thematic complexity, and why it feels tone deaf to try to work in too much moral ambiguity or “mature” content. But that’s also why I refuse to just reject all of it as being frivolous or just for kids; having all of the “meaning” floating on the surface, ready for interpretation, is a feature instead of a bug. The simplicity and accessibility makes it universal, not necessarily juvenile.

This is a franchise that has more archetypes than fully-realized characters — outside of the comics and some of the animated series, Boba Fett was the ultimate example of a “character” who had no actual characterization apart from “a bad-ass who has a cool spaceship and a jetpack.” I’m currently reading a book of short stories that recount events from the movies from the point of view of an incidental or background character, and it includes one from the perspective of Boba Fett. It’s written by Paul Dini, who’s extremely talented, but having to work with the version of the character as it exists in the movies. And it shows just how little there is to work with; it’s difficult to make music when you’ve only got one note to play.

So I respect what a big swing it was for The Mandalorian and The Book of Boba Fett to take one-note characters and spin them into genuine character arcs about loyalty, identity, and self-determination. And I doubly respect that they did it while keeping everything in the realm of parable, instead of trying to take the Rogue One approach, trying to turn stories of Good vs Evil into “more mature” stories of politics and morally-compromised heroes. I’d expected The Book of Boba Fett to be a story about an anti-hero, with all the double-crosses and dirty deals of a mob story — Star Wars trying to bring spaceship and lasers to a more action-heavy version of The Sopranos. As frustrating as the series often was, I really like that they rejected that idea. Instead of asking me to identify or even empathize with an anti-hero, they took a pretty shallow non-character and let him become a hero.

Rumors of the Author’s Death Have Been Greatly Exaggerated

The state of lazy media analysis in the age of Twitter

As I’ve been trying (with varying success) to ween myself off of social media, it’s been a little easier to recognize that the internet discourse has probably been a net positive. For as awful as it often is, it has changed the way I think about a lot of things. I tend to think about diversity and representation with more empathy instead of just sympathy, and I’m better at being mindful of my implicit biases and my own tendency to assume white, middle-class male by default.

I have to keep reminding myself of that, because so often I’ll read something that triggers my reactionary The Internet Is Irredeemably Broken, Shut it All Down Now response. The most recent trigger has been the corruption of the idea of “the death of the author,” turning it from something potentially expansive and democratic, into a regressive, lazy, arrogant, and willfully incurious way to approach art.

It’s been annoying me for a couple of years, as I’ve seen the regressive version gain traction and eventually become just taken for granted. When I first encountered the assertion “intent doesn’t matter,” I’d assumed that it was just a typical case of over-simplified hyperbole. Of course they realize that intent matters, I thought. They’re just being provocative, to make the point that a negative or stereotypical depiction can still be harmful, even if it isn’t intended as such.

As I’ve been seeing increasingly literal and shallow interpretations of art and entertainment, I’m not so sure. Especially since it’s so often used in conjunction with my other most hated, regressive trend in popular media analysis, the bullshit idea of “punching up” vs “punching down.” It perpetuates this idea that art and entertainment isn’t actually a dialogue between authors and audiences, but an environment in which powerful creators make products for people to consume or reject.

If you take “intent doesn’t matter” to its extreme, you make it impossible for camp, black comedy, and satire to exist. Or at least, if it still exists, it’s been rendered so toothless as to be inert.

(I should probably mention that I’m talking about actual satire, and not the version in which anybody who’s been called out for being an asshole immediately and invariably shouts “it was satire!” as their first line of defense. Because come on, nobody actually believes it).

Even if that’s an over-exaggeration of “intent doesn’t matter,” the idea is arrogant and reductive at its core. It assumes that an audience’s interpretation — or more often, a hypothetical audience’s interpretation, since it too often looks for potential offense instead of responding to actual offense — takes precedence over the author’s, instead of being on equal footing with it.

That reduces your media analysis to be based on your own assumptions and your own experience, without needing to challenge those assumptions. If you assume that a negative or stereotypical depiction is negative or stereotypical regardless of intent, you ignore the potential for an artist to use that depiction to say something that’s not completely literal. Literal in the same sense as putting disclaimers before cartoons that have racist caricatures, for instance. Having to explicitly acknowledge “this is bad and we, the artists who created this material or the publishers responsible for releasing it, know that it is bad” in a way that can’t possibly be misinterpreted by even the most stubborn person in the audience.

Even if it’s being used to establish a time or place, to consider themes of racial or cultural identity, or to comment on the stereotypical depiction itself. Or all three, like for instance, all of the anti-semitic (and anti-Italian, and anti-Irish, and misogynist, and homophobic) material in Miller’s Crossing. Removing any of that from the movie would cheapen it irreparably. It’s as impassive as its protagonist when it comes to questions of loyalty and morality, and it defiantly resists a literal interpretation, a declaration of who’s good and who’s bad and what it all means.

If you’ve only got the one hammer and approach every piece of art looking for nails, you’re shutting out the potential for art to change how you think. Treating every negative depiction as interchangeable imposes a new sort of Hays Code on art: context is irrelevant, only the depiction matters. Eventually, you end up with a cargo cult going through the motions of progressive representation instead of making actual progress. It becomes a list of approved and taboo depictions, instead of more thoughtful consideration of what makes a depiction negative or how it actually affects people.

And even if you don’t believe in — or don’t care about — the potential chilling effect, it’s still just an extremely shallow and ignorant way to approach art. I don’t understand watching something with such a lack of humility that you refuse to consider that it’s challenging your assumptions instead of just reinforcing them. If you genuinely believe in diversity of representation, then excluding anyone’s voice from the conversation goes against that.

Libby, Get Your Ebooks Here

I’m late to the party on checking out ebooks from the local library.

Likely old news to everyone, but since I didn’t hear about it until a week or so ago, maybe it’ll benefit someone out there:

The Libby app for iOS, Android, and web browsers lets you use your library card to download ebooks and audiobooks. I always had a vague idea that this was possible, but I assumed that it would involve going to a local branch to set everything up, or at best going to an archaic website and using QR codes or something to get books locked to a proprietary, inferior e-reader.

After a week, here’s what’s impressed me most about using Libby:

  • They start by helping you get set up with a library card, if you don’t already have one. Here in Oakland, I did the whole process on my phone and got a digital card within 24 hours, on a weekend.
  • The app is really good-looking and pleasant to use, completely unlike the outdated experience I’d been dreading. It’s odd to see such a polished app not being used to sell stuff or make me angry.
  • The app has an interesting design not quite like anything I’ve seen before. It seems to combine a library-style interface with the AI messenger fad that blew up a couple of years ago, but in a way that actually works.
  • You can choose the format you want to borrow the book, including Kindle, the app’s built-in e-reader, or in some cases downloading as an e-pub. This is the main draw for me, since reading on the Kindle has honestly gotten me to read more.
  • I haven’t yet used the in-app reader, since I’ve gone all-in on Kindle, but from what I’ve seen on the website, it looks professional. (Compared to less-than-great experiences I’ve had with other readers, or badly-formatted books on the Kindle).
  • Once delivered to the Kindle, a book borrowed from the library is treated identically to ones that I’d bought. Synced across devices, readable from multiple versions of the Kindle app, integrated with Goodreads, and so on.
  • Placing a book on hold, when it’s not immediately available, is very easy. You’re given an estimate of how long it’ll take for the book to become available, and how many other readers are waiting for how many available “copies.” In my case, a book became available weeks before the estimate, and it was easy for me to reschedule it for later.

I’ve been living in Oakland for years, but I just have never been able to drag my ass to the library to get a library card. (I never got one for San Francisco, either, come to think of it). I don’t usually read enough to warrant one, plus I’m spoiled and don’t have the patience to wait if a book I want isn’t immediately available. I worry that my years of laziness and eagerness to take the path of least resistance has ended up paying for Jeff Bezos’s in-flight magazine on his peen rocket or something.

Maybe reading library books delivered online isn’t as novel (sorry) for everyone else as it is for me, but I can’t help feeling as if I’d unlocked a hidden secret I haven’t been taking advantage of for decades. This system isn’t perfect, of course; it’s got artificial scarcity built in, to mimic borrowing a physical book. And there are going to be plenty of titles that aren’t available at all.

But in just over a week, I’ve already finished one book and am a quarter of the way through another one. Both were books that I was curious about, but hesitant to commit to if it meant buying them outright. It seems dumb and obvious written out, but having to pay publisher prices for everything imposed this bar on anything I read: it had to be good enough that I’d be willing to “own” it. And that was lurking in the back of my mind while I read everything, making me a little more subconsciously hyper-critical.

If I’m just borrowing from the library, though, I can go back to reading trash without guilt or remorse!

They… they ASSURED me there was PEANUT BRITTLE in that can!

Get a load of the whiny sons o’ bitches at The Verge!

I have it on very good authority that this is the new mascot for the Volkswagen Group. Image from D23.com.

Given all the genuine stuff to get stressed out or worried about, I’ve got to thank The Verge for giving me something completely inconsequential to be irrationally annoyed by.

The story in brief was that Volkswagen did a beginning-of-April marketing stunt announcing that they were changing their US branding to “Voltswagen,” to reaffirm their commitment to electric vehicles. The Verge chomped on that like a starving bass, running it as a top story on the site. Now, after finding out that the obvious marketing stunt was, in fact, a marketing stunt, they edited their story from press release regurgitation into a long-form tantrum.

Normally, I’d do the Nelson Muntz point-and-laugh and then move on, but the Verge writers’ histrionics have actually made me kind of angry. First, instead of being good-natured — or even the wet blanket but appropriately skeptical approach that Ars Technica took — they changed the headline to say that Volkswagen lied about their rebranding! Here showing the same understanding of “lying” as the aliens in Galaxy Quest.

Worse, they made repeated references — in the byline of the rewritten article, and on Twitter — to “Dieselgate.” Because, obviously, fooling a couple of gullible and clickbait-seeking internet writers is equivalent to a multi-billion dollar, years-long, massive environmental scandal.

But now we know the rebrand was nothing more than another lie from a company that’s become known for something else: lying.

A butt-hurt, insufferably whiny baby

The reason this makes me so irrationally angry — apart from putting me in a position where I’m not just defending Volkswagen, but defending an April Fools prank — is that it’s another reminder of how embarrassingly low journalistic standards are in 2021. Actually, that’s a third strike against it: it makes me want to put “journalist” in sarcastic quotes, but I can’t do that, because that’s the province of all the knuckle-dragging losers on the internet complaining about Brie Larson and Kathleen Kennedy.

The writers can clutch their pearls and stand aghast at VW all they want, but the truth is that they simply didn’t do due diligence for their non-story. They valued page views over newsworthiness. “They published a press release!” insists the article, ignoring the obvious fact that you’re not obliged to run every press release as front-page news.

The undeniable fact of all this is that this stunt was not news. Even if it had been 100% real. Even for a company gigantic as the Volkswagen Group. It was so obviously as much a non-story as the results of the Puppy Bowl or the war between Left Twix and Right Twix. It’s depressing that they think the problem is a company fooling them with a lame (and clearly publicity-grabbing) stunt, instead of how eager they were to “report” on the stunt in the first place.

Arrogance Persevering

Thoughts about jackasses on the internet and how much of my life I’ve wasted responding to them.

Yet another thing that I have to thank WandaVision for: maybe I can finally stop feeling the need to respond to arrogant dipshits on the internet? Last week’s excellent episode had an extremely well-written and well-performed scene in which Vision reminded a grieving Wanda that what she was feeling wasn’t just sorrow and emptiness. “What is grief, if not love persevering?”

An objectively good line in an objectively good scene in an objectively good show. ‘Nuff said!

Except Twitter’s gonna Twit, so the whole weekend was filled with some people gushing about what a well-written moment that was… followed by an assload of trolls, snobs, condescending misogynist dolts, insufferable anti-corporate twits, and generally arrogant an awful people mocking it — and the series as a whole — as being insultingly beneath them.

Continue reading “Arrogance Persevering”

Spoiler Warning: Human Beings Continue to Disappoint

When I first heard that Disney+ was going to release its original series as real series, meaning waiting a week between episodes instead of dumping an entire season online at once, I was very happy to hear it. The Netflix model makes sense for what they’re trying to do — be a repository for hours and hours and hours of programming available whenever you want it — but it turns out that even in the over-stimulated 21st century, there’s a lot to be said for that week of speculation and anticipation between episodes. It feels more like a shared communal experience.

Or at least, it would feel like that, if there weren’t so many selfish a-holes out there.

As much as I’ve been loving The Mandalorian, I’m not watching new episodes at midnight the night before a new episode is released. But I’ve seen people not even waiting an hour to start posting spoilers online.

Now granted, I didn’t see many direct spoilers, probably because I’ve managed to weed out the worst offenders from my social media by now. But there were enough people proud of themselves for talking around the spoilers that by the time I watched the episode at a reasonable time tonight, I already had a rough idea of what was going to happen.1The biggest spoiler was a coy, roundabout tweet from one of the guest stars of the episode, which more or less revealed that they were going to be a guest star of the episode. It reminded me of The Crying Game, when I’d seen so many people so deliberately talking around the spoiler that I could tell what the spoiler was within a few minutes.

Most surprising to me, though, was how many people I saw on Twitter defending their right to post whatever they want. “If you don’t want to be spoiled, you shouldn’t be on Twitter!” was the claim. One particularly asinine person started mocking somebody who was complaining about spoilers, then said that if you’re reading Twitter in the morning you’re clearly not working, so you could just as well be watching the episode. Because taking two minutes to scroll through Twitter at work is exactly the same as taking 45 minutes to watch television during work, I guess.

I started to break my read-only policy to call the guy out for not only being stupid, but also being such a jack-ass that he’d go out of his way to defend carelessly and selfishly ruining the experience for other people, instead of showing the barest minimum amount of consideration by demonstrating the barest minimum amount of impulse control for a couple of hours until everyone got a chance to watch it. But then I realized three things.

One is that the people I was about to yell at were people I didn’t know, and one of them is apparently a contributor to a notoriously asinine Disney “news” site, so I had no idea why I’d been following them in the first place.

Two was that once someone’s selfishness has gotten to that point, calling them out on it isn’t going to have any effect at all. If there’s ever any question, the best course of action is always to block them and move on.

And lastly, no matter how selfish their intention, their advice was “you shouldn’t be on Twitter.” Which is impossible to argue with.

Apart from just bitching about a social media platform I should never have signed back onto, this also has me wondering about building anticipation and buzz and community when distribution gets wider and audiences get more and more fractured. The Mandalorian in particular has been, since its first episode, full of revelations that it’s tried to keep under wraps. Surprisingly, it’s succeeded more often than not. Obviously, people are super-eager to talk about it, or there wouldn’t be so many people eager to spoil it, so they’ve built (and earned) a dedicated audience. I’d be interested to see if there are ways to preserve that communal experience of the old broadcast TV days, that don’t just depend on people not being jerks.

  • 1
    The biggest spoiler was a coy, roundabout tweet from one of the guest stars of the episode, which more or less revealed that they were going to be a guest star of the episode.