Last night, we watched the first episode of The Acolyte. I don’t yet have a strong opinion of the show, one way or the other, so this post isn’t about that. Instead, it’s about how I went to IMDB to look something up about the series, and I saw that it had been review-bombed. As I read through the surprisingly lengthy outraged 1-to-3-star reviews, a couple of trends emerged: apparently, it has the worst writing of any Star Wars project to date, Disney and Lucasfilms [sic] should be ashamed, and the existence of a fire in outer space was the last straw for many, many dedicated fans.
Earlier this week, XKCD ran a comic about the objective superiority of electric motors over internal combustion engines. Comments across social media platforms were full of disappointment in cartoonist Randall Munroe for ignoring the environmental hazards of mining rare earth materials for EV batteries, the obvious fact that electric vehicles only benefit the automotive industries and prevent adoption of better mass transit, and that gas engines are necessary in the desert and/or the event of an apocalypse.
Earlier this month, collectible toy retailer Super 7 posted a message on their Instagram in honor of Pride month, asserting that they’ve always promoted inclusion and diversity ever since the company’s founding in San Francisco. There were dozens of comments in response, calling them out for pandering and going woke, and informing them that they’d lost a loyal customer.
The one thing that all these comments have in common is that they’re garbage. Not just in the sense that they’re wrong, but in that they provide nothing of value and just weigh down the target with useless noise. As they’re most often intended to.
Anyone who’s been paying any attention at all knows that public comments sections and centralized social media — which already had a well-deserved reputation for being toxic waste — fell off a cliff in terms of usefulness around 2018-2019, when people with enough money finally recognized their value as tools for misinformation and disinformation. At least for me, that was the point that it all turned from “unreliable source” to “useless cesspit.”
I don’t even put in the effort to be discerning anymore. I used to look for stuff that seemed suspicious. But now, I just assume that everything posted online from any source I don’t recognize is being made in bad faith. Not even in the sense of not trusting the “wisdom of the crowd,” since there are always going to be cranks and fools out there; but in the sense that we can no longer assume that the crowd even exists at all.
This latest round of noise-to-signal stood out to me because of its predictability. Even though I don’t put any stock in the actual comments, I still just take it as a given that modern Star Wars, electric vehicles, and LGBT equality, are going to be treated as “controversial” topics on the internet. Even when we think we’re being discerning, we’ve already been “infected” by the torrent of garbage, conditioned to assume that there’s controversy where none actually exists.
And with increasing access to “generative AI,” then it’s going to be even easier to generate tons and tons of bad-faith speech to drown out anything of value.
We’ve been cautioned to be wary of video and audio deepfakes, signs that a piece of writing was ML-generated, or things to look for in “AI art.” Which is fine, since it’s a good skill to have. But I think the more pervasive damage goes beyond any individual piece of media. Even if a faked piece of media is identified and debunked, it’s already done its damage. It’s wasted our time, undermined our faith in the media, and drawn attention away from something more deserving.
There’s a really good episode of the PBS “Otherwords” series, about the linguistics of cults and cult leaders. I recommend it even beyond the seemingly-narrow topic:
What stood out to me was how much of the techniques of cult leaders is easily recognizable not only in contemporary politics (I mean, duh), but social media. There are two in particular that helped articulate why I get frustrated and stressed out even in the midst of people who are ostensibly on “my side.”
One is the prevalence of “loaded language.” This played out a short time ago on Mastodon, when several posters started casually using the word “genocide” to describe the disgusting anti-trans policies taking hold in state governments across the US. When people pointed out that it wasn’t “genocide” — because words mean things — that was met with tantrums about how non-trans people didn’t care about any issue that didn’t affect them directly, would rather “argue semantics” than do anything to help, etc.
Interestingly, I haven’t seen this pop up again since the US started supplying arms to help a nation engage in actual genocide in Gaza. I’d hope that people finally caught on that using loaded language around a serious issue doesn’t actually help draw attention to it; it more often helps trivialize it. And obfuscate it under arguments about “tone policing” and such, which get further and further away from the actual issue, and seem most intended to solidify the divisions between Us and Them. It’s more about manipulation than information, or even persuasion. (I’d hope that people caught on, but it’s a lot more likely that I just muted all the worst offenders).
The other observation that stood out to me was about the amount of language that cult leaders use. In the video, it’s about assuming authority over people, our tendency to assume that the person who’s talking the most about a topic must know the most about it. For social media and comments sections, it’s more about reframing a conversation. People tend to look for consensus, so a torrent of noise in a comment section not only overwhelms the original point, but creates a sense that the original point is more charged or contentious than it actually is.
I like to think that I’ve gotten somewhat proficient at spotting fake comments, comments made in bad faith, and other worthless noise or trolling. But I’m only now realizing that I haven’t been as vigilant about the side effects of all this noise. I’ve got pretty strong opinions about Star Wars, EVs, and LGBTQ equality, so the chances of my being persuaded to change my mind by an online comment or post are pretty much zero. But I’ve already been persuaded over time, taught to recognize these immediately as controversial topics with a lot of strong opinions on “both sides.”
Whenever I see a post on Instagram that’s about something I know is going to get a lot of stupid comments, I spend a few minutes reading through the comments and blocking the idiots, spam accounts, and trolls. I always thought it was a mindless but harmless activity, like popping bubble wrap. But it’s also taught me to anticipate controversy where there should be none.
It’s made me think, “Oh well of course a movie about a toy doll for little girls is going to generate a lot of controversy among grown adult men. That’s just how things are in the 21st century.” Even though there’s not a part of that sentence that makes any sense at all.
The growing ubiquity of LLMs means we’re going to be seeing more bad-faith arguments passed off as cogent ones, and so much time is going to be wasted trying to counter them. I’m not sure what the “correct” solution is, but mine is just to pledge to stop engaging with any of it. The Discourse will be fine without me, and anybody who’s got an opinion actually worth considering can take a minute to get a blog or something.