Gabriel L. Helman Gabriel L. Helman

Delicious Library Crosses Over

Okay, this one hurts. Amazon has finally turned off the thing Delicious Library used, so the app has been discontinued.

Over at Daring Fireball, Gruber has a really nice writeup about how Delicious Library was the exemplar for the era of apps as art in their own right that seems to have mostly passed: Daring Fireball: The End of the Line for Delicious Library

Personally? I have a Delicious Library that contains (almost) everything I own. “Dad, did you scan these yet” was a standard phase of any new purchase or gift. Books, games, toys, whatever, the thing where it could pull data on anything with a UPC or EAN from Amazon was amazing. Even more amazing was the barcode scanner app for the iPhone that used the camera. Just walk around the house zip, zip, zip.

You know how they say you should have a list of everything you own for insurance purposes? I had that! Plus, the ability to see when something was bought was surprisingly useful. “What birthday was that?” You could answer those questions! I loved it.

It was clear DL was on its way out: Wil Shipley has been an Apple employee for years now, the app hasn’t been updated in ages, the Amazon link was getting… “sketchy.” But there just isn’t a replacement.

This is the sort of think where all you can do is throw your hands up and make a sort of “ecchhhhh” sound. One more great thing we used to have that’s gone because the easiest way for some middle manager somewhere to make one of their KPIs was to break it.

Although, maybe the most maddening thing about this one is that I would have absolutely paid some kind of subscription fee to Amazon (directly or indirectly) to keep this working, and… nope. Not an option.

Read More
Gabriel L. Helman Gabriel L. Helman

Happy Bell Riots to All Who Celebrate

Stay safe out there during one of the watershed events of the 21st century! I was going to write something about how the worst dystopia Star Trek could imagine in the mid-90s is dramatically, breathtakingly better than the future we actually got, but jwz has the roundup of people who already did.

Can you imagine the real San Franciso of 2024 setting aside a couple of blocks for homeless people to live? To hand out ration cards? For there to be infrastructure?

Like all good Science Fiction, Deep Space Nine doesn’t say a lot about the future, but it sure says an awful lot about the time in which it was written.

Read More
Gabriel L. Helman Gabriel L. Helman

More Musings on the Starcruiser

Over the various overlapping illnesses and convalescences of the last two months I finally caught up with the rest of the western hemisphere and made my way though Jenny Nicholson’s remarkable four-hour review/port-mortem of Disney’s “Galactic Starcruiser”—The Spectacular Failure of the Star Wars Hotel.

It’s a outstanding piece of work, not only reviewing her own trip, but also providing context and some attempted root-cause analysis for the whole misbegotten project. Go carve out some time to watch it if you haven’t already. It’s the definitive work on the subject.

We’ve covered the Star Wars Hotel on Icecano before, but that was based on a trip report from someone whose trip went well. Nicholson’s trip didn’t go so well, and the ways systems fail often shed a lot more light on how they really function then when they work as intended.

The thing that has always stuck me about the Starcruiser is that it was so clearly three different attractions:

  • A heavily-themed hotel with a direct “side-door” connection to the park
  • A collection of low-barrier-to-entry interactive “games” somewhere between an arcade and an escape room
  • A 2-day LARP summer camp with a stage show at the end

Those are all pretty good ideas, but why did they do them as one thing? All those ideas would have been so much cooler as an actual fancy hotel connected to the park, and then separately an EPCOT-style “Star Wars Pavillion”, in the style of the current space restaurant or the old The Living Seas “submarine base” you got into via the “Hydrolater”.

I thought Nicholson’s sharpest insight about the whole debacle was that all the “features” of the hotel were things originally promoted as being part of the main Star Wars Land, but the hotel allowed them to put them behind an extra paywall.

I maintain my belief I alluded to last summer that I don’t think the hotel was ever meant to last very long, it really does feel like a short-term experiment to try out a bunch of ideas and tech in a way where they can charge through the nose for access to the “beta”. So many strange decisions make more sense if you assume it was never meant to last for more than about 2 years. (But still! Why build it way out there instead of something you could turn into a more-permanent fixture?)

But that’s all old news; that was stuff we were speculating on before the thing even opened. No, what I’ve been stewing on since I watched this video was the LARP aspect. Nicholson’s video was the first thing I’d seen or read that really dug into what the “role play” aspect of the experience was like and how that worked—or didn’t. And I can’t believe how amateur-hour it was.

Credit where credit was due, Disney was going for something interesting: an open-to-the-public Diet LARP that still had actual NPC characters played by paid actors with storylines and semi-scripted events. Complexity-wise, not all the way up to a “real” LARP, but certainly up above an escape room or a murder mystery party or a ren faire or something of those ilk. Plus, you have to assume basically everyone who will every play it is doing so for the first time, no veteran players. And at a premium price.

One would think this would come with a fairly straightforward set of rules or guidelines; I imagined an email with a title along the lines of “To ensure you have the best possible experience…” And instead, they just… didn’t?

For example, the marketing made a big deal about “starring in your own story” and guests were strongly encouraged to dress up. But they really didn’t want guests to use character names. That seems mostly logistical, with guest profiles and whatnot tied to their real names. That’s the sort of obvious-in-retrospect but not-so-much ahead of time detail that is the reason Session Zero exists! This isn’t Paranoia, it’s not cheating to tell the players how to play the game, just tell them! For $6000, I’d expect to be told ahead of time “please wear costumes but please don’t use a fake name.”

But it’s the lack of any sort of GameMaster/StoryTeller that stunned me. The just-shy of 40-year DM in me kept watching those video clips going “no, no, no, someone put your thumb on the scale there.” The interaction that really got me is the part of the video where she’s trying and failing to get pulled into the First Order story, and is attempting to have a conversation with the Officer actor to make this happen, and they are just talking past each other. And this made my skin crawl, because this is perfect example of a moment where you need to be able to make the “out of game” hand sign and just tell someone what’s happening. I can’t believe there wasn’t a way to break out of kayfabe and ask for help. Again, this is basic session zero safety tools shit. This is shit my 12 year old figured out on his own with his friends. Metaphorically, and maybe literally, there should always be a giant handle you can pull that means “this isn’t working for me”.

Look, this is not an original view, but for 6 grand, you should be able to do everything wrong and still get a killer experience. You shouldn’t be begging an underpaid SoCal improv actor to let you play the game you paid for halfway though your trip.

I get that they were trying to do something new for Disney, but The Mind's Eye Theatre for Vampire came out in 1993. Running a safe and fun LARP is a solved problem.

I get wanting to make something that’s as mainstream and rookie-friendly as possible, and that you don’t want to just appeal to the sort of folks that can tell you who the seven founding clans of the Camarilla were. But something we talk about a lot in tabletop RPGs is “calling for buy-in”, and holy shit clicking CONFIRM ORDER on a screen with a juicy four digit number of dollars on it is the most extreme RPG buy-in I’ve ever heard of.

I know I keep coming back to the price, and that’s partly because for a price that premium you should get an equivalently premium experience, but more importantly: there was no-one casual at this thing. No one “impulse-bought” a trip on the Starcruiser. Everyone there was as bought-in as anyone ever has been, and they couldn’t figure out how to deliver an experience as good as any random night in the park with the other vampires in the sleepy NorCal farming town I went to college in.

It’s tempting to attribute all that to general Disney arrogance, but I don’t think so. It all feels so much stupider than that. Arrogance would be ignoring the prior art, this feels more like no-one could be bothered to find out if there was any? The most expensive piece of half-ass work I have ever seen. This all could have worked? Beyond the obvious budget cuts and trying to scale down, this could have worked. It’s wild to me that they’d spend that much money, and energy, and marketing mindshare, and then not make sure it did. I mean, really, no one employed by Imagineering used to be the Prince of Glendale or something? Unlikely. I don’t think anyone intentionally sandbagged this project, but it sure doesn’t look like anyone involved cared if it was successful.

Weird.

Read More
Gabriel L. Helman Gabriel L. Helman

Un-Reviewed Code Is Tech Debt

Reading code is hard. It’s one of those universal truths is that regardless of language or platform, writing code is always easier than reading it. Joel on Software has a now 24-year old post on this topic which is still as relevant now as it was at the turn of the century: Reading Code is Like Reading the Talmud.

Mostly this is due to lack of context, there’s so much stuff around any piece of code that you really need to hold in your head for this part to make any sense. All the really useful code reviews I’ve been a part of involve asking the original developer questions: “whats going on here”, “why did you make this decision”, and so on. These don’t always result in changes, but they’re necessary to make sure the context is shared amongst everyone.

One of the reasons I tend to be millitant about software comments is to try and get that initial programmer context preserved with the code itself—to run with Joel’s Talmud analogy, to see if we can include the first batch of commentary along with the source right at the start.

Which brings us to code reviews. I’ve been thinking about code reviews a lot lately, for various reasons. One of the things I keep thinking about is that I can’t believe how hard to do well they still are, but they’re hard for many of the same reasons that reading code is hard. But of course, thats one of the reasons they’re so important: not only to “make sure the code is right”, but also to spread that context around to the team.

I’ve taken to thinking of un-reviewed code as a kind of tech debt—context debt, if you will. And this is the worst kind of debt, in that it’ll build up in the background while you’re not paying attention, and then a couple people get promoted or leave, and you realize you have a whole-ass application that no one on the team has ever seen the insides of. This is kind of rephrasing the “bus factor” problem, but I like treating it as a class of debt because it gives us an existing framework to pay it down.

But that doesn’t solve the basic problem that code review is hard to do, and most of our tools don’t really help. I mean, one of the reasons XP went all-in on pair programming is that was easier than figuring out how to make code easier to read and reason about.

And so given all that, I’ve also been stewing on how it’s very (not) funny to me that we keep finding new ways to replace “writing code” with “code review.”

One of them is that on top of all the other reasons not to let the Plagiarism Machine condense you some code out of the æther, is that now you still have to review that code, but that original context not only isn’t available, it doesn’t even exist. So code reviews become even more important at the same time as they get impossibly harder. Sort of instant-deploy tech debt. It’s the copy-paste from Stack Overflow, only amped way up. But, okay, that’s this new toy burning off the fad, hopefully people will knock that off at some point.

The thing I’ve really been thinking about is all that un-reviewed code we’ve been dragging around in the form of open source libraries. This musing, of course, brought to you by that huge near-miss last month (Did 1 guy just stop a huge cyberattack?), along with the various other issues going on over in NPM, PyPy, and then the follow-up discussion like: Bullying in Open Source Software Is a Massive Security Vulnerability

I mean, this whole thing should be a real wakeup call to the entire OSS world in a “hang on, what the hell are we doing” sort of way. Turns out that sure, with enough eyes all bugs are shallow, but you still have to have someone look. And the fact that it was a guy from Microsoft who found the bug? Because something was too slow? Delicious, but terrifying.

Everyone links to the xkcd about Dependencies with a sort of head-shake “that’s just how it is”. But sooner or later, that guy is going to leave, or need better insurance. You might not be paying the volunteers, but you can bet someone else would be willing to.

Like all of us, I wonder how many of these are out there in the wild? I’m glad I don’t run a Software Dev team that handles sensitive data currently, because at this point you have to assume any FOSS package has a >0% chance of hosting something you don’t want running on your servers.

And to bring it back around to the subject at hand, the real solution is “we need a way to audit and review open source packages”, but after a generation of externalizing that cost, no one even knows how to do that?

But what would I be doing if I was still in charge of something that handled PHI or other sensitive or valuable data? My initial reaction was I’d be having some serious conversations about “what would it take to remove all the open source. No, give me an estimate for all of it. All.”

(And to be clear, it’s not like commercial software is immune either, but that’s a different set of risk vectors and liability.)

I’d want a list of all the FOSS packages in the system, sorted into these buckets:

  1. Stuff we’re barely using, that we could probably replace in a day or two. (The CSV formatter library that we only use to write one file in one place.)
  2. Bigger things that we’re using more of, but could still get our arms around what a replacement looks like. (We pulled in Apache Commons collections because it was easy to use, but we’re using less than 10% of it.)
  3. Big, foundational stuff: Spring, Tomcat, Linux, language standard libraries. Stuff you aren’t going to rewrite.

That third category needs a light audit to make sure there’s an actual entity in charge of it with safety practices and the like. Probably a conversation with legal about liability and whatnot.

But for those first two buckets, I’d want to see an estimated cost to replace. And then I want to see a comparison of “how many hours of effort converted to salary dollars” vs “worst-case losses if our severs got p0ned”. Because the hell of it is, those numbers probably make it a slam dunk to do the rewrite.

But look! I’m doing that same fallacy—it’s easier to write than review, so let’s just rewrite it. And this has been sitting in the drafts folder for a month now because… I don’t know!

The current situation seem untenable, and all the solutions seem impossible. But that review debt is still there, waiting.

Read More
Gabriel L. Helman Gabriel L. Helman

The Dam

Real blockbuster from David Roth on the Defector this morning, which you should go read: The Coffee Machine That Explained Vice Media

In a large and growing tranche of wildly varied lines of work, this is just what working is like—a series of discrete tasks of various social function that can be done well or less well, with more dignity or less, alongside people you care about or don't, all unfolding in the shadow of a poorly maintained dam.

It goes on like that until such time as the ominous weather upstairs finally breaks or one of the people working at the dam dynamites it out of boredom or curiosity or spite, at which point everyone and everything below is carried off in a cleansing flood.

[…]

That money grew the company in a way that naturally never enriched or empowered the people making the stuff the company sold, but also never went toward making the broader endeavor more likely to succeed in the long term.

Depending on how you count, I’ve had that dam detonated on me a couple of times now. He’s talking about media companies, but everything he describes applies to a lot more than just that. More than once I’ve watched a functional, successful, potentially sustainable outfit get dynamited because someone was afraid they weren’t going to cash out hard enough. And sure, once you realize that to a particular class of ghoul “business” is a synonym for “high-stakes gambling” a lot of the decisions more sense, at least on their own terms.

But what always got me, though, was this:

These are not nurturing types, but they are also not interested in anything—not creating things or even being entertained, and increasingly not even in commerce.

What drove me crazy was that these people didn’t use the money for anything. They all dressed badly, drove expensive but mediocre cars—mid-list Acuras or Ford F-250s—and didn’t seem to care about their families, didn’t seem to have any recognizable interests or hobbies. This wasn’t a case of “they had bad taste in music”, it was “they don’t listen to music at all.” What occasional flickers of interest there were—fancy bicycles or or golf clubs or something—was always more about proving they could spend the money, not that they wanted whatever it was.

It’s one thing if the boss cashes out and drives up to lay everyone off in a Lamborghini, but it’s somehow more insulting when they drive up in the second-best Acura, you know?

I used to look at this people and wonder, what did you dream about when you were young? And now that you could be doing whatever that was, why aren’t you?

Read More
Gabriel L. Helman Gabriel L. Helman

Implosions

Despite the fact that basically everyone likes movies, video games, and reading things on websites, every company that does one of those seems to continue to go out of business at an alarming rate?

For the sake of future readers, today I’m subtweeting Vice and Engaget both getting killed by private equity vampires in the same week, but also Coyote vs Acme, and all the video game layoffs, and Sports Illustrated becoming an AI slop shop and… I know “late state capitalism” has been a meme for years now, and the unsustainable has been wrecking out for a while, but this really does feel like we’re coming to the end of the whole neoliberal project.

It seems like we’ve spent the whole last two decades hearing about something valuable or well-liked went under because “their business model wasn’t viable”, but on the other hand, it sure doesn’t seem like anyone was trying to find a viable one?

Rusty Foster asks What Are We Dune 2 Journalism? while Josh Marshall asks over at TPM: Why Is Your News Site Going Out of Business?. Definitely click through for the graph on TPM’s ad revenue.

What I find really wild is that all these big implosions are happening at the same time as folks are figuring out how to make smaller, subscription based coöps work.

Heck, just looking in my RSS reader alone, you have: Defector, 404 Media, Aftermath, Rascal News, 1900HOTDOG, a dozen other substacks or former substacks,
Achewood has a Patreon!

It’s more possible than ever to actually build a (semi?) sustainable business out there on the web if you want to. Of course, all those sites combined employ less people that Sports Illustrated ever did. Because we’re talking less about “scrappy startups”, and more “survivors of the disaster.”

I think those Defector-style coöps, and substacks, and patreons are less about people finding viable business models then they are the kind of organisms that survive a major plague or extinction event, and have evolved specifically around increasing their resistance to that threat. The only thing left as the private equity vultures turn everything else and each other into financial gray goo.

It’s tempting to see some deeper, sinister purpose in all this, but Instapot wasn’t threatening the global order, Sports Illustrated really wasn’t speaking truth to power, and Adam Smith’s invisible hand didn’t shutter everyone’s favorite toy store. Batgirl wasn’t going to start a socialist revolution.

But I don’t think the ghouls enervating everything we care about have any sort of viewpoint beyond “I bet we could loot that”. If they were creative enough to have some kind of super-villian plan, they’d be doing something else for a living.

I’ve increasingly taken to viewing private equity as the economy equivalent of Covid; a mindless disease ravaging the unfortunate, or the unlucky, or the insufficiently supported, one that we’ve failed as a society to put sufficient public health protections against.

Read More
Gabriel L. Helman Gabriel L. Helman

“Hanging Out”

For the most recent entry in asking if ghosts have civil rights, the Atlantic last month wonders: Why Americans Suddenly Stopped Hanging Out.

And it’s an almost perfect Atlantic article, in that it looks at a real trend, finds some really interesting research, and then utterly fails to ask any obvious follow-up questions.

It has all the usual howlers of the genre: it recognizes that something changed in the US somewhere around the late 70s or early 80s without ever wondering what that was, it recognizes that something else changed about 20 years ago without wondering what that was, it displays no curiosity whatsoever around the lack of “third places” and where, exactly kids are supposed to actually go when then try to hang out. It’s got that thing where it has a chart of (something, anything) social over time, and the you can perfectly pick out Reagan’s election and the ’08 recession, and not much else.

There’s lots of passive voice sentences about how “Something’s changed in the past few decades,” coupled with an almost perverse refusal to look for a root cause, or connect any social or political actions to this. You can occasionally feel the panic around the edges as the author starts to suspect that maybe the problem might be “rich people” or “social structures”, so instead of talking to people inspects a bunch of data about what people do, instead of why people do it. It’s the exact opposite of that F1 article; this has nothing in it that might cause the editor to pull it after publication.

In a revelation that will shock no one, the author instead decides that the reason for all this change must be “screens”, without actually checking to see what “the kids these days” are actually using those screens for. (Spoiler: they’re using them to hang out). Because, delightfully, the data the author is basing all this on tracks only in-person socializing, and leaves anything virtual off the table.

This is a great example of something I call “Alvin Toffler Syndrome”, where you correctly identify a really interesting trend, but are then unable to get past the bias that your teenage years were the peak of human civilization and so therefore anything different is bad. Future Shock.

I had three very strong reaction to this, in order:

First, I think that header image is accidentally more revealing than they thought. All those guys eating alone at the diner look like they have a gay son they cut off; maybe we live in an age where people have lower tolerance for spending time with assholes?

Second, I suspect the author is just slightly younger than I am, based on a few of the things he says, but also the list of things “kids should be doing” he cites from another expert:

“There’s very clearly been a striking decline in in-person socializing among teens and young adults, whether it’s going to parties, driving around in cars, going to the mall, or just about anything that has to do with getting together in person”.

Buddy, I was there, and “going to the mall, driving around in cars” sucked. Do you have any idea how much my friends and I would have rather hung out in a shared Minecraft server? Are you seriously telling me that eating a Cinnabon or drinking too much at a high school house party full of college kids home on the prowl was a better use of our time? Also: it’s not the 80s anymore, what malls?

(One of the funniest giveaways is that unlike these sorts of articles from a decade ago, “having sex” doesn’t get listed as one of the activities that teenagers aren’t doing anymore. Like everyone else between 30 and 50, the author grew up in a world where sex with a stranger can kill you, and so that’s slipped out of the domain of things “teenagers ought to be doing, like I was”.)

But mostly, though, I disagree with the fundamental premise. We might have stopped socializing the same ways, but we certainly didn’t stop. How do I know this? Because we’re currently entering year five of a pandemic that became uncontrollable because more Americans were afraid of the silence of their own homes than they were of dying.

Read More
Gabriel L. Helman Gabriel L. Helman

Even Further Behind The Velvet Curtain Than We Thought

Kate Wagner, mostly known around these parts for McMansion Hell, but who also does sports journalism, wrote an absolutely incredible piece for Road & Track on F1, which was published and then unpublished nearly instantly. Why yes, the Internet Archive does have a copy: Behind F1's Velvet Curtain. It’s the sort of thing where if you start quoting it, you end up reading the whole thing out loud, so I’ll just block quote the subhead:

If you wanted to turn someone into a socialist you could do it in about an hour by taking them for a spin around the paddock of a Formula 1 race. The kind of money I saw will haunt me forever.

It’s outstanding, and you should go read it.

But, so, how exactly does a piece like this get all the way to being published out on Al Gore’s Internet, and then spiked? The Last Good Website tries to get to the bottom of it: Road & Track EIC Tries To Explain Why He Deleted An Article About Formula 1 Power Dynamics.

Road & Track’s editor’s response to the Defector is one of the most brazen “there was no pressure because I never would have gotten this job if I waited until they called me to censor things they didn’t like” responses since, well, the Hugos, I guess?


Edited to add: Today in Tabs—The Road & Track Formula One Scandal Makes No Sense

Read More
Gabriel L. Helman Gabriel L. Helman

March Fifth. Fifth March.

Today is Tuesday, March 1465th, 2020, COVID Standard Time.

Three more weeks to flatten the curve!

What else even is there to say at this point? Covid is real. Long Covid is real. You don’t want either of them. Masks work. Vaccines work.

It didn’t have to be like this.

Every day, I mourn the futures we lost because our civilization wasn’t willing to put in the effort to try to protect everyone. And we might have even pulled it off, if the people who already had too much had been willing to make a little less money for a while. But instead, we're walking into year five of this thing.

Read More
Gabriel L. Helman Gabriel L. Helman

Monday Snarkblog

I spent the weekend at home with a back injury letting articles about AI irritate me, and I’m slowly realizing how useful Satan is as a societal construct. (Hang on, this isn’t just the painkillers talking). Because, my goodness, I’m already sick of talking about why AI is bad, and we’re barely at the start of this thing. I cannot tell you how badly I want to just point at ChatGPT and say “look, Satan made that. It's evil! Don't touch it!

Here’s some more open tabs that are irritating me, and I’ve given myself a maximum budget of “three tweets” each to snark on them:

Pluralistic: American education has all the downsides of standardization, none of the upsides (16 Jan 2024)

Wherein Cory does a great job laying out the problems with common core and how we got here, and then blows a fuse and goes Full Galaxy Brain, freestyling a solution where computers spit out new tests via some kind of standards-based electronic mad libs. Ha ha, fuck you man, did you hear what you just said? That’s the exact opposite of a solution, and I’m only pointing it out because this is the exact crap he’s usually railing against. Computers don’t need to be all “hammer lfg new nails” about every problem. Turn the robots off and let the experts do their jobs.

I abandoned OpenLiteSpeed and went back to good ol’ Nginx | Ars Technica

So wait, this guy had a fully working stack, and then was all “lol yolo” and replaced everything with no metrics or testing—twice??

I don’t know what the opposite of tech debt is called, but this is it. There’s a difference between “continuous improvement” and “the winchester mystery house” and boy oh boy are were on the wrong side of the looking glass.

The part of this one that got me wasn’t where he sat on his laptop in the hotel on his 21st wedding anniversary trip fixing things, it was the thing where he had already decided to bring his laptop on the trip before anything broke.

Things can just be done, guys. Quit tinkering to tinker and spend time with your family away from screens. Professionalism means making the exact opposite choices as this guy.

Read More
Gabriel L. Helman Gabriel L. Helman

End Of Year Open Tab Bankruptcy Roundup Jamboree, Part 1: Mastodon

I’m declaring bankruptcy on my open tabs; these are all things I’ve had open on my laptop or phone over the last several months, thinking “I should send this to someone? Or Is this a blog post?” Turns out the answer is ‘no’ to both of those, so here they are. Day 1: The Twitter to Mastodon migration, or lack thereof.

I was going do to a whole piece on mastodon last summer. At first, the second half of what became What was happening: Twitter, 2006-2023 was going to be a roundup of alternatives and why none of them were replacements for what I missed, then that got demoted to a bunch of footnotes, then it’s own thing, but I just can’t be bothered because the moment was lost.

The short, short version is that from the end of ’21 and through the first half of ’22 there was an really intersting moment where we could have built an open alternative to twitter. Instead, they built another space space for white male nerds, and the internet didn’t need one more of those.

Mastodon somehow manges to embody every flaw of open source projects: they brought a protocol to a product fight, no one’s even thinking about it as a product, misfeatures dressed up as virtue, and a weirdly hostile userbase that’s too busy patting each other on the back to noice who’s been left out. A whole bunch of gatekeeping bullshit dressed up as faux-libertarian morality. Mixed with this very irritating evangelistic true-believer vibe where Mastodon is the “morally correct” one to use, so there’s no reason to listen to reasons why people don’t use it, because axiomatically, they must be wrong.

And look, this is coming from a guy with an active subscription to Ivory. But the communities on twitter I cared about just aren’t there. They went elsewhere, or just dispersed into the web. I understand that there are people who found what they had on twitter on mastodon, and as before I can only quote former-President Beeblebrox: “Zowee, here we are at the End of the Universe and you haven't even lived yet. Did you miss out.”

One of the key works of our time is that episode of 30 Rock where Liz Lemon realizes that actually, she was the bully in high school. Mastodon was built entirely by people that never had that realization.

Michael Tsai - Blog - Why Has Mastodon Adoption Stalled?

Blue Skies Over Mastodon

jwz: Mastodon stampede Pretty great comment thread on this one, including this amazing line, re the mastodon developers:

And for this it’s a general Stallman problem. “I don’t understand this. Therefore you are wrong.”

jwz: Mastodon's Mastodon'ts

The nerd rage respones to jwz’s entirely reasonable (and positive!) critique is everything you need to know. I’ve got enough weird HOA scolds in my real life without going on Al Gore’s internet and seeking them out.

Drivers of social influence in the Twitter migration to Mastodon | Scientific Reports

(I could have sworn I had more open tabs here, but I think I’m remembering frustrated twitter threads instead)

Read More
Gabriel L. Helman Gabriel L. Helman

Fully Automated Insults to Life Itself

In 20 years time, we’re going to be talking about “generative AI”, in the same tone of voice we currently use to talk about asbestos. A bad idea that initially seemed promising which ultimately caused far more harm than good, and that left a swathe of deeply embedded pollution across the landscape that we’re still cleaning up.

It’s the final apotheosis of three decades of valuing STEM over the Humanities, in parallel with the broader tech industry being gutted and replaced by a string of venture-backed pyramid schemes, casinos, and outright cons.

The entire technology is utterly without value and needs to be scrapped, legislated out of existence, and the people involved need to be forcibly invited to find something better to send their time on. We’ve spent decades operating under the unspoken assumption that just because we can build something, that means it’s inevitable and we have to build it first before someone else does. It’s time to knock that off, and start asking better questions.

AI is the ultimate form of the joke about the restaurant where the food is terrible and also the portions are too small. The technology has two core problems, both of which are intractable:

  1. The output is terrible
  2. It’s deeply, fundamentally unethical

Probably the definite article on generative AI’s quality, or profound lack thereof, is Ted Chiang’s ChatGPT Is a Blurry JPEG of the Web; that’s almost a year old now, and everything that’s happened in 2023 has only underscored his points. Fundamentally, we’re not talking about vast cyber-intelligences, we’re talking Sparkling Autocorrect.

Let me provide a personal anecdote.

Earlier this year, a coworker of mine was working on some documentation, and had worked up a fairly detailed outline of what needed to be covered. As an experiment, he fed that outline into ChatGPT, intended to publish the output, and I offered to look over the result.

At first glance it was fine. Digging in, thought, it wasn’t great. It wasn’t terrible either—nothing in it was technically incorrect, but it had the quality of a high school book report written by someone who had only read the back cover. Or like documentation written by a tech writer who had a detailed outline they didn’t understand and a word count to hit? It repeated itself, it used far too many words to cover very little ground. It was, for lack of a better word, just kind of a “glurge”. Just room-temperature tepidarium generic garbage.

I started to jot down some editing notes, as you do, and found that I would stare at a sentence, then the whole paragraph, before crossing the paragraph out and writing “rephrase” in the margin. To try and be actually productive, I took a section and started to rewrite in what I thought was better, more concise manner—removing duplicates, omitting needless words. De-glurgifying.

Of course, I discovered I had essentially reconstituted the outline.

I called my friend back and found the most professional possible way to tell him he needed to scrap the whole thing start over.

It left me with a strange feeling, that we had this tool that could instantly generate a couple thousand words of worthless text that at first glance seemed to pass muster. Which is so, so much worse than something written by a junior tech writer who doesn’t understand the subject, because this was produced by something that you can’t talk to, you can’t coach, that will never learn.

On a pretty regular basis this year, someone would pop up and say something along the lines of “I didn’t know the answer, and the docs were bad, so I asked the robot and it wrote the code for me!” and then they would post some screenshots of ChatGPTs output full of a terribly wrong answer. Human’s AI pin demo was full of wrong answers, for heaven’s sake. And so we get this trend where ChatGPT manages to be an expert in things you know nothing about, but a moron about things you’re an expert in. I’m baffled by the responses to the GPT-n “search” “results”; they’re universally terrible and wrong.

And this is all baked in to the technology! It’s a very, very fancy set of pattern recognition based on a huge corpus of (mostly stolen?) text, computing the most probable next word, but not in any way considering if the answer might be correct. Because it has no way to, thats totally outside the bounds of what the system can achieve.

A year and a bit later, and the web is absolutely drowning in AI glurge. Clarkesworld had to suspend submissions for a while to get a handle on blocking the tide of AI garbage. Page after page of fake content with fake images, content no one ever wrote and only meant for other robots to read. Fake articles. Lists of things that don’t exist, recipes no one has ever cooked.

And we were already drowning in “AI” “machine learning” gludge, and it all sucks. The autocorrect on my phone got so bad when they went from the hard-coded list to the ML one that I had to turn it off. Google’s search results are terrible. The “we found this answer for you” thing at the top of the search results are terrible.

It’s bad, and bad by design, it can’t ever be more than a thoughtless mashup of material it pulled in. Or even worse, it’s not wrong so much as it’s all bullshit. Not outright lies, but vaguely truthy-shaped “content”, freely mixing copied facts with pure fiction, speech intended to persuade without regard for truth: Bullshit.

Every generated image would have been better and funnier if you gave the prompt to a real artist. But that would cost money—and that’s not even the problem, the problem is that would take time. Can’t we just have the computer kick something out now? Something that looks good enough from a distance? If I don’t count the fingers?

My question, though, is this: what future do these people want to live in? Is it really this? Swimming a sea of glurge? Just endless mechanized bullshit flooding every corner of the Web?Who looked at the state of the world here in the Twenties and thought “what the world needs right now is a way to generate Infinite Bullshit”?

Of course, the fact that the results are terrible-but-occasionally-fascinating obscure the deeper issue: It’s a massive plagiarism machine.

Thanks to copyleft and free & open source, the tech industry has a pretty comprehensive—if idiosyncratic—understanding of copyright, fair use, and licensing. But that’s the wrong model. This isn’t about “fair use” or “transformative works”, this is about Plagiarism.

This is a real “humanities and the liberal arts vs technology” moment, because STEM really has no concept of plagiarism. Copying and pasting from the web is a legit way to do your job.

(I mean, stop and think about that for a second. There’s no other industry on earth where copying other people’s work verbatim into your own is a widely accepted technique. We had a sign up a few jobs back that read “Expert level copy and paste from stack overflow” and people would point at it when other people had questions about how to solve a problem!)

We have this massive cultural disconnect that would be interesting or funny if it wasn’t causing so much ruin. This feels like nothing so much as the end result of valuing STEM over the Humanities and Liberal Arts in education for the last few decades. Maybe we should have made sure all those kids we told to “learn to code” also had some, you know, ethics? Maybe had read a couple of books written since they turned fourteen?

So we land in a place where a bunch of people convinced they’re the princes of the universe have sucked up everything written on the internet and built a giant machine for laundering plagiarism; regurgitating and shuffling the content they didn’t ask permission to use. There’s a whole end-state libertarian angle here too; just because it’s not explicitly illegal, that means it’s okay to do it, ethics or morals be damned.

“It’s fair use!” Then the hell with fair use. I’d hate to lose the wayback machine, but even that respects robots.txt.

I used to be a hard core open source, public domain, fair use guy, but then the worst people alive taught a bunch of if-statements to make unreadable counterfit Calvin & Hobbes comics, and now I’m ready to join the Butlerian Jihad.

Why should I bother reading something that no one bothered to write?

Why should I bother looking at a picure that no one could be bothered to draw?

Generative AI and it’s ilk are the final apotheosis of the people who started calling art “content”, and meant it.

These are people who think art or creativity are fundamentally a trick, a confidence game. They don’t believe or understand that art can be about something. They reject utter the concept of “about-ness”, the basic concept of “theme” is utterly beyond comprehension. The idea that art might contain anything other than its most surface qualities never crosses their mind. The sort of people who would say “Art should soothe, not distract”. Entirely about the surface aesthetic over anything.

(To put that another way, these are the same kind people who vote Republican but listen to Rage Against the Machine.)

Don’t respect or value creativity.

Don’t respect actual expertise.

Don’t understand why they can’t just have what someone else worked for. It’s even worse than wanting to pay for it, these creatures actually think they’re entitled to it for free because they know how to parse a JSON file. It feels like the final end-point of a certain flavor of free software thought: no one deserves to be paid for anything. A key cultual and conceptual point past “information wants to be free” and “everything is a remix”. Just a machine that endlessly spits out bad copies of other work.

They don’y understand that these are skills you can learn, you have to work at, become an expert in. Not one of these people who spend hours upon hours training models or crafting prompts ever considered using that time to learn how to draw. Because if someone else can do it, they should get access to that skill for free, with no compensation or even credit.

This is why those machine generated Calvin & Hobbes comics were such a shock last summer; anyone who had understood a single thing about Bill Watterson’s work would have understood that he’d be utterly opposed to something like that. It’s difficult to fathom someone who liked the strip enough to do the work to train up a model to generate new ones while still not understanding what it was about.

“Consent” doesn’t even come up. These are not people you should leave your drink uncovered around.

But then you combine all that with the fact that we have a whole industry of neo-philes, desperate to work on something New and Important, terrified their work might have no value.

(See also: the number of abandoned javascript frameworks that re-solve all the problems that have already been solved.)

As a result, tech has an ongoing issue with cool technology that’s a solution in search of a problem, but ultimately is only good for some kind of grift. The classical examples here are the blockchain, bitcoin, NFTs. But the list is endless: so-called “4th generation languages”, “rational rose”, the CueCat, basically anything that ever got put on the cover of Wired.

My go-to example is usually bittorrent, which seemed really exciting at first, but turned out to only be good at acquiring TV shows that hadn’t aired in the US yet. (As they say, “If you want to know how to use bittorrent, ask a Doctor Who fan.”)

And now generative AI.

There’s that scene at the end of Fargo, where Frances McDormand is scolding The Shoveler for “all this for such a tiny amount of money”, and thats how I keep thinking about the AI grift carnival. So much stupid collateral damage we’re gonna be cleaning up for years, and it’s not like any of them are going to get Fuck You(tm) rich. No one is buying an island or founding a university here, this is all so some tech bros can buy the deluxe package on their next SUV. At least crypto got some people rich, and was just those dorks milking each other; here we all gotta deal with the pollution.

But this feels weirdly personal in a way the dunning-krugerrands were not. How on earth did we end up in a place where we automated art, but not making fast food, or some other minimum wage, minimum respect job?

For a while I thought this was something along one of the asides in David Graeber’s Bullshit Jobs, where people with meaningless jobs hate it when other people have meaningful ones. The phenomenon of “If we have to work crappy jobs, we want to pull everyone down to our level, not pull everyone up”. See also: “waffle house workers shouldn’t make 25 bucks an hour”, “state workers should have to work like a dog for that pension”, etc.

But no, these are not people with “bullshit jobs”, these are upper-middle class, incredibly comfortable tech bros pulling down a half a million dollars a year. They just don’t believe creativity is real.

But because all that apparently isn’t fulfilling enough, they make up ghost stories about how their stochastic parrots are going to come alive and conquer the world, how we have to build good ones to fight the bad ones, but they can’t be stopped because it’s inevitable. Breathless article after article about whistleblowers worried about how dangerous it all is.

Just the self-declared best minds of our generation failing the mirror test over and over again.

This is usually where someone says something about how this isn’t a problem and we can all learn to be “prompt engineers”, or “advisors”. The people trying to become a prompt advisor are the same sort who would be proud they convinced Immortan Joe to strap them to the back of the car instead of the front.

This isn’t about computers, or technology, or “the future”, or the inevitability of change, or the march or progress. This is about what we value as a culture. What do we want?

“Thus did a handful of rapacious citizens come to control all that was worth controlling in America. Thus was the savage and stupid and entirely inappropriate and unnecessary and humorless American class system created. Honest, industrious, peaceful citizens were classed as bloodsuckers, if they asked to be paid a living wage. And they saw that praise was reserved henceforth for those who devised means of getting paid enormously for committing crimes against which no laws had been passed. Thus the American dream turned belly up, turned green, bobbed to the scummy surface of cupidity unlimited, filled with gas, went bang in the noonday sun.” ― Kurt Vonnegut, God Bless You, Mr. Rosewater

At the start of the year, the dominant narrative was that AI was inevitable, this was how things are going, get on board or get left behind.

Thats… not quite how the year went?

AI was a centerpiece in both Hollywood strikes, and both the Writers and Actors basically ran the table, getting everything they asked for, and enshrining a set of protections from AI into a contract for the first time. Excuse me, not protection from AI, but protection from the sort of empty suits that would use it to undercut working writers and performers.

Publisher after publisher has been updating their guidelines to forbid AI art. A remarkable number of other places that support artists instituted guidlines to ban or curtail AI. Even Kickstarter, which plunged into the blockchain with both feet, seemed to have learned their lesson and rolled out some pretty stringent rules.

Oh! And there’s some actual high-powered lawsuits bearing down on the industry, not to mention investigations of, shall we say, “unsavory” material in the training sets?

The initial shine seems to be off, where last year was all about sharing goofy AI-generated garbage, there’s been a real shift in the air as everyone gets tired of it and starts pointing out that it sucks, actually. And that the people still boosting it all seem to have some kind of scam going. Oh, and in a lot of cases, it’s literally the same people who were hyping blockchain a year or two ago, and who seem to have found a new use for their warehouses full of GPUs.

One of the more heartening and interesting developments this year was the (long overdue) start of a re-evaluation of the Luddites. Despite the popular stereotype, they weren’t anti-technology, but anti-technology-being-used-to-disenfrancise-workers. This seems to be the year a lot of people sat up and said “hey, me too!”

AI isn’t the only reason “hot labor summer” rolled into “eternal labor september”, but it’s pretty high on the list.

Theres an argument thats sometimes made that we don’t have any way as a society to throw away a technology that already exists, but that’s not true. You can’t buy gasoline with lead in it, or hairspray with CFCs, and my late lamented McDLT vanished along with the Styrofoam that kept the hot side hot and the cold side cold.

And yes, asbestos made a bunch of people a lot of money and was very good at being children’s pyjamas that didn’t catch fire, as long as that child didn’t need to breathe as an adult.

But, we've never done that for software.

Back around the turn of the century, there was some argument around if cryptography software should be classified as a munition. The Feds wanted stronger export controls, and there was a contingent of technologists who thought, basically, “Hey, it might be neat if our compiler had first and second amendment protection”. Obviously, that didn’t happen. “You can’t regulate math! It’s free expression!”

I don’t have a fully developed argument on this, but I’ve never been able to shake the feeling like that was a mistake, that we all got conned while we thought we were winning.

Maybe some precedent for heavily “regulating math” would be really useful right about now.

Maybe we need to start making some.

There’s a persistant belief in computer science since computers were invented that brains are a really fancy powerful computer and if we can just figure out how to program them, intelligent robots are right around the corner.

Theres an analogy that floats around that says if the human mind is a bird, then AI will be a plane, flying, but very different application of the same principals.

The human mind is not a computer.

At best, AI is a paper airplane. Sometimes a very fancy one! With nice paper and stickers and tricky folds! Byt the key is that a hand has to throw it.

The act of a person looking at bunch of art and trying to build their own skills is fundamentally different than a software pattern recognition algorithm drawing a picture from pieces of other ones.

Anyone who claims otherwise has no concept of creativity other than as an abstract concept. The creative impulse is fundamental to the human condition. Everyone has it. In some people it’s repressed, or withered, or undeveloped, but it’s always there.

Back in the early days of the pandemic, people posted all these stories about the “crazy stuff they were making!” It wasn’t crazy, that was just the urge to create, it’s always there, and capitalism finally got quiet enough that you could hear it.

“Making Art” is what humans do. The rest of society is there so we stay alive long enough to do so. It’s not the part we need to automate away so we can spend more time delivering value to the shareholders.

AI isn’t going to turn into skynet and take over the world. There won’t be killer robots coming for your life, or your job, or your kids.

However, the sort of soulless goons who thought it was a good idea to computer automate “writing poetry” before “fixing plumbing” are absolutely coming to take away your job, turn you into a gig worker, replace whoever they can with a chatbot, keep all the money for themselves.

I can’t think of anything more profoundly evil than trying to automate creativity and leaving humans to do the grunt manual labor.

Fuck those people. And fuck everyone who ever enabled them.

Read More
Gabriel L. Helman Gabriel L. Helman

It’ll Be Worth It

An early version of this got worked out in a sprawling Slack thread with some friends. Thanks helping me work out why my perfectly nice neighbor’s garage banner bugs me, fellas

There’s this house about a dozen doors down from mine. Friendly people, I don’t really know them, but my son went to school with their youngest kid, so we kinda know each other in passing. They frequently have the door to the garage open, and they have some home gym equipment, some tools, and a huge banner that reads in big block capital letters:

NOBODY CARES WORK HARDER

My reaction is always to recoil slightly. Really, nobody? Even at your own home, nobody? And I think “you need better friends, man. Maybe not everyone cares, but someone should.” I keep wanting to say “hey man, I care. Good job, keep it up!” It feels so toxic in a way I can’t quite put my finger on.

And, look, I get it. It’s a shorthand to communicate that we’re in a space where the goal is what matters, and the work is assumed. It’s very sports-oriented worldview, where the message is that the complaints don’t matter, only the results matter. But my reaction to things like that from coaches in a sports context was always kinda “well, if no one cares, can I go home?”

(Well, that, and I would always think “I’d love to see you come on over to my world and slam into a compiler error for 2 hours and then have me tell you ‘nobody cares, do better’ when you ask for help and see how you handle that. Because you would lose your mind”)

Because that’s the thing: if nobody cared, we woudn’t be here. We’re hever because we think everyone cares.

The actual message isn’t “nobody cares,” but:

“All this will be worth it in the end, you’ll see”

Now, that’s a banner I could get behind.

To come at it another way, there’s the goal and there’s the work. Depending on the context, people care about one or the other. I used to work with some people who would always put the number of hours spent on a project as the first slide of their final read-outs, and the rest of us used to make terrible fun of them. (As did the execs they were presenting to.)

And it’s not that the “seventeen million hours” wasn’t worth celebrating, or that we didn’t care about it, but that the slide touting it was in the wrong venue. Here, we’re in an environment where only care about the end goal. High fives for working hard go in a different meeting, you know?

But what really, really bugs me about that banner specifically, and things like it, that that they’re so fake. If you really didn’t think anyone cares, you wouldn’t hang a banner up where all your neighbors could see it over your weight set. If you really thought no one cared, you wouldn’t even own the exercise gear, you’d be inside doing something you want to do! Because no one has to hang a “WORK HARDER” banner over a Magic: The Gathering tournament, or a plant nursery, or a book club. But no, it’s there because you think everyone cares, and you want them to think you’re cool because you don’t have feelings. A banner like that is just performative; you hang something like that because you want others to care about you not caring.

There’s a thing where people try and hold up their lack of emotional support as a kind of badge of honor, and like, if you’re at home and really nobody cares, you gotta rethink your life. And if people do care, why do you find pretending they don’t motivating? What’s missing from your life such that pretending you’re on your own is better than embracing your support?

The older I get, the less tolerance I have for people who think empathy is some kind of weakness, that emotional isolation is some kind of strength. The only way any of us are going to get through any of this is together.

Work Harder.

Everyone Cares.

I’ll Be Worth It.

Read More
Gabriel L. Helman Gabriel L. Helman

Friday Linkblog, don’t-be-evil edition

I’ve been meaning to link to these for a while, but keeping some thematic unity with this week, the Verge has has a whole series of articles on Google at 25. My favorites were: The end of the Googleverse and The people who ruined the internet.

(Also, that second article links to Ed Zitron’s excellent The Internet is Already Broken, which I also recommend)

As someone who was both a legal adult and college graduate before Google happened, it’s deeply strange to realize that I lived through the entire era where Google “worked”; before it choked out on SEO content-farm garbage, advertising conflicts of interest, and general enshittification.

And then, Google lost the antitrust case against Epic; see: The Verge, Ars.

(As an aside a certain class of nerd are losing their damn minds that Google lost but Apple didn’t. The Ars comment thread in particular is a showcase of Dunning-Kruger hallucinations of what they wish the law was instead of what it really is.)

I bring this all up so I can tell this story:

Back in the early 2000s, probably 2003 or 4 based on where I was and who I was talking to, I remember a conversation I had about the then-new “don’t be evil” Google. The persons I was talking to were very enthusiastic about them. Recall, there was still the mood in the room that “we” had finally beat Microsoft, they’d lost the antritrust case, the web was going to defeat Windows, and so on.

And I distinctly remember saying something like “Microsoft just operated like an old railroad monopoly, so we already knew how to be afraid of them. We haven’t learned how to be afraid of companies like google yet.”

And, reader: “LOL”. “LMAO”, even. Because, go back and read the stuff in Epic’s lawsuit against Google—Google was doing the exact same stuff that Microsoft got nailed for twenty years ago. To call myself out here on main, we already DID know how to be afraid of google, we just bought their marketing hook, line, and sinker.

We were all so eager to get past Microsoft’s stranglehold on computers that we just conned ourselves into handing even more control to an even worse company. Unable to imagine computers not being dominated by a company, so hey, at least this one isn’t Microsoft, or IBM, or AT&T!

(This is the same fallacy that bugs me about Satanists—they want to rebel, but buy into all the same fundamental assumptions about the universe, but they just root for the other team. Those people never actually go outside the box they started in, and become, say, Buddhists.)

A decade ago this is where I would have 800 words endorsing FOSS as the solution, but I think at this point, deep down, we all know that isn’t the answer either.

Maybe this time, lets try regulating the hell out of all of this, and then try hard to pay attention and not get scammed by the next big company that comes along and flirts with us? Let's put some energy into getting out of the box instead of just finding one with nicer branding.

Read More
Gabriel L. Helman Gabriel L. Helman

Layoff Season(s)

Well, it’s layoff season again, which pretty much never stopped this year? I was going to bury a link or two to an article in that last sentence, but you know what? There’s too many. Especially in tech, or tech-adjacent fields, it’s been an absolute bloodbath this year.

So, why? What gives?

I’ve got a little personal experience here: I’ve been through three layoffs now, lost my job once, shoulder-rolled out of the way for the other two. I’ve also spent the last couple decades in and around “the tech industry”, which here we use as shorthand for companies that are either actually a Silicon Valley software company, or a company run by folks that used to/want to be from one, with a strong software development wing and at least one venture capital–type on the board.

In my experience, Tech companies are really bad at people. I mean this holistically: they’re bad at finding people, bad at hiring, and then when they do finally hire someone, they’re bad at supporting those people—training, “career development”, mentoring, making sure they’re in the right spot, making sure they’re successful. They’re also bad any kind of actual feedback cycle, either to reward the excellent or terminate underperformers. As such, they’re also bad at keeping people. This results in the vicious cycle that puts the average time in a tech job at about 18 months—why train them if they’re gonna leave? Why stay if they won’t support me?

There are pockets where this isn’t true, of course; individual managers, or departments, or groups, or even glue-type individuals holding the scene together that handle this well. I think this is all a classic “don’t attribute to malice what you can attribute to incompetence” situtation. I say this with all the love in the world, but people who are good at tech-type jobs tend to be low-empathy lone wolf types? And then you spend a couple decades promoting the people from that pool, and “ask your employees what they need” stops being common sense and is suddenly some deep management koan.

The upshot of all this is that most companies with more than a dozen or two employees have somewhere between 10–20% of the workforce that isn’t really helping out. Again—this isn’t their fault! The vast majority of those people would be great employees in a situation that’s probably only a tiny bit different than the one you’re in. But instead you have the one developer who never seems to get anything done, the other developer who’s work always fails QA and needs a huge amount of rework, the person who only seems to check hockey scores, the person whos always in meetings, the other person whose always in “meetings.” That one guy who always works on projects that never seem to ship.1 The extra managers that don’t seem to manage anyone. And, to be clear, I’m talking about full-time salaried people. People with a 401(k) match. People with a vesting schedule.

No one doing badly enough to get fired, but not actually helping row the boat.

As such, at basically any point any large company—and by large I mean over about 50—can probably do a 10% layoff and actually move faster afterwards, and do a 20% layoff without any significant damage to the annual goals—as long as you don’t have any goals about employee morale or well-being. Or want to retain the people left.

The interesting part—and this is the bad interesting, to be clear—is if you can fire 20% of your employees at any time, when do you do that?

In my experience, there’s two reasons.

First, you drop them like a submarine blowing the ballast tanks. Salaries are the biggest expense center, and in a situation where the line isn’t going up right, dropping 20% of the cost is the financial equivalent of the USS Dallas doing an emergency surface.

Second, you do it to discipline labor. Is the workforce getting a little restless? Unhappy about the stagnat raises? Grumpy about benefits costing more? Is someone waving around a copy of Peopleware?2 Did the word “union” float across the courtyard? That all shuts down real fast if all those people are suddenly sitting between two empty cubicles. “Let’s see how bad they score the engagement survey if the unemployment rate goes up a little!” Etc.

Again—this is all bad! This is very bad! Why do any this?

The current wave feels like a combo plate of both reasons. On the one hand, we have a whole generation of executive leaders that have never seen interest rates go up, so they’re hitting the one easy panic button they have. But mostly this feels like a tantrum by the c-suite class reacting to “hot labor summer” becoming “eternal labor september.”

Of course, this is where I throw up my hands and have nothing to offer except sympathy. This all feels so deeply baked in to the world we live in that it seems unsolvable short of a solution that ends with us all wearing leather jackets with only one sleve.

So, all my thoughts with everyone unexpectedly jobless as the weather gets cold. Hang on to each other, we’ll all get through this.


  1. At one point in my “career”, the wags in the cubes next to mine made me a new nameplate that listed my job as “senior shelf-ware engineer.” I left it up for months, because it was only a little bit funny, but it was a whole lot true.

  2. That one was probably me, sorrryyyyyy (not sorry)

Read More
Gabriel L. Helman Gabriel L. Helman

Wednesday linkblog, Twitter reminisces edition

The reminisces are starting to flow now, as is really starts to settle in that the thing we used to call twitter is gone and won’t be back.. As such, I’d like to call your attention to The Verge’s truly excellent The Year Twitter Died. This is probably the best “what it was, and what we lost, for both good and ill” piece I’ve read. Especially don’t miss The Great Scrollback of Alexandria. I’m glad someone is putting the work into saving some part of what used to be there.

Also, this week in folks talking about twitter, I enjoyed John Scalzi’s check-in a month after finally walking away: Abandoning the Former Twitter: A Four-Week Check-In. Scalzi was one of the strongest “I was here before he got here, and I’ll be here after he leaves” voices I saw a year ago, and the last year beat him like the rest of us.

There’s, of couse, the usual blowback to stuff like this, with at least one article I saw in response to that verge piece claiming that no, twitter always sucked, here’s all the problems it had, I always had a bad time there, so on and so on. I won’t link to it because why give them the attention, but I spent the whole time reading it thinking of this quote from former-President Beeblebrox: “Zowee, here we are at the End of the Universe and you haven't even lived yet. Did you miss out.”

Read More
Gabriel L. Helman Gabriel L. Helman

Re-Capturing the Commons

The year’s winding down, which means it’s time to clear out the drafts folder. Let me tell you about a trend I was watching this year.

Over the last couple of decades, a business model has emerged that looks something like this:

  1. A company creates a product with a clear sales model, but doesn’t have value without a strong community
  2. The company then fosters such a community, which then steps in and shoulders a fair amount of the work of running said community
  3. The community starts creating new things on top of what that original work of the parent company—and this is important—belong to those community members, not the company
  4. This works well enough that the community starts selling additional things to each other—critically, these aren’t competing with that parent company, instead we have a whole “third party ecosystem”.

(Hang on, I’ll list some examples in a second.)

These aren’t necessarily “open source” from a formal OSI “Free & Open Source Software” perspective, but they’re certainly open source–adjacent, if you will. Following the sprit, if not the strict legal definition.

Then, this year especially, a whole bunch of those types of companies decided that they wouldn’t suffer anyone else makining things they don’t own in their own backyard, and tried to reassert control over the broader community efforts.

Some specific examples of what I mean:

  • The website formerly known as Twitter eliminating 3rd party apps, restricting the API to nothing, and blocking most open web access.
  • Reddit does something similar, effectively eliminates 3rd party clients and gets into an extended conflict with the volunteer community moderators.
  • StackOverflow and the rest of the StackExchange network also gets into an extended set of conflicts with its community moderators, tries to stop releasing the community-generated data for public use, revises license terms, and descends into—if you’ll forgive the technical term—a shitshow.
  • Hasbro tries to not only massively restrict the open license for future versions of Dungeons and Dragons, but also makes a move to retroactively invalidate the Open Game License that covered material created for the 3rd and 5th editions of the game over the last 20 years.

And broadly, this is all part of the Enshittification Curve story. And each of these examples have a whole set of unique details. Tens, if not hundreds of thousands of words have been written on each of these, and we don’t need to re-litigate those here.

But there’s a specific sub-trend here that I think is worth highlighting. Let’s look at what those four have in common:

  • Each had, by all accounts, a successful business model. After-the-fact grandstanding non-withstanding, none of those four companies was in financial trouble, and had a clear story about how they got paid. (Book sales, ads, etc.)
  • They all had a product that was absolutely worthless without an active community. (The D&D player’s handbook is a pretty poor read if you don’t have people to play with, reddit with no comments is just an ugly website, and so on)
  • Community members were doing significant heavy lifting that the parent company was literally unable to do. (Dungeon Mastering, community moderating. Twitter seems like the outlier here at first glance, but recall that hashtags, threads, the word “tweet” and literally using a bird as a logo all came from people not on twitter’s payroll.)
  • There were community members that made a living from their work in and around the community, either directly or indirectly. (3rd party clients, actual play streams, turning a twitter account about things your dad says into a network sitcom. StackOverflow seems like the outlier on this one, until you remember that many, many people use their profiles there as a kind of auxiliary outboard resume.)
  • They’ve all had recent management changes; more to the point, the people who designed the open source–adjacent business model are no longer there.
  • These all resulted in huge community pushback

So we end up in a place where a set of companies that no one but them can make money in their domains, and set their communities on fire. There was a lot of handwaving about AI as an excuse, but mostly that’s just “we don’t want other people to make money” with extra steps.

To me, the most enlightening one here is Hasbro, because it’s not a tech company and D&D is not a tech product, so the usual tech excuses for this kind of behavior don’t fly. So let’s poke at that one for an extra paragraph or two:

When the whole OGL controversy blew up back at the start of the year, certain quarters made a fair amount of noise about how this was a good thing, because actually, most of what mattered about D&D wasn’t restrict-able, or was in the public domain, and good old fair use was a better deal than the overly-restrictive OGL, and that the community should never have taken the deal in the first place. And this is technically true, but only in the ways that don’t matter.

Because, yes. The OGL, as written, is more restrictive that fair use, and strict adherence to the OGL prevents someone from doing things that should otherwise be legal. But that misses the point.

Because what we’re actually talking about is an industry with one multi-billion dollar company—the only company on earth that has literal Monopoly money to spend—and a whole bunch of little tiny companies with less than a dozen people. So the OGL wasn’t a crummy deal offered between equals, it was the entity with all the power in the room declaring a safe harbor.

Could your two-person outfit selling PDFs online use stuff from Hasbro’s book without permission legally? Sure. Could you win the court case when they sue you before you lose your house? I mean, maybe? But not probably.

And that’s what was great about it. For two decades, it was the deal, accept these slightly more restrictive terms, and you can operate with the confidence that your business, and your house, is safe. And an entire industry formed inside that safe harbor.

Then some mid-level suit at Hasbro decided they wanted a cut?

And I’m using this as the example partly because it’s the most egregious. But 3rd party clients for twitter and reddit were a good business to be in, until they suddenly were not.

And I also like using Hasbro’s Bogus Journey with D&D as the example because that’s the only one where the community won. With the other three here, the various owners basically leaned back in their chairs and said “yeah, okay, where ya gonna go?” and after much rending of cloth, the respective communities of twitter, and reddit, and StackOverflow basically had to admit there wasn’t an alternative., they were stuck on that website.

Meanwhile, Hasbro asked the same question, and the D&D community responded with, basically, “well, that’s a really long list, how do you want that organized?”

So Hasbro surrendered utterly, to the extent that more of D&D is now under a more irrevocable and open license that it was before. It feels like there’s a lesson in competition being healthy here? But that would be crass to say.

Honestly, I’m not sure what all this means; I don’t have a strong conclusion here. Part of why this has been stuck in my drafts folder since June is that I was hoping one of these would pop in a way that would illuminate the situation.

And maybe this isn’t anything more than just what corporate support for open source looks like when interest rates start going up.

But this feels like a thing. This feels like it comes from the same place as movie studios making record profits while saying their negotiation strategy is to wait for underpaid writers to lose their houses?

Something is released into the commons, a community forms, and then someone decides they need to re-capture the commons because if they aren’t making the money, no one can. And I think that’s what stuck with me. The pettiness.

You have a company that’s making enough money, bills are paid, profits are landing, employees are taken care of. But other people are also making money. And the parent company stops being a steward and burns the world down rather than suffer someone else make a dollar they were never going to see. Because there’s no universe where a dollar spent on Tweetbot was going to go to twitter, or one spent on Apollo was going to go to reddit, or one spent on any “3rd party” adventure was going to go to Hasbro.

What can we learn from all this? Probably not a lot we didn’t already know, but: solidarity works, community matters, and we might not have anywhere else to go, but at the same time, they don’t have any other users. There’s no version where they win without us.

Read More
Gabriel L. Helman Gabriel L. Helman

2023’s strange box office

Weird year for the box office, huh? Back in July, we had that whole rash of articles about the “age of the flopbuster” as movie after movie face-planted. Maybe things hadn’t recovered from the pandemic like people hoped?

And then, you know, Barbenheimer made a bazillion dollars.

And really, nothing hit like it was supposed to all year. People kept throwing out theories. Elemental did badly, and it was “maybe kids are done with animation!” Ant-Man did badly, and it was “Super-Hero fatigue!” Then Spider-Verse made a ton of money disproving both. And Super Mario made a billion dollars. And then Elemental recovered on the long tail and ended up making half a billion? And Guardians 3 did just fine. But Captain Marvel flopped. Harrison Ford came back for one more Indiana Jones and no one cared.

Somewhere around the second weekend of Barbenheimer everyone seemed to throw up their hands as if to say “we don’t even know what’ll make money any more”.

Where does all that leave us? Well, we clearly have a post-pandemic audience that’s willing to show up and watch movies, but sure seems more choosy than they used to be. (Or choosy about different things?)

Here’s my take on some reasons why:

The Pandemic. I know we as a society have decided to act like COVID never happened, but it’s still out there. Folks may not admit it, but it’s still influencing decisions. Sure, it probably wont land you in the hospital, but do you really want to risk your kid missing two weeks of school just so you can see the tenth Fast and the Furious in the theatre? It may not be the key decision input anymore, but that’s enough potential friction to give you pause.

Speaking of the theatre, the actual theater experience sucks most of the time. We all like to wax poetic about the magic of the shared theatre experience, but in actual theaters, not the fancy ones down in LA, that “experience” is kids talking, the guy in front of you on his phone, the couple behind you being confused, gross floors, and half an hour of the worst commercials you’ve ever seen before the picture starts out of focus and too dim.

On the other hand, you know what everyone did while they were stuck at home for that first year of COVID? Upgrade their home theatre rig. I didn’t spend a whole lot of money, but the rig in my living room is better than every mall theatre I went to in the 90s, and I can put the volume where I want it, stop the show when the kids need to go to the bathroom, and my snacks are better, and my chairs are more comfortable.

Finally, and I think this is the key one—The value proposition has gotten out of wack in a way I don’t think the industry has reckoned with. Let me put my cards down on the table here: I think I saw just about every movie released theatrically in the US between about 1997 and maybe 2005. I’m pro–movie theatre. It was fun and I enjoyed it, but also that was absolutely the cheapest way to spend 2-3 hours. Tickets were five bucks, you could basically fund a whole day on a $20 bill if you were deliberate about it.

But now, taking a family of four to a movie is in the $60-70 range. And, thats a whole different category. That’s what a new video game costs. That’s what I paid for the new Zelda, which the whole family is still playing and enjoying six months later, hundreds of hours in. Thats Mario Kart with all the DLC, which we’ve also got about a million hours in. You’re telling me that I should pay the same amount of money that got me all that for one viewing of The Flash? Absolutely Not. I just told the kids we weren’t going to buy the new Mario before christmas, but I’m supposed to blow that on… well, literally anything that only takes up two hours?

And looking at that from the other direction, I’m paying twelve bucks a month for Paramount +, for mostly Star Trek–related reasons. But that also has the first six Mission: Impossible movies on it right now. Twelve bucks, you could cram ‘em all in a long weekend if you were serious about it. And that’s not really even a streaming thing, you could have netted six not-so-new release movies for that back in the Blockbuster days too. And like I said, I have some really nice speakers and a 4k projector, those movies look great in my living room. You’re trying to tell me that the new one is so much better that I need to pay five times what watching all the other movies cost me, just to see it now? As opposed to waiting a couple of months?

And I think that’s the key I’m driving towards here: movies in the theatre have found themselves with a premium price without offering a premium product.

So what’s premium even mean in this context? Clicking back and forth between Box Office Mojo’s domestic grosses for 2023 and 2019, this year didn’t end up being that much worse, it just wasn’t the movies people were betting on that made money.

There’s a line I can’t remember the source of that goes something to the effect of “hollywood doesn’t have a superhero movie problem, it has a ‘worse copy of movies we’ve already seen’ problem.” Which dovetails nicely with John Scalzi’s twitter quip about The Flash bombing: “…the fact is we’re in the “Paint Your Wagon” phase of the superhero film era, in which the genre is played out, the tropes are tired and everyone’s waiting for what the next economic engine of movies will be.”

Of course, when we say “Superhero”, we mostly mean Marvel Studios, since the recent DC movies have never been that good or successful. And Marvel did one of the dumbest things I’ve ever seen, which is gave everyone an off ramp. For a decade they had everyone in a groove to go see two or three movies a year and keep up on what those Avengers and their buddies were up to. Sure, people would skip one or two here or there, a Thor, an Ant-Man, but everyone would click back in for one of the big team up movies. And then they made Endgame, and said “you’re good, story is over, you can stop now!” And so people did! The movie they did right after Endgame needed to be absolutely the best movie they had ever done, and instead it was Black Widow. Which was fine, but didn’t convince anyone they needed to keep watching.

And I’d extend all this out to not just Superheros, but also “superhero adjacent” moves, your Fast and Furious, Mission: Impossible, Indiana Jones. Basically all the “big noise” action blockbusters. I mean, what’s different about this one versus the other half-dozen I’ve already seen?

(Indiana Jones is kind of funny for other reasons, because I think Disney dramatically underestimated how much the general audience knew or cared about Spielburg. His name on those movies mattered! The guy who made “The Wolverine” is fine and all, but I’m gonna watch that one at home. I’m pretty sure if Steve had directed it instead of going off to do West Side Story it would have made a zillion dollars.)

But on the other hand, the three highest grossing movies that weren’t Barbenheimer were Super Mario Bros, Spider-Verse, and Guardians of the Galaxy 3, so clearly superheros and animation are still popular, just the right superheros and animation. Dragging the superhero-movies-are-musicals metaphor to the limit, there were plenty of successful musicals after Paint your Wagon, but they were the ones that did something interesting or different. They stopped being automatically required viewing.

At this point, I feel like we gotta talk about budgets for a second, only only for a second because it is not that interesting. If you don’t care about this, I’ll meet down on the other side of the horizontal line.

Because the thing is, most of those movies that, ahem, “underperformed” cost a ton. The new M:I movie payed the salaries for everyone working on it through the whole COVID lockdown, so they get a pass. (Nice work, Tom Cruise!). Everyone else, though, what are you even doing? If you spend so much money making a movie that you need to be one of the highest grossing films of all time just to break even, maybe that’s the problem right there? Dial of Destiny cost 300 million dollars. Last Crusade cost forty eight. Adjusted for inflation, thats (checks wolfram alpha) …$116 million? Okay, that amount of inflation surprised me too, but the point stands: is Dial three times as much movie as Last Crusade? Don’t bother answering that, no it is not, and thats even before pointing out the cheap one was the one with Sean friggin’ Connery.

This where everyone brings up Sound of Freedom. Let’s just go ahead and ignore, well, literally everything else about the movie and just point out that it made just slightly more money than the new Indiana Jones movie, but also only cost, what, 14 million bucks? Less than five percent of what Indy cost?

There’s another much repeated bon mot I can’t seem to find an origin for that goes something along the lines of “They used to used to make ten movies hoping one would be successful enough to pay for the other nine, but then decided to just make the one that makes money, which worked great until it didn’t.” And look, pulpy little 14 million dollar action movies are exactly the kind of movie they’re talking about there. Sometimes they hit a chord! Next time you’re tempted to make a sequel to a Spielburg/Lucas movie without them, maybe just scrap that movie and make twenty one little movies instead.

So, okay. What’s the point, what can we learn from this strange year in a strange decade? Well, people like movies. They like going to see movies. But they aren’t going to pay to see a worse version of something they can already watch at home on their giant surround-sound-equipped TV for “free”. Or risk getting sick for the privilege.

Looking at the movies that did well this year, it was the movies that had something to say, that had a take, movies that had ambitions beyond being “the next one.”

Hand more beloved brand names to indie film directors and let them do whatever they want. Or, make a movie based on something kids love that doesn’t already have a movie. Or make a biography about how sad it is that the guy who invented the atomic bomb lost his security clearance because iron man hated him. That one feels less applicable, but you never know. If you can build a whole social event around an inexplicable double-feature, so much the better.

And, look, basically none of this is new. The pandemic hyper-charged a whole bunch of trends, but I feel like I could have written a version of this after Thanksgiving weekend for any year in the past decade.

That’s not the point. This is:

My favorite movie of the year was Asteroid City. That was only allegedly released into theatres. It made, statistically speaking, no money. Those kinds of movies never do! They make it up on the long tail.

I like superhero/action movies movies as much as the next dork who knew who “Rocket Racoon” was before 2014, but I’m not about to pretend they’re high art or anything. They’re junk food, sometimes well made very entertaining junk food, but lets not kid ourselves about the rest of this complete breakfast.

“Actually good” movies (as opposed to “fun and loud”) don’t do well in the theatre, they do well on home video.

Go back and look at that 2019 list I linked above. On my monitor, the list cuts off at number fifteen before you have to scroll, and every one of those fifteen movies is garbage. Fun garbage, in most cases! On average, well made, popular, very enjoyable. (Well, mostly, Rise of Skywalker is the worst movie I’ve ever paid to see.)

Thats what was so weird about Barbenheimer, and Spider-Verse, and 2023’s box office. For once, objectively actually good movies made all the money.

Go watch Asteroid City at home, that’s what I’m saying.

Read More
Gabriel L. Helman Gabriel L. Helman

What was happening: Twitter, 2006-2023

Twitter! What can I tell ya? It was the best of times, it was the worst of times. It was a huge part my life for a long time. It was so full of art, and humor, and joy, and community, and ideas, and insight. It was also deeply flawed and profoundly toxic, but many of those flaws were fundamental to what made it so great.

It’s almost all gone now, though. The thing called X that currently lives where twitter used to be is a pale, evil, corrupted shadow of what used to be there. I keep trying to explain what we lost, and I can’t, it’s just too big.1 So let me sum up. Let me tell you why I loved it, and why I left. As the man2 said, let me tell you of the days of high adventure.


I can’t now remember when I first heard the word “twitter”. I distinctly remember a friend complaining that this “new twitter thing” had blown out the number of free SMS messages he got on his nokia flip phone, and that feels like a very 2006 conversation.

I tend to be pretty online, and have been since the dawn of the web, but I’m not usually an early adopter of social networks, so I largely ignored twitter for the first couple of years. Then, for reasons downstream of the Great Recession, I found myself unemployed for most of the summer of 2009.3 Suddenly finding myself with a surfit of free time, I worked my way down that list of “things I’ll do if I ever get time,” including signing up for “that twitter thing.” (I think that’s the same summer I lit up my now-unused Facebook account, too.) Smartphones existed by then, and it wasn’t SMS-based anymore, but had a website, and apps.4

It was great. This was still in it’s original “microblogging” configuration, where it was essentially an Instant Messenger status with history. You logged in, and there was the statuses of the people you followed, in chronological order, and nothing else.

It was instantly clear that this wasn’t a replacement for something that already existed—this wasn't going to do away with your LiveJournal, or Tumblr, or Facebook, or blog. This was something new, something extra, something yes and. The question was, what was it for? Where did it fit in?

Personally, at first I used my account as a “current baby status” feed, updating extended family about what words my kids had learned that day. The early iteration of the site was perfect for that—terse updates to and from people you knew.

Over time, it accumulated various social & conversational features, not unlike a Katamari rolling around Usenet, BBSes, forums, discussion boards, other early internet communication systems. It kept growing, and it became less useful as a micro-blogging system and more of a free-wheeling world-wide discussion forum.

It was a huge part of my life, and for a while there, everyone’s life. Most of that time, I enjoyed it an awful lot, and got a lot out of it. Everyone had their own take on what it was Twitter had that set it apart, but for me it was three main things, all of which reinforced each other:

  1. It was a great way to share work. If you made things, no matter how “big” you were, it was a great way to get your work out there. And, it was a great way to re-share other people’s work. As a “discovery engine” it was unmatched.

  2. Looking at that the other way, It was an amazing content aggregator. It essentially turned into “RSS, but Better”; at the time RSS feeds had pretty much shrunk to just “google reader’s website”. It turns out that sharing things from your RSS feed into the feeds of other people, plus a discussion thread, was the key missing feature. If you had work of your own to share, or wanted to talk about something someone else had done elsewhere on the internet, twitter was a great way to share a link and talk about it. But, it also worked equally well for work native to twitter itself. Critically, the joke about the web shrinking to five websites full of screenshots of the other four5 was posted to twitter, which was absolutely the first of those five websites.

  3. Most importantly, folks who weren’t anywhere else on the web were on twitter. Folks with day jobs, who didn’t consider themselves web content people were there; these people didn’t have a blog, or facebook, or instagram, but they were cracking jokes and hanging out in twitter.

There is a type of person whom twitter appealed to in a way that no other social networking did. A particular kind of weirdo that took Twitter’s limitations—all text, 140 or 280 characters max—and turned them into a playground.

And that’s the real thing—twitter was for writers. Obviously it was text based, and not a lot of text at that, so you had to be good at making language work for you. As much as the web was originally built around “hypertext”, most of the modern social web is built around photos, pictures, memes, video. Twitter was for people who didn’t want to deal with that, who could make the language sing in a few dozen words.

It had the vibe of getting to sit in on the funniest people you know’s group text, mixed with this free-wheeling chaos energy. On it’s best days, it had the vibe of the snarky kids at the back of the bus, except the bus was the internet, and most of the kids were world-class expoerts in something.

There’s a certain class of literary writer goofballs that all glommed onto twitter in a way none of us did with any other “social network.” Finally, something that rewarded what we liked and were good at!

Writers, comedians, poets, cartoonists, rabbis, just hanging out. There was a consistent informality to the place—this wasn’t the show, this was the hotel bar after the show. The big important stuff happened over in blogs, or columns, or novels, or wherever everyone’s “real job” was, this was where everyone let their hair down and cracked jokes.

But most of all, it was weird. Way, way weirder than any other social system has ever been or probably ever will be again, this was a system that ran on the same energy you use to make your friends laugh in class when you’re supposed to be paying attention.

It got at least one thing exactly right: it was no harder to sign into twitter and fire off a joke than it was to fire a message off to the group chat. Between the low bar to entry and the emphasis on words over everthing else, it managed to attract a crowd of folks that liked computers, but didn’t see them as a path to self-actualization.

But what made twitter truly great were all the little (and not so little) communities that formed. It wasn’t the feature set, or the website, or the tech, it was the people, and the groups they formed. It’s hard to start making lists, because we could be here all night and still leave things out. In no particular order, here’s the communities I think I’ll miss the most:

  • Weird Twitter—Twitter was such a great vector for being strange. Micro-fiction, non-sequiturs, cats sending their mothers to jail, dispatches from the apocalypse.
  • Comedians—professional and otherwise, people who could craft a whole joke in one sentence.
  • Writers—A whole lot of people who write for a living ended up on twitter in a way they hadn’t anywhere else on the web.
  • Jewish Twitter—Speaking as a Jew largely disconnection from the local Jewish community , it was so much fun to get to hang out with the Rabbis and other Jews.

But also! The tech crowd! Legal experts! Minorities of all possible interpretations of the word sharing their experiences.

And the thing is, other than the tech crowd,6 most of those people didn’t go anywhere else. They hadn’t been active on the previous sites, and many of them drifted away again the wheels started coming off twitter. There was a unique alchemy on twitter for forming communities that no other system has ever had.

And so the real tragedy of twitter’s implosion is that those people aren’t going somewhere else. That particular alchemy doesn’t exist elsewhere, and so the built up community is blowing away on the wind.


Because all that’s almost entirely gone now, though. I miss it a lot, but I realize I’ve been missing it for a year now. There had been a vague sense of rot and decline for a while. You can draw a pretty straight line from gamergate, to the 2016 Hugos, to the 2016 election, to everything around The Last Jedi, to now, as the site rotted out from the inside; a mounting sense that things were increasingly worse than they used to be. The Pandemic saw a resurgence of energy as everyone was stuck at home hanging out via tweets, but in retrospect that was a final gasp.7

Once The New Guy took over, there was a real sense of impending closure. There were plenty of accounts that made a big deal out of Formally Leaving the site and flouncing out to “greener pastures”, either to make a statement, or (more common) to let their followers know where they were. There were also plenty of accounts saying things like “you’ll all be back”, or “I was here before he got here and I’ll be here after he leaves”, but over the last year mostly people just drifted away. People just stopped posting and disappeared.

It’s like the loss of a favorite restaurant —the people who went there already know, and when people who wen’t there express disbelief, the response is to tell them how sorry you are they missed the party!

The closest comparison I can make to the decayed community is my last year of college. (Bear with me, this’ll make sense.). For a variety of reasons, mostly good, it took me 5 years to get my 4 year degree. I picked up a minor, did some other bits and bobs on the side, and it made sense to tack on an extra semester, and at that point you might as well do the whole extra year.

I went to a medium sized school in a small town.8 Among the many, many positive features of that school was the community. It seemed like everyone knew everyone, and you couldn’t go anywhere without running into someone you knew. More than once, when I didn’t have anything better to do, I’d just hike downtown and inevitably I’d run into someone I knew and the day would vector off from there.9

And I’d be lying if I said this sense of community wasn’t one of the reasons I stuck around a little longer—I wasn’t ready to give all that up. Of course, what I hadn’t realized was that not everyone else was doing that. So one by one, everyone left town, and by the end, there I was in downtown surrounded by faces I didn’t know. My lease had a end-date, and I knew I was moving out of town on that day no matter what, so what, was I going to build up a whole new peer group with a short-term expiration date? That last six months or so was probably the weirdest, loneliest time of my whole lide. When the lease ended, I couldn’t move out fast enough.

The point is: twitter got to be like that. I was only there for the people, and nearly all the people I was there for had already gone. Being the one to close out the party isn’t always the right move.


One of the things that made it so frustrating was that it had always problems, but it had the same problems that any under-moderated semi-anonymous internet system had. “How to stop assholes from screwing up your board” is a 4 decade old playbook at this point, and twitter consistently failed to actually deploy any of the solutions, or at least deploy them at a scale that made a difference. The maddening thing was always that the only unique thing about twitter’s problems was the scale.

I had a soft rule that I could only read Twitter when using my exercise bike, and a year or two ago I couldn’t get to the end of the tweets from people I followed before I collapsed from exhaustion. Recently, I’d run out of things to read before I was done with my workout. People were posting less, and less often, but mostly they were just… gone. Quietly fading away as the site got worse.

In the end, though, it was the tsunami of antisemitism that got me. “Seeing only what you wanted to see” was always a skill on twitter, but the unfolding disaster in Israel and Gaza broke that. Not only did you have the literal nazis showing up and spewing their garbage without check, but you had otherwise progressive liberal leftists (accidentally?) doing the same thing, without pushback or attempt at discussion, because all the people that would have done that are gone. So instead it’s just a nazi sludge.10


There was so much great stuff on there—art, ideas, people, history, jokes. Work I never would have seen, things I wouldn’t have learned, books I wouldn’t have read, people I wouldn’t know about. I keep trying to encompass what’s been lost, make lists, but it’s too big. Instead, let me tell you one story about the old twitter:

One of the people I follow(ed) was Kate Beaton, originally known for the webcomic Hark A Vagrant!, most recently the author of Ducks (the best book I read last year). One day, something like seven years ago, she started enthusing about a book called Tough Guys Have Feelings Too. I don’t think she had a connection to the book? I remember it being an unsolicited rave from someone who had read it and was stuck by it.

The cover is a striking piece of art of a superhero, head bowed, eyes closed, a tear rolling down his cheek. The premise of the book is what it says on the cover—even tough guys have feelings. The book goes through a set of sterotypical “tough guys”—pirates, ninjas, wrestlers, superheros, race car drivers, lumberjacks, and shows them having bad days, breaking their tools, crashing their cars, hurting themselves. The tough guys have to stop, and maybe shed a tear, or mourn, or comfort themselves or each other, and the text points out, if even the tough guys can have a hard time, we shouldn’t feel bad for doing the same. The art is striking and beautiful, the prose is well written, the theme clearly and well delivered.

I bought it immediately. You see, my at-the-time four-year-old son was a child of Big Feelings, but frequently had trouble handling those feelings. I thought this might help him. Overnight, this book became almost a mantra. For years after this, when he was having Big Feelings, we’d read this book, and it would help him calm down and take control of what he was feeling.

It’s not an exaggeration to say this book changed all our lives for the better. And in the years since then, I’ve often been struck that despite all the infrastructure of moden capitalism—marketing, book tours, reviews, blogs, none of those ever got that book into my hands. There’s only been one system where an unsolicited rave from a web cartoonist being excited about a book outside their normal professional wheelhouse could reach someone they’ve never met or heard of and change that person’s son’s life.

And that’s gone now.


  1. I’ve been trying to write something about the loss of twitter for a while now. The first draft of this post has a date back in May, to give you some idea.

  2. Mako.

  3. As as aside, everyone should take a summer off every decade or so.

  4. I tried them all, I think, but settled on the late, lamented Tweetbot.

  5. Tom Eastman: I’m old enough to remember when the Internet wasn't a group of five websites, each consisting of screenshots of text from the other four.

  6. The tech crowd all headed to mastodon, but didn’t build that into a place that any of those other communities could thrive. Don’t @-me, it’s true.

  7. In retrospect, getting Morbius to flop a second time was probably the high point, it was all downhill after that.

  8. CSU Chico in Chico, California!

  9. Yes, this is what we did back in the 90s before cellphones and texting, kids.

  10. This is out of band for the rest of the post, so I’m jamming all this into a footnote:

    Obviously, criticizing the actions of the government of Israel is no more antisemitic than criticizing Hamas would be islamophobic. But objecting to the actions of Israel ’s government with “how do the Jews not know they’re the bad guys” sure as heck is, and I really didn’t need to see that kind of stuff being retweeted by the eve6 guy.

    A lot of things are true. Hamas is not Palestine is not “The Arabs”, and the Netanyahu administration is not Israel is not “The Jews.” To be clear, Hamas is a terror organization, and Israel is on the functional equivalent of Year 14 of the Trump administration.

    The whole disaster hits at a pair of weird seams in the US—the Israel-Palestine conflict maps very strangely to the American political left-right divide, and the US left has always had a deep-rooted antisemitism problem. As such, what really got me was watching all the comments criticizing “the Jews” for this conflict come from _literally_ the same people who spent four years wearing “not my president” t-shirts and absolving themselves from any responsibility for their governments actions because they voted for “the email lady”. They get the benefit of infinite nuance, but the Jews are all somehow responsible for Bibi’s incompetent choices.

Read More
Gabriel L. Helman Gabriel L. Helman

Congrats, SAG-AFTRA!

Good work, everyone! Solidarity always wins.

(As of this writing the terms aren’t public for those of us on the outside looking in, but I’m very curious to see what form the AI protections take, among other things.)

Read More