Gabriel L. Helman Gabriel L. Helman

Retweeting Feeds

There’s a feature I miss from the old twitter, or rather, there’s a use case that twitter filled better than anything else.

The use case is this: RSS feeds are a great way to publish content, but that’s where it stops—there’s no intrinsic way to (re)share an item from a feed you’re subscribed to with anyone else, to retweet it, if you will. I’d love to have an easy way to reshare content from feeds I’m subscribed to.

I think one of the reasons that twitter sucked the air out of the RSS ecosystem was that not only was it trivial to set up a twitter account that worked just like an RSS feed, with links to your blog or podcast or whatever, but everyone who followed your feed could re-share it with their followers with a single action, optionally adding a comment. I cannot tell you how much great stuff I found out about because someone I followed on twitter quote-tweeted it.

Here in the post-twitter afterscape, I keep wishing NetNewsWire had a retweet button. I’ve spent enough time in product design and development to know why that doesn’t exist; something that could re-share an item from an RSS feed out into a new feed with a connection back to the original feed requires about 80% of twitter, if you’re going to build all that, add the ability to post your own content as well and go all the way to a social network/microblogging system/twitter clone. And I guess the solution is either bluesky or mastodon.

But I keep thinking there has to be something between turn of the century–style RSS feeds and a full-blown social network. And I am putting this out in the universe because I absolutely do not want to build or work on this myself, I want to use it. (All my actual startup ideas are going to be buried with me.)

Somebody go figure this out!


Edited to add: I am reminded by an Alert Reader that Google Reader (RIP) had a similar feature, you could “share” things from your feed. But google’s gonna google and the share list was basically your gmail contacts list? Which would lead to some really bizarre results like suddenly seeing a thing in your feed that got there because a friend of a friend shared it, because you were both on the same party invite a couple of years ago. Cool idea, but again, something twitter improved on by making those connections obvious.

Read More
Gabriel L. Helman Gabriel L. Helman

No One is Talking About This by Patricia Lockwood (2021)

I’ve been Extremely Online pretty much since that was a thing you could be. Being Online is a condition that’s not well described or represented Offline. Most books or movies about scenes I was a part of, either directly or tangentially, tend not to be very accurate, not get the vibe right. I read books about computer games, say, and tend to leave with a sense of “huh, that’s not how it was for me at all.” Online is even worse; this is probably because Online is always describing itself to itself, and there’s no room for a slow, non-networked, Offline description.

Patricia Lockwood, who apparently dodged a thousand years of jail, used to be fairly active on the outer edges of what used to be called “weird twitter.” It turns out, poets were really good at twitter’s strange limitations, go figure. She wrote a book a few years ago called No One is Talking About This, which I had been looking forward to very much, but only just now finally had a chance to sit down and read.

This book is the single best description I’ve ever read of what it’s like to be Extremely Online. Specifically, it’s simply the best description of what it was like to read twitter too much in the late twenty-teens. The timing is accidentally perfect, it’s the perfect eulogy for that phase of the internet that existed between the recession and the pandemic; the five websites full of screenshots of the other four era, before the Disaster of the Twenties really got rolling.

But more generally, it perfectly encapsulates the Online Condition. The way The Online expands and consumes all your mental and emotional bandwidth, and the way Real Life sort of falls away, unable to match the dopamine flow. The way your head is full of all this stuff that no one else around you knows, or recognizes, or cares about. The Online doesn’t become more real than The Real, exactly, just more present, and faster, and louder.

But this book isn’t about any of that. This book is about what it’s like to be Online when Real Life suddenly becomes Extremely Real. And the result isn’t that suddenly Real Life becomes real again, it’s that neither seems real, and you float in this twilight realm between the two spaces, unable to engage with or believe either of them.

The way neither space can act as an escape valve for the other, and the realities continue to diverge past the point where you can hold both in your head, and you find yourself in both places, gasping out, for different reasons, No One is Talking About This.

I’m generally a fast reader. I don’t intend to humblebrag here, despite leaving this sentence in—I’ve always read fast, I tend to gulp books down. (I also walk fast and talk fast, and should probably do something about my caffeine intake.) This is a short book, but it took me a long time to read, because I couldn’t make it very far before I had to put it down and just sort of process the last couple of pages. It was very, very funny, but it got much further under my skin than I was expecting.

I enjoyed it very much. Strongly recommended.

Read More
Gabriel L. Helman Gabriel L. Helman

Last Week In Good Sentences

It’s been a little while since I did an open tab balance transfer, so I’d like to tell you about some good sentences I read last week.

Up first, old-school blogger Jason Kottke links to a podcast conversation between Ezra Klein and Nilay Patel in The Art of Work in the Age of AI Production. Kottke quotes a couple of lines that I’m going to re-quote here because I like them so much:

EZRA KLEIN: You said something on your show that I thought was one of the wisest, single things I’ve heard on the whole last decade and a half of media, which is that places were building traffic thinking they were building an audience. And the traffic, at least in that era, was easy, but an audience is really hard. Talk a bit about that.

NILAY PATEL: Yeah first of all, I need to give credit to Casey Newton for that line. That is something — at The Verge, we used to say that to ourselves all the time just to keep ourselves from the temptations of getting cheap traffic. I think most media companies built relationships with the platforms, not with the people that were consuming their content.

“Building traffic instead of an audience” sums up the last decade and change of the web perfectly. I don’t even have anything to add there, just a little wave and “there you go.”

Kottke ends by linking out to The Revenge of the Home Page in the The New Yorker, talking about the web starting to climb back towards a pre-social media form. And that’s a thought thats clearly in the air these days, because other old school blogger Andy Baio linked to We can have a different web.

I have this theory that we’re slowly reckoning with the amount of cognitive space that was absorbed by twitter. Not “social media”, but twitter, specifically. As someone who still mostly consumes the web via his RSS reader, and has been the whole time, I’ve had to spend a lot of time re-working my feeds the last several months because I didn’t realize how many feeds had rotted away but I hadn’t noticed because those sites were posting update as tweets.

Twitter absorbed so much oxygen, and there was so much stuff that migrated from “other places” onto twitter in a way that didn’t happen with other social media systems. And now that twitter is mostly gone, and all that creativity and energy is out there looking for new places to land.

If you’ll allow me a strained metaphor, last summer felt like last call before the party at twitter fully shut down; this summer really feels like that next morning, where we’ve all shook off the hangover and now everyone is looking at each other over breakfast asking “okay, what do you want to go do now?”


Jumping back up the stack to Patel talking about AI for a moment, a couple of extra sentences:

But these models in their most reductive essence are just statistical representations of the past. They are not great at new ideas. […] The human creativity is reduced to a prompt, and I think that’s the message of A.I. that I worry about the most, is when you take your creativity and you say, this is actually easy. It’s actually easy to get to this thing that’s a pastiche of the thing that was hard, you just let the computer run its way through whatever statistical path to get there. Then I think more people will fail to recognize the hard thing for being hard.

(The whole interview is great, you should go read it.)

But that bit about ideas and reducing creativity to a prompt brings me to my last good sentences, in this depressing-but-enlightening article over on 404 media: Flood of AI-Generated Submissions ‘Final Straw’ for Small 22-Year-Old Publisher

A small publisher for speculative fiction and roleplaying games is shuttering after 22 years, and the “final straw,” its founder said, is an influx of AI-generated submissions. […] “The problem with AI is the people who use AI. They don't respect the written word,” [founder Julie Ann] Dawson told me. “These are people who think their ‘ideas’ are more important than the actual craft of writing, so they churn out all these ‘ideas’ and enter their idea prompts and think the output is a story. But they never bothered to learn the craft of writing. Most of them don't even read recreationally. They are more enamored with the idea of being a writer than the process of being a writer. They think in terms of quantity and not quality.”

And this really gets to one of the things that bothers me so much about The Plagiarism Machine—the sheer, raw entitlement. Why shouldn’t they get to just have an easy copy of something someone else worked hard on? Why can’t they just have the respect of being an artist, while bypassing the work it takes to earn it?

My usual metaphor for AI is that it’s asbestos, but it’s also the art equivalent of steroids in pro sports. Sure, you hit all those home runs or won all those races, but we don’t care, we choose to live in a civilization where those don’t count, where those are cheating.

I know several people who have become enamored with the Plagiarism Machines over the last year—as I imagine all of us do now—and I’m always struck by a couple of things whenever they accidentally show me their latest works:

First, they’re always crap, just absolute dogshit garbage. And I think to myself, how did you make it to adulthood without being able to tell what’s good or not? There’s a basic artistic media literacy that’s just missing.

Second, how did we get to the point where you’ve got the nerve to be proud that you were cheating?

Read More
Gabriel L. Helman Gabriel L. Helman

The Sky Above The Headset Was The Color Of Cyberpunk’s Dead Hand

Occasionally I poke my head into the burned-out wasteland where twitter used to be, and whilw doing so stumbled over this thread by Neil Stephenson from a couple years ago:

Neal Stephenson: "The assumption that the Metaverse is primarily an AR/VR thing isn't crazy. In my book it's all VR. And I worked for an AR company--one of several that are putting billions of dollars into building headsets. But..."

Neal Stephenson: "...I didn't see video games coming when I wrote Snow Crash. I thought that the killer app for computer graphics would be something more akin to TV. But then along came DOOM and generations of games in its wake. That's what made 3D graphics cheap enough to reach a mass audience."

Neal Stephenson: "Thanks to games, billions of people are now comfortable navigating 3D environments on flat 2D screens. The UIs that they've mastered (e.g. WASD + mouse) are not what most science fiction writers would have predicted. But that's how path dependency in tech works."

I had to go back and look it up, and yep: Snow Crash came out the year before Doom did. I’d absolutely have stuck this fact in Playthings For The Alone if I’d had remembered, so instead I’m gonna “yes, and” my own post from last month.

One of the oft-remarked on aspects of the 80s cyberpunk movement was that the majority of the authors weren’t “computer guys” before-hand; they were coming at computers from a literary/artist/musician worldview which is part of why cyberpunk hit the way it did; it wasn’t the way computer people thought about computers—it was the street finding it’s own use for things, to quote Gibson. But a less remarked-on aspect was that they also weren’t gamers. Not just not computer games, but any sort of board games, tabletop RPGs.

Snow Crash is still an amazing book, but it was written at the last possible second where you could imagine a multi-user digital world and not treat “pretending to be an elf” as a primary use-case. Instead the Metaverse is sort of a mall? And what “games” there are aren’t really baked in, they’re things a bored kid would do at a mall in the 80s. It’s a wild piece of context drift from the world in which it was written.

In many ways, Neuromancer has aged better than Snow Crash, if for no other reason that it’s clear that the part of The Matrix that Case is interested in is a tiny slice, and it’s easy to imagine Wintermute running several online game competitions off camera, whereas in Snow Crash it sure seems like The Metaverse is all there is; a stack of other big on-line systems next to it doesn’t jive with the rest of the book.

But, all that makes Snow Crash a really useful as a point of reference, because depending on who you talk to it’s either “the last cyberpunk novel”, or “the first post-cyberpunk novel”. Genre boundaries are tricky, especially when you’re talking about artistic movements within a genre, but there’s clearly a set of work that includes Neuromancer, Mirrorshades, Islands in the Net, and Snow Crash, that does not include Pattern Recognition, Shaping Things, or Cryptonomicon; the central aspect probably being “books about computers written by people who do not themselves use computers every day”. Once the authors in question all started writing their novels in Word and looking things up on the web, the whole tenor changed. As such, Snow Crash unexpectedly found itself as the final statement for a set of ideas, a particular mix of how near-future computers, commerce, and the economy might all work together—a vision with strong social predictive power, but unencumbered by the lived experience of actually using computers.

(As the old joke goes, if you’re under 50, you weren’t promised flying cars, you were promised a cyberpunk dystopia, and well, here we are, pick up your complementary torment nexus at the front desk.)

The accidental predictive power of cyberpunk is a whole media thesis on it’s own, but it’s grimly amusing that all the places where cyberpunk gets the future wrong, it’s usually because the author wasn’t being pessimistic enough. The Bridge Trilogy is pretty pessimistic, but there’s no indication that a couple million people died of a preventable disease because the immediate ROI on saving them wasn’t high enough. (And there’s at least two diseases I could be talking about there.)

But for our purposes here, one of the places the genre overshot was this idea that you’d need a 3d display—like a headset—to interact with a 3d world. And this is where I think Stephenson’s thread above is interesting, because it turns out it really didn’t occur to him that 3d on a flat screen would be a thing, and assumed that any sort of 3d interface would require a head-mounted display. As he says, that got stomped the moment Doom came out. I first read Snow Crash in ’98 or so, and even then I was thinking none of this really needs a headset, this would all work find on a decently-sized monitor.

And so we have two takes on the “future of 3d computing”: the literary tradition from the cyberpunk novels of the 80s, and then actual lived experience from people building software since then.

What I think is interesting about the Apple Cyber Goggles, in part, is if feels like that earlier, literary take on how futuristic computers would work re-emerging and directly competing with the last four decades of actual computing that have happened since Neuromancer came out.

In a lot of ways, Meta is doing the funniest and most interesting work here, as the former Oculus headsets are pretty much the cutting edge of “what actually works well with a headset”, while at the same time, Zuck’s “Metaverse” is blatantly an older millennial pointing at a dog-eared copy of Snow Crash saying “no, just build this” to a team of engineers desperately hoping the boss never searches the web for “second life”. They didn’t even change the name! And this makes a sort of sense, there are parts of Snow Crash that read less like fiction and more like Stephenson is writing a pitch deck.

I think this is the fundamental tension behind the reactions to Apple Vision Pro: we can now build the thing we were all imagining in 1984. The headset is designed by cyberpunk’s dead hand; after four decades of lived experience, is it still a good idea?

Read More
Gabriel L. Helman Gabriel L. Helman

End Of Year Open Tab Bankruptcy Roundup Jamboree, Part 1: Mastodon

I’m declaring bankruptcy on my open tabs; these are all things I’ve had open on my laptop or phone over the last several months, thinking “I should send this to someone? Or Is this a blog post?” Turns out the answer is ‘no’ to both of those, so here they are. Day 1: The Twitter to Mastodon migration, or lack thereof.

I was going do to a whole piece on mastodon last summer. At first, the second half of what became What was happening: Twitter, 2006-2023 was going to be a roundup of alternatives and why none of them were replacements for what I missed, then that got demoted to a bunch of footnotes, then it’s own thing, but I just can’t be bothered because the moment was lost.

The short, short version is that from the end of ’21 and through the first half of ’22 there was an really intersting moment where we could have built an open alternative to twitter. Instead, they built another space space for white male nerds, and the internet didn’t need one more of those.

Mastodon somehow manges to embody every flaw of open source projects: they brought a protocol to a product fight, no one’s even thinking about it as a product, misfeatures dressed up as virtue, and a weirdly hostile userbase that’s too busy patting each other on the back to noice who’s been left out. A whole bunch of gatekeeping bullshit dressed up as faux-libertarian morality. Mixed with this very irritating evangelistic true-believer vibe where Mastodon is the “morally correct” one to use, so there’s no reason to listen to reasons why people don’t use it, because axiomatically, they must be wrong.

And look, this is coming from a guy with an active subscription to Ivory. But the communities on twitter I cared about just aren’t there. They went elsewhere, or just dispersed into the web. I understand that there are people who found what they had on twitter on mastodon, and as before I can only quote former-President Beeblebrox: “Zowee, here we are at the End of the Universe and you haven't even lived yet. Did you miss out.”

One of the key works of our time is that episode of 30 Rock where Liz Lemon realizes that actually, she was the bully in high school. Mastodon was built entirely by people that never had that realization.

Michael Tsai - Blog - Why Has Mastodon Adoption Stalled?

Blue Skies Over Mastodon

jwz: Mastodon stampede Pretty great comment thread on this one, including this amazing line, re the mastodon developers:

And for this it’s a general Stallman problem. “I don’t understand this. Therefore you are wrong.”

jwz: Mastodon's Mastodon'ts

The nerd rage respones to jwz’s entirely reasonable (and positive!) critique is everything you need to know. I’ve got enough weird HOA scolds in my real life without going on Al Gore’s internet and seeking them out.

Drivers of social influence in the Twitter migration to Mastodon | Scientific Reports

(I could have sworn I had more open tabs here, but I think I’m remembering frustrated twitter threads instead)

Read More
Gabriel L. Helman Gabriel L. Helman

Wednesday linkblog, Twitter reminisces edition

The reminisces are starting to flow now, as is really starts to settle in that the thing we used to call twitter is gone and won’t be back.. As such, I’d like to call your attention to The Verge’s truly excellent The Year Twitter Died. This is probably the best “what it was, and what we lost, for both good and ill” piece I’ve read. Especially don’t miss The Great Scrollback of Alexandria. I’m glad someone is putting the work into saving some part of what used to be there.

Also, this week in folks talking about twitter, I enjoyed John Scalzi’s check-in a month after finally walking away: Abandoning the Former Twitter: A Four-Week Check-In. Scalzi was one of the strongest “I was here before he got here, and I’ll be here after he leaves” voices I saw a year ago, and the last year beat him like the rest of us.

There’s, of couse, the usual blowback to stuff like this, with at least one article I saw in response to that verge piece claiming that no, twitter always sucked, here’s all the problems it had, I always had a bad time there, so on and so on. I won’t link to it because why give them the attention, but I spent the whole time reading it thinking of this quote from former-President Beeblebrox: “Zowee, here we are at the End of the Universe and you haven't even lived yet. Did you miss out.”

Read More
Gabriel L. Helman Gabriel L. Helman

Re-Capturing the Commons

The year’s winding down, which means it’s time to clear out the drafts folder. Let me tell you about a trend I was watching this year.

Over the last couple of decades, a business model has emerged that looks something like this:

  1. A company creates a product with a clear sales model, but doesn’t have value without a strong community
  2. The company then fosters such a community, which then steps in and shoulders a fair amount of the work of running said community
  3. The community starts creating new things on top of what that original work of the parent company—and this is important—belong to those community members, not the company
  4. This works well enough that the community starts selling additional things to each other—critically, these aren’t competing with that parent company, instead we have a whole “third party ecosystem”.

(Hang on, I’ll list some examples in a second.)

These aren’t necessarily “open source” from a formal OSI “Free & Open Source Software” perspective, but they’re certainly open source–adjacent, if you will. Following the sprit, if not the strict legal definition.

Then, this year especially, a whole bunch of those types of companies decided that they wouldn’t suffer anyone else makining things they don’t own in their own backyard, and tried to reassert control over the broader community efforts.

Some specific examples of what I mean:

  • The website formerly known as Twitter eliminating 3rd party apps, restricting the API to nothing, and blocking most open web access.
  • Reddit does something similar, effectively eliminates 3rd party clients and gets into an extended conflict with the volunteer community moderators.
  • StackOverflow and the rest of the StackExchange network also gets into an extended set of conflicts with its community moderators, tries to stop releasing the community-generated data for public use, revises license terms, and descends into—if you’ll forgive the technical term—a shitshow.
  • Hasbro tries to not only massively restrict the open license for future versions of Dungeons and Dragons, but also makes a move to retroactively invalidate the Open Game License that covered material created for the 3rd and 5th editions of the game over the last 20 years.

And broadly, this is all part of the Enshittification Curve story. And each of these examples have a whole set of unique details. Tens, if not hundreds of thousands of words have been written on each of these, and we don’t need to re-litigate those here.

But there’s a specific sub-trend here that I think is worth highlighting. Let’s look at what those four have in common:

  • Each had, by all accounts, a successful business model. After-the-fact grandstanding non-withstanding, none of those four companies was in financial trouble, and had a clear story about how they got paid. (Book sales, ads, etc.)
  • They all had a product that was absolutely worthless without an active community. (The D&D player’s handbook is a pretty poor read if you don’t have people to play with, reddit with no comments is just an ugly website, and so on)
  • Community members were doing significant heavy lifting that the parent company was literally unable to do. (Dungeon Mastering, community moderating. Twitter seems like the outlier here at first glance, but recall that hashtags, threads, the word “tweet” and literally using a bird as a logo all came from people not on twitter’s payroll.)
  • There were community members that made a living from their work in and around the community, either directly or indirectly. (3rd party clients, actual play streams, turning a twitter account about things your dad says into a network sitcom. StackOverflow seems like the outlier on this one, until you remember that many, many people use their profiles there as a kind of auxiliary outboard resume.)
  • They’ve all had recent management changes; more to the point, the people who designed the open source–adjacent business model are no longer there.
  • These all resulted in huge community pushback

So we end up in a place where a set of companies that no one but them can make money in their domains, and set their communities on fire. There was a lot of handwaving about AI as an excuse, but mostly that’s just “we don’t want other people to make money” with extra steps.

To me, the most enlightening one here is Hasbro, because it’s not a tech company and D&D is not a tech product, so the usual tech excuses for this kind of behavior don’t fly. So let’s poke at that one for an extra paragraph or two:

When the whole OGL controversy blew up back at the start of the year, certain quarters made a fair amount of noise about how this was a good thing, because actually, most of what mattered about D&D wasn’t restrict-able, or was in the public domain, and good old fair use was a better deal than the overly-restrictive OGL, and that the community should never have taken the deal in the first place. And this is technically true, but only in the ways that don’t matter.

Because, yes. The OGL, as written, is more restrictive that fair use, and strict adherence to the OGL prevents someone from doing things that should otherwise be legal. But that misses the point.

Because what we’re actually talking about is an industry with one multi-billion dollar company—the only company on earth that has literal Monopoly money to spend—and a whole bunch of little tiny companies with less than a dozen people. So the OGL wasn’t a crummy deal offered between equals, it was the entity with all the power in the room declaring a safe harbor.

Could your two-person outfit selling PDFs online use stuff from Hasbro’s book without permission legally? Sure. Could you win the court case when they sue you before you lose your house? I mean, maybe? But not probably.

And that’s what was great about it. For two decades, it was the deal, accept these slightly more restrictive terms, and you can operate with the confidence that your business, and your house, is safe. And an entire industry formed inside that safe harbor.

Then some mid-level suit at Hasbro decided they wanted a cut?

And I’m using this as the example partly because it’s the most egregious. But 3rd party clients for twitter and reddit were a good business to be in, until they suddenly were not.

And I also like using Hasbro’s Bogus Journey with D&D as the example because that’s the only one where the community won. With the other three here, the various owners basically leaned back in their chairs and said “yeah, okay, where ya gonna go?” and after much rending of cloth, the respective communities of twitter, and reddit, and StackOverflow basically had to admit there wasn’t an alternative., they were stuck on that website.

Meanwhile, Hasbro asked the same question, and the D&D community responded with, basically, “well, that’s a really long list, how do you want that organized?”

So Hasbro surrendered utterly, to the extent that more of D&D is now under a more irrevocable and open license that it was before. It feels like there’s a lesson in competition being healthy here? But that would be crass to say.

Honestly, I’m not sure what all this means; I don’t have a strong conclusion here. Part of why this has been stuck in my drafts folder since June is that I was hoping one of these would pop in a way that would illuminate the situation.

And maybe this isn’t anything more than just what corporate support for open source looks like when interest rates start going up.

But this feels like a thing. This feels like it comes from the same place as movie studios making record profits while saying their negotiation strategy is to wait for underpaid writers to lose their houses?

Something is released into the commons, a community forms, and then someone decides they need to re-capture the commons because if they aren’t making the money, no one can. And I think that’s what stuck with me. The pettiness.

You have a company that’s making enough money, bills are paid, profits are landing, employees are taken care of. But other people are also making money. And the parent company stops being a steward and burns the world down rather than suffer someone else make a dollar they were never going to see. Because there’s no universe where a dollar spent on Tweetbot was going to go to twitter, or one spent on Apollo was going to go to reddit, or one spent on any “3rd party” adventure was going to go to Hasbro.

What can we learn from all this? Probably not a lot we didn’t already know, but: solidarity works, community matters, and we might not have anywhere else to go, but at the same time, they don’t have any other users. There’s no version where they win without us.

Read More
Gabriel L. Helman Gabriel L. Helman

What was happening: Twitter, 2006-2023

Twitter! What can I tell ya? It was the best of times, it was the worst of times. It was a huge part my life for a long time. It was so full of art, and humor, and joy, and community, and ideas, and insight. It was also deeply flawed and profoundly toxic, but many of those flaws were fundamental to what made it so great.

It’s almost all gone now, though. The thing called X that currently lives where twitter used to be is a pale, evil, corrupted shadow of what used to be there. I keep trying to explain what we lost, and I can’t, it’s just too big.1 So let me sum up. Let me tell you why I loved it, and why I left. As the man2 said, let me tell you of the days of high adventure.


I can’t now remember when I first heard the word “twitter”. I distinctly remember a friend complaining that this “new twitter thing” had blown out the number of free SMS messages he got on his nokia flip phone, and that feels like a very 2006 conversation.

I tend to be pretty online, and have been since the dawn of the web, but I’m not usually an early adopter of social networks, so I largely ignored twitter for the first couple of years. Then, for reasons downstream of the Great Recession, I found myself unemployed for most of the summer of 2009.3 Suddenly finding myself with a surfit of free time, I worked my way down that list of “things I’ll do if I ever get time,” including signing up for “that twitter thing.” (I think that’s the same summer I lit up my now-unused Facebook account, too.) Smartphones existed by then, and it wasn’t SMS-based anymore, but had a website, and apps.4

It was great. This was still in it’s original “microblogging” configuration, where it was essentially an Instant Messenger status with history. You logged in, and there was the statuses of the people you followed, in chronological order, and nothing else.

It was instantly clear that this wasn’t a replacement for something that already existed—this wasn't going to do away with your LiveJournal, or Tumblr, or Facebook, or blog. This was something new, something extra, something yes and. The question was, what was it for? Where did it fit in?

Personally, at first I used my account as a “current baby status” feed, updating extended family about what words my kids had learned that day. The early iteration of the site was perfect for that—terse updates to and from people you knew.

Over time, it accumulated various social & conversational features, not unlike a Katamari rolling around Usenet, BBSes, forums, discussion boards, other early internet communication systems. It kept growing, and it became less useful as a micro-blogging system and more of a free-wheeling world-wide discussion forum.

It was a huge part of my life, and for a while there, everyone’s life. Most of that time, I enjoyed it an awful lot, and got a lot out of it. Everyone had their own take on what it was Twitter had that set it apart, but for me it was three main things, all of which reinforced each other:

  1. It was a great way to share work. If you made things, no matter how “big” you were, it was a great way to get your work out there. And, it was a great way to re-share other people’s work. As a “discovery engine” it was unmatched.

  2. Looking at that the other way, It was an amazing content aggregator. It essentially turned into “RSS, but Better”; at the time RSS feeds had pretty much shrunk to just “google reader’s website”. It turns out that sharing things from your RSS feed into the feeds of other people, plus a discussion thread, was the key missing feature. If you had work of your own to share, or wanted to talk about something someone else had done elsewhere on the internet, twitter was a great way to share a link and talk about it. But, it also worked equally well for work native to twitter itself. Critically, the joke about the web shrinking to five websites full of screenshots of the other four5 was posted to twitter, which was absolutely the first of those five websites.

  3. Most importantly, folks who weren’t anywhere else on the web were on twitter. Folks with day jobs, who didn’t consider themselves web content people were there; these people didn’t have a blog, or facebook, or instagram, but they were cracking jokes and hanging out in twitter.

There is a type of person whom twitter appealed to in a way that no other social networking did. A particular kind of weirdo that took Twitter’s limitations—all text, 140 or 280 characters max—and turned them into a playground.

And that’s the real thing—twitter was for writers. Obviously it was text based, and not a lot of text at that, so you had to be good at making language work for you. As much as the web was originally built around “hypertext”, most of the modern social web is built around photos, pictures, memes, video. Twitter was for people who didn’t want to deal with that, who could make the language sing in a few dozen words.

It had the vibe of getting to sit in on the funniest people you know’s group text, mixed with this free-wheeling chaos energy. On it’s best days, it had the vibe of the snarky kids at the back of the bus, except the bus was the internet, and most of the kids were world-class expoerts in something.

There’s a certain class of literary writer goofballs that all glommed onto twitter in a way none of us did with any other “social network.” Finally, something that rewarded what we liked and were good at!

Writers, comedians, poets, cartoonists, rabbis, just hanging out. There was a consistent informality to the place—this wasn’t the show, this was the hotel bar after the show. The big important stuff happened over in blogs, or columns, or novels, or wherever everyone’s “real job” was, this was where everyone let their hair down and cracked jokes.

But most of all, it was weird. Way, way weirder than any other social system has ever been or probably ever will be again, this was a system that ran on the same energy you use to make your friends laugh in class when you’re supposed to be paying attention.

It got at least one thing exactly right: it was no harder to sign into twitter and fire off a joke than it was to fire a message off to the group chat. Between the low bar to entry and the emphasis on words over everthing else, it managed to attract a crowd of folks that liked computers, but didn’t see them as a path to self-actualization.

But what made twitter truly great were all the little (and not so little) communities that formed. It wasn’t the feature set, or the website, or the tech, it was the people, and the groups they formed. It’s hard to start making lists, because we could be here all night and still leave things out. In no particular order, here’s the communities I think I’ll miss the most:

  • Weird Twitter—Twitter was such a great vector for being strange. Micro-fiction, non-sequiturs, cats sending their mothers to jail, dispatches from the apocalypse.
  • Comedians—professional and otherwise, people who could craft a whole joke in one sentence.
  • Writers—A whole lot of people who write for a living ended up on twitter in a way they hadn’t anywhere else on the web.
  • Jewish Twitter—Speaking as a Jew largely disconnection from the local Jewish community , it was so much fun to get to hang out with the Rabbis and other Jews.

But also! The tech crowd! Legal experts! Minorities of all possible interpretations of the word sharing their experiences.

And the thing is, other than the tech crowd,6 most of those people didn’t go anywhere else. They hadn’t been active on the previous sites, and many of them drifted away again the wheels started coming off twitter. There was a unique alchemy on twitter for forming communities that no other system has ever had.

And so the real tragedy of twitter’s implosion is that those people aren’t going somewhere else. That particular alchemy doesn’t exist elsewhere, and so the built up community is blowing away on the wind.


Because all that’s almost entirely gone now, though. I miss it a lot, but I realize I’ve been missing it for a year now. There had been a vague sense of rot and decline for a while. You can draw a pretty straight line from gamergate, to the 2016 Hugos, to the 2016 election, to everything around The Last Jedi, to now, as the site rotted out from the inside; a mounting sense that things were increasingly worse than they used to be. The Pandemic saw a resurgence of energy as everyone was stuck at home hanging out via tweets, but in retrospect that was a final gasp.7

Once The New Guy took over, there was a real sense of impending closure. There were plenty of accounts that made a big deal out of Formally Leaving the site and flouncing out to “greener pastures”, either to make a statement, or (more common) to let their followers know where they were. There were also plenty of accounts saying things like “you’ll all be back”, or “I was here before he got here and I’ll be here after he leaves”, but over the last year mostly people just drifted away. People just stopped posting and disappeared.

It’s like the loss of a favorite restaurant —the people who went there already know, and when people who wen’t there express disbelief, the response is to tell them how sorry you are they missed the party!

The closest comparison I can make to the decayed community is my last year of college. (Bear with me, this’ll make sense.). For a variety of reasons, mostly good, it took me 5 years to get my 4 year degree. I picked up a minor, did some other bits and bobs on the side, and it made sense to tack on an extra semester, and at that point you might as well do the whole extra year.

I went to a medium sized school in a small town.8 Among the many, many positive features of that school was the community. It seemed like everyone knew everyone, and you couldn’t go anywhere without running into someone you knew. More than once, when I didn’t have anything better to do, I’d just hike downtown and inevitably I’d run into someone I knew and the day would vector off from there.9

And I’d be lying if I said this sense of community wasn’t one of the reasons I stuck around a little longer—I wasn’t ready to give all that up. Of course, what I hadn’t realized was that not everyone else was doing that. So one by one, everyone left town, and by the end, there I was in downtown surrounded by faces I didn’t know. My lease had a end-date, and I knew I was moving out of town on that day no matter what, so what, was I going to build up a whole new peer group with a short-term expiration date? That last six months or so was probably the weirdest, loneliest time of my whole lide. When the lease ended, I couldn’t move out fast enough.

The point is: twitter got to be like that. I was only there for the people, and nearly all the people I was there for had already gone. Being the one to close out the party isn’t always the right move.


One of the things that made it so frustrating was that it had always problems, but it had the same problems that any under-moderated semi-anonymous internet system had. “How to stop assholes from screwing up your board” is a 4 decade old playbook at this point, and twitter consistently failed to actually deploy any of the solutions, or at least deploy them at a scale that made a difference. The maddening thing was always that the only unique thing about twitter’s problems was the scale.

I had a soft rule that I could only read Twitter when using my exercise bike, and a year or two ago I couldn’t get to the end of the tweets from people I followed before I collapsed from exhaustion. Recently, I’d run out of things to read before I was done with my workout. People were posting less, and less often, but mostly they were just… gone. Quietly fading away as the site got worse.

In the end, though, it was the tsunami of antisemitism that got me. “Seeing only what you wanted to see” was always a skill on twitter, but the unfolding disaster in Israel and Gaza broke that. Not only did you have the literal nazis showing up and spewing their garbage without check, but you had otherwise progressive liberal leftists (accidentally?) doing the same thing, without pushback or attempt at discussion, because all the people that would have done that are gone. So instead it’s just a nazi sludge.10


There was so much great stuff on there—art, ideas, people, history, jokes. Work I never would have seen, things I wouldn’t have learned, books I wouldn’t have read, people I wouldn’t know about. I keep trying to encompass what’s been lost, make lists, but it’s too big. Instead, let me tell you one story about the old twitter:

One of the people I follow(ed) was Kate Beaton, originally known for the webcomic Hark A Vagrant!, most recently the author of Ducks (the best book I read last year). One day, something like seven years ago, she started enthusing about a book called Tough Guys Have Feelings Too. I don’t think she had a connection to the book? I remember it being an unsolicited rave from someone who had read it and was stuck by it.

The cover is a striking piece of art of a superhero, head bowed, eyes closed, a tear rolling down his cheek. The premise of the book is what it says on the cover—even tough guys have feelings. The book goes through a set of sterotypical “tough guys”—pirates, ninjas, wrestlers, superheros, race car drivers, lumberjacks, and shows them having bad days, breaking their tools, crashing their cars, hurting themselves. The tough guys have to stop, and maybe shed a tear, or mourn, or comfort themselves or each other, and the text points out, if even the tough guys can have a hard time, we shouldn’t feel bad for doing the same. The art is striking and beautiful, the prose is well written, the theme clearly and well delivered.

I bought it immediately. You see, my at-the-time four-year-old son was a child of Big Feelings, but frequently had trouble handling those feelings. I thought this might help him. Overnight, this book became almost a mantra. For years after this, when he was having Big Feelings, we’d read this book, and it would help him calm down and take control of what he was feeling.

It’s not an exaggeration to say this book changed all our lives for the better. And in the years since then, I’ve often been struck that despite all the infrastructure of moden capitalism—marketing, book tours, reviews, blogs, none of those ever got that book into my hands. There’s only been one system where an unsolicited rave from a web cartoonist being excited about a book outside their normal professional wheelhouse could reach someone they’ve never met or heard of and change that person’s son’s life.

And that’s gone now.


  1. I’ve been trying to write something about the loss of twitter for a while now. The first draft of this post has a date back in May, to give you some idea.

  2. Mako.

  3. As as aside, everyone should take a summer off every decade or so.

  4. I tried them all, I think, but settled on the late, lamented Tweetbot.

  5. Tom Eastman: I’m old enough to remember when the Internet wasn't a group of five websites, each consisting of screenshots of text from the other four.

  6. The tech crowd all headed to mastodon, but didn’t build that into a place that any of those other communities could thrive. Don’t @-me, it’s true.

  7. In retrospect, getting Morbius to flop a second time was probably the high point, it was all downhill after that.

  8. CSU Chico in Chico, California!

  9. Yes, this is what we did back in the 90s before cellphones and texting, kids.

  10. This is out of band for the rest of the post, so I’m jamming all this into a footnote:

    Obviously, criticizing the actions of the government of Israel is no more antisemitic than criticizing Hamas would be islamophobic. But objecting to the actions of Israel ’s government with “how do the Jews not know they’re the bad guys” sure as heck is, and I really didn’t need to see that kind of stuff being retweeted by the eve6 guy.

    A lot of things are true. Hamas is not Palestine is not “The Arabs”, and the Netanyahu administration is not Israel is not “The Jews.” To be clear, Hamas is a terror organization, and Israel is on the functional equivalent of Year 14 of the Trump administration.

    The whole disaster hits at a pair of weird seams in the US—the Israel-Palestine conflict maps very strangely to the American political left-right divide, and the US left has always had a deep-rooted antisemitism problem. As such, what really got me was watching all the comments criticizing “the Jews” for this conflict come from _literally_ the same people who spent four years wearing “not my president” t-shirts and absolving themselves from any responsibility for their governments actions because they voted for “the email lady”. They get the benefit of infinite nuance, but the Jews are all somehow responsible for Bibi’s incompetent choices.

Read More
Gabriel L. Helman Gabriel L. Helman

With enough money, you don’t have to be good at anything

Following up on our previous coverage, I’ve been enjoying watching the reactions to Isaacson’s book on twitter’s new owner.

My favorite so far has been Dave Karpf’s mastodon live-toot turned substack post. Credit where credit is due, I saw this via a link on One Foot Tsunami, and I’m about to jump on the same quote that both Dave Karpf and Paul Kafasis did:

[Max] Levchin was at a friend’s bachelor pad hanging out with Musk. Some people were playing a high-stakes game of Texas Hold ‘Em. Although Musk was not a card player, he pulled up to the table. “There were all these nerds and sharpsters who were good at memorizing cards and calculating odds,” Levchin says. “Elon just proceeded to go all in on every hand and lose. Then he would buy more chips and double down. Eventually, after losing many hands, he went all in and won. Then he said “Right, fine, I’m done.” It would be a theme in his life: avoid taking chips off the table; keep risking them.

That would turn out to be a good strategy. (page 86)

And, man, that’s just “Masterful gambit, sir”, but meant sincerely.

But this quote is it.. Here’s a guy who found the closest thing to the infinite money cheat in Sim City as exists in real life, and he’s got a fleet of people who think that’s same as being smart. And then finds himself a biographer possessed of such infinite credulity that he can’t tell the difference between being actually good at poker and being someone who found the poker equivalent of typing IDDQD before playing.

With enough money, you don’t have to be good at anything. With infinite lives, you’ll eventually win.

My other favorite piece of recent days is How the Elon Musk biography exposes Walter Isaacson by Elizabeth Lopatto. The subhead sums it up nicely: ” One way to keep Musk’s myth intact is simply not to check things out.”

There’s too much good stuff to pull out a single quote, but it does a great job outlining not only the book’s reflexive responses of “Masterful gambit”, but also the way Isaacson breezes past the labor issues, racism, sexism, transphobia, right-wing turn, or anything vaguely “political”, seeming to treat those things as besides the story. They’re not! That IS the story!

To throw one more elbow at the Steve Jobs book, something that was really funny about it was that Isaacson clearly knew that Jobs had a “reality distortion field” that let him talk people into things, so when Jobs told Isaacson something, Isaacson would go find someone else to corroborate or refute that thing. The problem was, Isaacson would take whatever that other person said as the unvarnished truth, never seeming to notice that he was talking to heavily biased people, like, say, Bill Gates.

With this book, he doesn’t even go that far, just writing down whatever Elno Mork tells him without checking it out, totally looking past the fact that he’s talking to a guy who absolutely loves to just make stuff up all the time.

Like Lopatto points out, this is maddening for many reasons, but not the least of which being that Isaacson has been handed a great story: it turns out the vaunted business techical genius spaceships & cars guy is a jerk whose been dining out on infinite money, a compliant press, and other people’s work for his whole life. “How in the heck did he get this far?” would have been a hell of a book. Unfortunately, the guy with access failed to live up to the moment

The tech/silicon valley-focused press has always had a problem with an enthusiasm for new tech and charismatic tech leaders that trends towards the gullible. Why check things out if this new startup it claiming something you really want to be true? (This isn’t a new problem, I still have the Cue Cat Wired sent me.)

But even more than recent reporting failures like Theranos or the Crypto collapse, Musk’s last year in the wreckage of twitter really seems to be forcing some questions around “Why did you all elevate someone like this for so long? Any why are people still carrying water for him?”

Read More