40 years of…
Just about 40 years ago, my Dad brought something home that literally changed my life. It was a computer—a home computer, which was still on the edge of being science fiction—but more than that, it was a portal. It was magic, a box of endless possibilities. It’s not even remotely hyperbole to say that bringing that computer home, which had just been released into the world, utterly changed the entire trajectory of my life.
I am, of course, talking about the Tandy 1000.
That’s not how you expected that sentence to end, was it? Because this year is also the 40th anniversary of the Mac. But I want to spend a beat talking about the other revolutionary, trend-setting computer from 1984, before we talk about the ancestor of the computer I’m writing this on now.
I’ve been enjoying everyone’s lyrical memories of the original Mac very much, but most have a slightly revisionist take that the once that original Mac landed in ’84 that it was obviously the future. Well, it was obviously a future. It’s hard to remember now how unsettled the computer world was in the mid-80s. The Tandy 1000, IBM AT, and Mac landed all in ’84. The first Amiga would come out the next year. The Apple IIgs and original Nintendo the year after that. There were an absurd number of other platforms; Commodore 64s were selling like hotcakes, Atari was still making computers, heck, look at the number of platforms Infocom released their games for. I mean, the Apple ][ family still outsold the Mac for a long time.
What was this Tandy you speak of, then?
Radio Shack started life as a company to supply amateur radio parts to mostly ham radio operators, and expanded into things like hi-fi audio components in the 50s. In one of the greatest “bet it all on the big win” moves I can think of, the small chain was bought by—of all people—the Tandy Leather Company in the early 60s. They made leather goods for hobbyists and crafters, and wanted to expand into other hobby markets. Seeing no meaningful difference between leather craft hobbyists and electronics ones, Charles Tandy bought the chain, and reworked and expanded the stores, re-envisioning them as, basically, craft stores for electronics.
I want to hang on that for a second. Craft stores, but for amateur electronics engineers.
It’s hard to express now, in this decayed age, how magical a place Radio Shack was. It seems ridiculous to even type now. If you were the kind of kid who were in any way into electronics, or phones in the old POTS Ma Bell sense, or computery stuff, RadioShack was the place. There was one two blocks from my house, and I loved it.
When home computers started to become a thing, they came up through the hobbyist world; Radio Shack was already making their own parts and gizmos, it was a short distance to making their own computers. Their first couple of swings, the TRS-80 and friends, were not huge hits, but not exactly failures either. Apple came out of this same hobbyist world, then IBM got involved because they were already making “big iron”, could they also make “little iron”?
For reasons that seem deeply, deeply strange four decades later, when IBM built their first PC, instead of writing their own operating system, they chose to license one from a little company outside of Seattle called Microsoft—maybe you’ve heard of them—with terms that let Gates and friends sell their new OS to other manufacturers. Meanwhile, for other reasons, equally strange, the only part of the IBM PC actually exclusive to IBM was the BIOS, the rest was free to be copied. So this whole little market formed where someone could build a computer that was “IBM Compatible”—again, maybe you’ve heard of this—buy the OS from that outfit up in Redmond, and take advantage of the software and hardware that was already out there. The basic idea that software should work on more than one kind of computer was starting to form.
One of the first companies to take a serious swing at this was Tandy, with the Tandy 2000. In addition to stretching the definition of “compatible” to the breaking point, it was one of the very few computers to ever use the Intel 80186, and was bought by almost no one, except, though a series of events no one has ever been able to adequately explain to me, my grandmother. (I feel I have to stress this isn’t a joke, Grandma wrote a sermon a week on that beast well into the late 90s. Continuing her track record for picking technology, she was also the only person I knew with a computer that ran Windows Me.)
As much as the original IBM PC was a “home computer”, it was really a small office computer, so IBM tried to make a cut down, cheaper version of the PC for home use, for real this time. I am, of course, talking about infamous flop the IBM PCjr, also 40 years old this year, and deserving its total lack of retrospective articles.
Tandy, meanwhile, had scrambled a “better PCjr” to market, the Tandy 1000. When the PCjr flopped, Tandy pivoted, and found they had the only DOS-running computer on the market with all the positives of the PCjr, but with a keyboard that worked.
Among these positives, the Tandy 1000 had dramatically better graphics and sound than anything IBM was selling. “Tandy Graphics” was a step up from CGA but not quite to EGA, and the “Tandy Sound” could play three notes at once! Meanwhile, the Tandy also came with something called DeskMate, an app suite / operating environment that included a text editor, spreadsheet, calendar, basic database with a text-character-based GUI.
So they found themselves in a strange new market: a computer that could do “business software”, both with what was built-in and what was already out there, but could also do, what are those called? Oh yeah, games.
The legend goes that IBM commissioned the nacent Sierra On-Line to write the first King’s Quest to show off the PCjr; when that flopped Sierra realized that Tandy was selling the only computer that could run their best game, and Tandy realized there was a hit game out there that could only run on their rigs. So they both leaned all the way in.
But of course, even the Tandy couldn’t match “arcade games”, so the capabilities and limits helped define what a “PC game” was. Adventure games, flight sims, RPGs. And, it must be said, both the words “operating” and “system” in MS-DOS were highly asperational. But what it lacked in features it made up for in being easy to sweep to the side and access the hardware directly, which is exactly what you want if you’re trying to coax game-quality performance out of the stone knives and bearskins of 80s home computers. Even when the NES cemented the “home console” market that Atari had sketched in a couple years later, “PC games” had already developed their own identity vs “console games”.
Radio Shacks got a whole corner, or more, turned over to their new computers. They had models out running programs you could play with, peripherals you could try, and most critically, a whole selection of software. I can distinctly remember the Radio Shack by my house with a set of bookstore-like shelves with what was at the time every game yet made by Sierra, Infocom, and everyone else at the time. Probably close to every DOS game out there. I have such clear memories of poring over the box to Starflight, or pulling Hitch-hiker’s Guide off the shelf, or playing Lode Runner on the demo computer.
A home computer with better graphics and sound than its contemporaries, pre-loaded with most of what you need to get going, and supported by its very own retail store? Does that sound familiar?
I’m cheating the timeline a little here, the Tandy 1000 didn’t release until November, and we didn’t get ours until early ’85. I asked my Dad once why he picked the one he did, of all the choices available, and he said something to the effect of he asked the “computer guy” at work which one he should get, and that guy indicated that he’d get the Tandy, since it would let you do the most different kinds of things.
Like I said at the top, it was magic. We’re so used to them now that it’s hard to remember, but I was so amazed that here was this thing, and it would do different things based on what you told it to do! I was utterly, utterly fascinated.
One of the apps Dad bought with computer was that first King's Quest, I was absolutely transfixed that you could drive this little guy around on the screen. I’d played arcade games—I’d probably already sunk a small fortune into Spy Hunter—but this was different. You could do things. Type what you thought of! Pushing the rock aside to find a hole underneath was one of those “the universe was never the same again” moments for me. I could barely spell, and certainly couldn’t type, but I was hooked. Somewhere, and this still exists, my Mom wrote a list of words on a sheet of paper for me to reference how to spell: look, take, shove.
And I wasn’t the only one, both of my parents were as fascinated as I was. My mom sucked down every game Infocom and Sierra ever put out. The Bard's Tale) landed a year later, and my parent’s played that obsessively.
It was a family obsession, this weird clunky beige box in the kitchen. Portals to other worlds, the centerpiece of our family spending time together. Four decades on, my parents still come over for dinner once a week, and we play video games together. (Currently, we’re still working on Tears of the Kingdon, because we’re slow.)
Everyone has something they lock onto between about 6 and 12 that’s their thing from that point on. Mine was computers. I’ve said many, many times how fortunate I feel that I lived at just the right time for my thing to turn into a pretty good paying career by the time I was an adult. What would I be doing to pay this mortgage if Dad hadn’t brought that Tandy box into the house 40 years ago? I literally have no idea.
Time marched on.
Through a series of tremendous own-goals, Radio Shack and Tandy failed to stay on top of, or even competitive in, the PC market they helped create, until as the Onion said: Even CEO Can't Figure Out How RadioShack Still In Business.
Meanwhile, through a series of maneuvers that, it has to be said, were not entirely legal, Microsoft steadily absorbed most of the market, with the unsettled market of the 80s really coalescing into the Microsoft-Intel “IBM Compatible” platform with the release of Windows 95.
Of all the players I mentioned way back at the start, the Mac was the only other one that remained, even the Apple ][, originally synonymous with home computers, had faded away. Apple had carved out a niche for the Mac for things that could really take advantage of the UI, mainly desktop publishing, graphic design, and your one friend’s Dad.
Over the years, I’d look over at the Mac side of the house with something approaching jealousy. Anyone who was “a computer person” in the 90s ended up “bilingual”, more-or-less comfortable on both Windows and Mac Classic. I took classes in graphic design, so I got pretty comfortable with illustrator or Aldus Pagemaker in the Mac.
I was always envious of the hardware of the old Mac laptops, which developed into open lust when those colored iBooks came out. The one I wanted the most, though, was that iMac G4 - Wikipedia with the “pixar lamp” design.
But the thing is, they didn’t do what I was mostly using a computer for. I played games, and lots of them, and for a whole list of reasons, none of those games came out for the Mac.
If ’84 saw the release of both the first Mac, and one of the major foundation stones of the modern Windows PC platform, and I just spent all that time singing the praises of my much missed Tandy 1000, why am I typing this blog post on a MacBook Pro? What happened?
Let me spin you my framework for understanding the home computer market. Invoking the Planescape Rule-of-Threes, there are basically three demographics of people who buy computers:
- Hobbyists. Tinkerers. People who are using computers as a source of self-actualization. Hackers, in the classical sense, not the Angelina Jolie sense.
- People who look at the computer market and thought, “I bet I make a lot of money off of this”.
- People who had something else to do, and thought, “I wonder if I could use a computer to help me do that?”
As the PC market got off the ground, it was just that first group, but then the other two followed along. And, of course, the people in the second group quickly realized that the real bucks were to be made selling stuff to that first group.
As the 80s wound on, the first and second group clustered on computers running Microsoft, and the third group bought Macs. Once we get into the late 90s the hobbyist group gets split between Microsoft and Linux.
(As an absolutely massive aside, this is the root of the weird cultural differences between “Apple people” and “Linux people”. The kind of people who buy Apples do so specifically so they don’t have to tinker, and the kinds of people who build Linux boxes do so specifically so that they can. If you derive a sense of self from being able to make computers do things, Apples are nanny-state locked-down morally suspect appliances, and if you just want to do some work and get home on time and do something else, Linux boxes are massively unreliable Rube Goldberg toys for people who aren’t actually serious.)
As for me? What happened was, I moved from being in the first group to the third. No, that’s a lie. What actually happened was I had a kid, and realized I had always been in the third group. I loved computers, but not for their own sake, I loved them for the other things I could with them. Play games, write text, make art, build things; they were tools, the means to my ends, not an end to themselves. I was always a little askew from most of the other “computer guys” I was hanging out with; I didn’t want to spend my evening recompiling sound drivers, I wanted to do somethat that required the computer to play sound, and I always slightly resented it when the machine required me to finish making the sausage myself. But, that’s just how it was, the price of doing business. Want to play Wing Commander with sound? You better learn how Himem works.
As time passed, and we rolled into the 21st century, and the Mac moved to the BSD-based OS X, and then again to Intel processors, I kept raising my eyebrows. The Mac platform was slowly converging into something that might do what I wanted it to do?
The last Windows PC I built for myself unceremoniously gave up the ghost sometime in 2008 or 9, I can’t remember. I needed a new rig, but our first kid was on the way, and I realized my “game playing” time had already shrunk to essentially nil. And, by this time I had an iPhone, and trying to make that work with my WindowsXP machine was… trying. So, I said, what the hell, and bought a refurbed 2009 polycarb MacBook). And I never looked back.
I couldn’t believe how easy it was to use. Stuff just worked! The built-in apps all did what they were supposed to do! Closing the laptop actually put the computer to sleep! It still had that sleep light that looked like breathing. The UI conventions were different from what I was used to on Windows for sure, but unlike what I was used to, they were internally consistent, and had an actual conceptual design behind them. You could actually learn how “the Mac” worked, instead of having to memorize a zillion snowflakes like Windows. And the software! Was just nice. There’s a huge difference in culture of software design, and it was like I could finally relax once I changed teams. It wasn’t life-changing quite the way that original Tandy was, but it was a fundamental recalibration in my relationship with computers. To paraphrase all those infomercials, it turns out there really was a better way.
So, happy birthday, to both of my most influential computers of the last forty years. Here’s to the next forty.
But see if you can pick up some actual games this time.
Pascal
Niklaus Wirth has passed away! A true giant of the field, he’s already missed. I don’t have much to add to the other obituaries and remembrances out there, but: perhaps an anecdote.
Like a lot of Gen-X programmers, Pascal was the first “real” programming language I learned in school. (I’m probably one of the youngest people out there to have gotten all three “classic” educational programming languages in school: Pascal, Logo, and BASIC). This was at the tail end of Pascal’s reign as a learning language, and past the point where it was considered a legitimate language for real work. (I missed the heyday of Turbo Pascal and the Mac Classic.). Mostly, we looked forward to learning C.
Flash forward a bit. Early in my career, I got a job as a Delphi programmer. Delphi was the pokeman-style final evolution of Borland’s Turbo Pascal, incorporating the Object Pascal Wirth worked out with Apple for the early Mac, along with Borland’s UI design tools. But under the object-oriented extensions and forms, it was “just” Pascal.
The job in question was actually half-and-half Delphi and C. The product had two parts: a Windows desktop app, and an attached device that ran the C programs on an embedded microcontroller.
For years, whenever this job came up in interviews or what have you, I always focused on the C part of the work. Mostly this was practical—focus on the language thats still a going concern, move past the dead one with a joke without dwelling on it. But also, I had a couple of solid interview stories based on the wacky behavior of C with semi-custom compilers on highly-constrained embedded hardware.
I don’t have any good stories about the Delphi part of the job. Because it was easy.
I probably wrote as much code in Delphi as C in that job, and in all honesty, the Delphi code was doing harder, more complex stuff. But the C parts were hard; I probably spent three times as much time to write the same number of lines of code.
Delphi, on the other hand, was probably the nicest programming environment I’ve ever used. It was a joy to use everyday, and no matter how goofy a feature I was trying to add, the language always made it possible. Some of that was Borland’s special sauce, but mostly that was Wirth’s Pascal, getting the job done.
These days, I’m not sure you could convince me to go back to C, but I’d go back to Delphi in a hot second. Thanks, Professor Wirth. You made younger me’s life much easier, and I appreciate it.
Wednesday linkblog, Twitter reminisces edition
The reminisces are starting to flow now, as is really starts to settle in that the thing we used to call twitter is gone and won’t be back.. As such, I’d like to call your attention to The Verge’s truly excellent The Year Twitter Died. This is probably the best “what it was, and what we lost, for both good and ill” piece I’ve read. Especially don’t miss The Great Scrollback of Alexandria. I’m glad someone is putting the work into saving some part of what used to be there.
Also, this week in folks talking about twitter, I enjoyed John Scalzi’s check-in a month after finally walking away: Abandoning the Former Twitter: A Four-Week Check-In. Scalzi was one of the strongest “I was here before he got here, and I’ll be here after he leaves” voices I saw a year ago, and the last year beat him like the rest of us.
There’s, of couse, the usual blowback to stuff like this, with at least one article I saw in response to that verge piece claiming that no, twitter always sucked, here’s all the problems it had, I always had a bad time there, so on and so on. I won’t link to it because why give them the attention, but I spent the whole time reading it thinking of this quote from former-President Beeblebrox: “Zowee, here we are at the End of the Universe and you haven't even lived yet. Did you miss out.”
What was happening: Twitter, 2006-2023
Twitter! What can I tell ya? It was the best of times, it was the worst of times. It was a huge part my life for a long time. It was so full of art, and humor, and joy, and community, and ideas, and insight. It was also deeply flawed and profoundly toxic, but many of those flaws were fundamental to what made it so great.
It’s almost all gone now, though. The thing called X that currently lives where twitter used to be is a pale, evil, corrupted shadow of what used to be there. I keep trying to explain what we lost, and I can’t, it’s just too big.1 So let me sum up. Let me tell you why I loved it, and why I left. As the man2 said, let me tell you of the days of high adventure.
I can’t now remember when I first heard the word “twitter”. I distinctly remember a friend complaining that this “new twitter thing” had blown out the number of free SMS messages he got on his nokia flip phone, and that feels like a very 2006 conversation.
I tend to be pretty online, and have been since the dawn of the web, but I’m not usually an early adopter of social networks, so I largely ignored twitter for the first couple of years. Then, for reasons downstream of the Great Recession, I found myself unemployed for most of the summer of 2009.3 Suddenly finding myself with a surfit of free time, I worked my way down that list of “things I’ll do if I ever get time,” including signing up for “that twitter thing.” (I think that’s the same summer I lit up my now-unused Facebook account, too.) Smartphones existed by then, and it wasn’t SMS-based anymore, but had a website, and apps.4
It was great. This was still in it’s original “microblogging” configuration, where it was essentially an Instant Messenger status with history. You logged in, and there was the statuses of the people you followed, in chronological order, and nothing else.
It was instantly clear that this wasn’t a replacement for something that already existed—this wasn't going to do away with your LiveJournal, or Tumblr, or Facebook, or blog. This was something new, something extra, something yes and. The question was, what was it for? Where did it fit in?
Personally, at first I used my account as a “current baby status” feed, updating extended family about what words my kids had learned that day. The early iteration of the site was perfect for that—terse updates to and from people you knew.
Over time, it accumulated various social & conversational features, not unlike a Katamari rolling around Usenet, BBSes, forums, discussion boards, other early internet communication systems. It kept growing, and it became less useful as a micro-blogging system and more of a free-wheeling world-wide discussion forum.
It was a huge part of my life, and for a while there, everyone’s life. Most of that time, I enjoyed it an awful lot, and got a lot out of it. Everyone had their own take on what it was Twitter had that set it apart, but for me it was three main things, all of which reinforced each other:
It was a great way to share work. If you made things, no matter how “big” you were, it was a great way to get your work out there. And, it was a great way to re-share other people’s work. As a “discovery engine” it was unmatched.
Looking at that the other way, It was an amazing content aggregator. It essentially turned into “RSS, but Better”; at the time RSS feeds had pretty much shrunk to just “google reader’s website”. It turns out that sharing things from your RSS feed into the feeds of other people, plus a discussion thread, was the key missing feature. If you had work of your own to share, or wanted to talk about something someone else had done elsewhere on the internet, twitter was a great way to share a link and talk about it. But, it also worked equally well for work native to twitter itself. Critically, the joke about the web shrinking to five websites full of screenshots of the other four5 was posted to twitter, which was absolutely the first of those five websites.
Most importantly, folks who weren’t anywhere else on the web were on twitter. Folks with day jobs, who didn’t consider themselves web content people were there; these people didn’t have a blog, or facebook, or instagram, but they were cracking jokes and hanging out in twitter.
There is a type of person whom twitter appealed to in a way that no other social networking did. A particular kind of weirdo that took Twitter’s limitations—all text, 140 or 280 characters max—and turned them into a playground.
And that’s the real thing—twitter was for writers. Obviously it was text based, and not a lot of text at that, so you had to be good at making language work for you. As much as the web was originally built around “hypertext”, most of the modern social web is built around photos, pictures, memes, video. Twitter was for people who didn’t want to deal with that, who could make the language sing in a few dozen words.
It had the vibe of getting to sit in on the funniest people you know’s group text, mixed with this free-wheeling chaos energy. On it’s best days, it had the vibe of the snarky kids at the back of the bus, except the bus was the internet, and most of the kids were world-class expoerts in something.
There’s a certain class of literary writer goofballs that all glommed onto twitter in a way none of us did with any other “social network.” Finally, something that rewarded what we liked and were good at!
Writers, comedians, poets, cartoonists, rabbis, just hanging out. There was a consistent informality to the place—this wasn’t the show, this was the hotel bar after the show. The big important stuff happened over in blogs, or columns, or novels, or wherever everyone’s “real job” was, this was where everyone let their hair down and cracked jokes.
But most of all, it was weird. Way, way weirder than any other social system has ever been or probably ever will be again, this was a system that ran on the same energy you use to make your friends laugh in class when you’re supposed to be paying attention.
It got at least one thing exactly right: it was no harder to sign into twitter and fire off a joke than it was to fire a message off to the group chat. Between the low bar to entry and the emphasis on words over everthing else, it managed to attract a crowd of folks that liked computers, but didn’t see them as a path to self-actualization.
But what made twitter truly great were all the little (and not so little) communities that formed. It wasn’t the feature set, or the website, or the tech, it was the people, and the groups they formed. It’s hard to start making lists, because we could be here all night and still leave things out. In no particular order, here’s the communities I think I’ll miss the most:
- Weird Twitter—Twitter was such a great vector for being strange. Micro-fiction, non-sequiturs, cats sending their mothers to jail, dispatches from the apocalypse.
- Comedians—professional and otherwise, people who could craft a whole joke in one sentence.
- Writers—A whole lot of people who write for a living ended up on twitter in a way they hadn’t anywhere else on the web.
- Jewish Twitter—Speaking as a Jew largely disconnection from the local Jewish community , it was so much fun to get to hang out with the Rabbis and other Jews.
But also! The tech crowd! Legal experts! Minorities of all possible interpretations of the word sharing their experiences.
And the thing is, other than the tech crowd,6 most of those people didn’t go anywhere else. They hadn’t been active on the previous sites, and many of them drifted away again the wheels started coming off twitter. There was a unique alchemy on twitter for forming communities that no other system has ever had.
And so the real tragedy of twitter’s implosion is that those people aren’t going somewhere else. That particular alchemy doesn’t exist elsewhere, and so the built up community is blowing away on the wind.
Because all that’s almost entirely gone now, though. I miss it a lot, but I realize I’ve been missing it for a year now. There had been a vague sense of rot and decline for a while. You can draw a pretty straight line from gamergate, to the 2016 Hugos, to the 2016 election, to everything around The Last Jedi, to now, as the site rotted out from the inside; a mounting sense that things were increasingly worse than they used to be. The Pandemic saw a resurgence of energy as everyone was stuck at home hanging out via tweets, but in retrospect that was a final gasp.7
Once The New Guy took over, there was a real sense of impending closure. There were plenty of accounts that made a big deal out of Formally Leaving the site and flouncing out to “greener pastures”, either to make a statement, or (more common) to let their followers know where they were. There were also plenty of accounts saying things like “you’ll all be back”, or “I was here before he got here and I’ll be here after he leaves”, but over the last year mostly people just drifted away. People just stopped posting and disappeared.
It’s like the loss of a favorite restaurant —the people who went there already know, and when people who wen’t there express disbelief, the response is to tell them how sorry you are they missed the party!
The closest comparison I can make to the decayed community is my last year of college. (Bear with me, this’ll make sense.). For a variety of reasons, mostly good, it took me 5 years to get my 4 year degree. I picked up a minor, did some other bits and bobs on the side, and it made sense to tack on an extra semester, and at that point you might as well do the whole extra year.
I went to a medium sized school in a small town.8 Among the many, many positive features of that school was the community. It seemed like everyone knew everyone, and you couldn’t go anywhere without running into someone you knew. More than once, when I didn’t have anything better to do, I’d just hike downtown and inevitably I’d run into someone I knew and the day would vector off from there.9
And I’d be lying if I said this sense of community wasn’t one of the reasons I stuck around a little longer—I wasn’t ready to give all that up. Of course, what I hadn’t realized was that not everyone else was doing that. So one by one, everyone left town, and by the end, there I was in downtown surrounded by faces I didn’t know. My lease had a end-date, and I knew I was moving out of town on that day no matter what, so what, was I going to build up a whole new peer group with a short-term expiration date? That last six months or so was probably the weirdest, loneliest time of my whole lide. When the lease ended, I couldn’t move out fast enough.
The point is: twitter got to be like that. I was only there for the people, and nearly all the people I was there for had already gone. Being the one to close out the party isn’t always the right move.
One of the things that made it so frustrating was that it had always problems, but it had the same problems that any under-moderated semi-anonymous internet system had. “How to stop assholes from screwing up your board” is a 4 decade old playbook at this point, and twitter consistently failed to actually deploy any of the solutions, or at least deploy them at a scale that made a difference. The maddening thing was always that the only unique thing about twitter’s problems was the scale.
I had a soft rule that I could only read Twitter when using my exercise bike, and a year or two ago I couldn’t get to the end of the tweets from people I followed before I collapsed from exhaustion. Recently, I’d run out of things to read before I was done with my workout. People were posting less, and less often, but mostly they were just… gone. Quietly fading away as the site got worse.
In the end, though, it was the tsunami of antisemitism that got me. “Seeing only what you wanted to see” was always a skill on twitter, but the unfolding disaster in Israel and Gaza broke that. Not only did you have the literal nazis showing up and spewing their garbage without check, but you had otherwise progressive liberal leftists (accidentally?) doing the same thing, without pushback or attempt at discussion, because all the people that would have done that are gone. So instead it’s just a nazi sludge.10
There was so much great stuff on there—art, ideas, people, history, jokes. Work I never would have seen, things I wouldn’t have learned, books I wouldn’t have read, people I wouldn’t know about. I keep trying to encompass what’s been lost, make lists, but it’s too big. Instead, let me tell you one story about the old twitter:
One of the people I follow(ed) was Kate Beaton, originally known for the webcomic Hark A Vagrant!, most recently the author of Ducks (the best book I read last year). One day, something like seven years ago, she started enthusing about a book called Tough Guys Have Feelings Too. I don’t think she had a connection to the book? I remember it being an unsolicited rave from someone who had read it and was stuck by it.
The cover is a striking piece of art of a superhero, head bowed, eyes closed, a tear rolling down his cheek. The premise of the book is what it says on the cover—even tough guys have feelings. The book goes through a set of sterotypical “tough guys”—pirates, ninjas, wrestlers, superheros, race car drivers, lumberjacks, and shows them having bad days, breaking their tools, crashing their cars, hurting themselves. The tough guys have to stop, and maybe shed a tear, or mourn, or comfort themselves or each other, and the text points out, if even the tough guys can have a hard time, we shouldn’t feel bad for doing the same. The art is striking and beautiful, the prose is well written, the theme clearly and well delivered.
I bought it immediately. You see, my at-the-time four-year-old son was a child of Big Feelings, but frequently had trouble handling those feelings. I thought this might help him. Overnight, this book became almost a mantra. For years after this, when he was having Big Feelings, we’d read this book, and it would help him calm down and take control of what he was feeling.
It’s not an exaggeration to say this book changed all our lives for the better. And in the years since then, I’ve often been struck that despite all the infrastructure of moden capitalism—marketing, book tours, reviews, blogs, none of those ever got that book into my hands. There’s only been one system where an unsolicited rave from a web cartoonist being excited about a book outside their normal professional wheelhouse could reach someone they’ve never met or heard of and change that person’s son’s life.
And that’s gone now.
-
I’ve been trying to write something about the loss of twitter for a while now. The first draft of this post has a date back in May, to give you some idea.
-
Mako.
-
As as aside, everyone should take a summer off every decade or so.
-
I tried them all, I think, but settled on the late, lamented Tweetbot.
-
The tech crowd all headed to mastodon, but didn’t build that into a place that any of those other communities could thrive. Don’t @-me, it’s true.
-
In retrospect, getting Morbius to flop a second time was probably the high point, it was all downhill after that.
-
CSU Chico in Chico, California!
-
Yes, this is what we did back in the 90s before cellphones and texting, kids.
-
This is out of band for the rest of the post, so I’m jamming all this into a footnote:
Obviously, criticizing the actions of the government of Israel is no more antisemitic than criticizing Hamas would be islamophobic. But objecting to the actions of Israel ’s government with “how do the Jews not know they’re the bad guys” sure as heck is, and I really didn’t need to see that kind of stuff being retweeted by the eve6 guy.
A lot of things are true. Hamas is not Palestine is not “The Arabs”, and the Netanyahu administration is not Israel is not “The Jews.” To be clear, Hamas is a terror organization, and Israel is on the functional equivalent of Year 14 of the Trump administration.
The whole disaster hits at a pair of weird seams in the US—the Israel-Palestine conflict maps very strangely to the American political left-right divide, and the US left has always had a deep-rooted antisemitism problem. As such, what really got me was watching all the comments criticizing “the Jews” for this conflict come from _literally_ the same people who spent four years wearing “not my president” t-shirts and absolving themselves from any responsibility for their governments actions because they voted for “the email lady”. They get the benefit of infinite nuance, but the Jews are all somehow responsible for Bibi’s incompetent choices.