Gabriel L. Helman Gabriel L. Helman

“Hanging Out”

For the most recent entry in asking if ghosts have civil rights, the Atlantic last month wonders: Why Americans Suddenly Stopped Hanging Out.

And it’s an almost perfect Atlantic article, in that it looks at a real trend, finds some really interesting research, and then utterly fails to ask any obvious follow-up questions.

It has all the usual howlers of the genre: it recognizes that something changed in the US somewhere around the late 70s or early 80s without ever wondering what that was, it recognizes that something else changed about 20 years ago without wondering what that was, it displays no curiosity whatsoever around the lack of “third places” and where, exactly kids are supposed to actually go when then try to hang out. It’s got that thing where it has a chart of (something, anything) social over time, and the you can perfectly pick out Reagan’s election and the ’08 recession, and not much else.

There’s lots of passive voice sentences about how “Something’s changed in the past few decades,” coupled with an almost perverse refusal to look for a root cause, or connect any social or political actions to this. You can occasionally feel the panic around the edges as the author starts to suspect that maybe the problem might be “rich people” or “social structures”, so instead of talking to people inspects a bunch of data about what people do, instead of why people do it. It’s the exact opposite of that F1 article; this has nothing in it that might cause the editor to pull it after publication.

In a revelation that will shock no one, the author instead decides that the reason for all this change must be “screens”, without actually checking to see what “the kids these days” are actually using those screens for. (Spoiler: they’re using them to hang out). Because, delightfully, the data the author is basing all this on tracks only in-person socializing, and leaves anything virtual off the table.

This is a great example of something I call “Alvin Toffler Syndrome”, where you correctly identify a really interesting trend, but are then unable to get past the bias that your teenage years were the peak of human civilization and so therefore anything different is bad. Future Shock.

I had three very strong reaction to this, in order:

First, I think that header image is accidentally more revealing than they thought. All those guys eating alone at the diner look like they have a gay son they cut off; maybe we live in an age where people have lower tolerance for spending time with assholes?

Second, I suspect the author is just slightly younger than I am, based on a few of the things he says, but also the list of things “kids should be doing” he cites from another expert:

“There’s very clearly been a striking decline in in-person socializing among teens and young adults, whether it’s going to parties, driving around in cars, going to the mall, or just about anything that has to do with getting together in person”.

Buddy, I was there, and “going to the mall, driving around in cars” sucked. Do you have any idea how much my friends and I would have rather hung out in a shared Minecraft server? Are you seriously telling me that eating a Cinnabon or drinking too much at a high school house party full of college kids home on the prowl was a better use of our time? Also: it’s not the 80s anymore, what malls?

(One of the funniest giveaways is that unlike these sorts of articles from a decade ago, “having sex” doesn’t get listed as one of the activities that teenagers aren’t doing anymore. Like everyone else between 30 and 50, the author grew up in a world where sex with a stranger can kill you, and so that’s slipped out of the domain of things “teenagers ought to be doing, like I was”.)

But mostly, though, I disagree with the fundamental premise. We might have stopped socializing the same ways, but we certainly didn’t stop. How do I know this? Because we’re currently entering year five of a pandemic that became uncontrollable because more Americans were afraid of the silence of their own homes than they were of dying.

Read More
Gabriel L. Helman Gabriel L. Helman

Even Further Behind The Velvet Curtain Than We Thought

Kate Wagner, mostly known around these parts for McMansion Hell, but who also does sports journalism, wrote an absolutely incredible piece for Road & Track on F1, which was published and then unpublished nearly instantly. Why yes, the Internet Archive does have a copy: Behind F1's Velvet Curtain. It’s the sort of thing where if you start quoting it, you end up reading the whole thing out loud, so I’ll just block quote the subhead:

If you wanted to turn someone into a socialist you could do it in about an hour by taking them for a spin around the paddock of a Formula 1 race. The kind of money I saw will haunt me forever.

It’s outstanding, and you should go read it.

But, so, how exactly does a piece like this get all the way to being published out on Al Gore’s Internet, and then spiked? The Last Good Website tries to get to the bottom of it: Road & Track EIC Tries To Explain Why He Deleted An Article About Formula 1 Power Dynamics.

Road & Track’s editor’s response to the Defector is one of the most brazen “there was no pressure because I never would have gotten this job if I waited until they called me to censor things they didn’t like” responses since, well, the Hugos, I guess?


Edited to add: Today in Tabs—The Road & Track Formula One Scandal Makes No Sense

Read More
Gabriel L. Helman Gabriel L. Helman

March Fifth. Fifth March.

Today is Tuesday, March 1465th, 2020, COVID Standard Time.

Three more weeks to flatten the curve!

What else even is there to say at this point? Covid is real. Long Covid is real. You don’t want either of them. Masks work. Vaccines work.

It didn’t have to be like this.

Every day, I mourn the futures we lost because our civilization wasn’t willing to put in the effort to try to protect everyone. And we might have even pulled it off, if the people who already had too much had been willing to make a little less money for a while. But instead, we're walking into year five of this thing.

Read More
Gabriel L. Helman Gabriel L. Helman

The Sky Above The Headset Was The Color Of Cyberpunk’s Dead Hand

Occasionally I poke my head into the burned-out wasteland where twitter used to be, and whilw doing so stumbled over this thread by Neil Stephenson from a couple years ago:

Neal Stephenson: "The assumption that the Metaverse is primarily an AR/VR thing isn't crazy. In my book it's all VR. And I worked for an AR company--one of several that are putting billions of dollars into building headsets. But..."

Neal Stephenson: "...I didn't see video games coming when I wrote Snow Crash. I thought that the killer app for computer graphics would be something more akin to TV. But then along came DOOM and generations of games in its wake. That's what made 3D graphics cheap enough to reach a mass audience."

Neal Stephenson: "Thanks to games, billions of people are now comfortable navigating 3D environments on flat 2D screens. The UIs that they've mastered (e.g. WASD + mouse) are not what most science fiction writers would have predicted. But that's how path dependency in tech works."

I had to go back and look it up, and yep: Snow Crash came out the year before Doom did. I’d absolutely have stuck this fact in Playthings For The Alone if I’d had remembered, so instead I’m gonna “yes, and” my own post from last month.

One of the oft-remarked on aspects of the 80s cyberpunk movement was that the majority of the authors weren’t “computer guys” before-hand; they were coming at computers from a literary/artist/musician worldview which is part of why cyberpunk hit the way it did; it wasn’t the way computer people thought about computers—it was the street finding it’s own use for things, to quote Gibson. But a less remarked-on aspect was that they also weren’t gamers. Not just not computer games, but any sort of board games, tabletop RPGs.

Snow Crash is still an amazing book, but it was written at the last possible second where you could imagine a multi-user digital world and not treat “pretending to be an elf” as a primary use-case. Instead the Metaverse is sort of a mall? And what “games” there are aren’t really baked in, they’re things a bored kid would do at a mall in the 80s. It’s a wild piece of context drift from the world in which it was written.

In many ways, Neuromancer has aged better than Snow Crash, if for no other reason that it’s clear that the part of The Matrix that Case is interested in is a tiny slice, and it’s easy to imagine Wintermute running several online game competitions off camera, whereas in Snow Crash it sure seems like The Metaverse is all there is; a stack of other big on-line systems next to it doesn’t jive with the rest of the book.

But, all that makes Snow Crash a really useful as a point of reference, because depending on who you talk to it’s either “the last cyberpunk novel”, or “the first post-cyberpunk novel”. Genre boundaries are tricky, especially when you’re talking about artistic movements within a genre, but there’s clearly a set of work that includes Neuromancer, Mirrorshades, Islands in the Net, and Snow Crash, that does not include Pattern Recognition, Shaping Things, or Cryptonomicon; the central aspect probably being “books about computers written by people who do not themselves use computers every day”. Once the authors in question all started writing their novels in Word and looking things up on the web, the whole tenor changed. As such, Snow Crash unexpectedly found itself as the final statement for a set of ideas, a particular mix of how near-future computers, commerce, and the economy might all work together—a vision with strong social predictive power, but unencumbered by the lived experience of actually using computers.

(As the old joke goes, if you’re under 50, you weren’t promised flying cars, you were promised a cyberpunk dystopia, and well, here we are, pick up your complementary torment nexus at the front desk.)

The accidental predictive power of cyberpunk is a whole media thesis on it’s own, but it’s grimly amusing that all the places where cyberpunk gets the future wrong, it’s usually because the author wasn’t being pessimistic enough. The Bridge Trilogy is pretty pessimistic, but there’s no indication that a couple million people died of a preventable disease because the immediate ROI on saving them wasn’t high enough. (And there’s at least two diseases I could be talking about there.)

But for our purposes here, one of the places the genre overshot was this idea that you’d need a 3d display—like a headset—to interact with a 3d world. And this is where I think Stephenson’s thread above is interesting, because it turns out it really didn’t occur to him that 3d on a flat screen would be a thing, and assumed that any sort of 3d interface would require a head-mounted display. As he says, that got stomped the moment Doom came out. I first read Snow Crash in ’98 or so, and even then I was thinking none of this really needs a headset, this would all work find on a decently-sized monitor.

And so we have two takes on the “future of 3d computing”: the literary tradition from the cyberpunk novels of the 80s, and then actual lived experience from people building software since then.

What I think is interesting about the Apple Cyber Goggles, in part, is if feels like that earlier, literary take on how futuristic computers would work re-emerging and directly competing with the last four decades of actual computing that have happened since Neuromancer came out.

In a lot of ways, Meta is doing the funniest and most interesting work here, as the former Oculus headsets are pretty much the cutting edge of “what actually works well with a headset”, while at the same time, Zuck’s “Metaverse” is blatantly an older millennial pointing at a dog-eared copy of Snow Crash saying “no, just build this” to a team of engineers desperately hoping the boss never searches the web for “second life”. They didn’t even change the name! And this makes a sort of sense, there are parts of Snow Crash that read less like fiction and more like Stephenson is writing a pitch deck.

I think this is the fundamental tension behind the reactions to Apple Vision Pro: we can now build the thing we were all imagining in 1984. The headset is designed by cyberpunk’s dead hand; after four decades of lived experience, is it still a good idea?

Read More
Gabriel L. Helman Gabriel L. Helman

Friday Linkblog, best-in-what-way-edition

Back in January, Rolling Stone posted The 150 Best Sci-Fi Movies of All Time. And I almost dropped a link to it back then, with a sort of “look at these jokers” comment, but ehhh, why. But I keep remembering it and laughing, because this list is deranged.

All “best of” lists have the same core structural problem, which is they never say what they mean by “best”. Most likely to want to watch again? Cultural impact? Most financially successful? None of your business. It’s just “best”. So you get to infer what the grading criteria is, an exercise left to the reader.

And then this list has an extra problem, in that it also doesn’t define what they mean by “science fiction”. This could be an entire media studies thesis, but briefly and as far as movies are concerned, “science fiction” means roughly two things: first, “fiction” about “science”, usually in the future and concerned about the effects some possible technology has on society, lots of riffing on and manipulating big ideas, and second, movies that use science fiction props and iconography to tell a different story, usually as a subset of action-adventure movies. Or, to put all that another way, movies like 2001 and movies like Star Wars.

This list goes with “Option C: all of the above, but no super heroes” and throws everything into the mix. That’s not all that uncommon, and gives a solid way to handicap what approach the list is taking by comparing the relative placements of those two exemplars. Spoiler: 2001 is at №1, and Star Wars is at №9, which, okay, thats a pretty wide spread, so this is going to be a good one.

And this loops back to “what do you mean by best”? From a recommendation perspective, I’m not sure what value a list has that includes Star Wars and doesn’t include Radiers of the Lost Ark, and ditto 2001 and, say, The Seventh Seal. But, okay, we’re doing a big-tent, anything possibly science fiction, in addition to pulling from much deeper into the back catalog than you usually see in a list like this. Theres a bunch of really weird movies on here that you don’t usually see on best-of lists, but I was glad to see them! There’s no bad movies on there, but there are certainly some spicy ones you need to be in the right headspace for.

These are all good things, to be clear! But the result is a deeply strange list.

The top 20 or so are the movies you expect, and the order feels almost custom designed to get the most links on twitter. But it’s the bottom half of the list that’s funny.

Is Repo Man better than Bill and Ted’s Excellent Adventure? Is Colossus: The Forbin Project better than Time Bandits? Are both of those better than The Fifth Element? Is Them better than Tron? Is Zardoz better than David Lynch’s Dune? I don’t know! (Especially that last one.) I think the answer to all of those is “no”, but I can imagine a guy who doesn’t think so? I’ve been cracking up at the mental image of a group of friends wanting to watch Jurassic Park, and theres the one guy who says “you want to watch the good Crichton movie? Check this out!” and drops Westworld on them.

You can randomly scroll to almost anywhere and compare two movies, and go “yeah, that sounds about right”, and then go about ten places in either direction and compare that movie to those first two and go “wait, what?”

And look, we know how these kinds of lists get written: you get a group of people together, and they bring with them movies they actually like, movies they want people to think they like, and how hipster contrarian they’re feeling, and that all gets mashed up into something that’ll drive “engagement”, and maybe generate some value for the shareholders.

But what I think is funny about these kinds of lists in general, and this list specifically, is imagining the implied author. Imagine this really was one guy’s personal ranking of science fiction movies. The list is still deranged, but in a delightful way, sort of a “hell yeah, man, you keep being you!”

It’s less of a list of best movies, and more your weirdest friend’s Id out on display. I kind of love it! The more I scrolled, I kept thinking “I don’t think I agree with this guy about anything, but I’m pretty sure we’d be friends.”

Read More
Gabriel L. Helman Gabriel L. Helman

Time Zones Are Hard

In honor of leap day, my favorite story about working with computerized dates and times.

A few lifetimes ago, I was working on a team that was developing a wearable. It tracked various telemetry about the wearer, including steps. As you might imagine, there was an option to set a Step Goal for the day, and there was a reward the user got for hitting the goal.

Skipping a lot of details, we put together a prototype to do a limited alpha test, a couple hundred folks out in the real world walking around. For reasons that aren’t worth going into, and are probably still under NDA, we had to do this very quickly; on the software side we basically had to start from scratch and have a fully working stack in 2 or 3 months, for a feature set that was probably at minimum 6-9 months worth of work.

There were a couple of ways to slice what we meant by “day”, but we went with the most obvious one, midnight to midnight. Meaning that the user had until midnight to hit your goal, and then at 12:00 your steps for the day resets to 0.

Dates and Times are notoriously difficult for computers. Partly, this is because Dates and Times are legitimately complex. Look at the a full date: “February 29th 2024, 11:00:00 am”. Every value there has a different base, a different set of legal values. Month lengths, 24 vs 12 hour times, leap years, leap seconds. It’s a big tangle of arbitrary rules. If you take a date and time, and want to add 1000 minutes to it, the value of the result is “it depends”. This gets even worse when you add time zones, and the time zone’s angry sibling, daylight saving time. Now, the result of adding two times together also depends on where you were when it happened. It’s gross!

But the other reason it’s hard to use dates and times in computers is that they look easy. Everyone does this every day! How hard can it be?? So developers, especially developers working on platforms or frameworks, tend to write new time handling systems from scratch. This is where I link to this internet classic: Falsehoods programmers believe about time.

The upshot of all that is that there’s no good standard way to represent or transmit time data between systems, the way there is with, say, floating point numbers, or even unicode multi-language strings. It’s a stubbornly unsolved problem. Java, for example, has three different separate systems for representing dates and times built in to the language, none of which solve the whole problem. They’re all terrible, but in different ways.

Which brings me back to my story. This was a prototype, built fast. We aggressively cut features, anything that wasn’t absolutely critical went by the wayside. One of the things we cut out was Time Zone Support, and chose to run the whole thing in Pacific Time. We were talking about a test that was going to run about three months, which didn’t cross a DST boundary, and 95% of the testers were on the west coast. There were a handful of folks on the east cost, but, okay, they could handle their “day” starting and ending at 3am. Not perfect, but made things a whole lot simpler. They can handle a reset-to-zero at 3 am, sure.

We get ready to ship, to light to test run up.

“Great news!” someone says. “The CEO is really excited about this, he wants to be in the test cohort!”

Yeah, that’s great! There’s “executive sponsorship”, and then there’s “the CEO is wearing the device in meetings”. Have him come on down, we’ll get him set up with a unit.

“Just one thing,” gets causally mentioned days later, “this probably isn’t an issue, but he’s going to be taking them on his big publicized walking trip to Spain.”

Spain? Yeah, Spain. Turns out he’s doing this big charity walk thing with a whole bunch of other exec types across Spain, wants to use our gizmo, and at the end of the day show off that he’s hit his step count.

Midnight in California is 9am in Spain. This guy gets up early. Starts walking early. His steps are going to reset to zero every day somewhere around second breakfast.

Oh Shit.

I’m not sure we even said anything else in that meeting, we all just stood without a word, acquired a concerning amount of Mountain Dew, and proceeded to spend the next week and change hacking in time zone support to the whole stack: various servers, database, both iOS and Android mobile apps.

It was the worst code I have ever written, and of course it was so much harder to hack in after the fact in a sidecar instead of building it in from day one. But the big boss got hit step count reward at the end of the day every day, instead of just after breakfast.

From that point on, whenever something was described as hard, the immediate question was “well, is this just hard, or is it ’time zone’ hard?”

Read More
Gabriel L. Helman Gabriel L. Helman

Today in Star Wars Links

Time marches on, and it turns out Episode I is turning 25 this year. As previously mentioned, the prequels have aged interestingly, and Episode I the most. It’s not any better than it used to be, but the landscape around it has changed enough that they don’t look quite like they used to. They’re bad in a way that no other movies were before or have been since, and it’s easier to see that now than it was at the time. As such, I very much enjoyed Matt Zoller Seitz’s I Used to Hate The Phantom Menace, but I Didn’t Know How Good I Had It :

Watching “The Phantom Menace,” you knew you were watching a movie made by somebody in complete command of their craft, operating with absolute confidence, as well as the ability to make precisely the movie they wanted to make and overrule anyone who objected. […] But despite the absolute freedom with which it was devised, “The Phantom Menace” seemed lifeless somehow. A bricked phone. See it from across the room, you’d think that it was functional. Up close, a paperweight.

[…]

Like everything else that has ever been created, films are products of the age in which they were made, a factor that’s neither here nor there in terms of evaluating quality or importance. But I do think the prequels have a density and exactness that becomes more impressive the deeper we get into the current era of Hollywood, wherein it is not the director or producer or movie star who controls the production of a movie, or even an individual studio, but a global megacorporation, one that is increasingly concerned with branding than art.

My drafts folder is full of still-brewing star wars content, but for the moment I’ll say I largely vibe with this? Episode I was not a good movie, but whatever else you can say about it, it was the exact movie one guy wanted to make, and that’s an increasingly rare thing. There have been plenty of dramatically worse movies and shows in the years sense, and they don’t even have the virtue of being one artist’s insane vision. I mean, jeeze, I’ll happily watch TPM without complaint before I watch The Falcon & The Winter Soldier or Iron Fist or Thor 2 again.

And, loosely related, I also very much enjoyed this interview with prequel-star Natalie Portman:

Natalie Portman on Striking the Balance Between Public and Private Lives | Vanity Fair

Especially this part:

The striking thing has been the decline of film as a primary form of entertainment. It feels much more niche now. If you ask someone my kids’ age about movie stars, they don’t know anyone compared to YouTube stars, or whatever.

There’s a liberation to it, in having your art not be a popular art. You can really explore what’s interesting to you. It becomes much more about passion than about commerce. And interesting, too, to beware of it becoming something elitist. I think all of these art forms, when they become less popularized, you have to start being like, okay, who are we making this for anymore? And then amazing, too, because there’s also been this democratization of creativity, where gatekeepers have been demoted and everyone can make things and incredible talents come up. And the accessibility is incredible. If you lived in a small town, you might not have been able to access great art cinema when I was growing up. Now it feels like if you’ve got an internet connection, you can get access to anything. It’s pretty wild that you also feel like at the same time, more people than ever might see your weird art film because of his extraordinary access. So it’s this two-sided coin.

I think this is the first time that I’ve seen someone admit that not only is 2019 is never going to happen again, but that’s a thing to embrace, not fear.

Read More
Gabriel L. Helman Gabriel L. Helman

Zoozve

This is great. Radiolab cohost Latif Nasser notices that the solar system poster in his kid’s room has a moon for Venus—which doesn’t have a moon—and starts digging. It’s an extremely NPR (complimentary) slow burn solving of the simple mystery of what this thing on the poster was, with a surprisingly fun ending! Plus: “Quasi-moons!”

Space.com has a recap of the whole thing with links: Zoozve — the strange 'moon' of Venus that earned its name by accident. (See also: 524522 Zoozve - Wikipedia.)

One of the major recurring themes on Icecano are STEM people who need to take more humanties classes, but this is a case where the opposite is true?

Read More
Gabriel L. Helman Gabriel L. Helman

Playthings For The Alone

Previously on Icecano: Space Glasses, Apple Vision Pro: New Ways to be Alone.

I hurt my back at the beginning of January. (Wait! This isn’t about to become a recipe blog, or worse, a late-90s AICN movie review. This is gonna connect, bare with me.) I did some extremely unwise “lift with your back, not your knees” while trying to get our neutronium-forged artificial Christmas tree out of the house. As usual for this kind of injury, that wasn’t the problem, the problem was when I got overconfident while full of ibuprofen and hurt it asecond time at the end of January. I’m much improved now, but that’s the one that really messed me up. I have this back brace thing that helps a lot—my son has nicknamed it The Battle Harness, and yeah, if there was a He-Man character that was also a middle-aged programmer, that’s what I look like these days. This is one of those injuries where what positions are currently comfortable is an ever-changing landscape, but the consistant element has been that none of them allow me to use a screen and a keyboard at the same time. (A glace back at the Icecano archives since the start of the year will provided a fairly accurate summary of how my back was doing.)

One of the many results of this is that I’ve been reading way, way more reviews of the Apple Cyber Goggles than I was intending to. I’ve very much been enjoying the early reactions and reviews of the Apple Cyber Goggles, and the part I enjoy the most has been that I no longer need to have a professional opinion about them.

That’s a lie, actually my favorite part has been that as near as I can tell everyone seems to have individually invented “guy driving a cybertruck wearing vision pro”, which if I were Apple I’d treat as a 5-alarm PR emergency. Is anyone wearing these outside for any reason other than Making Content? I suspect not. Yet, anyway. Is this the first new Apple product that “normal people” immediately treated as being douchebag techbro trash? I mean, usually it gets dismissed as overly expensive hipster gear, but this feels different, somehow. Ed Zitron has a solid take on why: How Tech Outstayed Its Welcome.

Like jetpacks and hover cars before it, computers replacing screens with goggles is something science fiction decided was Going To Happen(tm). There’s a sense of inevitability about it, in the future we’re all going to use ski goggles to look at our email. If I was feeling better I’d go scare up some clips, but we’re all picturing the same things: Johnny Mnemonic, any cover art for any William Gibson book, that one Michael Douglas / Demi Moore movie, even that one episode of Murder She Wrote where Mrs. Potts puts on a Nintendo Power Glove and hacks into the Matrix.

(Also, personal sidebar: something I learned about myself the last couple of weeks is somehow I’ve led a life where in my mid-40s I can type “mnemonic” perfectly the first time every time, and literally can never write “johnny” without making a typo.)

But I picked jetpacks and hover cars on purpose—this isn’t something like Fusion Power that everyone also expects but stubbornly stays fifty years into the future, this is something that keeps showing up and turning out to not actually be a good idea. Jetpacks, it turns out, are kinda garbage in most cases? Computers, but for your face, was a concept science fiction latched on to long before anyone writing in the field had used a computer for much, and the idea never left. And people kept trying to build them! I used to work with a guy who literally helped build a prototype face-mounted PC during the Reagan administration, and he was far from the first person to work on it. Heck, I tried out a prototype VR headset in the early 90s. But the reality on the ground is that what we actually use computers for tends to be a bad fit for big glasses.

I have a very clear memory of watching J. Mnemonic the first time with an uncle who was probably the biggest science fiction fan of all time. The first scene where Mr. Mnemonic logs into cyberspace, and the screen goes all stargate as he puts on his ski goggles and flies into the best computer graphics that a mid-budget 1995 movie can muster, I remember my uncle, in full delight and without irony, yelling “that’s how you make a phone call!” He’d have ordered one of these on day one. But you know what’s kinda great about my actual video phone I have in my pocket he didn’t live to see? It just connects to the person I’m calling instead of making me fly around the opening credits of Tron first.

Partly because it’s such a long-established signifier of the future, and lots of various bits of science fiction have made them look very cool, there’s a deep, built-up reservoir of optimism and good will about the basic concept, despite many of the implementations being not-so-great. And because people keep taking swings at it, like jetpacks, face-mounted displays have carved out a niche where they excel. (VR; games mostly, for jetpacks: the opening of James Bond movies.)

And those niches have gotten pretty good! Just to be clear: Superhot VR is one of my favorite games of all time. I do, in fact, own an Oculus from back when they still used that name, and there’s maybe a half-dozen games where the head-mounted full-immersion genuinely results in a different & better experience than playing on a big screen.

And I think this has been one of the fundamental tensions around this entire space for the last 40 years: there’s a narrow band of applications where they’re clearly a good idea, but science fiction promised that these are the final form. Everyone wants to join up with Case and go ride with the other Console Jockeys. Someday, this will be the default, not just the fancy thing that mostly lives in the cabinet.

The Cyber Goggles keep making me thing about the Magic Leap. Remember them? Let me tell you a story. For those of you who have been living your lives right, Magic Leap is a startup who’ve been working on “AR goggles” for some time now. Back when I had to care about this professionally, we had one.

The Leap itself, was pair of thick goggles. I think there was a belt pack with a cable? I can’t quite remember now. The lenses were circular and mostly clear. It had a real steampunk dwarf quality, minus the decorative gears. The central feature was that the lenses were clear, but had an embedded transparent display, so the virtual elements would overlay with real life. It was heavy, heavier than you wanted it to be, and the lenses had the quality of looking through a pair of very dirty glasses, or walking out of a 3D movie without giving the eyewear back. You could see, but you could tell the lenses were there. But it worked! It did a legitimately amazing job of drawing computer-generated images over the real world, and they really looked like they were there, and then you could dismiss them, and you were back to looking at the office through Thorin’s welding goggles.

What was it actually do though? It clearly wasn’t sure. This was shortly after google glass, and it had a similar approach to being a stand-along device running “cut down” apps. Two of them have stuck in my memory.

The first was literally Angy Birds. But it projected the level onto the floor of the room you were in, blocks and pigs and all, so you could walk around it like it was a set of discarded kids toys. Was it better than “regular” angry birds? No, absolutely not, but it was a cool demo. The illusion was nearly perfect, the scattered blocks really stayed in the same place.

The second was the generic productivity apps. Email, calendar, some other stuff. I think there was a generic web browser? It’s been a while. They lived in these virtual screens, rectangles hanging in air. You could position them in space, and they’d stay there. I took a whole set of these screens and arrayed them in the air about and around my desk. I had three monitors in real life, and then another five or six virtual ones, each of the virtual ones showing one app. I could glance up, and see a whole array of summary information. The resolution was too low to use them for real work, and the lenses themselves were too cloudy to use my actual computer while wearing them.

You can guess the big problem though: I coudn’t show them to anybody. Anything I needed to share with a coworker needed to be on one of the real screens, not the virtual ones. Theres nothing quite so isolating as being able to see things no one else can see.

That’s stuck with me all these years. Those virtual screens surrounding my desk. That felt like something. Now, Magic Leap was clearly not the company to deliver a full version of that, even then it was clear they weren’t the ones to make a real go of the idea. But like I said, it stuck with me. Flash forward, and Apple goes and makes that one of the signature features of their cyber goggles.

I can’t ever remember a set of reviews for a new product that were trying this hard to get past the shortcomings and like the thing they were imagining. I feel like most reviews of this thing can be replaced with a clip from that one Simpson’s episode where Homer buys the first hover car, but it sucks, and Bart yells over the wind “Why’d you buy the first hover car ever made?”, to which Homer gleefully responds with “I know! It’s a hover car!” as the car rattles along, bouncing off the ground.

It’s clear what everyone suspected back over the summer is true; this isn’t the “real” product, this is a sketch, a prototype, a test article that Tim Apple is charging 4 grand to test. And, good for him, honestly. The “real” Apple Vision is probably rev 3 or 4, coming in 2028 or thereabouts.

I’m stashing the links to all the interesting ones I read here, mostly so I can find them again. (I apologize to those of you to whom this is a high-interest open tabs balance transfer.)

But my favorite summary was from Today in Tabs The Business Section:

This is all a lot to read and gadget reviews are very boring so my summary is: the PDF Goggles aren’t a product but an expensive placeholder for a different future product that will hypothetically provide augmented reality in a way that people will want to use, which isn’t this way. If you want to spend four thousand dollars on this placeholder you definitely already know that, and you don’t need a review to find out.

The reviews I thought were the most interesting were the ones from Thompson and Zitron; They didn’t get review units, they bought them with their own money and with very specific goals of what they wanted to use them for, which then slammed into the reality of what the device actually can do:

Both of those guys really, really wanted to use their cyber goggles as a primary productivity tool, and just couldn’t. It’s not unlike the people who tried really hard to make their iPad their primary computer—you know who you are; hey man, how ya doing?—it’s possible, but barely.

There’s a definite set of themes across all that:

  • The outward facing “eyesight” is not so great
  • The synthetic faces for video calls as just as uncanny valley weird as we thought
  • Watching movies is really, really cool—everyone really seems to linger on this one
  • Big virtual screens for anything other than movies and other work seems mixed
  • But why is this, though? “What’s the problem this is solving?”

Apple, I think at this point, has earned the benefit of the doubt when they launch new things, but I’ve spent the last week reading this going, “so that’s it, huh?” Because okay, it’s the thing from last summer. There’s no big “aha, this is why they made this”.

There’s a line from the first Verge hands-on, before the full review, where the author says:

I know what I saw, but I’m still trying to figure out where this headset fits in real life.

Ooh, ooh, pick me! I know! It’s a Plaything For The Alone.

They know—they know—that headsets like this are weird and isolating, but seem to have decided to just shrug, accept that as the cost of doing business. As much as they pay lip service to being able to dial into the real world, this is more about dialing out, about tuning out the world, about having an experience only you can have. They might not be by themselves, but they’re Alone.

A set of sci-fi sketches as placeholders for features, entertaining people who don’t have anyone to talk to.

My parents were over for the Super Bowl last weekend. Sports keep being cited as a key feature for the cyber goggles, cool angles, like you’re really there. I’ve never in my life watched sports by myself. Is the idea that everyone will have their own goggles for big games? My parents come over with their own his-and-hers headsets, we all sit around with these things strapped to our faces and pass snacks around? Really? There’s no on-court angle thats better than gesturing wildly at my mom on the couch next to me.

How alone do you have to be to even think of this?

With all that said, what’s this thing for? Not the future version, not the science fiction hallucination, this thing you can go out and buy today? A Plaything for the Alone, sure, but to what end?

The consistent theme I noticed, across nearly every article I read, was that just about everyone stops and mentions that thing where you can sit on top of mount hood and watch a movie by yourself.

Which brings me back around to my back injury. I’ve spent the last month-and-change in a variety of mostly inclined positions. There’s basically nowhere to put a screen or a book that’s convenient for where my head needs to be, so in addition to living on ibuprofen, I’ve been bored out of my mind.

And so for a second, I got it. I’m wedged into the couch, looking at the TV at a weird angle, and, you know what? Having a screen strapped to my face that just oriented to where I could look instead of where I wanted to look sounded pretty good. Now that you mention it, I could go for a few minutes on top of Mount Hood.

All things considered, as back injuries go, I had a pretty mild one. I’ll be back to normal in a couple of weeks, no lasting damage, didn’t need professional medical care. But being in pain/discomfort all day is exhausting. I ended up taking several days off work, mostly because I was just to tired to think or talk to anyone. There’d be days where I was done before sundown; not that I didn’t enjoy hanging out with my kids when they got home from school, but I just didn’t have any gas left in the tank to, you know, say anything. I was never by myself, but I frequently needed to be alone.

And look, I’m lucky enough that when I needed to just go feel sorry for myself by myself, I could just go upstairs. And not everyone has that. Not everyone has something making them exhausted that gets better after a couple of weeks.

The question every one keeps asking is “what problem are these solving?”

Ummm, is the problem cities?

When they announced this thing last summer, I thought the visual of a lady in the cramped IKEA demo apartment pretending to be out on a mountain was a real cyberpunk dystopia moment, life in the burbclaves got ya down? CyberGoggles to the rescue. But every office I ever worked in—both cube farms and open-office bullpens—was full of people wearing headphones. Trying to blot out the noise, literal and figurative, of the environment around them and focus on something. Or just be.

Are these headphones for the eyes?

Maybe I’ve been putting the emphasis in the wrong place. They’re a Plaything for the Alone, but not for those who are by themselves, but for those who wish they could be.

Read More
Gabriel L. Helman Gabriel L. Helman

Wait, Who Answers The Phone?

Everyone is dunking on the lady who got called by “Amazon” customer support and ended up putting a shoebox with 50 grand in it in the back seat of a car, and rightly so! But there’s been a spate of “I got scammed and you can too” articles recently, and lots of “you don’t know what might happen to you in the right/wrong situation”.

But here’s the thing, they all start the same way: the person being scammed answered their phone. Not to be indelicate, but who in the fuck does that anymore? No one, and I mean no one does legitimate business over the phone anymore. If someone has a reason to get a hold of you, you’ll get an email, or a text or something. I haven’t answered the phone from a cold call in ages. Let it go to voicemail, read the transcript, maybe call back if it sounds important. Spoiler: it ain’t important.

Look, the fact that the phone system is entirely populated by robots and scammers is the stupidest possible flavor of this cyberpunk dystopia we’re living in, and we, as a society, should probably do something about it?

But for the moment, here’s a flowchart of what to do if the phone rings and it’s not a person you personally know who you were expecting to call:

  1. It’s a scam! Don’t answer it!

If it was important, they’d have sent you an email.

One of the running themes on Icecano is “STEM people need more Humanties classes”, but this feels like the opposite? There’s a delightful detail in that article that the caller claimed to be Amazon’s customer service group, and I don’t know how to break this to you, but Amazon doesn’t have anyone who talks on the phone. Not to mention the “government numbers can’t be spoofed” howler, or the “FTC badge number”. It’s like phones have some kind of legitimacy reverse-debt? If nothing else, paying someone to cold-call a customer is expensive. No business is going to do that for anything they won’t have a chance of turning into a sale. Letting you know about fraud? Email at best. They want you to call them. I’m baffled that anyone still picks up when the phone rings.

Read More
Gabriel L. Helman Gabriel L. Helman

Rough Loss

Woof, rough loss for my 49ers. The Super Bowl is always more fun when it’s close, but having the game snatched away in overtime—after you already pulled ahead—is absolutely brutal. One of those games where it’s not so much that one team played better than the other, but that one played slightly less worse. Ouch. Ouch, ouch, ouch.

Read More
Gabriel L. Helman Gabriel L. Helman

Not customers, partners

Over the weekend, Rusty Foster wrote up how things went migrating Today in Tabs off of substack: Tabs Migration Report.

There’s an observation towards the end that stuck with me all weekend:

All of us who make a living doing creative work on the internet inevitably find ourselves in a partnership with some kind of tech platform.

Whenever a platform starts acting the fool, there’s always a contingent of folks saying this is why you should run your own systems, and so on. And, speaking as someone who has literally run an entire website from a server in a closet, it’s great that option exists. It’s great that the structure of the internet is such that you can run your own webserver, or email system, or mastodon instance. But if you’re putting creative work out on the web shouldn’t have to, any more than writers shouldn’t have to know how to operate a printing press. Part of the web and personal computers “growing up”, is that you don’t have to be an all-in expert to use this stuff anymore.

There’s always a lot of chatter about whether big systems are “platforms” or ‘publishers”; what the responsibilities of those big systems are. In actual grown-up media, it’s understood that say, a writer and a publisher are in a partnership—a partnership with some asymmetry, but a partnership. It’s not a customer-service relationship with upside-down power dynamics.

I know it’s nowhere close to that simple, but maybe we need to start saying “partner” instead of “platform”.

Read More
Gabriel L. Helman Gabriel L. Helman

Kirk Drift

I read this years ago, but didn’t save a bookmark or anything, and never managed to turn it up again—until this weekend, when I stumbled across it while procrastinating as hard as I could from doing something else:

Strange Horizons - Freshly Remember'd: Kirk Drift By Erin Horáková:

There is no other way to put this: essentially everything about Popular Consciousness Kirk is bullshit. Kirk, as received through mass culture memory and reflected in its productive imaginary (and subsequent franchise output, including the reboot movies), has little or no basis in Shatner’s performance and the television show as aired. Macho, brash Kirk is a mass hallucination.

I’m going to walk through this because it’s important for ST:TOS’s reception, but more importantly because I believe people often rewatch the text or even watch it afresh and cannot see what they are watching through the haze of bullshit that is the received idea of what they’re seeing. You “know” Star Trek before you ever see Star Trek: a ‘naive’ encounter with such a culturally cathected text is almost impossible, and even if you manage it you probably also have strong ideas about that period of history, era of SF, style of television, etc to contend with. The text is always already interpolated by forces which would derange a genuine reading, dragging such an effort into an ideological cul de sac which neither the text itself nor the viewer necessarily have any vested interest in. These forces work on the memory, extracting unpaid labour without consent. They interpose themselves between the viewer and the material, and they hardly stop at Star Trek.

It’s excellent, and well worth your time.

(Off topic, I posted this to my then-work Slack, and this was the article that caused a coworker to wish that Slack’s link previews came with an estimated reading time. So, ah, get a fresh coffee and go to the bathroom before reading this.)

This is from 2017, and real life has been “yes, and”-ing it ever since. This provides a nice framework to help understand such other modern bafflements as “who are these people saying Star Trek is woke now” and “wait, do they not know that Section 31 are villains?”

And this is different from general cultural context drift, or “reimaginaings”, this is a cultural mis-remembering played back against the source material. And it’s… interesting which way the mis-remembering always goes.

Read More
Gabriel L. Helman Gabriel L. Helman

Saturday Stray Thoughts

No one ever looks back at their life and thinks, “I wish I had spent less time petting cats.”

Read More
Gabriel L. Helman Gabriel L. Helman

A construction site! We need that good feminine energy: Barbie (2023)

Most of last year’s big movies “finally” hit streaming at the end of the year, so I’ve been working my way though them. Spoilers ahoy, but I promise not to give away any jokes, at least other than the one I used in the title.

There’s an almost infinite number of sentences you can start with “What’s really brilliant about Barbie is…” Here’s mine: What’s really brilliant about Barbie is how smart Barbie, the character, is. She’s not a ditz, or uneducated, or even really naïve—she’s an incredibly smart, accomplished person from a totally alien culture. The challenges she faces in “the real world” aren’t those of Buzz Lightyear grappling with existential crisis, they’re an immigrant realizing that the stories she heard back home are all wrong, and adapting.

There’s a difference between “naïve” and “misinformed”, and this movie starts all of its characters—the Barbies, Kens, business people, and students—solidly in the second category. Amongst the many joys of the film is watching all the characters adapt and learn.

In the center of this, Margot Robbie does one of the most subtle pieces of acting I can remember seeing. At the start of the movie, she looks and moves like a living doll; the way she stands, the way she moves her hands, the way she holds the not-quite-human expressions on her face. But by the end of the movie, she looks like a regular person—a regular person that looks like Margot Robbie, sure, but human. The expressions are natural, her gait has changed, her hands are real hands. But at no point is there a big change; she doesn’t get zapped with a “human ray”, she never shows off the difference by inflating and deflating while Lois Lane is in the other room. She just quietly changes over the course of the movie, subtly so that you never notice the change from scene to scene, but by the end she’s transformed. It’s a remarkable physical performance.

The movie has gotten an… interesting critical reaction.

The modern critical apparatus—pro and “internet”—has a hard time with movies that don’t fit in a comfortable category, and with movies about women. This one was both, and you could tell it really impacted into some weird seams of the structure around how movie reviews even work.

Just about every professional review was postive, some extremely so, and others with a sort of grudging quality where you could tell the writer was grouchy they hadn’t gotten to use the negative review they’d half-finished ahead of time.

Because the cognitive dissonance of “they made a movie about Barbie—Barbie, of all things—and it’s good” was clearly a struggle for many. My favorite way this manifested—and by favorite I mean the sort of laugh to keep you from crying favorite—was when everyone did their top ten best movies of the year lists in December. About half of the ones I read had Barbie on them, the other half not. But every single one that didn’t have Barbie on the list talked about it, pondering if it should have been there, justifying to themselves why it wasn’t. One, which I have lost the tab for and refuse to go digging for, had a long, thoughtful postscript after the end of the top ten list itself about how Barbie had made her cry four different times while watching it, but somehow these other movies were better. She spent more words on Barbie than her ten “best” movies combined. That’s a real calls are coming from inside the house moment, I think, maybe you’re trying to tell yourself something? If nothing else, maybe step back and define your terms: what do you even mean by “best”?

Roger Ebert (hang on, I know he wasn’t around to review this but this is gonna connect, bare with me) has a bit in his 2003 four star Great Movies re-review of The Good, the Bad and the Ugly where he talks about how he reacted to it when he saw it the first time in 1968, and says:

Looking up my old review, I see I described a four-star movie but only gave it three stars, perhaps because it was a "spaghetti Western" and so could not be art.

Barbie feels like a movie that’s going to cause a lot of similar reflections 35 years from now.

That said, my favorite review I read was jwz’s capsule description of watching it with his mother, wherein his mom just can’t engage with the fact that the movie called “Barbie” has the same opinion of Barbies that she does.

On the one hand, you clearly have the set of critics unable to comprehend that a movie based on Barbie could possibly be good, but also it’s hard to have an extended critical discourse about a movie where American Ferrera looks the camera dead in the eye and says the central thesis of the movie out loud.

I was just talking about the modern style to for Tell over Show, and this movie is a perfect example of that done well. We live in an era where subtlety is overrated, sometimes it’s okay to just say what you mean.

And—ha ha ha—everything you’ve read so far I wrote before the Oscar nominations were announced. I don’t have much to add to that particular discourse except to point out that regardless of context, any time a movie gets nominated for Best Picture and not Best Director, something is hinky.

Gerwig and Robbie being snubbed by the Oscars while Gosling was not is, of course, infuriating, but it’s also the new grand champion of “real life comes along after the fact and proves the movie’s point better than the movie ever could. (The previous all-timer was when Debbie Reynolds died the day after Carrie Fisher, so Carrie couldn’t even have her own funeral, which is a fact that should be tacked on to the end of every print of Postcards From The Edge.) The post-credits scene should just be Margot Robbie reading tweets about how Barbie should be grateful for the 8 nominations it got against Oppenheimer’s 12.

One of the other things that makes the movie hard to talk about is that all the movies you want to compare it to are made by and about men, and that feels a little strange.

The movie’s satire and takedown of toxic masculinity is as pointed and full-throated as any I’ve ever seen. It’s a David Fincher movie, but backwards and in high heels. But where Fincher tends to stop at “look at this sad weirdo”, this movie knows the Kens are victims too, and wants things to work out for them too. But, you know, tells a story about toxic men from the perspective of the women, which shouldn’t be rare but there you go.

It’s worth acknowledging that as a straight cis man who didn’t have Barbies as a central feature of his childhood, I probably responded to different things than maybe the intended core audience. (And, by the way, please someone do a metafictional satire of marketing and gender roles with The Transformers. I’m available to pitch.) But on the other hand maybe not? It’s dangerous to try and attribute any part of a work with shared authorship to a specific contributor, but this movie was written by both Greta Gerwig and her husband, Noah Baumbach, and there were definitely parts where I thought I could hear the voice of a fellow Alan. The Kens are a satire of a very particular Kind of Guy, and I can’t stand those guys either. The joke about “The Godfather” felt like it was written for me, personally. Alan trying to sneak out of Barbieland to get away from the Kens was basically my entire High School experience summed up.

There’s so many little, nice touches. The way that the “real world” is portrayed as realistically as possible, but the inside of the Mattel offices is a realm as strange and alien as Barbieland. The way those realms play as having their own internal rules that are totally different from each other. How casual the people in the know are with beings crossing between those realms, and the way they interact with each other. The fact that there’s a ghost, which is treated as being not supernatural at all, until it is. (In some ways, this is the best Planescape movie we’re ever going to get.) The way the Ken dance fight resolves. The choice of song the Kens use as a serenade. There’s a Doctor Who in it!

And, I don’t want to give anything away, but the thing Barbie says after being scolded by a student? When she’s crying on that bench? That’s the hardest I’ve laughed at a movie in I don’t know how long.

Mostly, though, what I like is how kind this movie is. It’s rare to see a satire that’s actually funny and loves both all of its characters and the thing that its satirizing—in fact, I’m not sure I can think of another one. Galaxy Quest, maybe?

This is a movie that wants everyone in it to do well. It’s a little disappointed in some of them; but no one really gets “a comeuppance” or “punished”, or really even a Second Chance; instead everyone gets an opportunity to do better, and they all take it.

There’s a saying that inside every cynic is a disappointed idealist; and this is a movie made by some very disappointed people. But it’s a movie that spends as much time showing a way forward, as it does attacking the problem.

The resolution manages to simultaneously hit “Immigrant assimilating while still being true to themselves”, “Pinocchio”, and “the end of 2001”, and do it with a joke. Brilliant.

Absolutely the best movie I saw last year.

Read More
Gabriel L. Helman Gabriel L. Helman

40 years of…

Just about 40 years ago, my Dad brought something home that literally changed my life. It was a computer—a home computer, which was still on the edge of being science fiction—but more than that, it was a portal. It was magic, a box of endless possibilities. It’s not even remotely hyperbole to say that bringing that computer home, which had just been released into the world, utterly changed the entire trajectory of my life.

I am, of course, talking about the Tandy 1000.

That’s not how you expected that sentence to end, was it? Because this year is also the 40th anniversary of the Mac. But I want to spend a beat talking about the other revolutionary, trend-setting computer from 1984, before we talk about the ancestor of the computer I’m writing this on now.

I’ve been enjoying everyone’s lyrical memories of the original Mac very much, but most have a slightly revisionist take that the once that original Mac landed in ’84 that it was obviously the future. Well, it was obviously a future. It’s hard to remember now how unsettled the computer world was in the mid-80s. The Tandy 1000, IBM AT, and Mac landed all in ’84. The first Amiga would come out the next year. The Apple IIgs and original Nintendo the year after that. There were an absurd number of other platforms; Commodore 64s were selling like hotcakes, Atari was still making computers, heck, look at the number of platforms Infocom released their games for. I mean, the Apple ][ family still outsold the Mac for a long time.

What was this Tandy you speak of, then?

Radio Shack started life as a company to supply amateur radio parts to mostly ham radio operators, and expanded into things like hi-fi audio components in the 50s. In one of the greatest “bet it all on the big win” moves I can think of, the small chain was bought by—of all people—the Tandy Leather Company in the early 60s. They made leather goods for hobbyists and crafters, and wanted to expand into other hobby markets. Seeing no meaningful difference between leather craft hobbyists and electronics ones, Charles Tandy bought the chain, and reworked and expanded the stores, re-envisioning them as, basically, craft stores for electronics.

I want to hang on that for a second. Craft stores, but for amateur electronics engineers.

It’s hard to express now, in this decayed age, how magical a place Radio Shack was. It seems ridiculous to even type now. If you were the kind of kid who were in any way into electronics, or phones in the old POTS Ma Bell sense, or computery stuff, RadioShack was the place. There was one two blocks from my house, and I loved it.

When home computers started to become a thing, they came up through the hobbyist world; Radio Shack was already making their own parts and gizmos, it was a short distance to making their own computers. Their first couple of swings, the TRS-80 and friends, were not huge hits, but not exactly failures either. Apple came out of this same hobbyist world, then IBM got involved because they were already making “big iron”, could they also make “little iron”?

For reasons that seem deeply, deeply strange four decades later, when IBM built their first PC, instead of writing their own operating system, they chose to license one from a little company outside of Seattle called Microsoft—maybe you’ve heard of them—with terms that let Gates and friends sell their new OS to other manufacturers. Meanwhile, for other reasons, equally strange, the only part of the IBM PC actually exclusive to IBM was the BIOS, the rest was free to be copied. So this whole little market formed where someone could build a computer that was “IBM Compatible”—again, maybe you’ve heard of this—buy the OS from that outfit up in Redmond, and take advantage of the software and hardware that was already out there. The basic idea that software should work on more than one kind of computer was starting to form.

One of the first companies to take a serious swing at this was Tandy, with the Tandy 2000. In addition to stretching the definition of “compatible” to the breaking point, it was one of the very few computers to ever use the Intel 80186, and was bought by almost no one, except, though a series of events no one has ever been able to adequately explain to me, my grandmother. (I feel I have to stress this isn’t a joke, Grandma wrote a sermon a week on that beast well into the late 90s. Continuing her track record for picking technology, she was also the only person I knew with a computer that ran Windows Me.)

As much as the original IBM PC was a “home computer”, it was really a small office computer, so IBM tried to make a cut down, cheaper version of the PC for home use, for real this time. I am, of course, talking about infamous flop the IBM PCjr, also 40 years old this year, and deserving its total lack of retrospective articles.

Tandy, meanwhile, had scrambled a “better PCjr” to market, the Tandy 1000. When the PCjr flopped, Tandy pivoted, and found they had the only DOS-running computer on the market with all the positives of the PCjr, but with a keyboard that worked.

Among these positives, the Tandy 1000 had dramatically better graphics and sound than anything IBM was selling. “Tandy Graphics” was a step up from CGA but not quite to EGA, and the “Tandy Sound” could play three notes at once! Meanwhile, the Tandy also came with something called DeskMate, an app suite / operating environment that included a text editor, spreadsheet, calendar, basic database with a text-character-based GUI.

So they found themselves in a strange new market: a computer that could do “business software”, both with what was built-in and what was already out there, but could also do, what are those called? Oh yeah, games.

The legend goes that IBM commissioned the nacent Sierra On-Line to write the first King’s Quest to show off the PCjr; when that flopped Sierra realized that Tandy was selling the only computer that could run their best game, and Tandy realized there was a hit game out there that could only run on their rigs. So they both leaned all the way in.

But of course, even the Tandy couldn’t match “arcade games”, so the capabilities and limits helped define what a “PC game” was. Adventure games, flight sims, RPGs. And, it must be said, both the words “operating” and “system” in MS-DOS were highly asperational. But what it lacked in features it made up for in being easy to sweep to the side and access the hardware directly, which is exactly what you want if you’re trying to coax game-quality performance out of the stone knives and bearskins of 80s home computers. Even when the NES cemented the “home console” market that Atari had sketched in a couple years later, “PC games” had already developed their own identity vs “console games”.

Radio Shacks got a whole corner, or more, turned over to their new computers. They had models out running programs you could play with, peripherals you could try, and most critically, a whole selection of software. I can distinctly remember the Radio Shack by my house with a set of bookstore-like shelves with what was at the time every game yet made by Sierra, Infocom, and everyone else at the time. Probably close to every DOS game out there. I have such clear memories of poring over the box to Starflight, or pulling Hitch-hiker’s Guide off the shelf, or playing Lode Runner on the demo computer.

A home computer with better graphics and sound than its contemporaries, pre-loaded with most of what you need to get going, and supported by its very own retail store? Does that sound familiar?

I’m cheating the timeline a little here, the Tandy 1000 didn’t release until November, and we didn’t get ours until early ’85. I asked my Dad once why he picked the one he did, of all the choices available, and he said something to the effect of he asked the “computer guy” at work which one he should get, and that guy indicated that he’d get the Tandy, since it would let you do the most different kinds of things.

Like I said at the top, it was magic. We’re so used to them now that it’s hard to remember, but I was so amazed that here was this thing, and it would do different things based on what you told it to do! I was utterly, utterly fascinated.

One of the apps Dad bought with computer was that first King's Quest, I was absolutely transfixed that you could drive this little guy around on the screen. I’d played arcade games—I’d probably already sunk a small fortune into Spy Hunter—but this was different. You could do things. Type what you thought of! Pushing the rock aside to find a hole underneath was one of those “the universe was never the same again” moments for me. I could barely spell, and certainly couldn’t type, but I was hooked. Somewhere, and this still exists, my Mom wrote a list of words on a sheet of paper for me to reference how to spell: look, take, shove.

And I wasn’t the only one, both of my parents were as fascinated as I was. My mom sucked down every game Infocom and Sierra ever put out. The Bard's Tale) landed a year later, and my parent’s played that obsessively.

It was a family obsession, this weird clunky beige box in the kitchen. Portals to other worlds, the centerpiece of our family spending time together. Four decades on, my parents still come over for dinner once a week, and we play video games together. (Currently, we’re still working on Tears of the Kingdon, because we’re slow.)

Everyone has something they lock onto between about 6 and 12 that’s their thing from that point on. Mine was computers. I’ve said many, many times how fortunate I feel that I lived at just the right time for my thing to turn into a pretty good paying career by the time I was an adult. What would I be doing to pay this mortgage if Dad hadn’t brought that Tandy box into the house 40 years ago? I literally have no idea.

Time marched on.

Through a series of tremendous own-goals, Radio Shack and Tandy failed to stay on top of, or even competitive in, the PC market they helped create, until as the Onion said: Even CEO Can't Figure Out How RadioShack Still In Business.

Meanwhile, through a series of maneuvers that, it has to be said, were not entirely legal, Microsoft steadily absorbed most of the market, with the unsettled market of the 80s really coalescing into the Microsoft-Intel “IBM Compatible” platform with the release of Windows 95.

Of all the players I mentioned way back at the start, the Mac was the only other one that remained, even the Apple ][, originally synonymous with home computers, had faded away. Apple had carved out a niche for the Mac for things that could really take advantage of the UI, mainly desktop publishing, graphic design, and your one friend’s Dad.

Over the years, I’d look over at the Mac side of the house with something approaching jealousy. Anyone who was “a computer person” in the 90s ended up “bilingual”, more-or-less comfortable on both Windows and Mac Classic. I took classes in graphic design, so I got pretty comfortable with illustrator or Aldus Pagemaker in the Mac.

I was always envious of the hardware of the old Mac laptops, which developed into open lust when those colored iBooks came out. The one I wanted the most, though, was that iMac G4 - Wikipedia with the “pixar lamp” design.

But the thing is, they didn’t do what I was mostly using a computer for. I played games, and lots of them, and for a whole list of reasons, none of those games came out for the Mac.

If ’84 saw the release of both the first Mac, and one of the major foundation stones of the modern Windows PC platform, and I just spent all that time singing the praises of my much missed Tandy 1000, why am I typing this blog post on a MacBook Pro? What happened?

Let me spin you my framework for understanding the home computer market. Invoking the Planescape Rule-of-Threes, there are basically three demographics of people who buy computers:

  1. Hobbyists. Tinkerers. People who are using computers as a source of self-actualization. Hackers, in the classical sense, not the Angelina Jolie sense.
  2. People who look at the computer market and thought, “I bet I make a lot of money off of this”.
  3. People who had something else to do, and thought, “I wonder if I could use a computer to help me do that?”

As the PC market got off the ground, it was just that first group, but then the other two followed along. And, of course, the people in the second group quickly realized that the real bucks were to be made selling stuff to that first group.

As the 80s wound on, the first and second group clustered on computers running Microsoft, and the third group bought Macs. Once we get into the late 90s the hobbyist group gets split between Microsoft and Linux.

(As an absolutely massive aside, this is the root of the weird cultural differences between “Apple people” and “Linux people”. The kind of people who buy Apples do so specifically so they don’t have to tinker, and the kinds of people who build Linux boxes do so specifically so that they can. If you derive a sense of self from being able to make computers do things, Apples are nanny-state locked-down morally suspect appliances, and if you just want to do some work and get home on time and do something else, Linux boxes are massively unreliable Rube Goldberg toys for people who aren’t actually serious.)

As for me? What happened was, I moved from being in the first group to the third. No, that’s a lie. What actually happened was I had a kid, and realized I had always been in the third group. I loved computers, but not for their own sake, I loved them for the other things I could with them. Play games, write text, make art, build things; they were tools, the means to my ends, not an end to themselves. I was always a little askew from most of the other “computer guys” I was hanging out with; I didn’t want to spend my evening recompiling sound drivers, I wanted to do somethat that required the computer to play sound, and I always slightly resented it when the machine required me to finish making the sausage myself. But, that’s just how it was, the price of doing business. Want to play Wing Commander with sound? You better learn how Himem works.

As time passed, and we rolled into the 21st century, and the Mac moved to the BSD-based OS X, and then again to Intel processors, I kept raising my eyebrows. The Mac platform was slowly converging into something that might do what I wanted it to do?

The last Windows PC I built for myself unceremoniously gave up the ghost sometime in 2008 or 9, I can’t remember. I needed a new rig, but our first kid was on the way, and I realized my “game playing” time had already shrunk to essentially nil. And, by this time I had an iPhone, and trying to make that work with my WindowsXP machine was… trying. So, I said, what the hell, and bought a refurbed 2009 polycarb MacBook). And I never looked back.

I couldn’t believe how easy it was to use. Stuff just worked! The built-in apps all did what they were supposed to do! Closing the laptop actually put the computer to sleep! It still had that sleep light that looked like breathing. The UI conventions were different from what I was used to on Windows for sure, but unlike what I was used to, they were internally consistent, and had an actual conceptual design behind them. You could actually learn how “the Mac” worked, instead of having to memorize a zillion snowflakes like Windows. And the software! Was just nice. There’s a huge difference in culture of software design, and it was like I could finally relax once I changed teams. It wasn’t life-changing quite the way that original Tandy was, but it was a fundamental recalibration in my relationship with computers. To paraphrase all those infomercials, it turns out there really was a better way.

So, happy birthday, to both of my most influential computers of the last forty years. Here’s to the next forty.

But see if you can pick up some actual games this time.

Read More
Gabriel L. Helman Gabriel L. Helman

Monday Snarkblog

I spent the weekend at home with a back injury letting articles about AI irritate me, and I’m slowly realizing how useful Satan is as a societal construct. (Hang on, this isn’t just the painkillers talking). Because, my goodness, I’m already sick of talking about why AI is bad, and we’re barely at the start of this thing. I cannot tell you how badly I want to just point at ChatGPT and say “look, Satan made that. It's evil! Don't touch it!

Here’s some more open tabs that are irritating me, and I’ve given myself a maximum budget of “three tweets” each to snark on them:

Pluralistic: American education has all the downsides of standardization, none of the upsides (16 Jan 2024)

Wherein Cory does a great job laying out the problems with common core and how we got here, and then blows a fuse and goes Full Galaxy Brain, freestyling a solution where computers spit out new tests via some kind of standards-based electronic mad libs. Ha ha, fuck you man, did you hear what you just said? That’s the exact opposite of a solution, and I’m only pointing it out because this is the exact crap he’s usually railing against. Computers don’t need to be all “hammer lfg new nails” about every problem. Turn the robots off and let the experts do their jobs.

I abandoned OpenLiteSpeed and went back to good ol’ Nginx | Ars Technica

So wait, this guy had a fully working stack, and then was all “lol yolo” and replaced everything with no metrics or testing—twice??

I don’t know what the opposite of tech debt is called, but this is it. There’s a difference between “continuous improvement” and “the winchester mystery house” and boy oh boy are were on the wrong side of the looking glass.

The part of this one that got me wasn’t where he sat on his laptop in the hotel on his 21st wedding anniversary trip fixing things, it was the thing where he had already decided to bring his laptop on the trip before anything broke.

Things can just be done, guys. Quit tinkering to tinker and spend time with your family away from screens. Professionalism means making the exact opposite choices as this guy.

Read More
Gabriel L. Helman Gabriel L. Helman

X-Wing Linkblog Friday

The Aftermath continues to be one of the few bright spots in the cursed wasteland of the digital media. And yes, I realize Icecano is slowly devolving into just a set of links to the aftermath that I gesture enthusiastically towards.

Today’s case in point: X-Wing Is Video Gaming's Greek Fire

And, oh man, Yes, And.

This gist here is that not only were the X-Wing games the peak of the genre in terms of mechanics, and not only has no one been able to reproduce them, no one has really even tried. It’s wild to me that space fighter “sims” were a big deal for the whole of the nineties, and then… nope, we don’t do that anymore. Like he says, it’s remarkable that in the current era of “let’s clone a game from the old days with a new name”, no one has touched the X-Wings. Even Star Wars: Squadrons, which was a much better Wing Commander than it was an X-Wing, didn’t quite get there.

And yeah! You boot up TIE Fighter today, and it still genuinely plays better than anything newer in the genre. It’s insane to me that the whole genre just… doesn’t exist any more? It feels like the indy scene should be full of X-Wing-alikes. Instead we got Strike Suit Zero, the two Rebel Galaxys and thats it? Yes I know, you want to at-me and say No Man’s Sky or Elite and buddy, those could not be less what I’m talking about. That goes double for whatever the heck is going on down at Star Citizen.

(Although, speaking of recapturing old gameplay mechanics, I am going to take this opportunity to remind everyone that Descent 4 came out, it was just called Overload.)

A while back my kid asked me what video game I’d turn onto a movie, and without missing a beat and not as a joke, I answered “TIE Fighter.”

“Dad!!” he yelled. “That already has a movie!”

And, obviously, but the Star War I keep wishing someone would make is the X-Wing pilot show; Top Gun or Flight of the Intruder, but with R2 units. I don’t understand why you spend a quarter billion dollars to set some kid up to fail with his bad Harrison Ford impression before you do this.

So, he said as an artful segue, in other X-Wing news, remember Star Wars fan films?

There was a whole fan film bubble around the turn of the century, during the iMac DV era. The bubble didn’t pop exactly, but now that energy mostly gets channeled into 3 hour Lore Breakdown Videos on youtube that explain how the next 200 million dollar blockbuster is going to fail because it isn’t consistent with the worst book you read 20 years ago. ( *Puts finger to earpiece* I’m sorry, I’ve just been informed that The Crystal Star was, in fact, thirty years ago. We regret the error.)

But! People are still out there making their indie Star Wars epics, and so I’d like to call to your attention to: Wingman - An X-Wing Story | Star Wars Fan Film | 2023 - YouTube

(Attention conservation notice: it’s nearly an hour long, but all the really good ideas are in the first 15 minutes, and in classic fan film fashion it just kinda keeps… going…)

It’s the fan-filmsiest possible version of the X-Wing movie idea. It’s about 20 guys filming an X-Wing movie in their basement for something like 4 grand. They only have one set: the cockpit, and some really nice replica helmets. They use the sound effects from the cockpit controls in the X-Wing games! The group that this this are all in Germany, so the squadron looks like it’s made up entirely of the backup keyboard players from Kraftwork. It’s an interesting limit case in “how can we tell a story when the characters can’t even stand up or be on screen at the same time.”

I was going to put some more snark here, but you know what? It’s a hell of an impressive thing, considering. I’d never show this to someone who wasn’t already completely bought in at “surprisingly good X-Wing fan film”, but I think it demonstrates that the basic premise is sound?

This feels like the ceiling for how good a fan film should get. Could it be better? Sure. But if you put any more effort into an indie movie than this, you need to pivot and go make Clerks or El Mariachi or A Fistful of Fingers or something. The only person who got a ticket into Hollywood from a fan film was the guy who made Troops, and even he never got to direct a big-boy movie.

So, okay, what have we learned from today’s program:

  1. Someone needs to make an X-Wing-alike.
  2. If you’re about to spend a second thousand dollars on your X-Wing cockpit, maybe try and make an indie festival film instead?
  3. Now that Patty Jenkins' Rogue Squadron movie is cancelled, someone pitch a fighter pilot show to Disney+.

    I should probably have another paragraph here where I tie things up? But it’s Friday, don’t worry about it.

Read More
Gabriel L. Helman Gabriel L. Helman

Rebel Moon (2023)

This is as close to a review-proof movie as has ever been made: at this point, everyone on earth knows what to expect from “Zack Snyder does a sci-fi remake of Seven Samurai” and if they’re in the target audience. And yeah, it’s exactly the movie you imagined when you read that sentence.

At this point, Snyder’s ticks and interests as a filmmaker are well established. They shouldn’t be a surprise to anyone, and there’s a shot right towards the start where the main character is planting seeds and the seeds flying though the air go into slow motion, so it’s not like they’re a surprise to him either. This is solidly playing to the part of the crowd that’s already bought in.

There’s guns that shoot blobs of molten lead. There’s a prison robot that looks like one of the statues from Beetlejuice. There’s another robot that looks like Daft Punk’s forgotten third brother that’s voiced by Anthony Hopkins of all people. I’m pretty sure if this had come out when I was fourteen, I’d have loved it. Everyone seems like they’re having a good time, and I think the number people in this movie who also worked on Justice League says a lot.

It’s not a good movie, but it’s a hard movie to genuinely dislike. I feel like this is where I should work up some irritation that something this banal got this much money, but you know what? I just can’t summon the energy. Are there things to complain about? Sure. But criticizing it feels like walking into a Mexican restaurant and complaining that they won’t sell you a hamburger; the correct response is to ask “what part of the sign out front was confusing?” It’s exactly the movie it said it was going to be, it you watched it and didn’t like it, well, your media illiteracy is not the movie’s problem. If I was in charge of spending Netflix’s money, would I have bought this movie? Probably not. But in a world where Netflix also keeps throwing money at the Dave Chapelle transphobia carnival, it hardly seems worth being irked by.

So, a few stray observations:

Before the movie came out, Star Wars was always cited as the big influence, but I suspect the movie owes a much bigger debt to Warhammer 40k. The whole design style and look of the thing screams Games Workshop. Like Event Horizon this feels like this could cleanly slot in as an unofficial prequel. (Warhammer 15k?)

Speaking of Star Wars, Luke Skywalker's first outing is the mother of all “making hard things look easy” movies. Lucas does some very subtle, tricky things with tone, and pacing, and exposition (more about that in a second,) and at first glance it looks like anyone could do the same thing. A lot better filmmakers than Snyder have stepped on the rake of “well, how hard could it be to make a Star War?” to the point where there’s almost no shame in it. Hell, Lucas himself only has about a 50% hit rate for “like star wars, and good.”

Right after I saw it, my initial zinger was “So you know, if *I* was going to remake seven samurai in the style of star wars, I'd try to make I understood how at least one of those movies worked.” Mostly what I was talking about was the different approaches to exposition.

For a long time, one of the established rules for telling stories in a visual medium was Show Don’t Tell. But, this was always more of a rule than a guideline. Author & extensive sex-scandal-haver Warren Ellis has a great riff on this: Show Don’t Tell Is A Tool Not A Rule.

That said, the style over the last decade or two has swung the other way. To pick one example at semi-random, author & divorced-but-no-sex-scandals-as-far-as-we-know-haver Neil Gaiman basically built an entire career out of Tell Don’t Show. The 21st century Doctor Who has been a heavy Tell Don’t Show work (although that as much working around the budget as anything,)

What’s funny about this is that the two major sources for Rebel Moon, The Seven Samurai and Star Wars are as Show Don’t Tell as it gets. Kurosawa, especially in his historical samurai movies, would make movies that just dropped the audience into a fully operational world and let them catch up. Lucas, heavily inspired by Kurosawa, turned around and applied the same technique to fictional settings. In both cases, all the characters know how the world works, they talk casually and in slang, there’s no character whose job is to ask questions. Supposedly, Lucas wanted to be a documentarian originally, and he shot his science fantasty movies like it. Star Wars especially I describe as having a “vibes-over-lore” approach to worldbuilding; you learn how things work by how the characters react to them, not what they say about them.

Flashing forward fifty years, Rebel Moon is all the way on the other end of the spectrum. Everywhere Kurosawa would have cut to a wide shot and let the wind howl a little, or Lucas would have pointed the camera at the sunset and let John Williams take over for a little bit, here a character looks directly into the camera and dictates another entry for the Lore Wiki. It’s not actually that strange for a genre movie here in the early Twenties, but the contract between this and the style of the direct source material is vast.

I feel a little bad doing this in a review, but the best illustration of the difference is the Auralnaut’s Zack Snyder's STAR WARS: Part 1 - A New Hope which reworks the original Star Wars to work like Rebel Moon does.

Like Zack, I’m gonna save the wrapup and resolution for the second part.

Read More