Gabriel L. Helman Gabriel L. Helman

We Need to Form An Alliance! Right Now!!

Today is the fourth anniversary of my single favorite piece of art to come out of the early-pandemic era, this absolute banger by the Auralnauts:

Back when we still thought this was all going to “blow over” in a couple of weeks, my kids were planning to do this song for the talent show at the end of that school year.

(Some slight context; the Auralnauts Star Wars Saga started as kind of a bad lip-reading thing, and then went it’s own way into an alternate version of Star Wars where the jedi are frat-bro jerks and the sith are just trying to run a chain of family restaurants. The actual villain of the series is “Creepio”, who has schemes of his own. I’m not normally a re-edit mash-up guy, but those are amazing.)

Read More
Gabriel L. Helman Gabriel L. Helman

Apple Report Card, Early 2024

So, lets take a peek behind the ol’ curtain here at icecano dot com. I’ve got a drafts folder on my laptop—which for reasons which are unlikely to become clear at the moment—is named “on deck”, and when I come across something that might be a blog post but isn’t yet, I make a new file and toss it in. Stray observations, links, partially-constructed jokes, whatever. A couple lifetimes back, these probably would have just been tweets? Instead, I kind of let them simmer in the background until I have something to do with them. For example, this spent two months as a text file containing the only the phrase “that deranged rolling stone list”. I have a soft rule that after about three months they get moved to another folder named “never mind.”

And so, over the last couple of months that drafts pile had picked up a fair number of stray Apple-related observations. There’s been a lot going on! Lawsuits, the EU, chat protocols, shenanigans galore. So I kept noting down bits and bobs, but no coherent takes or anything. There was no structure. Then, back in February, Six Colors published their annual “Apple Report Card”.

Every year for the last decade or so, SixColors has done an “Apple Report Card”, where they poll a panel on a variety of Apple-related topics and get a sense of how the company is doing, or at least perceived as doing. This year was: Apple in 2023: The Six Colors report card . There are a series of categories, where the panel grades the company on an A-F scale, and adds commentary as desired.

The categories are a little funny, because they make a lot of sense for a decade ago but aren’t quite what I’d make them in 2024, but having longitudinal data is more interesting than revising the buckets. And it’s genuinely interesting to see how the aggregate scores have changed over the years.

And, so, I think, aha, there’s the structure, I can wedge all these japes into that shape, have a little fun with it. Alert readers will note this was about when I hurt my back, so wasn’t in shape to sit down a write a longer piece for a while there, so the lashed-together draft kinda floated along. Finally, this week, I said to myself, I said, look, just wrap that sucker up and post it, it’s not like there’s gonna be any big Apple news this week!

Let me take a big sip of coffee and check the news, and then let’s go ahead and add a category up front so we can talk about the antitrust lawsuit, I guess?

The Antitrust Lawsuit, I Guess: F

This has been clearly coming for a while, as the antitrust and regulatory apparatus continues to slowly awaken from its long slumber.

At first glance, I have a very similar reaction to this as I had to the Microsoft Antritrust thing back in the 90s, in that:

  1. This action is long overdue
  2. The company in question should have seen this coming and easily dodged, but instead they’re sucking claw to the face
  3. The DOJ has their attention pointed all the wrong things, and then the legal action is either gonna ricochet off or cause more harm than good. They actually mention the bubble colors in the filing, for chrissakes. Mostly they seem determined to go after the things people actually like about Apple’s gear?

But this is all so much dumber than last time, mostly because Microsoft wasn’t living in a world where the Microsoft lawsuit had already happened. This was so, so avoidable; a little performative rule changes, cut some prices, form a few industry working groups, maybe start a Comics Code Authority, whatever. Instead, Apple’s entire response to the whole situation has been somewhere between a little kid refusing to leave the toy store mixed with an old guy yelling “it’s better to reign in hell than serve in heaven!” at the Arby’s drive through. I’m not sure I can think of another example where a company blew their own legs off because deep down they really don’t believe that regulators are real. That said, the entire post-dot-com Big Tech world only exists because the entire regulatory system has been hibernating out of sight. Well, it’s roused now, baby.

You know the part at the start of that Mork Movie where The Martian keeps getting into streetfights, but then keeps getting himself out of trouble because he knows some obscure legal technicality, but then the judge that looks exactly like Kurt Vonnegut says something like “I don’t care, you hit a cop and you’re going down,” so that the rest of the movie can happen? I think Apple is about to learn some cool life lessons as a janitor, is what I’m saying.

I may have let that metaphor get away from me. Let’s move on, I’m sure there aren’t any other sections here where the recent news will cause me to have to revise from future-tense to past-tense!

Mac: A-

Honestly, as a product line, the mac is probably as coherent and healthy as it’s ever been. Now that they’re fully moved over to their own processors and no longer making “really nice Intel computers”, we’re starting to see some real action.

A line of computers where they all have the same “guts” and the form factor is an aesthetic & functional choice, but not a performance one, is something no one’s ever done before? It seems like they’re on the verge of getting into an annual or annual-and-a-half regular upgrade cycle like the iOS devices have, and that’ll really be a thing when it lands.

Well, all except The Mac Pro, which still feels like a placeholder?

Are they expensive? Yes they are. Pound-for-pound, they’re not as expensive as they seem, because they don’t make anything on the lower-end of the spectrum, and so a lot of complaints about price have the same tenor as complaining that BMWs cost more than an entry-level Toyota Camry, where you just go “yeah, man, they absolutely do.” Then I go look at what it costs to upgrade to a usable amount of RAM and throw my hands up in disgust. How is that not the lead on the DOJ action? They want HOW much to get up to 16 gigs of RAM?

MacOS as a platform is evolving well beyond “BSD, but with a nice UI” the way it was back when it was named OS X. I’m not personally crazy about a lot of the design moves, but I’d be hard pressed to call most of them “objectively bad”, as opposed to “not to my taste”. Except for that new settings panel, that’s garbage.

All that said… actually, I’m going to finish this sentence down under “SW Quality”. I’ll meet ya down there.

iPhone: 🔥

On the one hand, the iPhone might be the most polished and refined tech product of all time. Somewhere around the iPhone 4 or 5, it was “done”; all major features were there, the form factor was locked in. And Apple kept refining, polishing. These supercomputers-in-slabs-of-glass are really remarkable.

On the other hand, that’t not what anyone means when they talk about the iPhone in 2024. Yeah, let’s talk about the App Store.

I had a whole thing here about the app store was clearly the thing that was going to summon the regulators, which I took out partly because it was superfluous now, but also because apparently it was actually the bubble colors?

There’s a lot of takes on the nature of software distribution, and what kind of device the phone is, and how the ecosystem around it should work, and “what Apple customers want.” Well, okay, I’m an Apple user. Mostly a fairly satisfied one. And you know what I want? I want the app store they fucking advertise.

Instead, I had to keep having conversations with my kids about “trick games”, and explain that no, the thing called “the Oregon trail” in the app store isn’t the game they’ve heard about, but is actually a fucking casino. I like Apple’s kit quite a bit, and I keep buying it, but never in a million years will I forgive Tim Apple for the conversations I had to keep having with them about one fucking scam app after another.

Because this is what drives me the most crazy in all the hoopla around the app stores: if it worked like they claim it works, none of this would be happening. Instead, we have bizarre and inconsistant app review, apps getting pulled after being accepted, openly predatory in-app purchases, and just scam after casino after should-be-illegal-knock-off-clone after scam.

The idea is great: Phones for people who don’t use computers as a source of self-actualization. Phones and Macs are different products, with a different deal. Part of the deal is that with the iPhone you can do “less”, but what you can do is rock solid, safe, and you don’t ever have to worry about what your mom or your kid might download on their device.

I know the deal, I signed up for that deal on purpose! I want them to hold up their side of the bargain.

Which brings me to my next point. One of the metaphors people use for iOS devices—which I think is a good one!—is that they’re more like game consoles than general purpose computers. They’re “app consoles”. And I like that, that’s a solid way of looking at the space. It’s Jobs’ “cars vs trucks” metaphor but with a slightly less-leaky abstraction.

But you know who doesn’t have these legal and developer relations problems, and who isn’t currently having their ass handed to them by the EU and the DOJ? Nintendo.

This is what kills me. You can absolutely sell a computer where every piece of software is approved by you, where you get a cut, where the store isn’t choked by garbage, where everyone is basically happy. Nintendo has been doing that for checks notes 40 years now?

Hell, Nintendo even kept the bottom from falling out of the prices by enforcing that while you could sell for any price, you had to sell the physical and digital copies at the same amount, and then left all their stuff at 60-70 bucks, giving air cover to the small guys to charge a sustainable price.

Apple and Nintendo are very similar companies, building their own hardware and software, at a slightly different angle from the rest of their industries. They both have a solid “us first, customers second, devs third” world-view. But Nintendo has maintained a level of control over their platform that Apple could only dream of. And I’m really oversimplifying here, but mostly they did this by just not being assholes? Nintendo is not a perfect company, because none of them are, but you know what? I can play Untitled Goose Game on my Switch.

In the end, Apple was so averse to games, they couldn’t even bring themselves to use Nintendo’s playbook to keep the Feds off their back.

iPad: C

I’m utterly convinced that somewhere around 1979 Steve Jobs had a vision—possibly chemically assisted—of The Computer. A device that was easy to use, fully self-contained, an appliance, not a specialist’s tool. Something kids could pick up and use.

Go dial up his keynote where he introduces the first iPad. He knows he’s dying, even if he wasn’t admitting it, and he knows this is the last “new thing” he’s going to present. The look on his face is satisfaction: he made it. The thing he saw in the desert all those years ago, he’s holding it in his hands on stage. Finally.

So, ahhhh, why isn’t it actually good for anything?

I take that back; it’s good for two things: watching video, and casual video games. Anything else… not really?

I’m continually baffled by the way the iPad just didn’t happen. It’s been fourteen years; fourteen years after the first Mac, the Mac Classic was basically over, all the stuff the Mac opened up was well in the past-tense. I’m hard-pressed to think of anything that happened because the iPad existed. Maybe in a world with small laptops and big-ass phones, the iPad just doesn’t have a seat at the big-kids table?

Watch & Wearables: C

I like my watch enough that when my old one died, I bought a new one, but not so much that I didn’t have to really, really think about it.

Airpods are pretty cool, except they make my ears hurt so I stopped using them.

Is this where we talk about the Cyber Goggles?

AppleTV

Wait, which Apple TV?

AppleTV (the hardware): B

For the core use-case, putting internet video on my TV, it’s great. Great picture, the streaming never stutters, even the remote is decent now. It’s my main way to use my TV, and it’s a solid, reliable workhorse.

But look, that thing is a full-ass iPhone without a screen. It’s got more compute power than all of North America in the 70s! Is this really all we’re going to use this for? This is an old example, but the AppleTV feels like it could easily slide into being the 3rd or 2nd best game console with almost no effort, and it just… doesn’t.

AppleTV (the service): B

Ted Lasso notwithstanding, this is a service filled exclusively with stuff I have no interest in. I’m not even saying it’s bad! But a pass from me, chief.

AppleTV (the app): F

Absolute total garbage, just complete trash. I’ll go to almost any length to avoid using it.

Services: C

What’s left here?

iCloud drive? Works okay, I guess, but you’ll never convince me to rely on it.

Apple Arcade? It’s fine, other than it shouldn’t have to exist.

Apple Fitness? No opinion.

Apple News? Really subpar, with the trashiest ads I’ve seen in a while.

Apple Music? The service is outstanding, no notes. The app, however, manages to keep getting worse every OS update, at this point it’s kind of remarkable.

Apple Classical Music? This was the best you could do, really?

iTunes Match? I’m afraid to cancel. Every year I spent 15 bucks so I don’t have to learn which part of my cloud library will vanish.

There’s ones I’m not remembering, right? That’s my review of them.

Homekit: F

I have one homekit device in my house—a smart lightbulb. You can set the color temperature from the app! There is no power in the universe that would convince me to add a second.

HW Reliability: A

I don’t even have a joke about this. The hardware works. I mean, I still have to turn my mouse over to charge it, like it’s a defeated Koopa Troopa, but it charges every time!

SW Quality: D

Let me tell you a story.

For the better part of a decade, my daily driver was a 2013 15-inch MacBook Pro. In that time, I’m pretty sure it ran every OS X flavor from 10.9 to 10.14; we stopped at Mojave because there was some 32-bit only software we needed for work.

My setup was this: I had the laptop itself in the center of the desk, on a little stand, with the clamshell open. On either side, I had an external monitor. Three screens, where the built-in laptop one in the middle was smaller but effectively higher resolution, and the ones on the sides were physically larger but had slightly chunkier pixels. (Eventually, I got a smokin’ deal on some 4k BenQs on Amazon, and that distinction ceased.) A focus monitor in the center for what I was working on that was generally easier to read, and then two outboard monitors for “bonus content.”

The monitor on the right plugged into the laptop’s right-side HDMI port. The monitor on the left plugged into one of the Thunderbolt ports—this was the original thunderbolt when it still looked like firewire or mini-displayport—via a thunderbolt-to-mini-displayport cable. In front of the little stand, I had a wired Apple keyboard with the numeric keypad that plugged into the USB-A port on the left side. I had a wireless Apple mouse. Occasionally, I’d plug into the wired network connection with a thunderbolt-to-Cat6 adapter I kept in my equipment bag. The magsafe power connection clicked in on the left side. Four, sometimes five cables, each clicking into their respective port.

Every night, I’d close the lid, unplug the cables in a random order, and take the laptop home. The next morning, I’d come in, put the laptop down, plug those cables back in via another random order, open the lid, and—this is the important part—every window was exactly where I had left it. I had a “workspace” layout that worked for me—email and slack on the left side, web browser and docs on the right, IDE or text editor in the center. Various Finder windows on the left side pointing at the folders holding what I was working on.

I’d frequently, multiple times a week, unplug the laptop during the middle of the day, and hide over to another building, or a conference room, or the coffee shop. Sometimes I’d plug into another monitor, or a projector? Open the lid, those open windows would re-arrange themselves to what was available. It was smart enough to recognize that if there was only one external display, that was probably a projector, and not put anything on it except the display view of Powerpoint or Keynote.

Then, I’d come back to my desk, plug everything back in, open the lid, and everything was exactly where it was. It worked flawlessly, every time. I was about to type “like magic”, but that’s wrong. It didn’t work like magic, it worked like an extremely expensive professional tool that was doing the exact job I bought it to do.

My daily driver today is a 16-inch 2021 M1 MacBook Pro running, I think, macOS 12. The rest of my peripherals are the same: same two monitors, same keyboard, same mouse. Except now, I have a block of an dock on the left side of my desk for the keyboard and wired network drop.

In the nineteen months I’ve had this computer, let me tell you how many times I plugged the monitors back in and had the desktop windows in the same places they were before: Literally Never.

In fact, the windows wouldn’t even stay put when it went to sleep, much less when I closed the lid. The windows would all snap back to the central monitor, the desktops of the two side monitors would swap places. This never happened on the old rig over nearly a decade, and happens every time with the new one.

Here is what I had to do so that my email is still on the left monitor when I come back from lunch:

  1. I have a terminal window running caffeinate all the time. Can’t let it go to sleep!
  2. The cables from the two monitors are plugged into the opposite side of the computer from where they sit: the cables cross over in the back and plug into the far side
  3. Most damning of all, I can’t use the reintroduced HDMI port, both monitors have to be plugged in via USB-C cables. The cable on the right, which needed an adapter to turn the HDMI cable to a USB-C/Thunderbolt connection is plugged into the USB-C port right next to the HDMI port, which is collecting dust. Can I use it? No, nothing works if that port is lit up.

Please don’t @-me with your solution, I guarantee you whatever you’re thinking of I tried it, I read that article, I downloaded that app. This took me a year to determine by trial and error, like I was a victorian scientist trying to measure the refraction of the æther, and I’m not changing anything now. It’s a laptop in name only, I haven’t closed the lid or moved it in months, and I’m not going to. God help me if I need to travel for work.

I’ve run some sketchy computers, I depended on the original OEM Windows 95 for over a year. I have never, in forty years, needed to deploy a rube goldberg–ass solution like this to keep my computer working right.

And everything is like this. I could put another thousand words here about things that worked great on the old rig—scratch that, that literally still work on the old rig—that just don’t function right on the new one. The hardware is so much better, but everything about using the computer is so much worse.

Screw the chat bubbles, get the DOJ working on why my nice monitors don’t work any more.

Dev Relations: D

Absolutely in the toilet, the worst I have ever seen it. See: just about everything above this. Long-time indie “for the love of the game” mac devs are just openly talking shit at this point. You know that Trust Thermocline we got all excited about as a concept a couple years ago? Yeah, we are well below that now.

Bluntly, the DOJ doesn’t move if all the developers working on the platform are happy and quiet.

I had an iOS-based project get scrapped in part because we weren’t willing to incur the risk of giving Apple total veto power over the product; that was five or six years ago now, and things have only decayed since then.

This is a D instead of an F because I’m quite certain it’s going to get worse before it gets better.

Environ/Social: ¯_(ツ)_/¯

This category feels like one of those weird “no ethical consumption under capitalism” riddles. Grading on the curve of “silicon valley companies”, they’re doing great here. On the other hand, that bar is on the floor. Like, it’s great that they make it easy to recycle your old phones, but maybe just making it less of a problem to throw things out hasn’t really backed up to the fifth “why”, you know?

Potpurri: N/A

This isn’t a sixcolors category, but I”m not sure where to put the fact that I like my HomePod mini? It’s a great speaker!

Also, please start making a wifi router again, thanks.

What Now: ?

Originally, this all wrapped up with an observation that it’s great that the product design is firing on all cylinders and that services revenue is through the roof, but if they don’t figure out how to stop pissing off developers and various governments, things are going to get weird, but I just highlighted all that and hit delete because we’re all the way through that particular looking glass now.

Back in the 90s, there was nothing much else going on, and Microsoft was doing some openly, blatantly illegal shit. Here? There’s a lot else going on, and Apple are mostly just kinda jerks?

I think that here in 2024, if the Attorney General of the United States is inspired to redirect a single brain cell away from figuring out how to stop a racist game show host from overthrowing the government and instead towards the color of the text message bubbles on his kid’s phone, that means that Apple is Well and Truely Fucked. I think the DOJ is gonna carve into them like a swarm of coconut crabs that just found a stranded aviator.

Maybe they shoulder-roll through this, dodge the big hits, settle for a mostly-toothless consent decree. You’d be hard-pressed, from the outside, to point at anything meaningly different about Microsoft in 1999 vs 2002. But before they settled, they did a lot of stuff, put quite a few dents in the universe, to coin a phrase. Afterwards? Not a whole lot. Mostly, it kept them tied up so that they didn’t pay attention to what Google was doing. And we know that that went.

I’m hard-pressed to think of a modern case where antitrust action actually made things better for consumers. I mean, it’s great that Microsoft got slammed for folding IE into Windows, but that didn’t save Netscape, you know? And I was still writing CSS fills for IE6 a decade later. Roughing up Apple over ebooks didn’t fix anything. I’m not sure mandating that I need to buy new charge cables was solving a real problem. And with the benefit of hindsight, I’m not sure breaking up Ma Bell did much beyond make the MCI guy a whole lot of money. AT&T reformed like T2, just without the regulations.

The problem here is that it’s the fear of enforcement thats supposed to do the job, not the actual enforcement itself, but that gun won’t scare anyone if they don’t think you’ll ever fire it. (Recall, this is why the Deliverator carried a sword.) Instead, Apple’s particular brew of arrogance and myopic confidence called this all down on them.

Skimming the lawsuit, and the innumerable articles about the lawsuit, the things the DOJ complains about are about a 50/50 mix of “yeah, make them stop that right now”, and “no wait, I bought my iPhone for that on purpose!” The bit about “shapeshifting app store rules” is already an all-time classic, but man oh man do I not want the Feds to legislate iOS into android, or macOS into Windows. There’s a very loud minority of people who would never buy something from Apple (or Microsoft) on principal, and they really think every computer-like device should be forced to work like Ubuntu or whatever, and that is not what I bought my iPhone for.

I’m pessimistic that this is going to result in any actually positive change, in case that wasn’t coming through clearly. All I want is them to hold up their end of the deal they already offered. And make those upgrades cheaper. Quit trying to soak every possible cent out of every single transaction.

And let my monitors work right.

Read More
Gabriel L. Helman Gabriel L. Helman

BSG, Fifteen Years On

It’s been called to my attention that the last episode of the “new” Battlestar Galactica aired fifteen years ago yesterday?

My favorite part of that finale is that you can tell someone whose never seen it that the whole show ends with a robot dance party, and even if they believe you, they will never in a million years guess how that happens.

And, literally putting the words “they have a plan” in big letters in the opening credits of every episode, while not ever bothering to work out what that plan was, that’s whatever the exact opposite of imposter syndrome is.

Not a great ending.

That first season, though, that was about as good a season of TV not named Twin Peaks has ever been. It was on in the UK months before it even had an airdate in the US, and I kept hearing good things, so I—ahem—obtained copies. I watched it every week on a CRT computer monitor at 2 in the morning after everyone else was asleep, and I really couldn’t believe what I was seeing. They really did take that cheesy late-70s Star Wars knockoff and make something outstanding out of it. Mostly, what I remember is I didn’t have anyone to talk about it with, so I had to convince everyone I knew to go watch it once it finally landed on US TV.

It was never that good again. Sure, the end was bad, but so was the couple of years leading up to that end? The three other seasons had occasional flashes of brilliance but that mostly drained out, replaced by escalating “what’s the craziest thing that could happen next?” so that by the time starbuck was a ghost and bob dylan was a fundamental force of the universe there was no going back, and they finally landed on that aforementioned dance party. And this was extra weird because it not only started so good, but it seemed to have such a clear mission: namely, show those dorks over at Star Trek: Voyager how their show should have worked.

Some shows should just be about 20 episodes, you know?

Read More
Gabriel L. Helman Gabriel L. Helman

Book Lists Wednesday

Speaking of best of lists, doing the rounds this week we have:

The Great American Novels

We give the Atlantic a hard time in these parts, and usually for good reasons, but it’s a pretty good list! I think there’s some things missing, and there’s a certain set of obvious biases in play, but it’s hard to begrudge a “best american fiction” list that remembers Blume, LeGuin, and Jemisin, you know? Also, Miette’s mother is on there!

I think I’ve read 20 of these? I say think, because there are a few I own a copy of but don’t remember a single thing about (I’m looking at YOU, Absalom, Absalom!)

And, as long as we’re posting links to lists of books, I’ve had this open in a tab for the last month:

Pulitzer Prize for General Nonfiction - Wikipedia

I forget now why I ended up there, but I thought this was a pretty funny list, because I considers myself a pretty literate, well-read person, and I hadn’t even heard of most of these, must less read them. That said, the four on there I actually have read—Guns of August, Stillwell and the American Experience in China, Soul of a New Machine, and Into Thin Air—are four of the best books I’ve ever read, so maybe I should read a couple more of these?

Since the start of the Disaster of the Twenties I’ve pretty exclusively read trash, because I needed the distraction, and I didn’t have the spare mental bandwidth for anything complicated or thought provoking. I can tell the disaster is an a low ebb at the moment, because I found myself looking at both of these lists thinking, maybe I’m in the mood for something a little chunkier.

Read More
Gabriel L. Helman Gabriel L. Helman

The Dam

Real blockbuster from David Roth on the Defector this morning, which you should go read: The Coffee Machine That Explained Vice Media

In a large and growing tranche of wildly varied lines of work, this is just what working is like—a series of discrete tasks of various social function that can be done well or less well, with more dignity or less, alongside people you care about or don't, all unfolding in the shadow of a poorly maintained dam.

It goes on like that until such time as the ominous weather upstairs finally breaks or one of the people working at the dam dynamites it out of boredom or curiosity or spite, at which point everyone and everything below is carried off in a cleansing flood.

[…]

That money grew the company in a way that naturally never enriched or empowered the people making the stuff the company sold, but also never went toward making the broader endeavor more likely to succeed in the long term.

Depending on how you count, I’ve had that dam detonated on me a couple of times now. He’s talking about media companies, but everything he describes applies to a lot more than just that. More than once I’ve watched a functional, successful, potentially sustainable outfit get dynamited because someone was afraid they weren’t going to cash out hard enough. And sure, once you realize that to a particular class of ghoul “business” is a synonym for “high-stakes gambling” a lot of the decisions more sense, at least on their own terms.

But what always got me, though, was this:

These are not nurturing types, but they are also not interested in anything—not creating things or even being entertained, and increasingly not even in commerce.

What drove me crazy was that these people didn’t use the money for anything. They all dressed badly, drove expensive but mediocre cars—mid-list Acuras or Ford F-250s—and didn’t seem to care about their families, didn’t seem to have any recognizable interests or hobbies. This wasn’t a case of “they had bad taste in music”, it was “they don’t listen to music at all.” What occasional flickers of interest there were—fancy bicycles or or golf clubs or something—was always more about proving they could spend the money, not that they wanted whatever it was.

It’s one thing if the boss cashes out and drives up to lay everyone off in a Lamborghini, but it’s somehow more insulting when they drive up in the second-best Acura, you know?

I used to look at this people and wonder, what did you dream about when you were young? And now that you could be doing whatever that was, why aren’t you?

Read More
Gabriel L. Helman Gabriel L. Helman

Caves of Androzani at 40

As long as we’re talking about 40th anniversaries, this past Saturday marked 40 years since the last episode aired of “Caves of Androzani”, Peter Davison’s final story as Doctor Who.

One of the unique things about Doctor Who is the way it rolls its cast over on a pretty regular basis, including the actor that plays the title character. This isn’t totally unusual—Bond does the same thing—but what is unusual is that the show keeps the same continuity, in that the new actor is playing literally the same character, who just has a new body now.

The real-world reason for this is that Doctor Who is a hard show to make, and a harder show to be the lead of, and after about three seasons, everyone is ready to move on. The in-fiction reason is that when the Doctor is about to die they can “regenerate”, healing themselves but changing their body.

This results is a weird sub-genre of stories that only exist in Doctor Who—stories where the main character gets killed, but then the show keeps going. And the thing is, these basically never work. Doctor Who is a fairly light-weight family action-adventure show, where the main characters get into and out of life-threatening scrapes every time. “Regeneration Stories” tend to all fall into the same pattern, where something “really extra bad” is happening, and events conspire such that the Doctor needs to sacrifice themselves to save everyone else. And they’re always deeply unsatisfying, because it’s always the sort of problem that wouldn’t be that big a deal if the main actor wasn’t about to leave. There have been thirteen regular leads of the show at this point, and none of their last episodes have been anywhere near their best.

Except once.

In 1984, Doctor Who was a show in decline. No longer the creative or ratings juggernaut that it had been through most of the 1970s, it was wrapping up three years with Peter Davison as the Fifth Doctor that could most charitably be described as “fine”. Davison was one of the best actors to ever play the part, but with him in the lead the show could never quite figure out how to do better than about a B-.

For Davison’s last episode, the show brought back Robert Holmes, who had been the show’s dominant—and best—writer throughout the seventies, but had’t worked on the show since ’79. Holmes had written for every Doctor since the second, but had never written a last story, and seemed determined to make it work.

The result was extraordinary. While most previous examples had been huge, universe-spanning stakes, this was almost perversely small-fry. A tiny colony moon, where the forces of a corporation square off with a drug dealer whose basically space Phantom of the Opera, with the army and a group of gun-runners caught in the middle. At one point, the Doctor describes the situation as “a pathetic little war”, and he’s right—it’s almost perversely small-scale by his standards.

That said, there are enough moving pieces that the Doctor never really gets a handle on what’s going on. Any single part would be a regular day a the office, but combined, they keep him off balance as things keep spiraling out of control. It’s a perfect example of the catalytic effect the Doctor has—just by showing up, things start to destabilize without him having to do anything.

What’s really brilliant about it, though, is that he actually gets killed right at the start. He and new companion Peri stumble into an alien bat nest, which lethally poisons them, even though it takes a while to kick in. Things keep happening to keep him from solving all this, and by the end he’s only managed to scare up a single does of antidote, which he gives to his friend and then dies.

It's also remarkably better than everything around it—not just the best show Davison was in, but in genuine contention for best episode of the 26 seasons of the classic show. It’s better written, better directed, better acted than just about anything else the old show did.

It’s not flawless—the show’s reach far exceeds the grasp of the budget. As an example, there’s a “computer tablet” that’s blatantly a TV remote, and there’s a “magma beast” that’s anything but. But that’s all true for everything the show was doing in the 80s—but for once, it’s trying to do something good, instead of not having enough money to do something mediocre.

My favorite beat comes about 3/4 of the way through, when the Doctor has either a premonition of his own death, or starts to regenerate and chokes it back—it’s ambiguous. Something happens that the Doctor shakes off, and the show won’t do something that weird and unclear again until Peter Capaldi’s twelfth Doctor refused to regenerate in 2017.

It also has one of my favorite uses of the Tardis as a symbol; at the end, things have gone from bad to worse, to even worse than that, and the Doctor, dying, carries the unconscious body of his friend across the moonscape away from the exploding mud volcano (!!), and the appearance of the blue police box out of the mist has never been more welcome.

As a kid, it was everything I wanted out of the show—it was weird, and scary, and exciting. As a grown-up, I’m not inclined to argue.

Read More
Gabriel L. Helman Gabriel L. Helman

Nausicaä at 40

Hayao Miyazaki’s animated version of Nausicaä of the Valley of the Wind came out forty years ago this week!

Miyazaki is one of the rare artists where you could name any of his works as your favorite and not get any real pushback. It’s a corpus of work where “best” is meaningless, but “favorite” can sometimes be revealing. My kid’s favorite is Ponyo, so that’s the one I’ve now seen the most. When I retire, I want to go live on the island from Porco Rosso. * Totoro* might be the most delighted I’ve ever been while watching a movie for the first time. But Nausicaä of the Valley of the Wind is the only one I bought on blu-ray.

Nausicaä is the weird one, the one folks tend not to remember. It has all the key elements of a Miyazaki film—a strong woman protagonist, environmentalism, flying, villains that aren’t really villains, good-looking food—but it also has a character empty the gunpowder out of a shotgun shell to blow a hole in a giant dead insect exoskeleton. He never puts all those elements together quite like this again.

I can’t now remember when I saw it for the first time. It must have been late 80s or early 90s, which implies I saw the Warriors of the Wind cut, or maybe a subbed Japanese import? (Was there a subbed Japanese import?) I read the book—as much of it as existed—around the same time. I finally bought a copy of the whole thing my last year of college, in one of those great “I’m an adult now, and I can just go buy things” moments. And speaking of the book, this is one of the rare adaptations where it feels less like an “adaptation” than a “companion piece.” It’s the same author, using similar pieces, configured differently, providing a different take on the same material with the same conclusions.

So what is it about this move that appeals to me so much? The book is one of my favorite books of all times, but that’s a borderline tautology. If I’m honest, it’s a tick more “action-adventure” that most other Ghibli movies, which is my jam, but more importantly, it’s action-adventure where fighting is always the wrong choice, which is extremely my jam (see also: Doctor Who.)

I love the way everything looks, the way most of the tech you can’t tell if it was built or grown. I love the way it’s a post-apocalyptic landscape that looks pretty comfortable to live in, actually. I love sound her glider makes when the jet fires, I love the way Teto hides in the folds of her shirt. I love the way the prophecy turns out to be correct, but was garbled by the biases of the people who wrote it down. I love everything about the Sea of Corruption (sorry, “Toxic Jungle”,) the poisonous fungus forest as a setting, the insects, the way the spores float in the air, the caves underneath, and then, finally, what it turns out the forest really is and why it’s there.

Bluntly, I love the way the movie isn’t as angry or depressing as the book, and it has something approaching a happy ending. I love how fun it all is, while still being extremely sincere. I love that it’s an action adventure story where the resolution centers around the fact that the main character isn’t willing to not help a hurt kid, even though that kid is a weird bug.

Sometimes a piece of art hits you at just the right time or place. You can do a bunch of hand waving and talk about characters or themes or whatever, but the actual answer to “why do you love that so much?” is “because there was a hole in my heart the exact shape of that thing, that I didn’t know was there until this clicked into place.”

Read More
Gabriel L. Helman Gabriel L. Helman

Cyber-Curriculum

I very much enjoyed Cory Doctorow’s riff today on why people keep building torment nexii: Pluralistic: The Coprophagic AI crisis (14 Mar 2024).

He hits on an interesting point, namely that for a long time the fact that people couldn’t tell the difference between “science fiction thought experiments” and “futuristic predictions” didn’t matter. But now we have a bunch of aging gen-X tech billionaires waving dog-eared copies of Neuromancer or Moon is a Harsh Mistress or something, and, well…

I was about to make a crack that it sorta feels like high school should spend some time asking students “so, what’s do you think is going on with those robots in Blade Runner?” or the like, but you couldn’t actually show Blade Runner in a high school. Too much topless murder. (Whether or not that should be the case is besides the point.)

I do think we should spend some of that literary analysis time in high school english talking about how science fiction with computers works, but what book do you go with? Is there a cyberpunk novel without weird sex stuff in it? I mean, weird by high school curriculum standards. Off the top of my head, thinking about books and movies, Neuromancer, Snow Crash, Johnny Mnemonic, and Strange Days all have content that wouldn’t get passed the school board. The Matrix is probably borderline, but that’s got a whole different set of philosophical and technological concerns.

Goes and looks at his shelves for a minute

You could make Hitchhiker work. Something from later Gibson? I’m sure there’s a Bruce Sterling or Rudy Rucker novel I’m not thinking of. There’s a whole stack or Ursula LeGuin everyone should read in their teens, but I’m not sure those cover the same things I’m talking about here. I’m starting to see why this hasn’t happened.

(Also, Happy π day to everyone who uses American-style dates!)

Read More
Gabriel L. Helman Gabriel L. Helman

Jim Martini

I need everyone to quit what you’re doing and go read Jim Martini.

When she got to Jim Martini, he said, I don’t want a party.

What do you mean you don’t want a party, Leeanne said. Everyone wants a party.

Read More
Gabriel L. Helman Gabriel L. Helman

Unexpected, This Is

This blew up over the last week, but as user @heyitswindy says on the website formerly known as twitter:

“Around 2003 in Chile, when the original trilogy of Star Wars began airing on television there, they did this funny thing to avoid cutting to commercial breaks. They stitched the commercials into the films themselves. Here is one of them, with the English dub added in. https://t.co/wC7N2vPNvv"

The source article has more clips, but I’ll go ahead and embed them here as well:

I’m absolutely in love with the idea that old Ben has a cooler of crispy ones at the ready, but the one with the Emperor really takes the cake.

This went completely nuclear on the webs, and why wouldn’t it? How often does someone discover something new about Star Wars? According to the original thread, they also did this for other movies, including Gladiator and American Beauty(??!!) Hope we get to see those!

What really impresses me, though, is how much work they put into these? They built props! Costumes! Filmed these inserts! That would have been such a fun project to work on.

Read More
Gabriel L. Helman Gabriel L. Helman

Let Me Show You How We Did It For The Mouse

I frequently forget that Ryan Gosling got started as a Mouketeer (complementary,) but every now and then he throws the throttle open and demonstrates why everyone from that class of the MMC went on to be super successful.

I think Barbie getting nearly shut-out at the Oscars was garbage, but Gosling’s performance of “Im Just Ken” almost made up for it. Look at all their faces! The way Morgot Robbie can’t keep a straight face from moment one! Greta Gerwig beyond pumped! The singalong! You and all your friends get to put on a big show at the Oscars? That’s a bunch of theatre kids living their best life.

(And, I’m pretty sure this is also the first time one of the Doctors Who has ever been on stage at the Oscars?)

(Also, bonus points to John Cena for realizing that willing to be a little silly is what makes a star into a superstar.)

Read More
Gabriel L. Helman Gabriel L. Helman

Implosions

Despite the fact that basically everyone likes movies, video games, and reading things on websites, every company that does one of those seems to continue to go out of business at an alarming rate?

For the sake of future readers, today I’m subtweeting Vice and Engaget both getting killed by private equity vampires in the same week, but also Coyote vs Acme, and all the video game layoffs, and Sports Illustrated becoming an AI slop shop and… I know “late state capitalism” has been a meme for years now, and the unsustainable has been wrecking out for a while, but this really does feel like we’re coming to the end of the whole neoliberal project.

It seems like we’ve spent the whole last two decades hearing about something valuable or well-liked went under because “their business model wasn’t viable”, but on the other hand, it sure doesn’t seem like anyone was trying to find a viable one?

Rusty Foster asks What Are We Dune 2 Journalism? while Josh Marshall asks over at TPM: Why Is Your News Site Going Out of Business?. Definitely click through for the graph on TPM’s ad revenue.

What I find really wild is that all these big implosions are happening at the same time as folks are figuring out how to make smaller, subscription based coöps work.

Heck, just looking in my RSS reader alone, you have: Defector, 404 Media, Aftermath, Rascal News, 1900HOTDOG, a dozen other substacks or former substacks,
Achewood has a Patreon!

It’s more possible than ever to actually build a (semi?) sustainable business out there on the web if you want to. Of course, all those sites combined employ less people that Sports Illustrated ever did. Because we’re talking less about “scrappy startups”, and more “survivors of the disaster.”

I think those Defector-style coöps, and substacks, and patreons are less about people finding viable business models then they are the kind of organisms that survive a major plague or extinction event, and have evolved specifically around increasing their resistance to that threat. The only thing left as the private equity vultures turn everything else and each other into financial gray goo.

It’s tempting to see some deeper, sinister purpose in all this, but Instapot wasn’t threatening the global order, Sports Illustrated really wasn’t speaking truth to power, and Adam Smith’s invisible hand didn’t shutter everyone’s favorite toy store. Batgirl wasn’t going to start a socialist revolution.

But I don’t think the ghouls enervating everything we care about have any sort of viewpoint beyond “I bet we could loot that”. If they were creative enough to have some kind of super-villian plan, they’d be doing something else for a living.

I’ve increasingly taken to viewing private equity as the economy equivalent of Covid; a mindless disease ravaging the unfortunate, or the unlucky, or the insufficiently supported, one that we’ve failed as a society to put sufficient public health protections against.

Read More
Gabriel L. Helman Gabriel L. Helman

“Hanging Out”

For the most recent entry in asking if ghosts have civil rights, the Atlantic last month wonders: Why Americans Suddenly Stopped Hanging Out.

And it’s an almost perfect Atlantic article, in that it looks at a real trend, finds some really interesting research, and then utterly fails to ask any obvious follow-up questions.

It has all the usual howlers of the genre: it recognizes that something changed in the US somewhere around the late 70s or early 80s without ever wondering what that was, it recognizes that something else changed about 20 years ago without wondering what that was, it displays no curiosity whatsoever around the lack of “third places” and where, exactly kids are supposed to actually go when then try to hang out. It’s got that thing where it has a chart of (something, anything) social over time, and the you can perfectly pick out Reagan’s election and the ’08 recession, and not much else.

There’s lots of passive voice sentences about how “Something’s changed in the past few decades,” coupled with an almost perverse refusal to look for a root cause, or connect any social or political actions to this. You can occasionally feel the panic around the edges as the author starts to suspect that maybe the problem might be “rich people” or “social structures”, so instead of talking to people inspects a bunch of data about what people do, instead of why people do it. It’s the exact opposite of that F1 article; this has nothing in it that might cause the editor to pull it after publication.

In a revelation that will shock no one, the author instead decides that the reason for all this change must be “screens”, without actually checking to see what “the kids these days” are actually using those screens for. (Spoiler: they’re using them to hang out). Because, delightfully, the data the author is basing all this on tracks only in-person socializing, and leaves anything virtual off the table.

This is a great example of something I call “Alvin Toffler Syndrome”, where you correctly identify a really interesting trend, but are then unable to get past the bias that your teenage years were the peak of human civilization and so therefore anything different is bad. Future Shock.

I had three very strong reaction to this, in order:

First, I think that header image is accidentally more revealing than they thought. All those guys eating alone at the diner look like they have a gay son they cut off; maybe we live in an age where people have lower tolerance for spending time with assholes?

Second, I suspect the author is just slightly younger than I am, based on a few of the things he says, but also the list of things “kids should be doing” he cites from another expert:

“There’s very clearly been a striking decline in in-person socializing among teens and young adults, whether it’s going to parties, driving around in cars, going to the mall, or just about anything that has to do with getting together in person”.

Buddy, I was there, and “going to the mall, driving around in cars” sucked. Do you have any idea how much my friends and I would have rather hung out in a shared Minecraft server? Are you seriously telling me that eating a Cinnabon or drinking too much at a high school house party full of college kids home on the prowl was a better use of our time? Also: it’s not the 80s anymore, what malls?

(One of the funniest giveaways is that unlike these sorts of articles from a decade ago, “having sex” doesn’t get listed as one of the activities that teenagers aren’t doing anymore. Like everyone else between 30 and 50, the author grew up in a world where sex with a stranger can kill you, and so that’s slipped out of the domain of things “teenagers ought to be doing, like I was”.)

But mostly, though, I disagree with the fundamental premise. We might have stopped socializing the same ways, but we certainly didn’t stop. How do I know this? Because we’re currently entering year five of a pandemic that became uncontrollable because more Americans were afraid of the silence of their own homes than they were of dying.

Read More
Gabriel L. Helman Gabriel L. Helman

Even Further Behind The Velvet Curtain Than We Thought

Kate Wagner, mostly known around these parts for McMansion Hell, but who also does sports journalism, wrote an absolutely incredible piece for Road & Track on F1, which was published and then unpublished nearly instantly. Why yes, the Internet Archive does have a copy: Behind F1's Velvet Curtain. It’s the sort of thing where if you start quoting it, you end up reading the whole thing out loud, so I’ll just block quote the subhead:

If you wanted to turn someone into a socialist you could do it in about an hour by taking them for a spin around the paddock of a Formula 1 race. The kind of money I saw will haunt me forever.

It’s outstanding, and you should go read it.

But, so, how exactly does a piece like this get all the way to being published out on Al Gore’s Internet, and then spiked? The Last Good Website tries to get to the bottom of it: Road & Track EIC Tries To Explain Why He Deleted An Article About Formula 1 Power Dynamics.

Road & Track’s editor’s response to the Defector is one of the most brazen “there was no pressure because I never would have gotten this job if I waited until they called me to censor things they didn’t like” responses since, well, the Hugos, I guess?


Edited to add: Today in Tabs—The Road & Track Formula One Scandal Makes No Sense

Read More
Gabriel L. Helman Gabriel L. Helman

March Fifth. Fifth March.

Today is Tuesday, March 1465th, 2020, COVID Standard Time.

Three more weeks to flatten the curve!

What else even is there to say at this point? Covid is real. Long Covid is real. You don’t want either of them. Masks work. Vaccines work.

It didn’t have to be like this.

Every day, I mourn the futures we lost because our civilization wasn’t willing to put in the effort to try to protect everyone. And we might have even pulled it off, if the people who already had too much had been willing to make a little less money for a while. But instead, we're walking into year five of this thing.

Read More
Gabriel L. Helman Gabriel L. Helman

The Sky Above The Headset Was The Color Of Cyberpunk’s Dead Hand

Occasionally I poke my head into the burned-out wasteland where twitter used to be, and whilw doing so stumbled over this thread by Neil Stephenson from a couple years ago:

Neal Stephenson: "The assumption that the Metaverse is primarily an AR/VR thing isn't crazy. In my book it's all VR. And I worked for an AR company--one of several that are putting billions of dollars into building headsets. But..."

Neal Stephenson: "...I didn't see video games coming when I wrote Snow Crash. I thought that the killer app for computer graphics would be something more akin to TV. But then along came DOOM and generations of games in its wake. That's what made 3D graphics cheap enough to reach a mass audience."

Neal Stephenson: "Thanks to games, billions of people are now comfortable navigating 3D environments on flat 2D screens. The UIs that they've mastered (e.g. WASD + mouse) are not what most science fiction writers would have predicted. But that's how path dependency in tech works."

I had to go back and look it up, and yep: Snow Crash came out the year before Doom did. I’d absolutely have stuck this fact in Playthings For The Alone if I’d had remembered, so instead I’m gonna “yes, and” my own post from last month.

One of the oft-remarked on aspects of the 80s cyberpunk movement was that the majority of the authors weren’t “computer guys” before-hand; they were coming at computers from a literary/artist/musician worldview which is part of why cyberpunk hit the way it did; it wasn’t the way computer people thought about computers—it was the street finding it’s own use for things, to quote Gibson. But a less remarked-on aspect was that they also weren’t gamers. Not just not computer games, but any sort of board games, tabletop RPGs.

Snow Crash is still an amazing book, but it was written at the last possible second where you could imagine a multi-user digital world and not treat “pretending to be an elf” as a primary use-case. Instead the Metaverse is sort of a mall? And what “games” there are aren’t really baked in, they’re things a bored kid would do at a mall in the 80s. It’s a wild piece of context drift from the world in which it was written.

In many ways, Neuromancer has aged better than Snow Crash, if for no other reason that it’s clear that the part of The Matrix that Case is interested in is a tiny slice, and it’s easy to imagine Wintermute running several online game competitions off camera, whereas in Snow Crash it sure seems like The Metaverse is all there is; a stack of other big on-line systems next to it doesn’t jive with the rest of the book.

But, all that makes Snow Crash a really useful as a point of reference, because depending on who you talk to it’s either “the last cyberpunk novel”, or “the first post-cyberpunk novel”. Genre boundaries are tricky, especially when you’re talking about artistic movements within a genre, but there’s clearly a set of work that includes Neuromancer, Mirrorshades, Islands in the Net, and Snow Crash, that does not include Pattern Recognition, Shaping Things, or Cryptonomicon; the central aspect probably being “books about computers written by people who do not themselves use computers every day”. Once the authors in question all started writing their novels in Word and looking things up on the web, the whole tenor changed. As such, Snow Crash unexpectedly found itself as the final statement for a set of ideas, a particular mix of how near-future computers, commerce, and the economy might all work together—a vision with strong social predictive power, but unencumbered by the lived experience of actually using computers.

(As the old joke goes, if you’re under 50, you weren’t promised flying cars, you were promised a cyberpunk dystopia, and well, here we are, pick up your complementary torment nexus at the front desk.)

The accidental predictive power of cyberpunk is a whole media thesis on it’s own, but it’s grimly amusing that all the places where cyberpunk gets the future wrong, it’s usually because the author wasn’t being pessimistic enough. The Bridge Trilogy is pretty pessimistic, but there’s no indication that a couple million people died of a preventable disease because the immediate ROI on saving them wasn’t high enough. (And there’s at least two diseases I could be talking about there.)

But for our purposes here, one of the places the genre overshot was this idea that you’d need a 3d display—like a headset—to interact with a 3d world. And this is where I think Stephenson’s thread above is interesting, because it turns out it really didn’t occur to him that 3d on a flat screen would be a thing, and assumed that any sort of 3d interface would require a head-mounted display. As he says, that got stomped the moment Doom came out. I first read Snow Crash in ’98 or so, and even then I was thinking none of this really needs a headset, this would all work find on a decently-sized monitor.

And so we have two takes on the “future of 3d computing”: the literary tradition from the cyberpunk novels of the 80s, and then actual lived experience from people building software since then.

What I think is interesting about the Apple Cyber Goggles, in part, is if feels like that earlier, literary take on how futuristic computers would work re-emerging and directly competing with the last four decades of actual computing that have happened since Neuromancer came out.

In a lot of ways, Meta is doing the funniest and most interesting work here, as the former Oculus headsets are pretty much the cutting edge of “what actually works well with a headset”, while at the same time, Zuck’s “Metaverse” is blatantly an older millennial pointing at a dog-eared copy of Snow Crash saying “no, just build this” to a team of engineers desperately hoping the boss never searches the web for “second life”. They didn’t even change the name! And this makes a sort of sense, there are parts of Snow Crash that read less like fiction and more like Stephenson is writing a pitch deck.

I think this is the fundamental tension behind the reactions to Apple Vision Pro: we can now build the thing we were all imagining in 1984. The headset is designed by cyberpunk’s dead hand; after four decades of lived experience, is it still a good idea?

Read More
Gabriel L. Helman Gabriel L. Helman

Friday Linkblog, best-in-what-way-edition

Back in January, Rolling Stone posted The 150 Best Sci-Fi Movies of All Time. And I almost dropped a link to it back then, with a sort of “look at these jokers” comment, but ehhh, why. But I keep remembering it and laughing, because this list is deranged.

All “best of” lists have the same core structural problem, which is they never say what they mean by “best”. Most likely to want to watch again? Cultural impact? Most financially successful? None of your business. It’s just “best”. So you get to infer what the grading criteria is, an exercise left to the reader.

And then this list has an extra problem, in that it also doesn’t define what they mean by “science fiction”. This could be an entire media studies thesis, but briefly and as far as movies are concerned, “science fiction” means roughly two things: first, “fiction” about “science”, usually in the future and concerned about the effects some possible technology has on society, lots of riffing on and manipulating big ideas, and second, movies that use science fiction props and iconography to tell a different story, usually as a subset of action-adventure movies. Or, to put all that another way, movies like 2001 and movies like Star Wars.

This list goes with “Option C: all of the above, but no super heroes” and throws everything into the mix. That’s not all that uncommon, and gives a solid way to handicap what approach the list is taking by comparing the relative placements of those two exemplars. Spoiler: 2001 is at №1, and Star Wars is at №9, which, okay, thats a pretty wide spread, so this is going to be a good one.

And this loops back to “what do you mean by best”? From a recommendation perspective, I’m not sure what value a list has that includes Star Wars and doesn’t include Radiers of the Lost Ark, and ditto 2001 and, say, The Seventh Seal. But, okay, we’re doing a big-tent, anything possibly science fiction, in addition to pulling from much deeper into the back catalog than you usually see in a list like this. Theres a bunch of really weird movies on here that you don’t usually see on best-of lists, but I was glad to see them! There’s no bad movies on there, but there are certainly some spicy ones you need to be in the right headspace for.

These are all good things, to be clear! But the result is a deeply strange list.

The top 20 or so are the movies you expect, and the order feels almost custom designed to get the most links on twitter. But it’s the bottom half of the list that’s funny.

Is Repo Man better than Bill and Ted’s Excellent Adventure? Is Colossus: The Forbin Project better than Time Bandits? Are both of those better than The Fifth Element? Is Them better than Tron? Is Zardoz better than David Lynch’s Dune? I don’t know! (Especially that last one.) I think the answer to all of those is “no”, but I can imagine a guy who doesn’t think so? I’ve been cracking up at the mental image of a group of friends wanting to watch Jurassic Park, and theres the one guy who says “you want to watch the good Crichton movie? Check this out!” and drops Westworld on them.

You can randomly scroll to almost anywhere and compare two movies, and go “yeah, that sounds about right”, and then go about ten places in either direction and compare that movie to those first two and go “wait, what?”

And look, we know how these kinds of lists get written: you get a group of people together, and they bring with them movies they actually like, movies they want people to think they like, and how hipster contrarian they’re feeling, and that all gets mashed up into something that’ll drive “engagement”, and maybe generate some value for the shareholders.

But what I think is funny about these kinds of lists in general, and this list specifically, is imagining the implied author. Imagine this really was one guy’s personal ranking of science fiction movies. The list is still deranged, but in a delightful way, sort of a “hell yeah, man, you keep being you!”

It’s less of a list of best movies, and more your weirdest friend’s Id out on display. I kind of love it! The more I scrolled, I kept thinking “I don’t think I agree with this guy about anything, but I’m pretty sure we’d be friends.”

Read More
Gabriel L. Helman Gabriel L. Helman

Time Zones Are Hard

In honor of leap day, my favorite story about working with computerized dates and times.

A few lifetimes ago, I was working on a team that was developing a wearable. It tracked various telemetry about the wearer, including steps. As you might imagine, there was an option to set a Step Goal for the day, and there was a reward the user got for hitting the goal.

Skipping a lot of details, we put together a prototype to do a limited alpha test, a couple hundred folks out in the real world walking around. For reasons that aren’t worth going into, and are probably still under NDA, we had to do this very quickly; on the software side we basically had to start from scratch and have a fully working stack in 2 or 3 months, for a feature set that was probably at minimum 6-9 months worth of work.

There were a couple of ways to slice what we meant by “day”, but we went with the most obvious one, midnight to midnight. Meaning that the user had until midnight to hit your goal, and then at 12:00 your steps for the day resets to 0.

Dates and Times are notoriously difficult for computers. Partly, this is because Dates and Times are legitimately complex. Look at the a full date: “February 29th 2024, 11:00:00 am”. Every value there has a different base, a different set of legal values. Month lengths, 24 vs 12 hour times, leap years, leap seconds. It’s a big tangle of arbitrary rules. If you take a date and time, and want to add 1000 minutes to it, the value of the result is “it depends”. This gets even worse when you add time zones, and the time zone’s angry sibling, daylight saving time. Now, the result of adding two times together also depends on where you were when it happened. It’s gross!

But the other reason it’s hard to use dates and times in computers is that they look easy. Everyone does this every day! How hard can it be?? So developers, especially developers working on platforms or frameworks, tend to write new time handling systems from scratch. This is where I link to this internet classic: Falsehoods programmers believe about time.

The upshot of all that is that there’s no good standard way to represent or transmit time data between systems, the way there is with, say, floating point numbers, or even unicode multi-language strings. It’s a stubbornly unsolved problem. Java, for example, has three different separate systems for representing dates and times built in to the language, none of which solve the whole problem. They’re all terrible, but in different ways.

Which brings me back to my story. This was a prototype, built fast. We aggressively cut features, anything that wasn’t absolutely critical went by the wayside. One of the things we cut out was Time Zone Support, and chose to run the whole thing in Pacific Time. We were talking about a test that was going to run about three months, which didn’t cross a DST boundary, and 95% of the testers were on the west coast. There were a handful of folks on the east cost, but, okay, they could handle their “day” starting and ending at 3am. Not perfect, but made things a whole lot simpler. They can handle a reset-to-zero at 3 am, sure.

We get ready to ship, to light to test run up.

“Great news!” someone says. “The CEO is really excited about this, he wants to be in the test cohort!”

Yeah, that’s great! There’s “executive sponsorship”, and then there’s “the CEO is wearing the device in meetings”. Have him come on down, we’ll get him set up with a unit.

“Just one thing,” gets causally mentioned days later, “this probably isn’t an issue, but he’s going to be taking them on his big publicized walking trip to Spain.”

Spain? Yeah, Spain. Turns out he’s doing this big charity walk thing with a whole bunch of other exec types across Spain, wants to use our gizmo, and at the end of the day show off that he’s hit his step count.

Midnight in California is 9am in Spain. This guy gets up early. Starts walking early. His steps are going to reset to zero every day somewhere around second breakfast.

Oh Shit.

I’m not sure we even said anything else in that meeting, we all just stood without a word, acquired a concerning amount of Mountain Dew, and proceeded to spend the next week and change hacking in time zone support to the whole stack: various servers, database, both iOS and Android mobile apps.

It was the worst code I have ever written, and of course it was so much harder to hack in after the fact in a sidecar instead of building it in from day one. But the big boss got hit step count reward at the end of the day every day, instead of just after breakfast.

From that point on, whenever something was described as hard, the immediate question was “well, is this just hard, or is it ’time zone’ hard?”

Read More
Gabriel L. Helman Gabriel L. Helman

Today in Star Wars Links

Time marches on, and it turns out Episode I is turning 25 this year. As previously mentioned, the prequels have aged interestingly, and Episode I the most. It’s not any better than it used to be, but the landscape around it has changed enough that they don’t look quite like they used to. They’re bad in a way that no other movies were before or have been since, and it’s easier to see that now than it was at the time. As such, I very much enjoyed Matt Zoller Seitz’s I Used to Hate The Phantom Menace, but I Didn’t Know How Good I Had It :

Watching “The Phantom Menace,” you knew you were watching a movie made by somebody in complete command of their craft, operating with absolute confidence, as well as the ability to make precisely the movie they wanted to make and overrule anyone who objected. […] But despite the absolute freedom with which it was devised, “The Phantom Menace” seemed lifeless somehow. A bricked phone. See it from across the room, you’d think that it was functional. Up close, a paperweight.

[…]

Like everything else that has ever been created, films are products of the age in which they were made, a factor that’s neither here nor there in terms of evaluating quality or importance. But I do think the prequels have a density and exactness that becomes more impressive the deeper we get into the current era of Hollywood, wherein it is not the director or producer or movie star who controls the production of a movie, or even an individual studio, but a global megacorporation, one that is increasingly concerned with branding than art.

My drafts folder is full of still-brewing star wars content, but for the moment I’ll say I largely vibe with this? Episode I was not a good movie, but whatever else you can say about it, it was the exact movie one guy wanted to make, and that’s an increasingly rare thing. There have been plenty of dramatically worse movies and shows in the years sense, and they don’t even have the virtue of being one artist’s insane vision. I mean, jeeze, I’ll happily watch TPM without complaint before I watch The Falcon & The Winter Soldier or Iron Fist or Thor 2 again.

And, loosely related, I also very much enjoyed this interview with prequel-star Natalie Portman:

Natalie Portman on Striking the Balance Between Public and Private Lives | Vanity Fair

Especially this part:

The striking thing has been the decline of film as a primary form of entertainment. It feels much more niche now. If you ask someone my kids’ age about movie stars, they don’t know anyone compared to YouTube stars, or whatever.

There’s a liberation to it, in having your art not be a popular art. You can really explore what’s interesting to you. It becomes much more about passion than about commerce. And interesting, too, to beware of it becoming something elitist. I think all of these art forms, when they become less popularized, you have to start being like, okay, who are we making this for anymore? And then amazing, too, because there’s also been this democratization of creativity, where gatekeepers have been demoted and everyone can make things and incredible talents come up. And the accessibility is incredible. If you lived in a small town, you might not have been able to access great art cinema when I was growing up. Now it feels like if you’ve got an internet connection, you can get access to anything. It’s pretty wild that you also feel like at the same time, more people than ever might see your weird art film because of his extraordinary access. So it’s this two-sided coin.

I think this is the first time that I’ve seen someone admit that not only is 2019 is never going to happen again, but that’s a thing to embrace, not fear.

Read More