You Can't Be a Utility If You Go Down
Every real utility has a backup plan. AI's backup plan is the humans it keeps promising to replace.
Howdy folks. Claude was out this morning.
Not “degraded.” Not “partial.” Out. The status page politely noted “elevated errors on Claude.ai, primarily affecting login” starting at 15:40 UTC — which is to say, roughly the moment I sat down with a fresh cup of coffee to actually do some work. The model didn’t fall over under load. Nothing exotic happened. The login page couldn’t hold. The basic can-you-prove-you’re-you layer, the thing every utility on earth has figured out, simply stopped working across claude.ai, the API, Claude Code, Claude Cowork, and Claude for Government. As I write this paragraph, the incident is still open. They have “identified the issue” and are “working to apply a fix.” You know how this story goes.
I stared at the screen with my cold coffee and had a moment in that weird liminal space between empowered and helpless, where something clicked.
You know the feeling when the power goes out and you go “well, I guess I can read a book, but I definitely can’t work”? Same energy. When the internet goes down at the house — not the big Internet, the last-mile fibre-optic internet that’s apparently stitched together with chewing gum and hope — same energy. You stand up. You walk around. You make another cup of coffee. You notice, maybe for the first time in a while, what your life actually looks like without the thing.
We keep saying AI is becoming a utility. We compare it to electricity. We compare it to the internet. The comparison isn’t wrong — something about the pervasiveness, the ambient dependency, the way it has quietly crept into the middle of every knowledge worker’s day until the idea of working without it feels quaint. Fine. Granted. AI is a utility.
But here’s the thing about actual utilities: they took about a century of brutal, embarrassing, lights-flickering-in-half-the-city failure before we dragged them, kicking and screaming, into something resembling reliability. The regulatory apparatus didn’t show up because the power companies asked politely. It showed up because the power kept going out, people kept getting hurt, and the social contract eventually said enough.
AI is trying to skip that phase.
AI wants to be called infrastructure before it has earned the name. AI wants the cultural position of a utility — the ambient dependency, the “we can’t function without this,” the justification for replacing human labor at scale — without accepting the operational obligations that come with it. You cannot have both. You cannot say “we won’t need to hire those humans anymore” and also “oh, and by the way, the login page sometimes just goes away for an hour on a Monday morning.”
Pick one.
The internet itself — the big I, the backbone, the actual plumbing — is fucking reliable. It really is. The last mile is flaky, the services you talk to over it are a weekly coin flip, but the core thing has been built, rebuilt, overbuilt, and then hardened by three decades of adversarial stress-testing. It earned the name. It survived its brutal adolescence. We forget how long that took.
Every real utility I depend on has a backup plan. The power goes out, there’s the generator. The internet flakes, there’s the phone as a hotspot. The water goes out, there’s the stockpile in the garage. The grocery store closes, there’s the pantry. The backup plan is the whole point of depending on something — you depend on it and you have a plan for when it’s not there, because things break, and grown-up systems account for that.
What’s the backup plan for AI?
I’ll tell you the answer nobody wants to say out loud: the humans we were supposedly about to stop needing. That’s the backup plan. That’s the only backup plan. The fallback for “the AI is down” is “well, a person could do this, probably slower, probably more expensively, but at least it would get done.” And that fallback only exists if we still have the people — trained, employed, paid, actively engaged, muscle memory intact. Not ghosts of a replaced workforce. Not retirees sitting at home. Actually-in-the-game humans who can catch the ball when the machine drops it.
Which means the cultural argument for AI replacing people at scale is in direct tension with the operational reality of AI being unreliable enough to need that exact replacement pool as its backup.
You see the knot.
I know what you’re going to say. I’ve seen the take a thousand times on LinkedIn this week: “Fuck depending on Anthropic. Fuck OpenAI. Fuck Google. I’ll just run everything locally on this blazing hardware I’ve got right here.” The rugged individualist of the GPU.
I wish it were true. A world where every knowledge worker ran their own models on their own metal would be a genuinely more interesting world, and the dependency problem would shrink considerably. But I have actually tried to do real work on a local model. This past weekend — not some ancient 2019 anecdote, this past weekend — I ran basic audio transcription on an Apple M3 Pro, which is about as current and about as well-regarded as consumer silicon gets right now. Not frontier reasoning. Not a 70-billion-parameter anything. Transcription. And my battery drained faster than the wall adapter could recharge it. Plugged in. Fan at full airshow-flyby. For transcription. On an M3 Pro.
That’s not a workable utility. That’s a science experiment on a workbench while your actual work waits.
Yes, local models are getting better. Yes, some specialized workloads genuinely run on-device today, and the set will expand. But the LinkedIn fantasy — every professional suddenly running frontier-grade AI off their MacBook and telling the cloud vendors to eat rocks — isn’t engineering. It’s a thermal joke. The hardware isn’t there. The power budget isn’t there. The model weights aren’t there. Scale the fantasy up to an actual knowledge-work day and your laptop becomes a space heater with delusions of grandeur.
The dependency problem doesn’t get resolved by refusing to depend. It gets resolved by the thing you depend on earning the name.
I am not anti-AI. I use it daily. I have built infrastructure on top of it — a whole personal cognitive scaffolding system that holds the threads of my work across contexts and bad-brain days. When it works, it is one of the most extraordinary tools I have ever touched. When it went down this morning, I felt it in my chest. That is information worth paying attention to — not because the feeling means the tool is bad, but because the feeling is a warning signal about how much weight I have quietly started letting a fragile system carry.
If you’re going to become a critical utility, you really can’t go down. That’s the deal. That’s the whole deal. And if you can’t actually honor that deal yet — which, today, right now, on the ground, is the honest situation — then the replacement narrative is a cheque nobody at the vendor level is in a position to cash.
One more thing, because irony is the last honest register we have.
I wrote this with Claude.
Well — through Claude. The thinking is mine, the experience is mine, the argument is mine, but the drafting pipeline ran on the exact tool that went down this morning and sent me into a small existential spiral about how much weight I’d quietly let it carry. Which is either a fatal contradiction or the whole point, depending on how you read it.
I read it as the whole point. You can critique a tool you depend on. Especially a tool you depend on — because you’re the one who actually knows where it breaks. The people with the sharpest critiques of electricity are the ones who lost power in an ice storm, not the ones who never plugged anything in. The people worth listening to about AI’s failure modes are the people who use it every day, got burned this morning, and sat down the moment the fix deployed to write about what happened.
This piece would not exist without Claude. This piece also would not exist without the outage. Both are true. The tool giveth, the tool taketh away, the tool cometh back online about an hour later, and the work continueth.
We should still probably have the humans, though.
Just saying.
Stay feral, folks.


