The Rounding Error
AI Right Now Barely Matters
spoiler-alert: This post contains spoilers for Consider Phlebas by Iain M. Banks. If you haven’t read it, consider stopping here and picking it up. It’s a fantastic sci-fi novel and the first in the Culture series.
I just finished Consider Phlebas, the first Culture novel. In the epilogue, Banks does something cruel. After 500 pages of desperate battles, orbital destructions, and personal tragedy, he gives you statistics:
The Idiran-Culture War: 48 years. 851 billion casualties. 91 million ships destroyed.
Then: this represented approximately 0.01% of the stellar population.
The defining conflict of the novel, the thing Horza died for, the thing the Culture mobilized against was a rounding error.
We’re Having the Wrong Argument
I keep seeing the same interviews. Sam Altman on what’s next. Dario Amodei on timelines. Jensen Huang on compute. The discourse obsesses over:
Will AGI arrive in 2 years or 10?
Which jobs get automated first?
Who wins the AI race?
Perhaps they are tired of these questions too.
The Actual Shift
Here’s what I notice building AI systems every day: the discourse is about capabilities. Can it code? Can it reason? Can it pass benchmarks?
We are collectively trying to test the boundaries of the jagged frontier, and the walls keep moving.
The actual shift is about location. Where does thinking happen?
Right now, intelligence lives in human heads, occasionally augmented by tools. We’re moving toward something different. Intelligence everywhere. In your devices, your infrastructure, your ambient environment.
This isn’t replacement. It’s relocation. And relocation changes everything without anyone noticing until it’s done.
What’s Actually Being Built
We’re not building artificial general intelligence. We’re building artificial general availability of intelligence.
The difference matters. AGI implies a thing, a successor, something that arrives. Availability implies infrastructure, plumbing, something that spreads until it’s invisible.
Nobody talks about “artificial general availability of electricity.” It’s just there. You plug things in.
That’s the trajectory. Not a dramatic arrival. A quiet diffusion until the question changes from “can AI do this?” to “why wouldn’t it?”
851 billion casualties. A rounding error. The infrastructure won.


