Fred Brooks documented this mistake in 1975. The fuel changed. The pattern didn’t.
In 1975, Fred Brooks was watching a project at IBM fall apart in slow motion. The response from management was the obvious one — add more engineers. It got worse. He wrote about it, called it Brooks’s Law, and put it in a way that’s hard to forget:
“Adding manpower to a late software project makes it later. Like dousing a fire with gasoline, this makes matters worse, much worse.”
The book became required reading in computer science programs around the world. It’s been quoted in thousands of articles, referenced in hundreds of academic papers, and cited in more retrospectives than anyone can count.
And then everyone ignored it anyway.
Fifty years later the fire looks the same. The fuel is just different now.
The move we’re seeing across a lot of engineering teams right now isn’t “add people.” The market moved past that one. The move is the opposite.
And look, the logic isn’t crazy. AI tools are genuinely productive. GitHub Copilot, Cursor, Claude — these aren’t toys. Engineers who know how to use them well can move meaningfully faster on certain kinds of work. That part is real.
But here’s the thing. The bottleneck was almost never the speed of code generation.
It was the clarity of decisions. Who owns this component? What happens when this integration breaks at 2am? Why was this architectural call made eight months ago and what else breaks if we change it? Those questions don’t get answered faster just because the implementation layer got faster. If anything, faster implementation means you hit the decision bottlenecks more often, not less.
When experienced engineers leave — layoffs, restructuring, whatever the reason — they take something with them that doesn’t show up in any handoff doc.
Context.
AI doesn’t have that. It has the code. And those two things — the code and the context behind it — are not the same thing, even when they look like they are.
What tends to happen next is pretty predictable, honestly. The remaining team moves fast on the obvious work. Anything that requires understanding decisions made before they had full visibility slows to a crawl. AI fills the gaps with code that looks right and sometimes isn’t. Technical debt accumulates faster than before. Not slower.
The fire gets bigger. Not smaller.
AI can generate code. It can’t generate context. And context is what keeps a system from becoming a fire.
There’s a question worth asking before cutting headcount or reorganizing around AI tools, and it’s not “what can AI replace?”
It’s “where does this project actually slow down?”
Brooks figured this out watching it happen at IBM. The lesson he took wasn’t “don’t add resources.” It was simpler than that: understand what the problem actually is before you try to solve it.
That’s still the lesson. Fifty years later and we’re still learning it.
The fuel changed. The fire didn’t.
And gasoline is gasoline, regardless of what it’s made of.