Software Engineering is still a Human Endeavor
2025 was the year of vibe coding, and I want to reflect on what thats meant to me and how it has effected how myself and the teams around me have been working.
I want to establish some baseline items for where im coming from:
- AI makes individual engineers faster. Whether it makes them better is debatable, but it undeniably reduces barriers to raw productivity.
- AI does not make teams faster. Developers quickly create islands of productivity—personal tool configurations, custom scripts, preferred models—and managing shared state across these islands is difficult.
- AI can one-shot trivial workflows and boring tasks. But in the troubleshooting phase, its eagerness to satisfy prompts leads to what I've started calling prompt rot: half-implemented side effects, dead code, abandoned approaches. Over time this degrades your codebase and erodes the productivity gains you got earlier in the process.
- Senior and staff engineers have never been more in demand for code review and architecture. AI can work within patterns and even establish new ones, but it struggles to see the bigger picture—the boundaries between systems, the reasons a pattern exists in the first place.
This tension between productivity and quality has been exhausting to navigate. But I want to set aside the dogmatic arguments and focus on a practical question: if AI is here and our workplaces are pushing it aggressively, how do we automate the right things and empower the humans in the loop?
The Infinite Tooling Crisis
Jake Nations from Netflix recently gave a talk called "The Infinite Software Crisis" that lays out the benefits and pitfalls of AI-assisted development. It's worth watching and I've found it approachable for less technical folks in our org.
But I think the problem runs deeper than his framing suggests. The infinite software crisis is also an infinite tooling crisis.
I'm watching this happen in real time. Every person on every team is being pushed by their AI tools to recreate basic functionality from scratch. You can chart a team's AI adoption by when the first Makefiles and scripts/ folders appear—solving problems that were likely already solved elsewhere in the org.
Every engineer becomes a silo. They build a strong attachment to their personal setup because it makes them specifically very productive. They're suddenly freed from frictions that bothered them. And before you know it, you have an archipelago of divergences from whatever your team's golden path used to be.
Why This Will Get Worse Before It Gets Better
The free lunch era of AI isn't over. Scaling the models and tweaking parameters has produced more gains than most people expected. Until we hit a clearly defined wall—until the massive improvements slow down—we should expect these human coordination problems to accelerate.
More capability means more individual productivity. More individual productivity means more code. More code means more prompt rot. And prompt rot compounds: the messier the codebase, the worse the AI's suggestions become, the more half-fixes accumulate.
What We Can Actually Do
AI-assisted code review tools exist, but in my experience they're close to useless. They engender a rubber-stamp mentality—if the AI approved it, it must be fine.
The real leverage is human. We need to empower the people in our orgs who are good at reading code to stay vigilant. That means:
- Questioning additions, not just bugs. When reviewing AI-assisted work, ask why something was added. Often the answer is that the AI hit an error and "fixed" it in a way that doesn't align with the original intent.
- Treating prompt rot as technical debt. Make it visible. Track it. Budget time to clean it up. If you don't, your codebase will drift toward incoherence.
- Defending the golden path. When someone's personal scripts duplicate existing team tooling, that's a conversation worth having—not to kill their productivity, but to understand what friction drove them away from the shared solution. Sometimes the answer is that the shared solution needs improvement.
The engineers who can read a diff and ask "wait, why is this here?" have never been more valuable. In 2026, that's the skill to cultivate.