Steve Yegge's recent essay "The AI Vampire" surfaces a counterintuitive problem in modern development. Using agentic AI tools feels productive and engaging, but creates hidden costs that trigger burnout. The work moves fast. Developers ship code quickly. Then fatigue sets in.
Margaret Storey's parallel work on cognitive debt names the root issue. Cognitive debt accumulates when developers rush through tasks without fully understanding implications. It's the mental tax paid later when someone must untangle hastily made decisions, maintain poorly understood code, or debug systems built without proper reasoning.
The connection matters. AI-assisted programming amplifies this dynamic. Agentic systems generate solutions at machine speed. Developers accept outputs without deep comprehension, trading immediate velocity for future clarity. The code works initially. The velocity feels rewarding. But comprehension debt compounds.
Yegge's vampire metaphor captures this precisely. The tool extracts value upfront while depositing costs that compound invisibly. A developer uses AI to generate a complex authentication flow in minutes rather than hours. The solution functions. The sprint closes on time. Six months later, when that system needs modification or debugging, the original developer can't fully explain its logic. Someone new must reverse-engineer reasoning they didn't author.
Storey's framework adds precision. Cognitive debt isn't about bad code alone. It's about knowledge loss. It's about systems built faster than understanding spreads. When developers consistently work at this velocity, burnout follows naturally. The mental exhaustion stems not from effort, but from constant context-switching between surface-level task completion and the underlying complexity nobody fully owns.
The industry hasn't integrated these insights yet. Managers measure velocity as story points completed. AI tools amplify that metric immediately. The cognitive debt accumulates silently until something breaks or a developer simply burns out from maintaining systems they never fully understood.
