Vibe Coding and Learning Debt
Vibing with AI might feel productive in the short term, but watch out for the long-term consequences of trading dopamine hits for deeper learning.
A few weeks ago, I experienced a case of vibe coding gone wrong.
I was building a voice transcript anonymizer in Python, riding the Cursor and Claude 3.7 Sonnet melt. The flow was smooth. The code was crushing. The stoke was real.
Things started feeling strange as the agent pulled down increasingly complex models from Hugging Face—a named entity recognition model and a similarity detection one. Suddenly, my models had models, and I was LARPing as a computational linguistics expert.
Then I ran my program.
The results were... perplexing. My transcripts weren't being anonymized correctly at all. Something was off, but all those fancy ML models should work, right?
I tried the vibe coder’s way, repeatedly barking, “FIX IT!” It didn’t fix it. I kept getting the same trash output.
Forget it, Donny, you’re out of your element!
I vibed so hard I hadn't noticed we'd drifted far beyond my understanding of the problem I was trying to solve. I became a tourist on a programming trip, not the guide.
Thankfully, I had the good habit of making frequent, small commits. I rolled back to a previous version and took a different approach. I broke out of vibe mode and gave more explicit directions to use simple regex patterns and a deterministic similarity algorithm. These were techniques I understood at least well enough to prompt small and successfully.
The more straightforward solution worked adequately. More importantly, I could read, explain, and modify the code.
That moment of realization—when you notice you're getting way over your skis with AI—prompted me to write this article. The experience crystallized something I've been thinking about for a while now: the concept of learning debt.
Learning Debt
Learning debt is conceptually similar to technical debt.1 The accumulated costs of taking shortcuts in your learning journey will eventually need to be repaid, often with interest. And like tech debt, we can and should approach it as a tradeoff.
When you rely too heavily on AI to generate code without understanding the fundamentals, you borrow against your future competence. The code works today, but what happens when:
You need to debug something the AI created, but don't understand the underlying patterns?
Your requirements change, and you must shift your program’s behavior beyond what simple prompts can handle?
You face a novel problem that requires fundamental knowledge to solve?
That's when the debt collector comes calling. And trust me, it’s not if. It’s when.
Need-to-know and Domain Type
Let's take a concept from Domain-Driven Design to frame the issue. In DDD strategic design, we recognize three types of domains:2
Core domains are your competitive advantage, modeling the unique parts of your business
Supporting domains enable your core business but aren't a primary differentiator
Generic domains are standard stuff every business needs (like authentication)
Now think about your knowledge domains as a developer. What's core to you? What do you need a deep understanding of versus what can you outsource?
And the problem you’re solving, is it a core domain or a generic one? Knowing this informs the measure of ownership and control you might exert on the problem’s solution.
The trap many fall into is treating everything as generic—"I'll just ask AI for all of it"—without recognizing which domains should be core to their expertise and attention.
Signs You're Accumulating Learning Debt
Here are some warning signs that vibe coding might be creating learning debt:
You can't explain the code you're using. If someone asked you to walk through how your system works without looking at the code, could you do it?
You find yourself repeatedly asking AI for the same patterns. Instead of the knowledge becoming ingrained, you depend on external help for routine tasks. (Side note: speaking of patterns, I’m more than OK to outsource regex to the robots!)
Small changes require complete rewrites. When requirements shift slightly, you must regenerate entire components because you don't understand how to modify what exists. Intentionally disposable “fast fashion” architectures might be in our future, but we’re not there yet.
You feel anxiety when facing problems without AI assistance. That panicky feeling when you need to write code from scratch? That's your learning debt talking.
Balancing Vibes and Learning: It's All About Context
Agents are most potent when used to amplify understanding rather than outsource it.
This balance needs to be contextual, though. Kent Beck's 3X model3 provides a helpful framework for knowing your context and how much to vibe:
In the Explore phase, you're testing hypotheses and seeking validation. Here, vibe coding with AI can be incredibly valuable. You're trying to shorten the distance between experiment and feedback, looking for quick payoffs with minimal investment. Technical excellence isn't the goal yet. Learning whether your idea is good and worth pursuing, modifying, or rejecting is.
As you move into the Expand phase, where you're scaling something that works, your relationship with AI tooling (it’s not a person) should evolve too. This is where learning debt becomes dangerous. The foundations must be sound as you build on them, and your understanding needs to be deeper. Fortunately, I’m finding AI to be a good teacher for new tech, so it’s more a matter of changing your grasp of the tool than the tool itself.
By the time you reach the Extract phase, where you're optimizing a mature and likely complicated system, both feasibility and sustainability of the tech stack demand genuine expertise, not just vibes. Agents will obviously be more and more valuable here, particularly as their context window challenges get resolved. Once again, it’s about changing how you hold the tool.
Kent Beck’s model maps perfectly to Brian Tod's path to product-market fit, visualizing the journey from validating desirability to ensuring feasibility and viability.4 The Explore-Expand phases of 3X align with Brian Tod’s Product Market Fit curve.
In the early stages of a product idea, we should focus almost exclusively on desirability: Is this solving a real problem for users? But as we validate that and move toward expansion, feasibility becomes crucial: Can we actually build and scale this solution effectively?
The learning debt accumulated during pure vibe coding becomes a liability at this inflection point. What worked for quick experiments won't sustain growth when real users depend on your product. This is precisely when you need to start paying down that learning debt.
With this in mind, here's how to maintain the balance:
Use AI as an explainer and tutor. Ask for more than features. Ask for explanations of concepts, patterns, and trade-offs. Dipping into learning mode is a nice vibe unto itself.
Challenge yourself to modify AI-generated code. Don't just paste it in. Change it, break it, fix it, and understand every line.
Identify your core knowledge domains and product lifecycle stage. What areas do you need deep expertise in? Invest real learning time there, even if it's slower.
Practice retrieval, not just recognition. Test yourself by writing code without assistance to ensure you build lasting knowledge.
Match your AI dependence to your product phase. More AI-dependence in Explore, more understanding in Extract.
Many of these tips benefit from having a programming pair to keep you honest.
The Learning Payoff
The thing about learning debt is that it compounds just like financial debt. The longer you rely on shortcuts, the harder it becomes to build understanding later.
This is why we advise our clients to explicitly preserve time for learning in their workflow and when managing their queues (WIP, capacity).
Learning isn't separate from work. It's a fundamental part of work. Deep dives into understanding how and why code works are a crucial component of what Cal Newport calls "deep work."5 The ability to focus intensely on mastering complex concepts separates truly effective engineers from bricoleurs who cobble together solutions.
With both AI and developer productivity very much in the zeitgeist, it's tempting to eliminate learning time in favor of quick outputs in the name of efficiency. And yes, that's perfectly fine in some contexts, like pure exploration. But as your product matures, that investment in understanding should shift to non-negotiable.
Here's my challenge: for your next project, identify one core concept, fundamental to your work, and commit to truly understanding it, not just implementing it with AI assistance. A good test for this would be explaining what you’ve learned to a teammate. PSA: Don’t take humans out of your self-improvement loop!
While vibe coding feels good in the moment, truly understanding what you’re building feels so much sweeter for so much longer. Learning, now that's a vibe worth chasing.
Technical Debt: https://martinfowler.com/bliki/TechnicalDebt.html
Domain-Driven Design - Core, Supporting, and Generic Domains: https://blog.jonathanoliver.com/ddd-strategic-design-core-supporting-and-generic-subdomains/
Kent Beck's 3X (Explore, Expand, Extract) Model at YOW! 2019 (YouTube)
Brian Tod's Path to Product-Market Fit: https://briantod.medium.com/about-product-market-fit-what-ive-learned-about-the-goal-the-process-and-the-nuance-e7b317740f43
Cal Newport's Deep Work: https://www.calnewport.com/books/deep-work/