Agentic AI Makes Juniors More Productive, But Are They Actually Learning Anything?

I’ve been mentoring a junior engineer for the past 6 months, and I need to share something that’s been bothering me. Let me call him Jake (not his real name).

Jake is incredibly productive. He closes tickets fast, his PRs look clean, his features work. On paper, he’s crushing it. He’s using Claude, Cursor, GitHub Copilot - the full AI toolkit. And by the metrics we track, he’s completing tasks 50-60% faster than juniors I’ve mentored in previous years.

But last week, Jake’s feature broke in production. The fix should have taken 20 minutes - it was a straightforward null check that was missing. Instead, Jake spent 3 hours trying different AI-suggested fixes, none of which worked, because he didn’t understand the actual problem.

When I sat with him to debug it, I asked him to explain how the authentication flow worked. He couldn’t. He’d implemented it 2 weeks ago, but he’d done it by describing what he wanted to an AI agent and iterating on the suggestions until the tests passed.

The feature worked perfectly. But Jake had no idea how.

The Productivity Paradox

Research from Code Conductor shows that AI tools help junior developers complete tasks 56% faster. That sounds amazing, right? We’re solving the “juniors are slow and need hand-holding” problem!

But here’s what the metrics don’t capture: Are they actually learning?

When I started as a junior engineer, I remember spending an entire day debugging a CSS layout issue. It was frustrating and felt unproductive. But by the end of that day, I understood the box model, specificity, and inheritance in a way I never forgot.

Jake’s CSS layout issues get fixed in 10 minutes by Copilot. He’s more productive. But he doesn’t know CSS. And the scary thing is: he doesn’t know that he doesn’t know.

The Calculator in Math Class Problem

This reminds me of the old debate about calculators in math class. The question wasn’t “can calculators do math?” - obviously they can. The question was “if students always use calculators, do they learn to think mathematically?”

We decided: calculators are fine AFTER you understand the concepts, but you need to learn long division by hand first.

But with AI coding tools, we’re handing juniors calculators on day one. And unlike math class, there’s no standardized “you must do 100 problems without AI” requirement.

A Real Example That Scared Me

Two weeks ago, Jake shipped a feature that had a subtle security vulnerability. It wasn’t obvious - the code looked fine, the tests passed, even the security scanner didn’t flag it. It was a race condition that only manifested under specific concurrent load patterns.

In code review, I caught it. But here’s the thing: the AI that wrote the code didn’t catch it, and the junior who shipped it didn’t understand it enough to even look for it.

When I explained the vulnerability, Jake said “Oh, I wouldn’t have thought to check for that.” And that’s the problem - he’s not building the mental models that help you think “what could go wrong here?”

He’s building a mental model of “describe what you want, iterate until it works.”

The Numbers Are Concerning

I’ve been following the research on this, and the trends are troubling:

  • Entry-level dev jobs down 67% since 2022 (according to Hakia research)
  • 72% of tech leaders plan to reduce entry-level hiring (Code Conductor study)
  • Companies are thinking: “If AI can do junior-level work, why hire juniors?”

But here’s what they’re missing: Where do senior engineers come from?

Senior engineers aren’t people who were born knowing distributed systems and architectural patterns. They’re juniors who spent years making mistakes, debugging hard problems, and building mental models through experience.

If we eliminate the junior engineer pipeline because “AI can do that work,” where do our seniors come from in 5 years?

The Question I’m Wrestling With

I genuinely don’t know the answer to this:

How do we balance velocity with skill development?

On one hand, I want Jake to be productive and to leverage modern tools. Learning to work with AI is clearly going to be a critical skill.

On the other hand, I’m watching him build a career on a foundation of tools he doesn’t understand, solving problems he can’t debug, and writing code he can’t explain.

Is the answer to force “no-AI” time? To require juniors to implement features twice - once without AI to learn, once with AI for production? To accept that junior engineers are just slower, and that’s okay because they’re learning?

Or is this just the new normal, and I’m being old-fashioned? Maybe the next generation of engineers won’t need to understand how TCP works or how memory allocation happens - they’ll work at a higher abstraction layer, orchestrating AI agents that handle the implementation details.

What I’m Trying

For now, I’m experimenting with this approach:

  • Jake has to explain his PR to me before I review it - if he can’t explain it, he doesn’t understand it
  • We do “no-AI Fridays” where he implements small features without AI assistance
  • Every bug he writes, he has to debug himself before asking for help (including from AI)
  • We pair program on complex features so I can model the thinking process, not just the coding

But honestly, I’m making this up as I go. And I’m worried I’m handicapping him compared to other juniors who are just “shipping fast with AI.”

Has anyone else figured this out? How are you helping junior engineers actually learn in the age of AI code generation?

Because if we don’t solve this, we’re going to have a generation of engineers who can ship features but can’t understand them. And that terrifies me.

Alex, thank you for sharing this so openly. This is exactly the conversation we need to be having, and you’re not alone in wrestling with these questions.

I manage a team of 40+ engineers, and I’m seeing the same patterns you’re describing with Jake play out across multiple juniors. The “can ship but can’t explain” phenomenon is real, and it’s not just a junior problem anymore - I’m seeing some mid-level engineers exhibit similar gaps when they lean too heavily on AI.

“AI-Assisted Apprenticeship” Model

Here’s what we’ve been trying, and it’s showing some promise:

The core principle: Juniors should pair with AI AND a senior, not AI instead of a senior.

Practically, this means:

  • Junior uses AI to implement a feature
  • Junior must document: what they asked the AI, what the AI suggested, why they chose that approach
  • Senior reviews both the code AND the decision-making process
  • We explicitly discuss: what did the AI do well? What did it miss? What would you do differently now?

This shifts code review from “does this code work?” to “do you understand why this code works?”

The Code Review Evolution

We’ve changed our code review template specifically for AI-assisted work:

Required in PR description:

  • What did you ask the AI to do?
  • What parts did you write yourself vs AI-generated?
  • What did you modify from the AI’s suggestion and why?
  • What could go wrong with this implementation?

That last question is critical. If a junior can’t identify potential failure modes, they don’t understand the code deeply enough.

The Pipeline Problem

Your point about where senior engineers come from in 5 years hits hard. This is a CTO-level concern that I’ve been raising with leadership.

The data you cited - 72% of tech leaders reducing entry-level hiring - this is catastrophically short-sighted.

In our financial services context, we’re doubling down on junior hiring specifically because we see this as a competitive advantage. Our thinking:

  • Companies that cut juniors will face a senior talent shortage in 3-5 years
  • Meanwhile, we’ll have a bench of engineers who grew up learning WITH AI but also learned fundamentals
  • Those engineers will have a skillset nobody else has: fluency in both AI-assisted development AND deep system understanding

But this requires investment. We’re not getting the immediate productivity gains that other companies are claiming.

Practical Frameworks We Use

1. The “Two-Pass” Method (similar to your idea)

  • Junior implements feature with AI assistance (fast, learns modern tools)
  • Junior implements a similar feature without AI in a learning environment (slower, learns fundamentals)
  • We compare: what was different? What did each approach teach you?

2. “Explain Before Merge” Rule

  • Before any PR can merge, junior must walk a senior through the code live
  • If they can’t explain a section, they have to go understand it (with or without AI)
  • We’re explicitly teaching: using AI is fine, shipping code you don’t understand is not

3. Structured Debugging Practice

  • When juniors encounter bugs, they must attempt debugging for 30 minutes WITHOUT AI
  • This builds the pattern-recognition and hypothesis-testing skills that AI can’t teach
  • After 30 minutes, they can use AI, but they have to document what they learned

The Cultural Challenge

The hardest part isn’t the process - it’s the culture.

Juniors see other companies where “ship fast with AI” is celebrated and deep understanding isn’t explicitly valued. They worry they’re falling behind if they spend time learning “unnecessary” fundamentals.

We’ve had to be very explicit: Understanding how systems work isn’t optional nostalgia. It’s the foundation for senior engineering judgment that AI can’t replicate.

When reviewing incident post-mortems, we make a point to highlight: “The person who diagnosed this had deep system knowledge that came from years of experience.” That’s the role model we want juniors to aspire to.

What We’re Still Figuring Out

Your question about whether we’re being “old-fashioned” is one I ask myself constantly.

Maybe the future really is engineers who work at pure abstraction layers, never touching implementation details. Maybe trying to teach TCP internals is like insisting that modern developers learn assembly language - technically interesting but practically irrelevant.

But I don’t think so. Because in my experience, the engineers who can solve novel problems, who can debug the truly weird issues, who can architect systems that scale - they all have deep foundational knowledge.

AI can make them more productive, but it can’t replace the judgment that comes from having debugged hundreds of production incidents.

My Ask to Leadership

One thing I’m pushing for internally: Treat junior engineering development as an investment, not a cost center.

Instead of measuring juniors purely on story points delivered, measure:

  • Growth in system understanding (can they explain architecture?)
  • Debugging proficiency (how quickly can they root-cause issues?)
  • Code review quality (do they catch AI mistakes in others’ PRs?)
  • Independence trajectory (are they needing less help over time?)

This is harder to quantify than “tickets closed,” but it’s what actually matters for building senior engineers.

Alex, your approach with Jake sounds solid. The key thing is you’re paying attention and adjusting. That’s more than a lot of teams are doing.

Keep fighting the good fight. The industry needs mentors who care about this.

This hits SO close to home from a design perspective. I’ve seen the exact same pattern with junior designers using AI tools.

The Figma Auto-Layout Parallel

Alex, your CSS box model story reminded me of something I experienced with a junior designer on my team. Let’s call her Emma.

Emma can create beautiful, functional designs in Figma incredibly fast. She uses AI to generate design system components, auto-layout for responsive designs, and plugins for accessibility checks.

Her mockups look professional. They pass review. Developers can implement them.

But then we needed to translate our design system to a new platform that didn’t support Figma’s auto-layout. Emma was completely stuck. She’d never learned the underlying principles of CSS flexbox and grid that auto-layout was abstracting away.

She could create layouts, but she couldn’t explain HOW they worked or adapt them to new constraints.

Learning Through Constraints

Here’s something I learned from my startup failure (which, tbh, taught me more than any successful project):

The most valuable learning happens when you’re forced to solve a problem with limited tools.

When my startup was scrappy and we couldn’t afford fancy tools, I learned:

  • Pure CSS before using frameworks
  • Vanilla JavaScript before relying on libraries
  • Manual user testing before automated analytics

Those constraints forced me to understand the fundamentals. And now, even though I use all the fancy tools, I can debug issues and adapt when tools don’t work.

Jake sounds like he’s learning WITH every tool available from day one. He’s never had to build something the hard way.

What If We Designed “Learning Constraints”?

Luis’s “Two-Pass Method” is brilliant, but I want to push on this even more:

What if we intentionally created “AI-free zones” for learning?

Not as punishment, but as structured learning exercises - like a musician practicing scales even though they’ll never perform scales in concert.

Here’s what this could look like:

Month 1-2: Fundamentals Sprint

  • New juniors spend their first 6 weeks building small projects WITHOUT AI
  • Simple CRUD app, basic authentication, straightforward API
  • Goal: Build mental models for how code connects to functionality
  • It’ll be slower, more frustrating, but that’s the point

Month 3-4: AI-Assisted Building

  • Now introduce AI tools
  • Junior can see the difference: “Oh, AI just did what took me 2 hours last month in 10 minutes”
  • But they UNDERSTAND what the AI is doing because they’ve done it manually

Month 5+: Hybrid Workflow

  • Junior decides when to use AI vs manual implementation
  • More importantly: they can debug AI-generated code because they’ve written similar code by hand

The Rotation System

Another idea: Rotating “constraint sprints”

  • One sprint: You can only use AI for boilerplate, not logic
  • Next sprint: No AI at all, only documentation and stackoverflow
  • Next sprint: Full AI assistance encouraged

This builds muscle memory for multiple problem-solving approaches. If AI goes down or fails, you’re not helpless.

But Here’s the Hard Part…

The approach I’m describing is SLOWER. Emma would be less productive in the short term if I forced her to learn CSS grid without auto-layout.

Jake would close fewer tickets if Alex made him debug manually before using AI.

And in a world where everyone’s measuring velocity and story points, this looks like you’re handicapping your juniors.

I’ve had this exact argument with leadership: “Why is your junior moving slower than the junior on the other team who uses AI for everything?”

My answer: “Because I’m training her to be a senior designer in 3 years, not just a fast Figma operator right now.”

But I’ll be honest - sometimes I wonder if I’m wrong. Sometimes I worry I’m imposing constraints that are actually just nostalgia for “how I learned.”

The Startup Failure Lesson

But then I remember: My startup failed, and that failure taught me more than any success.

Why? Because when everything’s working and AI is handling the complexity, you don’t learn the hard lessons.

But when things break - when the AI suggestion doesn’t work, when the framework fails, when you hit an edge case nobody anticipated - THAT’S when you learn.

If we protect juniors from ever experiencing struggle, we’re protecting them from learning.

A Different Framing

Maybe instead of “AI-free time” (which sounds punitive), we call it:

“Building Your Debugging Instincts”
“Foundational Skills Workshop”
“Senior Engineer Bootcamp”

Same concept, but framed as investment in their growth, not restriction on their tools.

Luis is right that culture matters here. If juniors feel like learning fundamentals is busywork that’s slowing them down, they’ll resent it.

But if they see it as “this is how you level up to senior,” they’ll embrace it.

Question for Alex

You mentioned “no-AI Fridays” - how’s Jake responding to that? Is he seeing the value, or does he feel like you’re making him slower for no reason?

I’m curious because I’ve had mixed results with similar constraints in design. Some juniors get it immediately. Others think I’m stuck in the past.

Would love to hear how you’re framing this to make it feel like growth opportunity rather than punishment.

Alex, this is one of the most important conversations happening in engineering leadership right now, and I’m glad you’re having it openly.

Let me be direct: The “juniors with AI are super productive” narrative is missing half the story, and the half we’re missing will bite us in 3-5 years.

The Data Doesn’t Lie (But We’re Reading It Wrong)

You cited the statistics:

  • 67% drop in entry-level jobs since 2022
  • 72% of tech leaders reducing junior hiring
  • 56% faster task completion with AI tools

Here’s what those numbers tell me as a CTO: We’re optimizing for short-term productivity at the expense of long-term capability.

Companies are looking at AI-assisted juniors and thinking: “Great, we can hire fewer of them and get more output!”

What they’re not seeing:

  • The technical debt accumulating from code nobody fully understands
  • The missing generation of engineers who would have become seniors
  • The organizational knowledge that isn’t being built
  • The judgment gaps that will emerge when AI encounters novel problems

This is short-term thinking disguised as innovation.

Where Senior Engineers Really Come From

I’ve been in this industry for 25 years. I’ve watched junior engineers become seniors, and I can tell you: Senior engineering judgment doesn’t come from writing code. It comes from writing code that failed, debugging production disasters, making architectural mistakes, and building intuition through pattern recognition.

You can’t shortcut that with AI because AI provides answers, not experience.

When Jake debugged that production issue for 3 hours using AI suggestions that didn’t work - that WAS learning. Painful, inefficient learning, but learning nonetheless.

The problem isn’t that he used AI. The problem is he didn’t understand enough to evaluate whether the AI’s suggestions made sense.

The Strategic Workforce Question

Here’s what keeps me up at night from an organizational perspective:

Scenario: It’s 2028. Your infrastructure needs a major re-architecture because you’re scaling 10x.

Who on your team can do that work?

  • Not the juniors you hired in 2026 who only know how to prompt AI agents
  • Not the mid-levels you promoted quickly because they were “AI-productive” but lack depth
  • Not the seniors you couldn’t hire because there aren’t enough of them anymore

You’re stuck paying $600k to hire senior staff engineers from the tiny pool of people who actually learned engineering fundamentals before AI abstracted everything away.

This is a market failure waiting to happen.

What I’m Doing About It (And Why It’s Unpopular)

Our strategy:

1. Intentional junior hiring and development
We’re actually INCREASING entry-level hiring while competitors cut. Yes, it’s expensive. Yes, it’s slower. But in 3 years, we’ll have a talent advantage nobody else has.

2. Structured learning paths that include AI

  • Month 1-3: Core fundamentals with minimal AI (like Maya’s suggestion)
  • Month 4-6: AI-assisted development with mandatory explanation requirements
  • Month 7+: Full AI usage with senior oversight on complex work

3. Measuring the right things
We changed our junior engineer evaluation criteria:

  • :cross_mark: Story points completed
  • :cross_mark: PRs merged per week
  • :white_check_mark: Can explain architecture decisions
  • :white_check_mark: Independently debugs production issues
  • :white_check_mark: Catches problems in code review
  • :white_check_mark: Proposes technical improvements
  • :white_check_mark: Demonstrates growing system understanding

4. “Explain Before Merge” policy
Luis mentioned this - we’ve made it company-wide. Any PR that uses AI generation (which is most of them) requires the author to add an “AI Usage” section:

  • What did AI generate?
  • What did you modify and why?
  • What could go wrong with this approach?
  • How would you debug this if it failed?

If you can’t answer these questions, your PR doesn’t merge.

The Uncomfortable Truth

Here’s what I tell my board when they push back on our junior development investment:

“We can optimize for next quarter’s productivity, or we can invest in next year’s capability. We cannot do both.”

AI makes current engineers more productive. That’s real value.

But building the next generation of senior engineers requires patience, investment, and tolerance for slower short-term growth.

Most companies are choosing productivity. We’re choosing capability.

Luis Is Right About Culture

The cultural framing matters enormously.

At our company, we talk about:

  • “AI-native engineering” - not “with or without AI” but “understanding systems deeply WHILE leveraging AI”
  • “Building senior judgment” - not “learning the old way” but “developing skills that AI can’t replace”
  • “Competitive advantage through depth” - not “slowing down” but “building capabilities our competitors won’t have”

Juniors respond well to this framing because it’s aspirational, not nostalgic.

The Answer to Your Question

You asked: “Am I being old-fashioned, or is this the new normal?”

Neither. You’re being strategic.

The new normal is going to be a bifurcated market:

  • Commodity engineers: High productivity with AI, limited deep understanding, interchangeable
  • Strategic engineers: Deep system knowledge, can use AI effectively, can solve novel problems

The second group will be rare and highly valued. The first group will face intense competition and downward wage pressure.

Jake can still end up in the second group, but only if you keep doing what you’re doing: forcing him to build understanding even when it’s slower.

Practical Recommendation

Keep your “no-AI Fridays,” but reframe them:

“Senior Engineer Development Days”

Make them about:

  • Deliberate practice on skills AI can’t teach
  • Deep system understanding
  • Debugging hard problems without shortcuts
  • Building intuition through experience

Frame it as investment in Jake’s career ceiling, not restriction on his tools.

And honestly? Thank you for caring about this. A lot of engineering leaders are taking the easy path - maximize productivity now, worry about the consequences later.

You’re thinking long-term. That’s leadership.