AI-Powered Onboarding: From Weeks to Days Using Disco, Port.io, and Glean

One of the most expensive problems in engineering leadership is onboarding. We spend months recruiting the right person, then watch them struggle for weeks (sometimes months) before they’re truly productive. I’ve been experimenting with AI-powered onboarding tools over the past year, and the results have been significant enough to share.

The Numbers That Made Me Pay Attention

Research from DX analyzing six multinational enterprises found something striking: engineers using AI tools daily reached their 10th pull request in 49 days. Engineers without AI? 91 days. That’s nearly double the time to reach basic productivity.

Even more concerning: half of new hires who don’t use AI tools still haven’t reached 10 PRs after three months of work.

When you factor in fully-loaded engineering costs, that’s not just a productivity gap - it’s a substantial financial impact. Organizations report annual savings of around $21,000 per hire when implementing AI onboarding tools.

The Stack We’ve Built

After testing several combinations, here’s what’s working for our teams:

Disco for structured learning paths

  • Automatically generates 30/60/90 day plans customized to the role
  • Adapts content based on whether someone is frontend, backend, or full-stack
  • Tracks competency gaps and adjusts in real-time
  • $359/month starting point (14-day trial available)

Glean for knowledge discovery

  • AI-powered search across all our internal documentation
  • GPT integration means new hires can ask questions in natural language
  • Eliminates the “which wiki page has the deployment docs?” problem
  • Huge reduction in interruptions to senior engineers

Port.io for environment setup

  • Automated repository access, IDE configurations, CI/CD pipeline setup
  • What used to take 2-3 days of config now happens in hours
  • Handles GitHub permissions, Slack channels, tool integrations automatically

What’s Actually Working

The biggest win has been the reduction in “interrupt-driven onboarding.” Previously, new engineers would spend their first weeks asking senior team members basic questions - where’s the documentation, how do I set up my environment, what’s the deployment process?

Now, Glean handles about 65% of those questions automatically. New hires can search our knowledge base and get contextual answers without pulling someone out of deep work. One mid-sized company I spoke with reduced HR onboarding calls by 65% after deploying a chatbot.

The automated environment setup is the second biggest win. Nothing kills momentum like spending your first day fighting with local dev environment configuration.

What’s Not Working (Yet)

Human connection still matters
AI can’t replicate the relationship-building that makes someone feel like part of the team. We’ve had to be intentional about pairing AI onboarding with buddy programs and regular 1:1s.

Context for ambiguous decisions
AI is great at answering “how do I deploy?” but struggles with “why did we choose this architecture?” The tribal knowledge that explains historical decisions still needs human transmission.

Over-reliance risk
We’ve seen some new hires become dependent on AI assistance for things they should be learning to do independently. There’s a balance between accelerating ramp-up and building genuine competency.

The ROI Conversation

For those building the business case, here are the numbers we track:

  • Time to first meaningful commit (target: under 2 weeks)
  • Time to 10th PR (target: under 60 days)
  • Reduction in senior engineer interruptions (measured via Slack analytics)
  • New hire satisfaction scores at 30/60/90 days
  • First-year retention rates

The $21K per-hire savings estimate comes from reduced senior engineer time spent on onboarding tasks plus faster time-to-productivity. Your mileage will vary based on your team’s fully-loaded costs.

Questions for the Community

I’m curious how others are approaching this:

  • What’s your AI onboarding stack?
  • How are you measuring onboarding effectiveness?
  • What human elements are you being intentional about preserving?
  • Any tools I should be looking at that I haven’t mentioned?

Engineering onboarding has been broken for decades. AI tools aren’t a silver bullet, but they’re the biggest step change I’ve seen in making new hires productive faster.


eng_director_luis

This resonates with my experience joining TechFlow last year. I was the new hire who benefited from an AI-enhanced onboarding setup, so I can share the perspective from the other side.

What Made the Biggest Difference

Honestly? Glean. I cannot overstate how much this changed my first few weeks.

At my previous job, onboarding meant asking someone on Slack where to find things, waiting for them to get back to me, then asking a follow-up question when the doc they pointed me to was outdated. Rinse and repeat dozens of times per day.

With Glean, I could just search “how do I deploy to staging” and get a contextual answer that pulled from our actual runbooks, Slack threads, and wiki pages. It felt like having a senior engineer available 24/7 who never got annoyed at basic questions.

The Automated Setup Piece

The environment automation was genuinely magical. I got an email on day one with a link, clicked through a few permission prompts, and within a couple hours had:

  • All the repos cloned and configured
  • My IDE set up with the right extensions
  • GitHub permissions for the repos I needed
  • Slack added to the right channels
  • CI/CD access ready to go

At my last company, that would have been 2-3 days of following outdated wiki instructions and filing IT tickets for permissions. Here it was basically automatic.

What Still Needed Human Help

Two things the AI couldn’t help with:

First, the “why” questions. I could ask the AI “how does our auth service work?” and get a decent technical answer. But when I asked “why did we build our own auth instead of using Auth0?” it had no idea. Those historical context questions still needed a human who was there when the decision was made.

Second, understanding team dynamics and unwritten norms. Like which Slack channels actually get responses, who to escalate to for different issues, what the real deadline expectations are vs what’s in Jira. That stuff only came from my buddy and 1:1s with my manager.

My Recommendation for Teams Considering This

Don’t skip the human elements. The AI tools made me productive faster, but what made me feel like part of the team was my onboarding buddy and the intentional 1:1s scheduled during my first month.

If I had just been handed Glean and left alone, I would have been productive but probably would have felt isolated. The combination of AI efficiency plus human connection is the sweet spot.

Also: make sure your internal documentation is actually good before deploying Glean. AI-powered search on bad docs just surfaces bad information faster.


alex_dev

This is an area we’ve invested heavily in over the past year, and I want to share the strategic lens on why AI onboarding became a priority for us.

Why This Rose to the Top of Our Investment List

We’re scaling from 50 to 120 engineers over the next 18 months. At that growth rate, onboarding efficiency becomes a core constraint on how fast we can actually execute. Every week we shave off onboarding time is a week of productivity we get back, multiplied by dozens of new hires.

But beyond the math, there’s a competitive angle. We compete for talent with companies that have better developer experience. If a new hire’s first two weeks are frustrating, that colors their entire perception of the company. Word gets around. We’ve seen AI-powered onboarding become a recruiting differentiator - candidates actually ask about it in interviews now.

The ROI Metrics We Track

Luis mentioned several good ones. I’d add:

  • Time to first code review given (not just received) - this indicates when someone starts contributing to team learning, not just personal productivity
  • Onboarding buddy time savings - we track how many hours buddies spend on onboarding questions month-over-month as we improve tooling
  • First-year attrition by onboarding cohort - early exits are expensive; we correlate onboarding experience with retention

The $21K savings estimate is probably conservative for us given our fully-loaded engineering costs in Seattle. But ROI isn’t just about cost savings - it’s about competitive advantage in talent acquisition and retention.

The Build vs Buy Decision

We evaluated building custom onboarding automation versus buying tools like Disco and Glean. Our conclusion: buy.

The tooling in this space has matured faster than we expected. The integration cost of building our own would have been 6+ engineer-months of work to build something inferior to what’s commercially available. That time is better spent on our core product.

That said, we did build custom integrations on top of these tools - connecting them to our specific internal systems and workflows. The platforms are flexible enough to accommodate this.

My Concern About Over-Automation

One thing I’m watching carefully: are we training engineers to be dependent on AI assistance, or to be genuinely capable?

There’s a difference between using AI to accelerate learning (good) and using AI to avoid learning (concerning). I’ve started asking in skip-levels whether new hires feel like they understand our systems, or whether they just know how to ask the AI about them.

I don’t have a clear answer yet, but it’s something to monitor. The goal is to compress the learning curve, not to replace learning entirely.

What’s Next for Us

We’re exploring agentic AI for onboarding - systems that don’t just answer questions but proactively guide new hires through tasks, check in on their progress, and escalate to humans when needed. The technology is maturing quickly.

If anyone has experience with more autonomous onboarding assistants, I’d be curious to hear what’s working.


cto_michelle

I want to add some nuance to the data being cited here, because the picture is more complicated than the numbers suggest.

The METR Study Says Something Different

There’s a randomized controlled trial from METR that found experienced developers using AI tools took 19% longer to complete tasks - AI actually made them slower, not faster.

What’s more interesting: developers in that study estimated AI would speed them up by 24%, and even after experiencing the slowdown, they still believed AI had sped them up by 20%. There’s a significant gap between perceived and actual productivity impact.

This doesn’t contradict Luis’s data, but it does suggest the effect is highly dependent on context. AI seems to help new hires ramping up on unfamiliar codebases (the 49 vs 91 days finding) while potentially hurting experienced developers working on familiar systems.

Questions About the Metrics

“Time to 10th PR” is an interesting proxy for productivity, but I’d want to understand:

  • What’s the quality distribution of those PRs? Are AI-assisted new hires submitting more, smaller PRs that require more review cycles?
  • Are we measuring time to merge, or time to submit? The review bottleneck might be shifting, not the productivity itself.
  • How does the 10th PR milestone correlate with actually understanding the codebase vs knowing how to prompt the AI effectively?

I’m not saying the metric is wrong, but proxy metrics can mislead if we’re not careful about what they’re actually measuring.

The $21K Savings Claim

I’d want to see the calculation methodology. Is this:

  • Reduced senior engineer time spent answering questions?
  • Faster time-to-productivity valued at fully-loaded cost?
  • Reduced turnover costs?

All of these are reasonable, but they compound assumptions. The actual ROI probably varies significantly by team, role complexity, and documentation quality.

Also worth noting: Disco costs $359/month ($4,300/year), Glean is enterprise pricing (probably $50K+ annually for a company), and Port.io has its own costs. The tooling investment isn’t trivial, and needs to be factored into the savings calculation.

What I’d Want to Measure

If I were designing the measurement system:

  1. A/B test onboarding cohorts - some with AI tools, some without, measure time to defined competency milestones
  2. Track AI reliance over time - do engineers gradually use AI less as they learn, or does usage stay constant (suggesting dependency)?
  3. Measure knowledge transfer quality - can engineers explain systems they learned with AI assistance? Or just navigate them?
  4. Long-term retention by onboarding method - 12-18 month view, not just first-year

Where I Land

I’m cautiously optimistic. The qualitative benefits (reduced interruptions, faster environment setup) are real. But I’d want better data before claiming specific ROI numbers.

The 49 vs 91 days finding is compelling for new hire onboarding specifically. I just think we should be careful about generalizing AI productivity benefits beyond that specific context.


data_rachel