Here’s a uncomfortable question: How do you know your onboarding is effective?
Most organizations answer:
- “People seem to figure it out”
- “We haven’t had complaints”
- “Our retention is okay”
That’s not measurement. That’s assumption.
The Metrics That Matter
Leading Indicators (Early Signals)
These tell you if onboarding is on track while it’s happening:
| Metric |
Target |
Red Flag |
| Time to first commit |
Day 3-5 |
Day 8+ |
| Time to first PR merged |
Week 1-2 |
Week 3+ |
| Daily questions asked |
5-10 (week 1) |
<3 or declining rapidly |
| Buddy meeting attendance |
100% |
Missed sessions |
| Environment setup time |
<1 day |
>2 days |
Coincident Indicators (Progress Markers)
These track trajectory through the onboarding period:
| Metric |
Target |
Red Flag |
| 30-day satisfaction score |
8+/10 |
<6/10 |
| Manager confidence score |
7+/10 |
<5/10 |
| First feature ownership |
Week 2-3 |
Week 5+ |
| Code review participation |
Week 2 |
Not reviewing by week 4 |
Lagging Indicators (Outcomes)
These tell you if onboarding worked after it’s done:
| Metric |
Target |
Industry Average |
| 90-day retention |
95%+ |
~85% |
| Time to full productivity |
3-4 months |
6-9 months |
| 1-year retention |
90%+ |
77% |
| New hire NPS |
50+ |
Unknown (most don’t measure) |
The Feedback Loops
1. Day 7 Check-in (New Hire → Manager)
Questions:
- What’s been better than expected?
- What’s been harder than expected?
- What information did you need that you couldn’t find?
- How supported do you feel?
2. Day 30 Survey (New Hire → Onboarding Team)
Questions (1-10 scale):
- I felt prepared to do my job on day 1
- I understood what was expected of me
- My buddy was helpful
- My manager was available and supportive
- I would recommend this onboarding to future hires
3. Day 60 Assessment (Manager → Onboarding Team)
Questions:
- Is this person on track for their role?
- What would have made their ramp faster?
- Did our onboarding prepare them adequately?
4. Day 90 Exit Review (If Someone Leaves Early)
Questions:
- How did onboarding contribute to your decision to leave?
- What could we have done differently?
- At what point did you decide this wasn’t the right fit?
Making It Actionable
Data without action is just trivia. Here’s what we do:
Weekly: Review leading indicators for current new hires. Intervene if red flags appear.
Monthly: Review 30-day survey results. Identify patterns and quick fixes.
Quarterly: Analyze lagging indicators. Make structural changes.
Annually: Full onboarding program review. Major investments and redesigns.
You can’t improve what you don’t measure. Start measuring.
Time to first commit as a health signal is the most actionable metric I’ve found.
Why this metric works:
-
It’s objective: Either they committed code or they didn’t. No interpretation needed.
-
It’s early: You get the signal in days, not months. Enough time to intervene.
-
It reflects systemic health: A delayed first commit usually means upstream problems.
What delayed first commit tells you:
| If commit is delayed by… |
Then investigate… |
| Environment issues |
Golden path broken, documentation outdated |
| Access problems |
IT provisioning failed, pre-boarding incomplete |
| No appropriate task |
Task queue empty, manager unprepared |
| Mentor unavailable |
Buddy capacity issue, poor assignment |
| Overwhelming complexity |
Codebase needs better entry points |
How we track it:
GitHub webhook fires when anyone commits. If the committer is on our “new hire” list and it’s their first commit:
- Slack notification to #onboarding-wins channel
- Timestamp recorded against their profile
- Dashboard updated
If day 5 passes without a first commit:
- Alert to manager + buddy
- Escalation to onboarding lead
- Required check-in within 24 hours
The data we’ve collected:
Over 2 years, 80+ new hires:
| First Commit Timing |
1-Year Retention |
Time to Full Productivity |
| Day 1-3 |
94% |
2.8 months |
| Day 4-5 |
88% |
3.2 months |
| Day 6-10 |
71% |
4.5 months |
| Day 10+ |
52% |
6+ months |
The correlation is strong. Engineers who commit early tend to stay and ramp fast. Those who don’t are often struggling in ways we can address - if we catch it early.
First commit isn’t the only metric. But it’s the first signal that something might be wrong.
The new hire survey questions that actually matter - and what to do with the answers.
Day 7 Survey (5 questions)
-
“On a scale of 1-10, how prepared did you feel to contribute on day 1?”
- <7: Investigate pre-boarding gaps
- Pattern of low scores: Systemic issue
-
“What information did you need in week 1 that was hard to find?”
- Free text → feeds documentation improvements
- Common answers become FAQ additions
-
“How many hours did you spend with your buddy this week?”
- <3 hours: Flag to buddy’s manager
- Pattern: Buddy program capacity issue
-
“Have you had a 1:1 with your manager?”
- No: Immediate escalation
- Every “no” is a manager accountability issue
-
“One thing that would make next week better:”
- Actionable immediate feedback
- Manager receives and responds within 48 hours
Day 30 Survey (10 questions, scaled 1-10)
- I understand what success looks like in my role
- I have the tools and access I need to do my job
- My buddy has been helpful and available
- My manager has provided clear expectations
- I feel comfortable asking questions
- The team culture matches what I expected
- I understand how my work contributes to team goals
- I feel like a valued member of the team
- I would recommend joining this team to a friend
- Overall, how would you rate your onboarding experience?
What we do with scores:
| Average Score |
Action |
| 9-10 |
Celebrate, understand what worked |
| 7-8 |
On track, minor improvements |
| 5-6 |
Intervention needed, manager+HR meeting |
| <5 |
Critical, skip-level involvement, retention risk |
The key insight:
Most companies survey and file. We survey and act. Every <7 score triggers a specific follow-up. Every common complaint becomes a process improvement.
The survey isn’t measurement for measurement’s sake. It’s an early warning system with built-in responses.
Connecting onboarding metrics to business outcomes is how you get sustained investment.
The metrics engineering tracks:
- Time to first commit
- Time to first feature
- New hire satisfaction scores
- 90-day retention
The metrics business cares about:
- Cost per hire
- Time to fill positions
- Team velocity
- Roadmap predictability
- Revenue impact
Building the bridge:
1. Translate ramp time to velocity impact
If average ramp time is 6 months and you improve it to 4 months:
- Each new hire contributes 2 additional months of productivity in year 1
- For a $150K engineer, that’s ~$25K in recovered value per hire
- For 40 hires/year, that’s $1M in velocity gained
2. Translate retention to recruiting cost
If onboarding improves 90-day retention from 85% to 95%:
- 10% fewer early departures
- Each prevented departure saves ~$90K (recruiting + ramp replacement)
- For 40 hires/year, that’s 4 prevented departures = $360K saved
3. Translate predictability to planning accuracy
When new hires ramp faster and stay longer:
- Q2 capacity planning is actually accurate
- Feature commitments can be made confidently
- Fewer “surprise” delays from ramp issues
This is harder to quantify but very real in executive discussions.
The dashboard I review monthly:
| Metric |
Current |
Target |
Business Impact |
| Avg ramp time |
4.2 mo |
3.5 mo |
+$625K velocity |
| 90-day retention |
93% |
95% |
+$180K savings |
| New hire NPS |
62 |
70 |
Recruiting advantage |
| Time to first feature |
3.1 wk |
2.5 wk |
Faster roadmap execution |
When I present this to the CEO, it’s not “onboarding is important.” It’s “here’s how onboarding affects revenue and costs.”
That’s the language that gets budget.