The Leadership Development Crisis: How to Build Empathy at Engineering Scale

In the past 18 months, I’ve scaled our engineering organization from 50 to 120 people. That growth required promoting 8 new engineering managers. Three of them burned out within 6 months. Two are still struggling. The rest are thriving.

The difference? Not technical skills. Not years of experience. Not even domain knowledge.

The difference was whether they had systematic training in the human skills that actually matter for leadership.

The Traditional Approach (That Failed)

When we promoted our first cohort of managers, I did what most companies do:

Failed Approach #1: Generic Leadership Books

  • Gave them copies of “The Manager’s Path,” “Radical Candor,” “First, Break All the Rules”
  • Scheduled monthly book club discussions
  • Expected them to self-educate

Result: One person read the books. Zero behavior change. The team rejected it as “corporate BS that doesn’t understand engineering culture.”

Failed Approach #2: Let Them Figure It Out

  • Promoted based on technical excellence
  • Assumed leadership skills would develop naturally
  • Provided support “when they asked for it”

Result: 3 managers burned out from imposter syndrome and overwhelming responsibility. We lost one to another company. Two took stress leave. Team morale on those squads dropped 40%.

I realized we were treating leadership like an innate talent instead of a learnable skill. We wouldn’t do that with technical skills—we don’t promote someone to senior engineer and say “figure out distributed systems on your own.” But that’s exactly what we were doing with people skills.

The Insight: Treat People Skills Like Technical Skills

What if we developed leadership capabilities with the same rigor and respect as technical capabilities?

That question led to our Technical Leadership Ladder—a systematic framework for building empathy and people skills at scale.

The Technical Leadership Ladder Framework

Junior Manager (Level 1): Foundation Skills

  • Core competency: Effective 1-on-1s
  • Learning outcomes: Active listening, career development conversations, feedback delivery
  • Practice format: Role-playing difficult conversations, peer shadowing
  • Assessment: 360 feedback from direct reports on listening and support
  • Time investment: 8 hours training + 2 hours/week practice

Mid-Level Manager (Level 2): Conflict Resolution

  • Core competency: Navigate team disagreements and interpersonal conflicts
  • Learning outcomes: Mediation skills, de-escalation techniques, facilitating difficult conversations
  • Practice format: Conflict simulation scenarios, co-facilitation with experienced managers
  • Assessment: Team psychological safety scores, conflict resolution time metrics
  • Time investment: 12 hours training + 4 hours/quarter practicing

Senior Manager (Level 3): Organizational Design

  • Core competency: Structure teams and processes for human thriving
  • Learning outcomes: Team topology, communication patterns, role definition, growth pathways
  • Practice format: Team design workshops, organizational system thinking exercises
  • Assessment: Team retention, engagement scores, cross-team collaboration metrics
  • Time investment: 16 hours training + monthly peer review sessions

Staff Manager (Level 4): Cultural Architecture

  • Core competency: Shape organizational culture and leadership norms
  • Learning outcomes: Culture diagnosis, intervention design, systemic change management
  • Practice format: Culture case studies, peer coaching on cultural challenges
  • Assessment: Org-wide engagement, leadership bench strength, culture survey trends
  • Time investment: 20 hours training + quarterly strategy sessions

The Implementation: Leadership as Craft

1. Peer Code Reviews… for Difficult Conversations

Just like we review each other’s code, managers review each other’s people challenges:

  • Present anonymized scenario (“engineer resisting feedback,” “team conflict over technical decision”)
  • Peers suggest approaches
  • Original manager shares what they tried and results
  • Group extracts learnings

This created a judgment-free space to admit struggles and learn from each other.

2. Shadowing + Feedback

New managers shadow experienced managers:

  • Sit in on performance reviews (with employee permission)
  • Observe conflict mediation sessions
  • Watch difficult feedback conversations
  • Debrief afterward: what did you notice? What would you have done differently?

Then experienced managers shadow new managers and provide feedback.

3. Leadership Snippets Library

We created reusable templates for common situations:

  • Delivering negative feedback
  • Career development conversation frameworks
  • Conflict mediation scripts
  • One-on-one agendas for different scenarios

Not to be followed robotically, but as starting points for people new to management.

4. Quarterly Leadership Retrospectives

Same format as sprint retros, but for leadership:

  • What leadership practices worked well?
  • What people challenges did we face?
  • What will we try differently next quarter?
  • What support do we need?

Documented and tracked like technical decisions.

The Results (And Why They Matter)

Manager Satisfaction:

  • Up 50% (measured via anonymous quarterly surveys)
  • Burnout risk scores dropped from high to moderate across new manager cohort
  • Zero manager attrition in past 9 months (vs. 3 in previous cohort)

Team Impact:

  • Promotion rate for underrepresented engineers doubled (from 12% to 24% of promotions)
  • Engagement scores up 18% on teams with trained managers
  • Time-to-productivity for new hires decreased 30%

Business Outcomes:

  • Project delivery predictability up 25%
  • Cross-team collaboration initiatives up 3x
  • Engineering retention improved from 82% to 91%

The Controversial Parts

Time Investment:
This program requires 8-20 hours of training per manager level, plus ongoing practice time. That’s expensive.

Counterpoint: The cost of management failure is higher. One burned-out manager affects 8-12 engineers. One poor people decision can cost $200K in attrition.

Mandatory Participation:
We made leadership development non-optional for anyone in a management role.

Some pushed back: “I didn’t become a manager to sit through empathy training.”

Response: “You became a manager to lead people. This is how you learn to do that well. It’s as mandatory as security training.”

Measuring Soft Skills:
We track metrics on listening quality, conflict resolution speed, psychological safety.

Criticism: “You can’t quantify empathy.”

Response: “You can quantify the outcomes of empathy: retention, engagement, team velocity, innovation rate.”

The Question That Drives This Work

How do we make leadership development as rigorous and respected as technical development?

In engineering culture, we celebrate technical growth. We have clear career ladders, skill assessments, peer learning, and continuous improvement.

But leadership development is often treated as optional, soft, hard-to-measure, and lower priority than technical work.

That needs to change. Because as AI handles more technical work, leadership—the uniquely human work of building trust, resolving conflict, inspiring teams, and developing people—becomes our core competency.

What I’m Still Figuring Out

  1. Scale: This works for 8 managers. Will it work for 20? 50?
  2. Customization: Different people need different development paths. How do we personalize at scale?
  3. Promotion criteria: Should people skills be REQUIRED for promotion to senior IC roles, or only for management?
  4. Business case: How do I defend leadership development budget when under pressure to cut costs?

Questions for This Community

  • What leadership development approaches have worked (or failed) in your orgs?
  • How do you balance “leadership is a craft that requires training” with “leadership should feel authentic, not scripted”?
  • What metrics do you use to assess leadership effectiveness beyond engagement surveys?
  • How do you prevent leadership training from becoming performative corporate theater?

Leadership is too important to leave to chance. We need systematic, rigorous, respectful approaches to developing the human skills that technology can’t replace.

What’s working for you?


References:

Michelle, this framework is exactly what I needed. I’m scaling from 25 to 80 engineers and facing the exact same challenges. But I want to build on your Technical Leadership Ladder with something that’s worked incredibly well for us.

The Leadership PRD

When we promote someone to manager, we create a “Leadership Product Requirements Document” for their role—treating the management role like we’d treat launching a new product.

Here’s the template:

Leadership PRD Template

Role: Engineering Manager, Platform Team

Success Metrics:

  • Team engagement score: Target 85%+
  • Delivery predictability: ±1 week on quarterly commitments
  • Team growth: 2 promotions per year
  • Retention: <10% regrettable attrition
  • Cross-team collaboration: 3+ joint projects per quarter

Key Stakeholders:

  • Direct reports (8 engineers)
  • Peer managers (4)
  • VP Engineering (1)
  • Product partners (2)
  • Recruiting (ongoing)

Core Dependencies:

  • HR for compensation decisions
  • Finance for budget approvals
  • Senior leadership for strategic alignment
  • Peer managers for cross-team coordination

Quarterly Milestones:

  • Q1: Establish 1-on-1 rhythms, complete team assessment
  • Q2: Implement feedback culture, address one team dysfunction
  • Q3: Develop succession plan, drive one cross-team initiative
  • Q4: Performance review cycle, team roadmap for next year

Known Risks:

  • First-time manager, will need coaching support
  • Team has recent attrition, trust may be low
  • Technical skills gap in platform domain

This document does three things:

  1. Makes expectations concrete (no more “just be a good leader”)
  2. Identifies support needed (who will coach them? What resources?)
  3. Creates accountability (quarterly check-ins against the PRD)

Quarterly Leadership Retrospectives

Your idea of treating leadership with sprint-like rigor resonated deeply. We do quarterly leadership retrospectives with the same structure as technical retros:

What went well:

  • Successfully mediated conflict between senior engineers
  • Improved 1-on-1 quality based on feedback
  • Launched mentorship program

What was challenging:

  • Struggled with performance management for underperformer
  • Felt unprepared for compensation discussions
  • Didn’t allocate enough time for strategic thinking

What we’ll try next quarter:

  • Shadow VP Eng during difficult performance conversation
  • Complete compensation calibration training
  • Block 4 hours/week for strategic planning

Support needed:

  • Coaching on performance improvement plans
  • Peer manager for compensation benchmarking discussion
  • Executive sponsor for strategic initiatives

These retros surface patterns. When 3 managers struggle with performance management, that’s a training gap we need to address systematically.

The Progression Framework

I love your Technical Leadership Ladder. Here’s how we’ve mapped it to our career ladder:

IC Track              → Manager Track
Senior Engineer       → Junior Manager (L1): 1-on-1s, Feedback
Staff Engineer        → Mid Manager (L2): Conflict, Team Health
Senior Staff Engineer → Senior Manager (L3): Org Design, Strategy
Principal Engineer    → Director (L4): Culture, Multi-team Leadership

What’s critical: People skills are ALSO required for senior IC roles.

Staff+ engineers need conflict resolution, influence without authority, mentorship skills. They’re not managing people, but they’re absolutely leading.

So the same training applies, just adapted for IC context:

  • Staff ICs take L2 conflict resolution training
  • Senior Staff ICs participate in L3 organizational design workshops
  • Principal ICs are expected to contribute to L4 cultural architecture

This removes the stigma that “people skills are only for managers” and creates shared language across IC and management tracks.

The Template Library

Your “Leadership Snippets Library” is brilliant. We have something similar but organized by situation:

Performance Conversations:

  • Delivering negative feedback
  • Discussing promotion readiness
  • Performance improvement plan kickoff
  • Celebrating wins

Career Development:

  • Career goals exploration
  • Growth plan creation
  • Promotion packet preparation
  • Lateral move discussions

Team Dynamics:

  • Conflict mediation opening script
  • Retrospective facilitation guides
  • Team charter creation
  • Setting team norms

1-on-1 Structures:

  • Weekly check-in agenda
  • Monthly career conversation
  • Quarterly goal review
  • Annual reflection

The key is these are STARTING POINTS, not scripts. New managers customize them to their style, but having a template removes the “blank page anxiety” of “what do I even say?”

What We Measure

Beyond engagement surveys, we track:

Leading Indicators:

  • 1-on-1 completion rate (target: 95%+)
  • Time-to-feedback (how quickly managers address issues)
  • Career development plan coverage (% of reports with active plans)
  • Manager-direct report meeting quality scores

Lagging Indicators:

  • Promotion rate by manager (are they developing people?)
  • Retention by manager (are people staying?)
  • Engagement by manager (are teams thriving?)
  • Performance distribution by manager (are they differentiating?)

When we see outliers—a manager with 40% attrition or 100% “meets expectations” ratings—that’s a coaching opportunity, not a performance problem.

The Pushback Question

You asked: “How do you prevent leadership training from becoming performative corporate theater?”

This is my biggest fear too. Here’s what we do:

  1. Make it engineering-specific: No generic corporate consultants. All examples from actual engineering contexts.

  2. Peer-led learning: Senior managers facilitate, not HR or external trainers. It’s engineers teaching engineers.

  3. Real scenarios: We use anonymized real situations from our teams, not hypothetical case studies.

  4. Immediate application: Every training session ends with “what will you try this week?” and we follow up.

  5. Opt-in depth: Core training is mandatory, but advanced topics are opt-in. People self-select into what they need.

The Business Case

When defending budget, I use this framework:

Cost of leadership development: $3K per manager per year
Cost of leadership failure:

  • Burned out manager: $200K (replacement + team impact)
  • Regrettable attrition: $150K per engineer
  • Failed project due to team dysfunction: $500K+

Break-even: If this program prevents ONE manager burnout or TWO engineer departures, it pays for itself.

Last year, we invested $24K in leadership development for 8 managers.

We retained all managers and lost zero engineers to management-related issues.

ROI: Easily 10x, likely 20x.

What I’d Love Your Input On

How do you handle the managers who resist this training? The “I’m technical, not touchy-feely” crowd?

We had one manager who openly mocked the empathy training. His team had 50% attrition in 6 months. We had to move him back to IC.

But I wonder if we could have intervened earlier with a different approach. What’s worked for you?

Michelle, the Technical Leadership Ladder is a game-changer. I’m implementing a version of this immediately. But I want to add a dimension you touched on but didn’t fully develop: Leadership Pair Programming.

Leadership Pair Programming

We treat leadership development like we treat technical development. And one of the most effective ways engineers learn is pair programming.

So we do Leadership Pair Programming:

Structure:

  • New manager shadows experienced manager in real leadership situations
  • Experienced manager narrates their thinking (like “thinking aloud” in pair programming)
  • Afterward, they debrief: what did you notice? What was I considering? What would you have done?

Scenarios we’ve practiced:

  1. Performance Review Conversation

    • New manager observes (with employee consent)
    • Senior manager models: setting context, delivering feedback, listening, creating development plan
    • Debrief: Why did I start with that question? How did I handle defensiveness? What body language did you notice?
  2. Conflict Mediation

    • Two engineers in technical disagreement about architecture
    • Senior manager facilitates, new manager observes
    • Debrief: How did I stay neutral? When did I let them debate vs. intervene? How did I get to resolution?
  3. Difficult 1-on-1

    • Engineer expressing burnout or frustration
    • New manager sits in (with permission) or listens to recording (anonymized)
    • Debrief: What signals of distress did you catch? How did I validate feelings while still addressing performance?
  4. Promotion Calibration

    • Manager peer group debating promotion decisions
    • Junior managers observe how senior managers advocate, negotiate, and reach consensus
    • Debrief: How do we balance advocacy for our reports with organizational standards?

The Leadership Snippets Library

Your idea sparked something for us. We created a leadership pattern library, organized like design patterns in software:

Pattern: Delivering Difficult Feedback

Context: Engineer is underperforming or exhibiting problematic behavior

Problem: Need to address issue without damaging relationship or morale

Solution:

1. State observation (specific, factual)
2. Describe impact (on team, project, or goals)
3. Ask for perspective ("What's your read on this?")
4. Collaborate on solution ("What support do you need?")
5. Agree on next steps (specific, measurable)
6. Schedule follow-up (accountability)

Example:
“I noticed you’ve missed 3 sprint commitments in the last 6 weeks. This is affecting the team’s ability to plan and our delivery commitments to Product. What’s going on from your perspective? … What support would help you meet commitments? … Let’s check in next week to see how it’s going.”

Anti-patterns:

  • Sandwich feedback (positive-negative-positive) → feels manipulative
  • Feedback in public → embarrassing
  • Vague feedback (“you need to communicate better”) → not actionable

We have 20+ patterns now. New managers reference them constantly.

The Controversial Idea: Leadership Linters

Here’s something I’m experimenting with (and I know it’s controversial):

What if we had “linters” for leadership communication?

In code, we use linters to catch problematic patterns:

  • Unused variables
  • Overly complex functions
  • Inconsistent formatting
  • Potential bugs

What if we had similar tools for leadership communication?

Example: AI analysis of written feedback to managers flagging:

  • Vague language (“be more proactive” → suggest specific behaviors)
  • Blame language (“you always…” → suggest observation-based framing)
  • Missing context (feedback without impact statement)
  • Unbalanced feedback (all negative, no positive)

I piloted this with 3 managers. They found it helpful for catching blind spots in their writing before sending.

But it felt… weird. Like we’re reducing human interaction to rules.

Is this valuable automation? Or dehumanizing surveillance?

I genuinely don’t know.

The Cultural Challenge: Feedback on People Skills

The hardest part of leadership development isn’t the training. It’s the feedback culture.

Giving technical feedback is normalized:

  • “This function is too complex, let’s refactor”
  • “This variable name is unclear”
  • “This approach won’t scale”

Giving leadership feedback feels personal:

  • “Your 1-on-1s lack depth” → feels like “you’re not empathetic enough”
  • “You avoided conflict in that meeting” → feels like “you’re a coward”
  • “Your feedback was vague” → feels like “you’re a bad communicator”

How do we create the same dispassionate, improvement-focused culture for people skills that we have for technical skills?

What’s worked:

  1. Normalize imperfection: I share my own leadership failures openly. “I completely mishandled that 1-on-1 last week. Here’s what I learned.”

  2. Focus on impact, not character: “That approach led to confusion” vs. “You’re confusing”

  3. Frame as skill development: “1-on-1s are a skill. You’re at junior level. Here’s how to level up.”

  4. Use peer feedback, not just top-down: Managers give each other feedback in peer groups, not just from me.

But it’s still hard. People take feedback on communication more personally than feedback on code.

The Mentorship Model

Every new manager is paired with a senior manager mentor (not their direct manager):

Mentor responsibilities:

  • Monthly 1-on-1s to discuss leadership challenges
  • Available for real-time coaching (“I have a difficult conversation in 30 minutes, can we prep?”)
  • Reviews difficult communications before they’re sent
  • Shadows new manager and provides feedback

Time commitment: 2-3 hours/month per mentee

Compensation: We count this as 10% of the senior manager’s role (recognized in performance reviews and compensation)

This creates a leadership development culture where mentorship is valued, not volunteered.

Questions for You

Your framework is excellent. I have two implementation questions:

  1. Time allocation: How do you balance “time invested in leadership development” with “time spent actually leading”? New managers already feel overwhelmed. Adding 8-20 hours of training feels impossible.

  2. Resistance handling: What do you do with the engineer who was promoted to management for technical skills but actively resists developing people skills? Do you force it? Move them back to IC? Something else?

This is the most comprehensive leadership development framework I’ve seen. Thank you for sharing it.

okay this thread is blowing my mind because I’ve been thinking about leadership development through a design lens and it maps SO WELL to what you’re describing :exploding_head:

Leadership Is a Craft (Just Like Design)

In design, we don’t expect people to be naturally good at it. We teach it systematically:

  • Design principles
  • User research methods
  • Visual hierarchy
  • Prototyping and iteration
  • Critique and feedback

Why don’t we treat leadership the same way?

Your Technical Leadership Ladder is basically a design curriculum for people skills. I love it.

Leadership Critique Sessions

In design, we have critique sessions where we:

  • Present work-in-progress
  • Get feedback from peers
  • Discuss what’s working, what’s not
  • Iterate based on input

What if we did Leadership Critique Sessions for managers?

Format:

  • Manager presents a leadership scenario they’re working through (anonymized if needed)
  • “I’m struggling with an engineer who’s technically brilliant but dismissive in code reviews. I’ve tried X and Y. Here’s what happened…”
  • Peer managers give feedback:
    • “What if you tried Z?”
    • “I faced something similar, here’s what worked…”
    • “Have you considered the root cause might be…?”
  • Manager iterates on their approach
  • Follow-up next session: “Here’s what I tried, here’s what happened”

This is like design critique but for leadership challenges.

We could even use the same critique structure:

  • What’s the goal? (what outcome are you trying to achieve?)
  • What have you tried? (what approaches have you tested?)
  • What’s working? (what positive signals have you seen?)
  • What’s not working? (where are you stuck?)
  • What could you try next? (brainstorm alternatives)

Empathy Exercises from Design Thinking

In design, we do empathy exercises to understand users. What if we did the same for leadership?

Exercise: Walk in Your Report’s Shoes

Pick one of your direct reports. Spend a day experiencing their work:

  • Attend their meetings
  • Review their task queue
  • Observe their code reviews
  • Notice their blockers
  • Feel their frustrations

Then 1-on-1 with them and compare notes:

  • “Here’s what I noticed about your day…”
  • “What did I miss?”
  • “What would make your work easier?”

This builds REAL empathy, not theoretical empathy.

Exercise: User Testing Your 1-on-1s

In design, we user-test everything. Why not 1-on-1s?

After each 1-on-1, ask your report:

  • “Was this time valuable for you?” (1-10)
  • “What should we spend more time on?”
  • “What should we spend less time on?”
  • “What’s one thing I could do differently?”

This is basically user research for your leadership practice.

I started doing this 2 months ago. My 1-on-1 quality scores went from 6.5/10 to 8.7/10 in 6 weeks. Turns out my reports wanted more career conversation and less status updates (which they can Slack me).

The AI Flight Simulator Idea

Luis mentioned leadership linters. I want to take it further:

What if we used AI to simulate difficult leadership conversations?

Like a flight simulator for pilots, but for managers:

  • AI roleplays different employee personalities (defensive, burnt out, ambitious, disengaged)
  • Manager practices difficult conversations
  • AI provides feedback on approach
  • Manager iterates until they find effective strategies

This lets managers practice high-stakes conversations in low-stakes environments.

I know this sounds dystopian. But we use design tools to simulate user interactions before building features. Why not simulate people interactions before having difficult conversations?

Thoughts? Too weird?

Leadership as Portfolio Work

In design, we build portfolios to showcase our craft. What if managers did the same?

Leadership Portfolio:

  • Case studies of difficult situations handled
  • Before/after team metrics (engagement, velocity, retention)
  • Examples of feedback given and impact
  • Documentation of team culture initiatives
  • Peer recommendations and 360 feedback

This makes leadership development visible and valued the same way we value technical portfolios.

And when promoting managers, we review their leadership portfolio, not just their team’s output.

The Performative Theater Problem

You asked how to prevent leadership training from becoming corporate theater.

In design, we avoid this by:

  1. Working with real problems, not hypotheticals
  2. Shipping real solutions, not just mockups
  3. Measuring real impact, not just completion rates

Same should apply to leadership development:

  1. Real problems: Use actual challenges from your teams, not generic case studies
  2. Real solutions: Require managers to apply learnings and report back
  3. Real impact: Measure team outcomes (engagement, retention, delivery), not training completion

If a manager completed all the training but their team has 50% attrition, the training failed. Measure outcomes, not inputs.

The Question I Can’t Stop Thinking About

In design school, they teach us that constraints breed creativity.

What if we applied this to leadership development?

Instead of “here’s 20 hours of training, good luck,” what if we gave new managers specific constraints:

  • “Improve your 1-on-1 quality scores by 2 points in 30 days”
  • “Reduce team conflict resolution time from 2 weeks to 3 days”
  • “Increase engagement score from 65% to 75% this quarter”

And then gave them tools, mentorship, and peer support to hit those constraints.

This creates urgency (I need to learn this NOW) and focus (I need to learn THIS SPECIFIC THING) instead of generic “become a better leader.”

What I’d Love to See

A Leadership Design System for engineering organizations:

  • Reusable components (1-on-1 templates, feedback frameworks, conflict resolution patterns)
  • Design principles (psychological safety, radical candor, growth mindset)
  • Usage guidelines (when to use each component)
  • Accessibility considerations (how to adapt for neurodivergent team members, different cultures)
  • Version history (how our leadership practices evolve)

Just like a design system makes product development faster and more consistent, a leadership design system would make people development faster and more consistent.

Is anyone building this? Should we build this together?

This thread has me so inspired. Leadership development is one of the most important design problems we’re not treating as a design problem.

Michelle, this framework is excellent. But I want to challenge one assumption and add a product management lens that might strengthen the business case.

The Business Alignment Question

Your Technical Leadership Ladder focuses on internal metrics: engagement, retention, team health. These are important.

But when I defend leadership development budget to finance and executive teams, they care most about business outcomes.

So I’ve started tying leadership development directly to business metrics:

Product Velocity:

  • Features shipped per quarter
  • Time from idea to customer
  • Percentage of roadmap delivered on time

Product Quality:

  • Post-launch defect rate
  • Customer satisfaction scores
  • Support ticket volume

Innovation Rate:

  • New product proposals from engineering
  • Experiments run per quarter
  • Successful product pivots

Revenue Impact:

  • Product-driven revenue growth
  • Cost savings from efficiency improvements
  • Technical enablement of new revenue streams

Here’s what I found: Good people management directly impacts these metrics.

Example from my previous company:

Two engineering teams, similar size and scope:

  • Team A: Manager with strong people skills (trained in conflict resolution, 1-on-1s, feedback)
  • Team B: Manager with weak people skills (promoted for technical excellence, no training)

Business outcomes over 6 months:

Team A (good people management):

  • Shipped 23 features (90% of roadmap)
  • 2% post-launch defect rate
  • NPS: 72
  • 0% attrition

Team B (poor people management):

  • Shipped 14 features (55% of roadmap)
  • 8% post-launch defect rate
  • NPS: 48
  • 25% attrition

Business impact of good people management:

  • 64% more features delivered
  • 75% fewer defects
  • 50% higher customer satisfaction
  • $600K saved in attrition costs

When I presented this to finance, leadership development budget was approved in 10 minutes.

The ROI Template for Leadership Development

Here’s the business case template I use:

Investment:

  • Training program: $50K
  • Manager time (8 hrs/person × 10 managers × $150/hr): $12K
  • Ongoing coaching: $20K/year
  • Total: $82K

Return:

Productivity Gains:

  • 20% improvement in roadmap delivery (conservative estimate)
  • Value: 20% more features × average feature value of $100K = $200K/year

Quality Improvements:

  • 50% reduction in post-launch defects
  • Savings: 50% × eng time spent on bug fixes (200 hrs/year) × $150/hr = $15K/year

Retention Value:

  • Historical attrition with poor management: 20%
  • Projected attrition with good management: 10%
  • Engineers retained: 1 per year
  • Cost to replace: $200K (recruiting, onboarding, ramp time)
  • Savings: $200K/year

Innovation Unlocked:

  • Engineers spending less time on team dysfunction, more on innovation
  • 2 additional product experiments per quarter
  • Historical success rate: 25%
  • Value per successful experiment: $500K
  • Expected value: 2 experiments/quarter × 4 quarters × 25% × $500K = $1M/year

Total Annual Return: $1.415M

Investment: $82K

ROI: 17x in year 1

Obviously these numbers need to be customized for each org. But the framework of tying leadership development to business outcomes makes the investment defensible.

The Measurement Framework

To make this credible, I track:

Leading Indicators (things managers control):

  • 1-on-1 completion rate
  • Feedback delivery frequency
  • Conflict resolution speed
  • Career development plan coverage

Lagging Indicators (business outcomes):

  • Roadmap delivery percentage
  • Defect rate
  • Customer satisfaction
  • Retention rate
  • Innovation proposals

Correlation Analysis:
Every quarter, I analyze correlation between leading and lagging indicators.

Example findings:

  • Managers with 95%+ 1-on-1 completion have 30% better roadmap delivery
  • Teams with <1 week conflict resolution time have 2x higher innovation rate
  • Managers who deliver quarterly feedback have 50% lower attrition

This proves the connection between people practices and business outcomes.

The Challenge: Long-Term vs Short-Term

Here’s my tension with your framework:

Costs are immediate and concentrated:

  • $82K upfront
  • 8-20 hours of manager time (opportunity cost)
  • Ongoing coaching investment

Benefits are long-term and distributed:

  • Retention value realized over 12-24 months
  • Productivity gains compound over time
  • Culture improvements take quarters to manifest

In quarterly-driven organizations, this is a hard sell.

My solution: Pilot with one team, measure rigorously, show quick wins.

  • Month 1: Training + baseline metrics
  • Month 2-3: Application + coaching
  • Month 4: Show early results (engagement up, conflict down)
  • Month 6: Show business results (delivery up, attrition down)
  • Month 12: Full ROI analysis

Use early wins from pilot to fund broader rollout.

The Question of Scale

You asked: “Will this work for 20? 50 managers?”

Product answer: It depends on your platform strategy.

Option 1: Cohort-based training

  • Batch managers into cohorts of 8-12
  • Run same training 4x/year
  • Scales to ~50 managers/year
  • Cost-effective but less personalized

Option 2: Train-the-trainer model

  • Train 5 senior managers as facilitators
  • They train other managers
  • Scales to 100+ managers
  • Requires strong facilitator selection

Option 3: Leadership platform

  • Build internal platform with on-demand training, peer learning, mentorship matching
  • Scales infinitely
  • High upfront cost, low marginal cost

I’m experimenting with Option 3: building a Leadership Development Platform with:

  • Self-paced training modules
  • Peer learning cohorts
  • Mentorship matching algorithm
  • Practice simulations (AI-powered)
  • Progress tracking and certification

Think of it as a product for internal leadership development.

The Controversial Take

Should people skills be required for senior IC roles?

My answer: Absolutely yes.

Staff+ engineers have massive influence. They shape technical direction, mentor junior engineers, represent engineering in cross-functional meetings.

If they lack empathy, conflict resolution skills, or communication abilities, they’re as dangerous as managers with those gaps—maybe more so because they have technical authority.

So I require:

  • Senior Engineers: L1 people skills (feedback, 1-on-1s with mentees)
  • Staff Engineers: L2 people skills (conflict resolution, cross-team collaboration)
  • Principal Engineers: L3 people skills (organizational influence, culture shaping)

This isn’t about making everyone a manager. It’s about recognizing that all leadership requires people skills, whether you have direct reports or not.

Final Thought: Leadership Development Is Product Development

We should approach leadership development the same way we approach product development:

  • Understand user needs (what do managers struggle with?)
  • Build solutions (training, tools, support)
  • Measure impact (business outcomes, not just engagement)
  • Iterate based on feedback
  • Scale what works

Your Technical Leadership Ladder is a great product. Now let’s build the platform, measure the impact, and scale it.

Count me in.