Controversial take: Psychological safety predicts DevEx success better than DORA metrics.
I know that sounds wild. DORA metrics are the gold standard—deployment frequency, lead time, MTTR, change failure rate. They’re measurable, they’re concrete, they’ve been validated across thousands of organizations.
But after running a 6-month experiment across 8 engineering teams, I’m convinced we’re measuring the wrong thing.
The Experiment
Context: Mid-stage SaaS company, leading cloud migration, 8 engineering teams (~50 engineers total).
Hypothesis: DevEx initiatives fail because engineers don’t feel safe voicing concerns early when problems are fixable.
What We Measured:
- Psychological Safety - Edmondson’s validated framework
- DevEx Satisfaction - Internal DXI scores (developer experience index)
- Platform Adoption Rates - Usage metrics across teams
- DORA Metrics - For comparison
The Results (correlation coefficients):
- Psych Safety ↔ DevEx Satisfaction: 0.78
- Psych Safety ↔ Platform Adoption: 0.82
- DORA Metrics ↔ DevEx Satisfaction: 0.43
Let that sink in. Psychological safety was nearly twice as predictive as technical metrics.
Why This Matters
High Psychological Safety Teams:
- Gave honest feedback early (“This deployment process is confusing”)
- Caught problems before they became crises
- Actively shaped platform improvements
- Became advocates and helped other teams
Low Psychological Safety Teams:
- Claimed everything was fine in meetings (it wasn’t)
- Suffered in silence until frustration boiled over
- Built workarounds instead of reporting issues
- Had 4x higher shadow IT usage
The low-psych-safety teams technically had decent DORA metrics. They were shipping. But they were miserable, and they were doing it around the platform, not with it.
The Research Backs This Up
ACM Queue’s study on developer productivity: “Human factors like psychological safety, team collaboration, and clear communication have substantial impact on effectiveness.”
DX research from 2026: 62% of developers cite non-technical factors (collaboration, communication, clarity of goals) as critical to productivity, vs. 51% citing technical factors.
We’re optimizing the wrong variables.
How to Build Psychological Safety
This isn’t soft skills fluff. It’s concrete practices:
1. Blameless Postmortems (Actually Blameless)
We ran 12 incident retrospectives. Not a single person was blamed. We focused on systems and processes. Trust built fast.
2. Leaders Admitting Mistakes Publicly
I started every town hall with “Here’s what I got wrong last quarter.” Senior engineers followed suit. Vulnerability became normal.
3. Rewarding Dissenting Voices
When someone said “I disagree with this direction,” I publicly thanked them and explored their reasoning. That story spread: It’s safe to push back here.
4. Anonymous Feedback + Visible Action
Quarterly anonymous surveys. Published themes and specific actions we’d take. People saw their feedback mattered.
5. “Strong Opinions, Weakly Held” Culture
Made it normal to propose ideas and change them when evidence emerged. Ego detachment.
The ROI
DevEx initiatives cost roughly the same whether psychological safety is high or low.
But success rates were 3x higher when psych safety was built first.
Time to full adoption:
- High psych safety: 4 months
- Low psych safety: 18 months
Same platform. Same tools. Different culture. Radically different outcomes.
The Question
Do you measure psychological safety in your organization? Should we?
Part of me wonders if we’ve gotten so focused on quantifiable metrics that we’ve ignored the human substrate they rest on. You can’t deploy quickly if your team is afraid to say “this broke something.”
Thoughts?