We Measured Psychological Safety and It Predicted DevEx Success Better Than Any Technical Metric

Controversial take: Psychological safety predicts DevEx success better than DORA metrics.

I know that sounds wild. DORA metrics are the gold standard—deployment frequency, lead time, MTTR, change failure rate. They’re measurable, they’re concrete, they’ve been validated across thousands of organizations.

But after running a 6-month experiment across 8 engineering teams, I’m convinced we’re measuring the wrong thing.

The Experiment

Context: Mid-stage SaaS company, leading cloud migration, 8 engineering teams (~50 engineers total).

Hypothesis: DevEx initiatives fail because engineers don’t feel safe voicing concerns early when problems are fixable.

What We Measured:

  1. Psychological Safety - Edmondson’s validated framework
  2. DevEx Satisfaction - Internal DXI scores (developer experience index)
  3. Platform Adoption Rates - Usage metrics across teams
  4. DORA Metrics - For comparison

The Results (correlation coefficients):

  • Psych Safety ↔ DevEx Satisfaction: 0.78
  • Psych Safety ↔ Platform Adoption: 0.82
  • DORA Metrics ↔ DevEx Satisfaction: 0.43

Let that sink in. Psychological safety was nearly twice as predictive as technical metrics.

Why This Matters

High Psychological Safety Teams:

  • Gave honest feedback early (“This deployment process is confusing”)
  • Caught problems before they became crises
  • Actively shaped platform improvements
  • Became advocates and helped other teams

Low Psychological Safety Teams:

  • Claimed everything was fine in meetings (it wasn’t)
  • Suffered in silence until frustration boiled over
  • Built workarounds instead of reporting issues
  • Had 4x higher shadow IT usage

The low-psych-safety teams technically had decent DORA metrics. They were shipping. But they were miserable, and they were doing it around the platform, not with it.

The Research Backs This Up

ACM Queue’s study on developer productivity: “Human factors like psychological safety, team collaboration, and clear communication have substantial impact on effectiveness.”

DX research from 2026: 62% of developers cite non-technical factors (collaboration, communication, clarity of goals) as critical to productivity, vs. 51% citing technical factors.

We’re optimizing the wrong variables.

How to Build Psychological Safety

This isn’t soft skills fluff. It’s concrete practices:

1. Blameless Postmortems (Actually Blameless)
We ran 12 incident retrospectives. Not a single person was blamed. We focused on systems and processes. Trust built fast.

2. Leaders Admitting Mistakes Publicly
I started every town hall with “Here’s what I got wrong last quarter.” Senior engineers followed suit. Vulnerability became normal.

3. Rewarding Dissenting Voices
When someone said “I disagree with this direction,” I publicly thanked them and explored their reasoning. That story spread: It’s safe to push back here.

4. Anonymous Feedback + Visible Action
Quarterly anonymous surveys. Published themes and specific actions we’d take. People saw their feedback mattered.

5. “Strong Opinions, Weakly Held” Culture
Made it normal to propose ideas and change them when evidence emerged. Ego detachment.

The ROI

DevEx initiatives cost roughly the same whether psychological safety is high or low.

But success rates were 3x higher when psych safety was built first.

Time to full adoption:

  • High psych safety: 4 months
  • Low psych safety: 18 months

Same platform. Same tools. Different culture. Radically different outcomes.

The Question

Do you measure psychological safety in your organization? Should we?

Part of me wonders if we’ve gotten so focused on quantifiable metrics that we’ve ignored the human substrate they rest on. You can’t deploy quickly if your team is afraid to say “this broke something.”

Thoughts?

Michelle, this is my #1 focus right now and you’ve just validated everything I’ve been preaching.

I inherited a low-psych-safety culture at my current company. The pattern: Engineers said “yes” in meetings, then did absolutely nothing afterward. Took me 3 months to figure out why—they didn’t believe their input mattered, so why bother?

My Turnaround Approach

Started Every Meeting With: “What concerns do you have?”
Not “questions” or “thoughts.” CONCERNS. Made it explicit that I wanted to hear problems.

Publicly Thanked Problem-Raisers
When someone flagged an issue, I responded in Slack with “Thank you for raising this—exactly what I need to hear” with a checkmark emoji. Made that visible to everyone.

Made “I Was Wrong” a Regular Phrase
In standups, retros, planning meetings. When I got something wrong, I said it clearly. Senior team members started doing the same.

Created “Concerns Channel” in Slack
Visible to all of engineering leadership. Engineers could post concerns and see executives respond within 24 hours. Game-changer.

The Results

Initial flood: First month, 40+ concerns posted. Honestly scary—it revealed how much was broken.

Addressed top 5 immediately: Dropped other priorities to fix the most-raised issues. Showed we were serious.

Trust rebuilt over 6 months: Concerns became constructive instead of frustrated. People proposed solutions, not just complaints.

DevEx adoption went from crawl to sprint: Once engineers trusted we’d listen, they gave honest feedback on our platform. We fixed real issues. Adoption accelerated.

On Measurement

I use two tools:

Weekly Pulse Surveys: One question: “Did you feel comfortable voicing concerns this week?” 5-point scale. Track trend over time.

Quarterly Edmondson Assessments: Full psychological safety survey. More comprehensive, but the weekly pulse gives me leading indicators.

I also track: “How often do you voice concerns?” as a metric. High psych safety should mean more concerns raised, not fewer. Silence is a red flag.

The Scale Challenge

Your study was 8 teams (~50 engineers). I’m now at 80 engineers across 12 teams. Maintaining psychological safety at scale is way harder.

Some teams have it, others don’t. The culture fragments. I’m trying to standardize, but worried about losing the personal touch that builds trust.

Question for you: How do you maintain psychological safety across distributed/remote teams? The in-person vulnerability moments (admitting mistakes in a room together) don’t translate as well to Zoom.

Michelle, this explains why our DevEx reboot actually worked. We didn’t realize it at the time, but the “listening tour” was actually building psychological safety.

The Pivotal Moment:

In our first working group meeting, a senior engineer said bluntly: “This tool is terrible and I don’t understand why we’re using it.”

Old me might have gotten defensive. Instead, I said: “Tell me more—what specifically isn’t working?”

That story spread faster than anything else we did. People learned: It’s safe to be honest here.

Measurement Approach

We don’t have formal psych safety scores, but we track proxies:

Retrospectives: “Unsaid” vs “Said” Concerns
At the end of each retro, I ask: “What didn’t we talk about today?” If the answer is “nothing,” people are comfortable. If there’s a list, we have work to do.

Exit Interviews: Speaking Up
When engineers leave, we ask: “Did you feel you could voice concerns here?” Their answers are brutally honest because they’re already out the door.

Anonymous Quarterly Surveys
Including questions about comfort raising problems, disagreeing with decisions, admitting mistakes.

The Data Connection

Your correlation numbers are wild. We don’t have the data to prove it, but anecdotally, the teams with high psychological safety adopted our DevEx platform 3-4 months faster than teams where engineers were quiet in meetings.

It makes total sense: If you can’t say “this onboarding is confusing,” we can’t fix it. The tool stays confusing. Adoption stalls.

My Challenge

40+ engineers across 3 time zones. Some teams have strong psych safety, others don’t.

How do you standardize it? Can you mandate psychological safety? That feels like a paradox—“You will feel safe!” destroys safety.

I’m trying to lead by example, but my direct interaction with every engineer is limited at this scale.

Question: How do you ensure middle managers build psychological safety in their teams? Can you teach vulnerability?

Designer perspective: Psychological safety matters just as much for design work.

Previous company: Design reviews were brutal. Senior designers would tear apart work in front of the whole team. “This color choice is bad.” “This layout doesn’t work.” No explanation, just criticism.

Result? Designers stopped showing early work. They only shared near-final designs when it was too late to make major changes.

Outcome: Worse designs (no early feedback) and slower shipping (changes at the end are expensive).

Current team: We practice “Critique = Care”

  • Safe to show terrible first drafts
  • Feedback is specific and constructive
  • We critique the work, not the person
  • Everyone from junior to senior shows messy work

Result: Way better final outcomes because we catch problems early.

The Parallel to DevEx

If engineers can’t say “this platform sucks” when it’s in beta, it will suck longer. You’ll ship something that doesn’t work, then engineers will quietly avoid it.

Early honest feedback is a gift. But only psychologically safe teams give it.

On Measurement

My team health surveys include:

  • “I feel comfortable sharing work in progress”
  • “Feedback here makes my work better”
  • “I can disagree with design decisions”

We also track it in retros with a specific prompt: “What went unsaid this sprint?”

Vulnerability from Leaders First

Michelle, your point about leaders admitting mistakes publicly is critical. In design, I regularly share:

  • My startup failure story (company shut down, I learned hard lessons)
  • Design decisions I got wrong
  • Times I misunderstood user needs

Makes it safe for junior designers to fail. If the lead can mess up and still be respected, they can too.

Question: What do you do when leaders aren’t naturally vulnerable? Some executives see admitting mistakes as weakness. How do you coach that?