Last Tuesday, I sat in a board meeting where our CFO looked me straight in the eye and asked: “Michelle, can you just give me ONE number that tells me if engineering is productive?”
I wanted to laugh. Then cry. Then pull up the 47 tabs I have open comparing DORA vs SPACE vs DX Core 4.
Here’s what I said instead: “If I gave you one number, it would be wrong. And 66% of my team wouldn’t trust it anyway.”
The Trust Crisis Nobody’s Talking About
According to JetBrains’ 2025 State of Developer Ecosystem survey, 66% of developers don’t believe current metrics reflect their true contributions. That’s not a measurement problem—that’s a legitimacy crisis.
And it gets worse. 55% of developers reported that metrics were used punitively—to justify layoffs, enforce micromanagement, or penalize teams during performance reviews. When your measurement system becomes a weapon, developers learn to game it or ignore it entirely.
Meanwhile, 29.6% of platform teams measure nothing at all, making ROI conversations with executives impossible. We’re stuck between metric distrust and metric vacuum.
Framework Fatigue: DORA, SPACE, DX Core 4… What’s the Difference?
If you’re like me, you’ve spent the past 6 months drowning in framework comparisons:
DORA (DevOps Research and Assessment): The incumbent. Four metrics—Deployment Frequency, Lead Time for Changes, Change Failure Rate, and Mean Time to Recovery. DORA is focused on CI/CD pipeline health, which matters, but it’s not the full picture. You can have stellar DORA metrics while shipping features nobody uses.
SPACE Framework: The academic darling. Five dimensions—Satisfaction, Performance, Activity, Communication & collaboration, Efficiency & flow. SPACE explicitly acknowledges “there is no one measure of productivity” and forces you to think multidimensionally. But try presenting that to a board that wants simplicity.
DX Core 4: The new unifier. Four dimensions—Speed, Effectiveness, Quality, Business Impact. Developed in collaboration with the authors of DORA, SPACE, and DevEx, DX Core 4 aims to be practical: can be deployed in weeks (not months), uses readily-available system metrics, and minimizes survey fatigue.
The Executive Dilemma: They Want Simplicity, Research Says “No Single Metric”
Here’s the tension: Every framework’s documentation explicitly warns against single-number productivity measurement. The SPACE framework’s core insight is that multidimensional measurement is non-negotiable.
But executives operate in a world of comparable simplicity:
- Sales has ARR and win rate
- Marketing has CAC and LTV
- Finance has margins and burn rate
- Engineering has… “it’s complicated”?
When your board member asks for “one number,” they’re not being stupid. They’re pattern-matching from every other business function they oversee. Engineering feels like a black box.
What We Actually Use (And Why It’s Still Imperfect)
At my SaaS company, we’ve landed on a hybrid approach:
- DORA metrics for pipeline health: Deployment frequency and lead time give us operational baseline
- SPACE satisfaction surveys (quarterly): Developer sentiment is our early warning system for attrition and burnout
- Custom “business impact” metric: Features shipped weighted by actual revenue impact or user adoption (tracked in Amplitude)
We present this as an “Engineering Health Dashboard” to leadership—three sections, not one number. It took 6 months of education, but now our CFO understands that asking for “one number” is like asking “what’s one number that captures your company’s financial health?” (Hint: it’s never just revenue or just profit.)
But I’ll be honest: It’s still imperfect. Platform team work doesn’t map cleanly to revenue. Incident prevention efforts show up as “no deploys this sprint” in DORA. Developer happiness surveys have lag time—by the time satisfaction drops, your best engineers are already interviewing elsewhere.
The Hard Truth: You Can’t Collapse Complexity, But You CAN Tell a Story
What I’ve learned: Modern best practice is combining DORA with SPACE, DX Core 4, and flow metrics to capture the full picture. But executives don’t read 30-page reports.
The breakthrough wasn’t simplifying the data. It was improving the narrative.
Now our monthly engineering review is 4 slides:
- Slide 1: Speed (DORA deployment frequency + lead time)
- Slide 2: Quality (DORA change failure rate + P0 incidents)
- Slide 3: Team health (SPACE satisfaction + retention rate)
- Slide 4: Impact (Revenue-enabled features + platform adoption)
Each slide includes 1-2 anecdotes. Numbers tell you what’s happening. Stories tell you why it matters.
My Question for This Community
How do you balance executive simplicity needs with measurement complexity?
- What framework(s) have you adopted and why?
- How do you handle the “just give me one number” request?
- For those with platform teams—how do you measure value when there’s no direct revenue attribution?
- Have you found ways to rebuild developer trust in metrics after years of punitive use?
I’m especially curious to hear from product and business leaders: What would help you understand engineering productivity without forcing oversimplification?
Because right now, we’re caught between executive pressure for simplicity and developer distrust of any measurement at all. And with AI coding tools reshaping how we work, 2026 is forcing a reckoning on what “productivity” even means.
What’s working for you?