I Sat In Our CFO's Quarterly Reviews for a Year. Here Are the Platform Metrics Finance Actually Tracks (Spoiler: Not DORA)

Last year, our CTO asked if I (as design systems lead) wanted to attend the quarterly business reviews with finance. I said yes, curious to see how our platform and design system investments were being evaluated.

What I discovered shocked me.

Engineering tracks DORA metrics religiously. Deployment frequency, MTTR, change failure rate, lead time for changes. We have dashboards, quarterly retrospectives, the whole thing.

Finance literally never asked about any of these metrics. Not once. Not in four quarterly reviews.

What Finance Actually Tracks

Here’s what came up in EVERY quarterly business review:

1. Revenue per Engineer

What it is: Total company revenue / total engineering headcount

Why finance cares: It’s the ultimate productivity metric from a finance lens. Are we getting more output per engineering dollar?

Pre-platform: 80K revenue per engineer
Post-platform: 20K revenue per engineer (24% improvement)

CFO’s question every quarter: “Why did this number move?” Platform’s job is to keep pushing this ratio higher as we scale.

2. Engineering Expense as % of Revenue

What it is: Total engineering costs / total revenue

Why finance cares: They want to know if engineering costs are scaling proportionally to revenue, or if we’re improving leverage.

The platform story: Platform should keep this ratio stable even as we grow. Without platform, this ratio climbs as coordination costs explode with scale.

Our data: Held steady at 22% even while scaling from 80 to 120 engineers - platform prevented coordination overhead from increasing the ratio.

3. Feature Delivery Velocity

What it is: Number of customer-facing features shipped per quarter

Why finance cares: NOT deployment frequency - they don’t care how many times we deploy. They care about features customers see and will pay for.

Pre-platform: 8 major features per quarter
Post-platform: 13 major features per quarter (62% increase)

CFO’s question: “Which features contributed to which revenue?” - they want attribution, not just velocity.

4. Customer-Impacting Incidents

What it is: Incidents that affect customers or trigger SLA credits

Why finance cares: Each incident costs money (SLA credits) and risks customer retention.

The disconnect: Engineering tracks ALL incidents. Finance only cares about incidents customers experience.

Our data: Customer-impacting incidents dropped from 12/quarter to 4/quarter. Each incident costs ~00K in SLA credits. That’s 00K quarterly savings finance actually tracks.

5. Time-to-Productivity for New Hires

What it is: How long before a new engineer ships their first production feature

Why finance cares: Faster ramp = faster return on hiring investment. In tight talent markets, onboarding efficiency matters.

Pre-platform: 8 weeks average
Post-platform: 3 weeks average (standardized everything = faster onboarding)

The finance lens: Each week of faster ramp × K weekly cost per engineer = 5K saved per hire. At 40 hires per year, that’s 00K in hiring efficiency gains.

The Disconnect

Engineering optimizes for DORA metrics. Finance measures outcomes DORA predicts but doesn’t directly track.

The translation needed:

  • Deployment frequency (DORA) → Feature delivery velocity (finance metric)
  • MTTR (DORA) → Customer-impacting incidents (finance metric)
  • Lead time (DORA) → Time-to-productivity (finance metric)

DORA metrics are valuable - they’re leading indicators. But finance doesn’t speak DORA. They speak revenue, costs, and customer impact.

The Surprise

The biggest surprise? Finance never questioned whether our platform investment was worth it.

Why? Because from day one, we reported in their language:

  • Revenue per engineer improving
  • Features shipped increasing
  • Incidents decreasing
  • Hiring efficiency improving

When you speak finance language, they don’t question platform value. They see it as business-critical infrastructure.

The Question

What metrics does your finance team actually track when evaluating platform engineering? Is there a disconnect between what engineering measures and what finance cares about?

How do you bridge that gap? Do you create translation layers, or do you just report in both languages?

Maya, this observation is incredibly valuable - and it reveals the core problem with how most platform teams communicate value.

The disconnect is solvable. We need translation frameworks that map DORA metrics to finance outcomes.

The Translation Framework

DORA metrics predict business outcomes. The problem is we don’t show the correlation explicitly. Here’s how we bridge the gap:

Deployment Frequency → Feature Delivery Velocity

The connection: Higher deployment frequency enables faster feature iteration and delivery.

How we show it:
“Our deployment frequency increased from 2×/week to 10×/week this quarter. This enabled us to ship 13 customer-facing features (up from 8 last quarter), contributing M in incremental revenue.”

Finance hears: More deployments → more features → more revenue

MTTR → Customer-Impacting Incidents

The connection: Faster MTTR means shorter incident duration and fewer customer-impacting events.

How we show it:
“We reduced MTTR from 90 minutes to 30 minutes. This prevented 3 incidents from escalating to customer-impacting severity, avoiding 00K in SLA credits and potential churn.”

Finance hears: Faster recovery → fewer customer incidents → cost avoidance

Change Failure Rate → Rework Costs

The connection: Lower failure rate means less time spent fixing broken deployments.

How we show it:
“Change failure rate dropped from 25% to 8%. That’s 40 fewer failed deployments this quarter. Each failure costs ~4 hours of engineering time to diagnose and fix. 40 × 4 hours × 50/hr = 4K in avoided rework costs.”

Finance hears: Fewer failures → less wasted effort → cost savings

The Dual Scorecard Approach

We created two reporting layers:

Engineering Dashboard (internal):

  • DORA metrics
  • Platform adoption rates
  • Developer satisfaction scores
  • Technical health indicators

Finance Dashboard (quarterly business reviews):

  • Revenue per engineer
  • Feature delivery velocity
  • Customer-impacting incidents
  • Engineering expense as % of revenue
  • Hiring efficiency metrics

Same underlying reality. Different languages.

The Monthly Translation

In our monthly engineering all-hands, I present DORA metrics.

In our monthly exec updates to CFO, I translate:
“Deployment frequency improved 40% → enabled 12 new features (up from 8) → contributed M incremental revenue. MTTR reduced 30 minutes → prevented 3 customer-impacting incidents → saved 00K in SLA credits.”

It’s the same work. Just translated to business impact.

The Key Insight

Finance isn’t wrong to ignore DORA metrics. They’re not business outcomes - they’re engineering process metrics.

Our job as technical leaders is to be bilingual: speak DORA with engineering, speak business outcomes with finance.

When you proactively translate DORA to business impact in every executive update, finance teams stop questioning platform value. They see platform engineering as business infrastructure that enables growth.

Maya, I’m so glad you got to observe those finance QBRs. That perspective - seeing what finance actually pays attention to - is invaluable for engineering leaders.

Your observation about the DORA disconnect is exactly right. But I’d add: it’s VP Eng’s responsibility to proactively translate, not wait for CFO to ask.

Why Proactive Translation Matters

When engineering leaders wait for finance to ask “what does deployment frequency mean for revenue?”, we’ve already failed. By then, finance perceives us as speaking a foreign language and asking them to learn it.

Instead: Speak finance language from the start.

My Translation Template

Every month, I send our CFO a one-page engineering update. It follows this format:

Business Impact This Month:

  • Revenue enabled: [features shipped, revenue attributed]
  • Costs avoided: [incidents prevented, SLA credits saved]
  • Capacity created: [productivity improvements, hiring efficiency]

Engineering Health Indicators (one line each):

  • Deployment frequency: 12× per week (↑40% vs last month) → enabled faster feature iteration
  • MTTR: 28 minutes (↓from 45 minutes) → reduced customer-impacting incident duration
  • Change failure rate: 6% (↓from 12%) → less time spent on rework

What This Means for Business Goals:

  • Q2 goal: Ship 15 customer-facing features → on track (shipped 4 in January, 5 in February)
  • Annual goal: Improve revenue per engineer 20% → tracking at 22% improvement YTD

Finance teams appreciate engineering leaders who translate unprompted. It builds trust and credibility.

The Attribution Challenge

You mentioned: “CFO’s question: ‘Which features contributed to which revenue?’ - they want attribution, not just velocity.”

This is critical. Finance won’t accept “we shipped more features” without revenue attribution.

How we attribute:

  1. Product team tags features with revenue category: “Expansion revenue”, “New customer acquisition”, “Retention/churn prevention”
  2. Sales team confirms which features closed specific deals
  3. We track “features that wouldn’t have made roadmap without platform velocity” separately

Example: “Platform enabled us to ship enterprise SSO 6 weeks ahead of schedule. That feature closed 2 enterprise deals worth .2M ARR. Clear attribution: Platform → faster shipping → revenue.”

When you have clean attribution, finance stops questioning platform value.

The Bottom Line

Finance leaders don’t need to learn DORA metrics. Engineering leaders need to learn to translate DORA into business impact.

It’s not finance’s job to understand our craft. It’s our job to explain how our craft creates business value.

Do that proactively, consistently, in every executive update, and finance becomes your platform’s biggest advocate.

Maya, your observation about finance only caring about customer-impacting incidents (not all incidents) is a crucial distinction that many engineering teams miss.

I want to propose a structural solution to this disconnect: Shared OKRs between engineering and finance.

The Problem with Separate Metrics

When engineering has OKRs and finance has different OKRs, we optimize for different outcomes:

Engineering OKRs:

  • Improve deployment frequency by 40%
  • Reduce MTTR by 50%
  • Increase platform adoption to 90%

Finance OKRs:

  • Improve revenue per engineer by 15%
  • Hold engineering expense ratio at 22%
  • Reduce customer churn by 10%

These aren’t aligned. Engineering hits their OKRs but finance doesn’t see the connection to their goals.

The Shared OKR Experiment

Last year, we tried something different: Joint engineering-finance OKRs.

Q1 Shared Objective: “Improve engineering efficiency and business impact”

Shared Key Results:

  1. Improve revenue per engineer from 00K to 90K (15% increase)
  2. Reduce customer-impacting incidents from 12/quarter to 8/quarter
  3. Ship 12 customer-facing features (up from 8 previous quarter)

Both functions own these outcomes. Engineering optimizes DORA metrics to achieve them. Finance measures business impact.

Why This Works

Forces alignment on what success means:
Engineering can’t just improve deployment frequency and call it done - we have to improve it enough to actually ship 12 features and improve revenue per engineer.

Finance can’t just demand “more revenue per engineer” without understanding that platform investment enables it.

Creates shared accountability:
When revenue per engineer doesn’t improve, engineering and finance jointly diagnose why. Is it platform limitations? Market conditions? Product-market fit? Both functions have skin in the game.

Eliminates translation friction:
When both functions report against the same OKRs, there’s no “engineering speaks DORA, finance speaks revenue” disconnect. Everyone reports against shared business outcomes.

The Quarterly Sync

We formalized quarterly meetings between engineering leadership and FP&A (financial planning & analysis):

  • Review progress against shared OKRs
  • Engineering explains how DORA improvements contributed to business outcomes
  • Finance explains how business conditions affected engineering impact
  • Jointly plan next quarter’s focus

This turned our CFO into a platform engineering advocate. She understands that platform improvements directly drive the shared OKRs she owns.

The Suggestion

If your organization has a DORA vs business metrics disconnect, propose shared OKRs to your CFO:

“Instead of engineering having DORA-based OKRs and finance having revenue-based OKRs, what if we created shared objectives we both own - like revenue per engineer, feature delivery velocity, and incident reduction?”

Most CFOs will respond positively. They want engineering aligned to business outcomes. Shared OKRs formalize that alignment.