Frameworks and metrics for internal developer experience — DORA, SPACE, DevEx dimensions, IDP pillars, and how to measure. For teams defining metrics or building an internal DX roadmap.
This page pulls together the frameworks, metrics, and research that underpin internal developer experience (internal DX) and internal DX audits. Use it when you want to go deeper than the Internal DX audit: when and why guide — for example when defining metrics, aligning with industry frameworks, or building an internal developer platform (IDP) roadmap.
Internal developer experience is the experience of your own developers (employees, contractors): environment, tools, processes, onboarding, friction, and how quickly they can ship. It’s not external DX (your product for external developers) and not only “codebase health” — it’s velocity and experience of the team working in that codebase.
A useful nuance: internal DX is the way developers feel about their work and everything that impacts their day-to-day — including how their managers’ expectations are applied (clear goals, fair judgment, attainable standards). So it’s the conditions under which work happens, not just outputs (lines of code, deployments). Output metrics (story points, deployment frequency) capture what was delivered; they don’t explain why or whether the experience is sustainable. Poor internal DX shows up as slower delivery, burnout, turnover, and difficulty hiring even when raw “productivity” numbers look okay in the short term.
The DevEx framework (Abi Noda, Nicole Forsgren, Margaret-Anne Storey, Michaela Greiler, 2023) describes three dimensions that shape internal developer experience:
Principle: Combine developer perceptions (surveys, interviews) with system/workflow data (build times, review time, deploy frequency). Neither alone is enough.
DORA defines key delivery metrics that many teams use alongside DevEx:
DORA identifies what to improve (delivery performance) but doesn’t explain why or capture satisfaction/wellbeing. It can incentivize gaming. Best combined with perception and DevEx dimensions. Nicole Forsgren has noted: once you’ve identified what to improve using DORA, you can use SPACE to decide how to measure it.
The SPACE framework (Forsgren et al., ACM Queue 2021) defines five dimensions of developer productivity:
SPACE is broader than DORA and includes both human and system. It’s often used with DORA: DORA for delivery signal, SPACE for how to measure and improve sustainably.
Port and similar IDP/DevEx vendors often frame metrics around four pillars:
Surveys (e.g. Gartner) often report what developers say matters most for DevEx:
| Area | Examples | Benchmarks (sources in notes) |
|---|---|---|
| Onboarding | Time to first PR (TTFP), time to first “Hello, world!” (TTFHW), time to 10th PR | Many teams report >1 month for new hire’s first 3 meaningful PRs; a minority cite >3 months. Target: sub–2 hours to first run, days not weeks to first PR. |
| Feedback loops | Build time, test time, code review turnaround, deploy frequency, lead time | Varies by stack; goal is “fast enough that devs don’t context-switch away.” |
| Flow | Blocks of focus time, meeting load, unplanned work, on-call disruptiveness | ~23 min to re-enter flow after interruption (UC Irvine). |
| Satisfaction | Developer satisfaction score, dNPS, engagement, retention | Combine with system data to avoid gaming. |
| System | Deployment frequency, change failure rate, MTTR, cycle time, WIP | DORA tiers and internal baselines. |
| Tool sprawl | Number of tools, time lost to context-switching | Studies (e.g. Port) suggest many developers lose 6–15 hours/week; a large majority report tool sprawl as a cost. |
Research snapshots (for context):
(Exact percentages and study years can be found in the cited vendors and research; we summarize for practical use.)
When an internal DX audit is done in an IDP-style assessment, it often scores across pillars like these:
Not every audit uses all ten; some use a shorter set aligned to your context. The idea is to get a maturity view across the areas that affect developer experience and delivery.
Useful entry points if you want to go to the source:
Other research and vendors that inform internal DX (GetDX, Gartner, Cortex, GitHub, etc.) are widely cited in industry; you can find their latest reports via search or your analyst/vendor channels.
For when and why to get an internal DX audit and what it covers in practice, see Internal DX audit: when and why. For external DX (developer product for adopters), see External DX audit.
If this guide resonated with your situation, let's talk. We offer a free 30-minute discovery call — no pitch, just honest advice on your specific project.
Your developer went silent. Your project is half-built. You don't know what state the code is in. This is the step-by-step guide to recovering your project and getting back on track.
10 min readRescuing SoftwareAn external developer experience audit evaluates your developer-facing product — documentation, API, SDK, onboarding — for external developers who adopt or integrate with you. Strong DX opens B2B, enterprise, and platform integration opportunities and new revenue channels. Learn when you need one and how it differs from a codebase audit.
9 min read