Deep DiveOperations Intelligence

The Operations Maturity Model Explained

11 minAPFX Team

Most ops teams think they are further along than they are. We walk in expecting stage 4. We usually find stage 2 with a dashboard bolted on top. A dashboard does not make a company measured. Acting on what it shows does. That is the confusion we see most often: having metrics versus using them.

The operations maturity model gives teams a shared language for that gap. It sorts operational capability into five stages, from reactive firefighting to predictive, AI-augmented work. Each stage has its own traits profile across process, data, automation, decisions, people, and tools. Getting honest about where you actually sit is the first step. You cannot skip stages without paying for it later.

What is an operations maturity model?

An operations maturity model is a five-stage framework that classifies how an organization runs its work, from reactive (no documented process, no metrics) to predictive (AI-augmented, self-healing). Each stage defines the operating traits a company shows across process, data, automation, decisions, people, and tools.

The model has roots in three older frameworks. The Capability Maturity Model Integration (CMMI), developed at Carnegie Mellon University's Software Engineering Institute in 2002 and now stewarded by the CMMI Institute and ISACA, introduced the five-level pattern that most modern variants copy. APQC's Process Classification Framework, first released in 1992 and updated through version 7.4 in 2024, adds a taxonomy of over 1,000 business processes that organizations score individually. Roche's operational excellence program, profiled by McKinsey in 2022, shows how a large biopharma applies a maturity model across hundreds of production sites. Ron Carucci's 2023 Harvard Business Review work on high-performing ops teams and Gartner's ongoing operations research both use closely mapped five-stage frameworks.

In practice, the model is a diagnostic. It tells you what stage you are in, what stage you want to reach, and which capabilities you have to build to get there. It is not a report card. It is a map.

The five stages of operations maturity

The five operations maturity stages

Stage 1: Reactive

Reactive operations run on tribal knowledge and urgency. When a customer complains, someone scrambles. When a report is due, someone stays late to build it by hand. There is no written process. No metric anyone tracks consistently. Nobody who can answer "how long does this take us on average?" because the data was never captured.

Companies in stage 1 look busy. Team members know how to do their jobs because they learned from whoever had the job before. Losing a senior ops person is a minor crisis, because the process walks out with them. McKinsey's 2023 survey of operations leaders found 34% of mid-market companies still operate this way in at least one core function, most often customer service or finance close.

Stage 2: Documented

Documented operations have written processes but still run reactively. Someone, usually after a painful incident, wrote down the SOP. There is a wiki, or a Notion doc, or a shared drive full of PDFs. The documents exist.

Writing a process down is not the same as following it. Teams at stage 2 treat SOPs as reference material, not as a system. When something goes wrong, people default to what they already know. Documents go stale within months because nobody owns keeping them current. APQC's 2023 process maturity research found 62% of organizations at stage 2 reported their process documentation was "partially accurate" or worse, meaning the written process and the real process had drifted apart.

Stage 3: Measured

Measured operations track KPIs and run on dashboards. Cycle time, first-pass yield, utilization rate, backlog age, SLA attainment. The dashboards exist. The weekly ops review exists. People talk about numbers in meetings.

Stage 3 is where most $30M to $500M growth companies live. It is also where the most dangerous self-deception takes root. Having metrics is not the same as being driven by them. A team that reviews a dashboard on Monday and goes back to gut-feel decisions on Tuesday is measured in name only. Gartner's 2024 analytics adoption study found only 38% of organizations with deployed BI dashboards reported "high" rates of decision-making based on dashboard data.

Stage 4: Optimized

Optimized operations close the loop between data and action. Teams at stage 4 have short decision cycles, automation on anything repetitive, and clean instrumentation across the processes that matter most. When a metric moves, someone investigates within hours or days, not quarters. Process changes ship in weeks. The team measures whether the change actually worked.

The real difference from stage 3 is instrumentation depth plus decision speed. This is more than dashboards. There is alerting tied to thresholds, automated data flow between core systems, and an ops cadence that actually produces changes. Harvard Business Review's 2023 profile by Ron Carucci of eight high-performing operations teams found the common signal of stage 4 maturity was "the ratio of decisions made per dashboard view," with best-in-class teams producing roughly one decision for every three data reviews.

Stage 5: Predictive

Predictive operations are AI-augmented and partly self-healing. Forecasting models predict capacity needs two to six weeks ahead. Anomaly detection surfaces drift before a threshold alert would fire. Workflows route and escalate themselves. Humans handle exceptions, governance, and strategic calls.

Roche's biopharma operational excellence program, documented by McKinsey in 2022, is the clearest public example. The program spans more than 30 production sites, uses predictive maintenance models to forecast equipment failures, and routes interventions through automated workflows with human approval at defined checkpoints. Reported result: a 20 to 30% reduction in unplanned downtime within 18 months. Few mid-market companies reach stage 5 across every function. Many reach it in one or two domains (demand forecasting, claims processing, support routing) while sitting at stage 3 or 4 elsewhere.

Stage-by-stage traits

The traits table below shows what each stage looks like across six dimensions. Use it to diagnose where specific functions sit. Most companies are not at one stage across the whole business.

DimensionStage 1: ReactiveStage 2: DocumentedStage 3: MeasuredStage 4: OptimizedStage 5: Predictive
ProcessTribal knowledge, no SOPsWritten SOPs, stale in monthsSOPs tied to KPIsVersion-controlled, reviewed quarterlySelf-updating based on outcome data
DataNone captured systematicallyCaptured but not centralizedDashboards, batch refreshReal-time instrumentationReal-time plus predictive models
AutomationNone beyond email templatesBasic macros, spreadsheet logicScheduled reports, minor integrationsWorkflow automation on repetitive workAgentic routing, self-healing flows
DecisionsGut feel, hierarchy-drivenGut feel plus post-hoc reportsDashboard-informed, slowData-driven, fast cycleModel-suggested, human-approved
PeopleHeroes and firefightersProcess owners named but passiveKPI owners in every functionCross-functional ops squadsOps engineers plus AI governance roles
ToolsEmail, spreadsheets, shared drivesWiki or Notion, basic CRM or ERPBI tool, ops dashboardsIntegrated stack, iPaaS, monitoringML platforms, forecasting, agents

How do you diagnose your current stage?

You diagnose your current stage by running three questions against each core function: can you see the work, can you measure it, and do you act on what you see? If the answer to any one is no, you cap out at the stage below that question.

Start with the "see the work" test. Pick a core workflow (customer onboarding, invoice processing, sales pipeline handoff). Ask how many steps it has, who owns each step, and how long each step takes on average. If the team cannot answer without guessing, you are at stage 1 for that workflow. If they can answer from memory but not from a document, you are at stage 1 with a stage 2 aspiration. If they can answer from a document, check when the document was last updated. Documents older than 90 days on active processes signal stage 2, not stage 3.

Next, test "can you measure it." Check whether the KPIs for that workflow are actually being tracked. Not whether they were defined in a planning doc, but whether numbers flow into a dashboard or report on a regular cadence. No numbers means you are stuck below stage 3. Numbers that exist but nobody reviews on a schedule puts you in the gray zone between 2 and 3.

Finally, test "do you act on what you see." Look at the last 30 days of KPI movement. How many decisions were made because a metric moved? If the answer is zero or one, you are at stage 3 no matter how polished the dashboard looks. Stage 4 teams produce changes as output. The dashboard triggers work, not discussion.

The stage 3 trap

Most mid-market ops teams are at stage 3, think they are at stage 4, and are surprised when an audit shows the dashboard driving almost no decisions. Having instrumentation is not the same as running on it. Stage 4 is defined by decision throughput, not dashboard count.

Can you skip a maturity level?

You cannot skip from stage 2 to stage 4 without the measurement layer of stage 3, and the attempts we have seen usually collapse back within a year. Each stage builds a capability the next stage depends on. Documentation feeds measurement. Measurement feeds optimization. Optimization feeds prediction. Remove a stage and the one above it has nothing to stand on.

The most common stage-jumping mistake is buying stage 4 tools while operating at stage 2. A company installs a BI platform, a workflow automation tool, and an iPaaS, expecting the tools to pull the organization forward. What happens instead: the tools expose how inconsistent the underlying process is. Dashboards show garbage numbers because data entry upstream is uneven. Automation breaks because the process has branches nobody documented. The team spends six months fighting the tools instead of using them. CMMI Institute data from 2023 showed that organizations attempting to skip a CMMI level had a 71% rollback rate within 24 months, meaning they reverted to the prior level after failing to sustain the jump.

The second common mistake is jumping from stage 3 to stage 5 by adding AI models without building stage 4 fundamentals. Predictive models need clean, real-time data. If your measurement layer is batch refresh on a two-day lag, your forecasts will lag with it. If your process lacks instrumentation at the step level, your anomaly detection has nothing to detect. Roche's program worked because production sites were already at stage 4 operationally before predictive models were layered on. The model did not substitute for maturity. It extended it.

Stage 1: Reactive

    Stage 4: Optimized

      How long does it take to move up a stage?

      Moving up a single maturity stage usually takes 9 to 18 months for a mid-market company actively working on the transition. Moving up two stages takes 24 to 36 months and usually requires an outside partner or a dedicated internal transformation lead. Reactive to predictive across the whole business is a five-year project for most companies, and most never finish it across every function.

      The benchmark data has a few anchors. APQC's 2023 Process Maturity Benchmarks, drawn from over 800 member organizations, reported an average of 14 months to move from stage 2 to stage 3 in a single function, and 22 months for stage 3 to stage 4. CMMI Institute appraisal data from 2022 showed a median of 18 months between successful appraisals at each level. Gartner's 2024 operations leader survey found 68% of respondents at stage 4 reported their transition from stage 3 took longer than planned, with the most common cause cited as "change management and adoption," not tooling or data availability.

      The factor that predicts speed most reliably is leadership behavior. Teams with a senior leader who publicly uses dashboards in decisions, and who holds people accountable to acted-on metrics, move roughly twice as fast as teams where leaders treat metrics as reporting artifacts. That is a culture variable, not a tooling one, and it shows up in the time-to-maturity data across every framework we have reviewed.

      How does this connect to operations benchmarking?

      The maturity model is one of three lenses in operations benchmarking. The other two are quantitative benchmarks (cycle time, utilization rate, cost per transaction by industry) and team design benchmarks (ratios, roles, decision rights). The maturity model tells you your stage. The quantitative benchmarks tell you how you compare to peers at that stage. The team design benchmarks tell you whether your org structure can support it.

      A stage 3 company with stage 4 cycle times is probably running too hot and will burn out. A stage 4 company with stage 2 team design will regress to stage 3 within a year. For the other two lenses, see operations benchmarks for $30M to $500M companies and what best-in-class operations teams look like.

      Key takeaways

      The operations maturity model is diagnostic, not aspirational. Its value comes from telling you honestly where you are, which is almost always a stage below where you assume. For most growth companies, the real work sits in two places: getting the measurement layer of stage 3 to actually drive decisions, and building the instrumentation and decision-speed discipline of stage 4.

      Skipping stages does not work. Tooling without the underlying capability regresses within 24 months roughly 71% of the time, per CMMI data. Time to maturity is 9 to 18 months per stage when the work is resourced, and the biggest accelerator is leadership behavior, not budget. Run the three-question diagnostic on your core workflows. If the answers make you wince, the diagnostic is working.

      For a structured diagnostic, see how to run an operations audit in 5 days. For instrumentation, see how to track process health across your organization. For the full pillar, see what is operations intelligence.

      Next step

      Ready to go AI-native?

      Schedule 30 minutes with our team. We’ll explore where AI can drive the most value in your business.

      Get in Touch

      Related Articles