Ran Wei / MBSE Series / Module 11
中文
MBSE Series — Ran Wei

Module 11: Building an MBSE Practice

Organisational adoption — how to introduce MBSE into a team or enterprise, overcome resistance, and measure success.

Prerequisite: MBSE Module 10

1

The Adoption Challenge

By now you understand the technical foundations of MBSE — the languages, methods, tools, frameworks, and domain applications. But knowing what MBSE is and knowing how to make an organisation adopt it are two very different things. Many MBSE initiatives fail not because the technology is wrong, but because the human and organisational dimensions were neglected.

MBSE adoption is as much an organisational change as it is a technical one. You are asking people to change how they think, how they work, and how they collaborate. That is hard. It touches on identity, competence, workflow habits, and power structures. Understanding this is the first step toward a successful adoption.

ANALOGY

Think about the transition from paper maps to GPS navigation. GPS is technically superior in almost every way — it recalculates routes in real time, shows traffic, and never gets folded the wrong way. But when GPS first appeared, many experienced drivers resisted it. They "knew the roads." They did not trust a screen. They worried about what would happen if the battery died. Some felt that relying on GPS made them less skilled. The transition required not just buying a device, but building trust, learning new habits, and accepting that the old way — however familiar — had real limitations. MBSE adoption follows the same pattern. The technology is ready; the challenge is getting people to embrace it.

The human side of change

Change management research consistently shows that technology adoption follows a predictable emotional arc. People move from awareness ("I've heard of MBSE") through understanding ("I see why it matters") to acceptance ("I'm willing to try") and finally commitment ("I can't imagine going back"). Most failed adoptions stall somewhere between understanding and acceptance — people intellectually agree that MBSE is better, but they are not willing to endure the discomfort of changing their daily practices.

Successful change requires addressing three things simultaneously:

If any one of these is missing, adoption will fail. A brilliantly motivated team without training will flounder. A well-trained team without management reinforcement will drift back to documents within six months.

2

MBSE Maturity Models

Before you can plan an adoption journey, you need to know where you are starting from. MBSE maturity models provide a structured way to assess an organisation's current capability and chart a path forward. The most widely referenced is the INCOSE MBSE maturity assessment, but the general concept applies regardless of the specific framework you use.

Maturity models typically define five levels, each building on the one before:

LevelNameDescriptionCharacteristics
1Initial / Ad HocNo systematic modelling. Engineering is entirely document-centric.No modelling tools or standards; knowledge lives in documents and people's heads; no formal traceability.
2Managed / ExploratoryIndividual teams or engineers experiment with modelling on selected projects.One or two pilot projects; modelling is voluntary; models are not authoritative artefacts; limited tool deployment.
3Defined / StandardisedThe organisation has defined MBSE processes, a chosen language and tool, and training programmes.Modelling method documented; tool licences available; models used in design reviews; basic traceability in place.
4Quantitatively ManagedMBSE is the standard way of working. Metrics are collected and used to improve the practice.Models are the authoritative source of truth; automated consistency checking; metrics tracked (defect rates, review efficiency); model libraries and reuse.
5Optimising / Digital ThreadFully integrated digital thread from requirements through design, manufacturing, and operations.Models drive downstream processes (simulation, code generation, test automation); continuous improvement based on data; cross-programme model reuse; AI-assisted modelling.
Table 1 — MBSE maturity levels
SELF-ASSESSMENT

Before reading further, pause and honestly assess your organisation. Which level are you at today? Most organisations that think they are at Level 3 are actually at Level 2 — they have tools and some training, but models are not yet authoritative artefacts used in formal reviews. Being honest about your starting point is essential for planning a realistic adoption roadmap. The INCOSE MBSE Maturity Assessment provides a structured questionnaire to help with this evaluation.

Why maturity matters

Trying to jump from Level 1 to Level 5 in a single leap is the most common strategic mistake in MBSE adoption. Each level builds capabilities and organisational muscle that the next level depends on. An organisation at Level 1 that invests in an enterprise modelling tool (a Level 4 activity) will waste money — the tool will sit unused because no one knows how to model, and there is no method to guide them.

The right approach is to target the next level up, achieve it, stabilise, and then plan the next step. For most organisations, reaching Level 3 within two to three years is an ambitious but achievable goal.

3

Pilot Projects and Incremental Adoption

The single most effective strategy for MBSE adoption is to start small. Rather than attempting to transform the entire organisation at once, begin with one team, one project, and one viewpoint. This is sometimes called the "lighthouse project" approach — pick a visible project that will shine a light on what MBSE can do, guiding others to follow.

The lighthouse project

A lighthouse project serves multiple purposes. It demonstrates concrete value to sceptics. It builds practical experience within the team. It produces reusable artefacts (model templates, method guidelines, training materials) that can be shared with other teams. And it provides real data on the costs and benefits of MBSE, which is essential for securing continued investment.

The key is choosing the right project. Not every project is suitable as a pilot.

PILOT SELECTION CRITERIA

A good MBSE pilot project should have the following characteristics:

Right size: Large enough to demonstrate value, small enough to be manageable. A six-month project with a team of five to ten people is ideal. Too small and the overhead of setting up MBSE will dominate; too large and the risk of failure is too high.

Right complexity: The project should involve enough cross-domain interaction (hardware, software, interfaces) to benefit from model-based traceability. A purely software project or a purely mechanical project will not showcase MBSE's strengths.

Supportive team: The team must include at least a few people who are enthusiastic about MBSE. Forcing an entirely reluctant team to pilot a new approach is a recipe for failure. You need early adopters who will champion the effort.

Measurable outcomes: Define success criteria before you start. What metrics will you track? How will you compare the pilot's performance to previous projects? Without measurable outcomes, the pilot becomes an anecdote rather than evidence.

Visible stakeholders: Choose a project that matters to senior leadership. A successful pilot that no one important sees is a wasted opportunity.

Incremental expansion

After the first pilot succeeds, resist the temptation to immediately mandate MBSE across the organisation. Instead, expand incrementally:

  1. Second pilot: Apply lessons learned to a second project, ideally in a different domain or division. This tests whether the approach transfers beyond the original team.
  2. Community of practice: Bring together people from both pilots to share experiences, build common templates, and develop internal training materials.
  3. Broader rollout: Gradually expand to more projects, supported by the community of practice. Each new project benefits from the accumulated knowledge.
  4. Standardisation: Once a critical mass of projects is using MBSE, formalise the approach as an organisational standard.

This incremental approach typically takes two to four years to reach organisation-wide adoption, but the results are far more durable than a top-down mandate.

4

Common Barriers and How to Overcome Them

Every MBSE adoption will encounter barriers. Anticipating them and having mitigation strategies ready is far better than being surprised. The table below summarises the most common barriers, their root causes, and proven mitigation strategies.

BarrierRoot CauseMitigation Strategy
Resistance to changeFear of the unknown; loss of established expertise; comfort with current practicesStart with volunteers and early adopters; demonstrate quick wins; provide psychological safety to experiment and fail; celebrate early successes publicly
Lack of trainingEngineers are expected to learn MBSE on their own; training budget is insufficientInvest in structured training programmes (formal courses + mentoring); allocate dedicated learning time; provide sandbox environments for practice
Tool costsCommercial MBSE tools require significant licensing investment; infrastructure costs for servers and administrationStart with evaluation licences or open-source tools (e.g. Eclipse Papyrus, Capella); build a business case based on pilot results; consider cloud-based licensing models
Lack of management buy-inLeadership does not understand MBSE value; no visible ROI data; competing prioritiesPresent MBSE as a risk-reduction strategy, not a cost; use pilot data to build the business case; connect MBSE outcomes to strategic goals (time-to-market, quality, compliance)
"We've always done it this way"Deep-rooted organisational culture; success with current approach makes change feel unnecessaryAcknowledge past success while highlighting new challenges (increasing complexity, shorter timelines, regulatory pressure); frame MBSE as evolution, not revolution
Unclear ROIBenefits of MBSE are long-term and systemic; hard to attribute directly to modellingDefine measurable metrics before starting; track leading indicators (review duration, defect discovery timing); benchmark against comparable non-MBSE projects
Table 2 — Common MBSE adoption barriers and mitigations
MOST COMMON FAILURE MODE

The single most common failure mode in MBSE adoption is mandating tools without investing in method and training. An organisation buys expensive tool licences, installs them on every engineer's machine, and declares "we are now doing MBSE." Six months later, the tools are unused, the licences are wasted, and the organisation concludes that "MBSE doesn't work." In reality, MBSE was never tried — only tool procurement was. Without a defined method, adequate training, and a supportive culture, tools alone achieve nothing.

Addressing resistance constructively

Resistance is not the enemy — it is a signal. When an experienced engineer says "this won't work," they are often raising legitimate concerns based on years of practical experience. The most effective response is not to dismiss their concerns, but to engage with them:

5

Training and Competency Development

MBSE competency is not a single skill — it is a bundle of interrelated skills that must be developed together. An engineer who knows SysML syntax but cannot think in terms of systems architecture will produce syntactically correct but semantically useless models. Training programmes must address all four skill dimensions.

The four skill dimensions

Training pathways

Effective training uses a blended approach that combines multiple learning modalities:

The MBSE champion

Every successful MBSE adoption has at least one MBSE champion — a person who evangelises the approach, supports struggling colleagues, troubleshoots problems, and keeps the momentum going when enthusiasm wanes. The champion does not have to be the most senior person in the room, but they need three qualities: deep MBSE competency, credibility with their peers, and relentless enthusiasm.

NOTE

The MBSE champion role is often informal and unpaid — which is a mistake. Organisations that formally recognise and resource this role (dedicated time, training budget, access to leadership) see significantly faster adoption. Consider creating a formal role such as "MBSE Practice Lead" or "Modelling Coach" to give the champion the authority and resources they need.

6

Measuring MBSE Success

If you cannot measure it, you cannot improve it — and you cannot justify continued investment in it. Measuring the success of an MBSE adoption is essential, but it must be done carefully. The wrong metrics can be worse than no metrics at all.

Meaningful metrics

Good MBSE metrics connect modelling activities to business outcomes. They answer the question: "Is MBSE making our engineering better?" The following table presents a set of proven metrics organised by what they measure.

CategoryMetricWhat It MeasuresTarget Direction
QualityDefect reductionNumber of defects found after design reviews vs. historical baselineDecrease
QualityRequirements coveragePercentage of requirements traced to design elements and test casesIncrease toward 100%
EfficiencyDesign review durationTime spent in design reviews per review cycleDecrease
EfficiencyTime-to-marketOverall project duration from concept to deliveryDecrease
CostRework reductionEffort spent on rework as a percentage of total effortDecrease
CostChange impact analysis timeTime needed to assess the impact of a requirements changeDecrease
AdoptionModel usage ratePercentage of teams actively using the model as their primary engineering artefactIncrease
AdoptionTraining completionPercentage of target engineers who have completed MBSE trainingIncrease toward 100%
Table 3 — MBSE success metrics

Avoiding vanity metrics

A vanity metric is one that looks impressive on a slide but does not actually indicate success. The most common vanity metrics in MBSE are:

The antidote to vanity metrics is to always ask: "So what?" If someone reports that the team created 500 diagrams last quarter, ask: "Did that reduce defects? Did it improve design review efficiency? Did it help the team make better decisions?" If the answer is unclear, the metric is not meaningful.

Connecting to business value

Ultimately, MBSE must justify itself in business terms. Engineering leaders and executives do not care about modelling for its own sake — they care about delivering products faster, cheaper, and with higher quality. When presenting MBSE metrics, always connect them to business outcomes:

Building this evidence base takes time. The pilot project provides the first data points; each subsequent project adds more. Over two to three years, the cumulative evidence becomes compelling.

Up Next

Module 12 — The Future of MBSE — Digital threads, AI-assisted modelling, SysML v2 ecosystem, and the convergence of MBSE with DevOps.