Module 11: Building an MBSE Practice
Organisational adoption — how to introduce MBSE into a team or enterprise, overcome resistance, and measure success.
Prerequisite: MBSE Module 10
The Adoption Challenge
By now you understand the technical foundations of MBSE — the languages, methods, tools, frameworks, and domain applications. But knowing what MBSE is and knowing how to make an organisation adopt it are two very different things. Many MBSE initiatives fail not because the technology is wrong, but because the human and organisational dimensions were neglected.
MBSE adoption is as much an organisational change as it is a technical one. You are asking people to change how they think, how they work, and how they collaborate. That is hard. It touches on identity, competence, workflow habits, and power structures. Understanding this is the first step toward a successful adoption.
Think about the transition from paper maps to GPS navigation. GPS is technically superior in almost every way — it recalculates routes in real time, shows traffic, and never gets folded the wrong way. But when GPS first appeared, many experienced drivers resisted it. They "knew the roads." They did not trust a screen. They worried about what would happen if the battery died. Some felt that relying on GPS made them less skilled. The transition required not just buying a device, but building trust, learning new habits, and accepting that the old way — however familiar — had real limitations. MBSE adoption follows the same pattern. The technology is ready; the challenge is getting people to embrace it.
The human side of change
Change management research consistently shows that technology adoption follows a predictable emotional arc. People move from awareness ("I've heard of MBSE") through understanding ("I see why it matters") to acceptance ("I'm willing to try") and finally commitment ("I can't imagine going back"). Most failed adoptions stall somewhere between understanding and acceptance — people intellectually agree that MBSE is better, but they are not willing to endure the discomfort of changing their daily practices.
Successful change requires addressing three things simultaneously:
- Motivation — why should I change? What is in it for me personally, not just for the organisation?
- Ability — can I actually do this? Do I have the training, tools, and support I need?
- Reinforcement — will the organisation reward this new behaviour, or will it quietly revert to the old way?
If any one of these is missing, adoption will fail. A brilliantly motivated team without training will flounder. A well-trained team without management reinforcement will drift back to documents within six months.
MBSE Maturity Models
Before you can plan an adoption journey, you need to know where you are starting from. MBSE maturity models provide a structured way to assess an organisation's current capability and chart a path forward. The most widely referenced is the INCOSE MBSE maturity assessment, but the general concept applies regardless of the specific framework you use.
Maturity models typically define five levels, each building on the one before:
| Level | Name | Description | Characteristics |
|---|---|---|---|
| 1 | Initial / Ad Hoc | No systematic modelling. Engineering is entirely document-centric. | No modelling tools or standards; knowledge lives in documents and people's heads; no formal traceability. |
| 2 | Managed / Exploratory | Individual teams or engineers experiment with modelling on selected projects. | One or two pilot projects; modelling is voluntary; models are not authoritative artefacts; limited tool deployment. |
| 3 | Defined / Standardised | The organisation has defined MBSE processes, a chosen language and tool, and training programmes. | Modelling method documented; tool licences available; models used in design reviews; basic traceability in place. |
| 4 | Quantitatively Managed | MBSE is the standard way of working. Metrics are collected and used to improve the practice. | Models are the authoritative source of truth; automated consistency checking; metrics tracked (defect rates, review efficiency); model libraries and reuse. |
| 5 | Optimising / Digital Thread | Fully integrated digital thread from requirements through design, manufacturing, and operations. | Models drive downstream processes (simulation, code generation, test automation); continuous improvement based on data; cross-programme model reuse; AI-assisted modelling. |
Before reading further, pause and honestly assess your organisation. Which level are you at today? Most organisations that think they are at Level 3 are actually at Level 2 — they have tools and some training, but models are not yet authoritative artefacts used in formal reviews. Being honest about your starting point is essential for planning a realistic adoption roadmap. The INCOSE MBSE Maturity Assessment provides a structured questionnaire to help with this evaluation.
Why maturity matters
Trying to jump from Level 1 to Level 5 in a single leap is the most common strategic mistake in MBSE adoption. Each level builds capabilities and organisational muscle that the next level depends on. An organisation at Level 1 that invests in an enterprise modelling tool (a Level 4 activity) will waste money — the tool will sit unused because no one knows how to model, and there is no method to guide them.
The right approach is to target the next level up, achieve it, stabilise, and then plan the next step. For most organisations, reaching Level 3 within two to three years is an ambitious but achievable goal.
Pilot Projects and Incremental Adoption
The single most effective strategy for MBSE adoption is to start small. Rather than attempting to transform the entire organisation at once, begin with one team, one project, and one viewpoint. This is sometimes called the "lighthouse project" approach — pick a visible project that will shine a light on what MBSE can do, guiding others to follow.
The lighthouse project
A lighthouse project serves multiple purposes. It demonstrates concrete value to sceptics. It builds practical experience within the team. It produces reusable artefacts (model templates, method guidelines, training materials) that can be shared with other teams. And it provides real data on the costs and benefits of MBSE, which is essential for securing continued investment.
The key is choosing the right project. Not every project is suitable as a pilot.
A good MBSE pilot project should have the following characteristics:
Right size: Large enough to demonstrate value, small enough to be manageable. A six-month project with a team of five to ten people is ideal. Too small and the overhead of setting up MBSE will dominate; too large and the risk of failure is too high.
Right complexity: The project should involve enough cross-domain interaction (hardware, software, interfaces) to benefit from model-based traceability. A purely software project or a purely mechanical project will not showcase MBSE's strengths.
Supportive team: The team must include at least a few people who are enthusiastic about MBSE. Forcing an entirely reluctant team to pilot a new approach is a recipe for failure. You need early adopters who will champion the effort.
Measurable outcomes: Define success criteria before you start. What metrics will you track? How will you compare the pilot's performance to previous projects? Without measurable outcomes, the pilot becomes an anecdote rather than evidence.
Visible stakeholders: Choose a project that matters to senior leadership. A successful pilot that no one important sees is a wasted opportunity.
Incremental expansion
After the first pilot succeeds, resist the temptation to immediately mandate MBSE across the organisation. Instead, expand incrementally:
- Second pilot: Apply lessons learned to a second project, ideally in a different domain or division. This tests whether the approach transfers beyond the original team.
- Community of practice: Bring together people from both pilots to share experiences, build common templates, and develop internal training materials.
- Broader rollout: Gradually expand to more projects, supported by the community of practice. Each new project benefits from the accumulated knowledge.
- Standardisation: Once a critical mass of projects is using MBSE, formalise the approach as an organisational standard.
This incremental approach typically takes two to four years to reach organisation-wide adoption, but the results are far more durable than a top-down mandate.
Common Barriers and How to Overcome Them
Every MBSE adoption will encounter barriers. Anticipating them and having mitigation strategies ready is far better than being surprised. The table below summarises the most common barriers, their root causes, and proven mitigation strategies.
| Barrier | Root Cause | Mitigation Strategy |
|---|---|---|
| Resistance to change | Fear of the unknown; loss of established expertise; comfort with current practices | Start with volunteers and early adopters; demonstrate quick wins; provide psychological safety to experiment and fail; celebrate early successes publicly |
| Lack of training | Engineers are expected to learn MBSE on their own; training budget is insufficient | Invest in structured training programmes (formal courses + mentoring); allocate dedicated learning time; provide sandbox environments for practice |
| Tool costs | Commercial MBSE tools require significant licensing investment; infrastructure costs for servers and administration | Start with evaluation licences or open-source tools (e.g. Eclipse Papyrus, Capella); build a business case based on pilot results; consider cloud-based licensing models |
| Lack of management buy-in | Leadership does not understand MBSE value; no visible ROI data; competing priorities | Present MBSE as a risk-reduction strategy, not a cost; use pilot data to build the business case; connect MBSE outcomes to strategic goals (time-to-market, quality, compliance) |
| "We've always done it this way" | Deep-rooted organisational culture; success with current approach makes change feel unnecessary | Acknowledge past success while highlighting new challenges (increasing complexity, shorter timelines, regulatory pressure); frame MBSE as evolution, not revolution |
| Unclear ROI | Benefits of MBSE are long-term and systemic; hard to attribute directly to modelling | Define measurable metrics before starting; track leading indicators (review duration, defect discovery timing); benchmark against comparable non-MBSE projects |
The single most common failure mode in MBSE adoption is mandating tools without investing in method and training. An organisation buys expensive tool licences, installs them on every engineer's machine, and declares "we are now doing MBSE." Six months later, the tools are unused, the licences are wasted, and the organisation concludes that "MBSE doesn't work." In reality, MBSE was never tried — only tool procurement was. Without a defined method, adequate training, and a supportive culture, tools alone achieve nothing.
Addressing resistance constructively
Resistance is not the enemy — it is a signal. When an experienced engineer says "this won't work," they are often raising legitimate concerns based on years of practical experience. The most effective response is not to dismiss their concerns, but to engage with them:
- Listen first. Understand why they are resistant. Is it fear of obsolescence? Past experience with failed initiatives? Genuine technical concerns?
- Involve them. Invite sceptics to participate in the pilot — not as passive observers, but as active contributors whose expertise is valued.
- Show, don't tell. Demonstrations are more persuasive than presentations. Show a working model solving a real problem the engineer cares about.
- Be patient. Some people will come around quickly; others will take a year or more. That is normal and acceptable.
Training and Competency Development
MBSE competency is not a single skill — it is a bundle of interrelated skills that must be developed together. An engineer who knows SysML syntax but cannot think in terms of systems architecture will produce syntactically correct but semantically useless models. Training programmes must address all four skill dimensions.
The four skill dimensions
- Systems thinking: The ability to see a system as an integrated whole, understand emergent behaviour, and reason about cross-cutting concerns. This is the hardest skill to teach because it is a way of thinking, not a body of knowledge.
- Modelling language proficiency: Fluency in SysML (or whatever modelling language the organisation has chosen). This includes understanding the abstract syntax (what the language elements mean) and the concrete syntax (how to write and read models).
- Method knowledge: Understanding the modelling method — which views to create, in what order, what questions each view answers, and how views relate to each other. Without method knowledge, engineers create models but do not know what to model or why.
- Tool proficiency: Practical ability to use the modelling tool effectively. This includes not just basic operations (creating elements, drawing diagrams) but also advanced features (model queries, impact analysis, report generation).
Training pathways
Effective training uses a blended approach that combines multiple learning modalities:
- Formal courses: Structured classroom or online courses covering the modelling language and method. Typically two to five days for foundational competency. Organisations like INCOSE and tool vendors offer certified training.
- Mentoring: Pairing less experienced modellers with experienced ones on real projects. This is the most effective way to transfer tacit knowledge — the kind of practical wisdom that cannot be captured in a textbook.
- Community of practice: A regular forum (weekly or biweekly) where practitioners share challenges, solutions, and best practices. Communities of practice build collective knowledge and create a support network.
- Learning by doing: Provide sandbox projects where engineers can experiment without fear of breaking a real model. Hands-on practice is essential — modelling is a craft that improves with repetition.
The MBSE champion
Every successful MBSE adoption has at least one MBSE champion — a person who evangelises the approach, supports struggling colleagues, troubleshoots problems, and keeps the momentum going when enthusiasm wanes. The champion does not have to be the most senior person in the room, but they need three qualities: deep MBSE competency, credibility with their peers, and relentless enthusiasm.
The MBSE champion role is often informal and unpaid — which is a mistake. Organisations that formally recognise and resource this role (dedicated time, training budget, access to leadership) see significantly faster adoption. Consider creating a formal role such as "MBSE Practice Lead" or "Modelling Coach" to give the champion the authority and resources they need.
Measuring MBSE Success
If you cannot measure it, you cannot improve it — and you cannot justify continued investment in it. Measuring the success of an MBSE adoption is essential, but it must be done carefully. The wrong metrics can be worse than no metrics at all.
Meaningful metrics
Good MBSE metrics connect modelling activities to business outcomes. They answer the question: "Is MBSE making our engineering better?" The following table presents a set of proven metrics organised by what they measure.
| Category | Metric | What It Measures | Target Direction |
|---|---|---|---|
| Quality | Defect reduction | Number of defects found after design reviews vs. historical baseline | Decrease |
| Quality | Requirements coverage | Percentage of requirements traced to design elements and test cases | Increase toward 100% |
| Efficiency | Design review duration | Time spent in design reviews per review cycle | Decrease |
| Efficiency | Time-to-market | Overall project duration from concept to delivery | Decrease |
| Cost | Rework reduction | Effort spent on rework as a percentage of total effort | Decrease |
| Cost | Change impact analysis time | Time needed to assess the impact of a requirements change | Decrease |
| Adoption | Model usage rate | Percentage of teams actively using the model as their primary engineering artefact | Increase |
| Adoption | Training completion | Percentage of target engineers who have completed MBSE training | Increase toward 100% |
Avoiding vanity metrics
A vanity metric is one that looks impressive on a slide but does not actually indicate success. The most common vanity metrics in MBSE are:
- "Number of diagrams created" — more diagrams does not mean better engineering. A team could create hundreds of meaningless diagrams. What matters is whether the model supports better decision-making.
- "Number of model elements" — a large model is not inherently better than a small one. A bloated model with redundant or unused elements is actually worse.
- "Number of tool licences deployed" — deploying licences is an input, not an outcome. What matters is whether people are actually using the tools to create value.
The antidote to vanity metrics is to always ask: "So what?" If someone reports that the team created 500 diagrams last quarter, ask: "Did that reduce defects? Did it improve design review efficiency? Did it help the team make better decisions?" If the answer is unclear, the metric is not meaningful.
Connecting to business value
Ultimately, MBSE must justify itself in business terms. Engineering leaders and executives do not care about modelling for its own sake — they care about delivering products faster, cheaper, and with higher quality. When presenting MBSE metrics, always connect them to business outcomes:
- Defect reduction translates to lower warranty costs and fewer field failures.
- Faster design reviews translate to shorter schedules and lower labour costs.
- Better requirements coverage translates to reduced compliance risk.
- Lower rework translates directly to cost savings — rework typically consumes 20–40% of engineering effort in document-centric projects.
Building this evidence base takes time. The pilot project provides the first data points; each subsequent project adds more. Over two to three years, the cumulative evidence becomes compelling.