By Whitney Coggeshall
Most conversations about skills-based assessment assume you’re starting something entirely new. But what if you’re not? What if you’re working with an established program that already carries decades of credibility, loyal stakeholders, and a well-earned reputation for rigor?
For many large-scale programs, introducing skills isn’t about starting over. It’s about carefully and strategically modernizing what already works while protecting the brand equity you’ve built. That’s where things get daunting. Established programs often have deeply embedded processes, multiple governance layers, and highly invested stakeholders who expect clear answers long before you have them. Building on the themes of my earlier articles about the complexity of skills and the need for intentional design, this piece shifts to a different challenge: how to bring skills into long-standing programs without starting over.
The key is not to get bogged down in the details too early. Instead, focus first on building the skeleton, or a clear vision of what the end state should look like. Once you have that skeleton, you can start putting meat on the bones, defining what skills mean in your context, how they might fit within your program, and what success would look like. Only then does it make sense to outline a roadmap, think through how to phase things in, and start tackling the detailed questions that inevitably come up.
It’s tempting to jump straight into designing Phase I or to spend months developing a detailed skills taxonomy (which you will need eventually). But visioning should start well before that work is finalized. As Stephen Covey wisely said, “Begin with the end in mind.” Before you can phase anything in, you need a stable way of thinking about the end state. A simple but powerful way to structure your thinking is to ask how skills will ultimately live inside your program. In practice, three end-state models serve as that framework: the adjacent layer, the integrated pillar, and the hybrid model. These provide a structured way to explore possible futures without getting lost in the details too early.
Three models for integrating skills into existing assessment programs
When it comes to weaving skills into an existing assessment program, there are three broad models to consider. Each represents a different way of balancing innovation, tradition, and brand risk.
You can think of these as three possible end states:
- The adjacent layer model – where a separate, skills-focused component sits alongside your existing program.
- The integrated pillar model – where skills become a core part of the program itself, on equal footing with other established domains.
- The hybrid model – where elements of both approaches are intentionally combined into one cohesive structure.
These models aren’t about phases or sequencing. They’re about the destination. Each one offers a different path to the same goal, which is a more modern and relevant assessment that reflects what learners and employers value most.
The right choice depends on what you’re trying to achieve and how your program is positioned today. Are you looking to test new skill-based approaches with minimal disruption? Are you ready to fully embed skills into your existing framework? Or do you need something in between that allows you to evolve over time while maintaining stability?
Getting clear on which end-state model fits your program best gives you something solid to aim for. It turns abstract goals like “become more skills-based” into a concrete design vision, something you can plan for, communicate, and build toward.
The adjacent layer model
What it is
A skills-focused offering that sits beside your existing program rather than inside it. This could be a separate credential, a digital badge, or a performance module designed to showcase applied capability without altering the core exam.
Situations where this model works well
- You are introducing only a small number of new or high-stakes skills
- The skills represent essential hurdles candidates must clear to earn the credential, or a non-compensatory model is warranted
- You want to modernize without disrupting the core exam structure
- Stakeholders are uncertain or divided about how skills should be incorporated into the main program
- You want a controlled space to test new task types, scoring approaches, or technologies
Why it works
It allows innovation while protecting the stability and reputation of the core program. Because the adjacent layer stands on its own, it is easier to pilot, iterate, and learn. It gives you freedom to test ideas, gather evidence, and learn what works before making larger structural commitments.
Example
Aviation pilots complete written knowledge exams but must also pass a mandatory flight simulator test. The simulator is adjacent to the knowledge test, but it is a critical hurdle required for licensure.
What to consider
- Candidates must complete the main exam plus this additional requirement, which increases time and effort
- The adjacent layer must feel purposefully connected to the main program, or it risks confusing stakeholders
- This model supports gradual evolution rather than full transformation
The integrated pillar model
What it is
Skills become an essential part of the program structure. They are embedded directly into the existing assessment framework. This can happen in two ways:
- Skills are woven into current topics on the exam
- New topics are introduced specifically to accommodate skills that align with the program’s mission and structure
In this model, the exam reflects not only what candidates know, but how they apply that knowledge through real-world performance.
Situations where this model works well
- The skills you want to measure map naturally onto existing content areas
- You want to fully transform the program to reflect modern professional practice
- Stakeholders understand and support a visible shift in what the program represents
- You have the operational and psychometric capacity for deeper redesign
Why it works
It signals that skills are not an add-on, but instead central to what the credential stands for. That is, skills stand on equal footing with knowledge domains, thereby creating coherence and strengthening the program’s purpose.
Example
If ethics is already a topic on the exam and ethical reasoning is a desired skill to assess, integrating performance-based scenarios directly into that topic creates a natural fit.
What to consider
- This model represents a transformative end state because it alters the structure and experience of the program as a whole
- It requires significant redesign across content, scoring, candidate preparation, and governance
- It demands strong change management and clear communication so stakeholders understand how this approach strengthens the program
- Integrated skills must align with the identity and purpose of the credential, or they risk feeling forced
The hybrid model
What it is
A cohesive structure that intentionally combines both approaches. Some skills are integrated directly into existing topics on the exam, while others live in a separate adjacent layer. Together, these components form a unified program design.
Situations where this model works well
- Some skills map naturally to existing topics, while others do not
- You have skills that vary widely in complexity, stakes, or delivery feasibility
- You want to protect stable, enduring skills in the core exam while giving emerging skills space to evolve in an adjacent component
- You want to create a more modern learner journey without overloading the core exam
- You value flexibility, and want to keep an adjacent layer to pilot new skills as skills needed evolve over time
Why it works
It offers flexibility and balance. The hybrid model allows you to place skills in the format where they make the most sense. Skills that align tightly with existing domains can be integrated there. Skills that require more authentic, applied performance can be assessed in a separate, adjacent layer. Done well, the hybrid structure feels intentional, unified, and modern.
Examples
- A technology certification integrates debugging skills into existing domain topics but adds a capstone-style project to assess system-level design and implementation.
- A data science certification integrates scenario-based modeling skills into existing exam topics but includes an adjacent communication task to evaluate how effectively candidates translate technical results for nontechnical stakeholders.
What to consider
- The hybrid model must feel unified. If the integrated and adjacent components operate in silos, candidates may struggle to understand the overall structure
- Requires thoughtful communication about how each piece contributes to the total assessment of skill
- Works best when each component plays a distinct and intentional role within the credential
- Represents a balanced end state: more modern than purely adjacent, more flexible than fully integrated
Choosing the right model
Once you understand the three possible end states, the real work is deciding which one matches the future you want for your program. Instead of starting with constraints or operational concerns, start with a few strategic questions. These questions reveal which model aligns best with your goals.
- What identity do you want this credential to have five years from now? If you want a bold transformation, the integrated pillar model is the clearest signal. If you want to preserve the core identity while expanding relevance, the adjacent layer or hybrid model may be a better fit.
- How naturally do the skills align with your existing topics? If the skills map cleanly to domains you already assess, integration strengthens coherence. If they cut across topics or require very different formats, an adjacent layer or hybrid structure gives you more room to design effectively.
- How variable are the skills you want to measure? If some skills are deeply tied to specific content and others require applied performance, the hybrid model offers the best structural match.
- What message do you want to send to the market? Integration communicates that skills are essential to readiness. An adjacent layer signals that you are expanding the program in a controlled way. A hybrid model communicates both stability and modernization.
The best model is the one that reflects your long-term vision for the credential, not just what is easiest to implement in the short term. Once the end state is clear, the phasing becomes much easier to solve.
The path ahead
Modernizing a long-standing assessment program is complex, but it does not have to be overwhelming. The real work is choosing the destination, not perfecting every detail up front. Having a clear model to anchor to also gives your teams something stable to align around, which reduces the noise and uncertainty that often accompany modernization efforts. Once you know what you want the program to look like, choices about structure, sequencing, and implementation start to fall into place. Start with the end in mind, choose the model that fits your vision, and let that clarity guide the rest.
Author
More from Whitney Coggeshall
You may also be interested in
Explore our programs and certificates
CFA Institute offers a diverse range of programs and certificates designed to meet the needs of finance professionals across various career stages and specializations.