As part of our BALDDIE instructional design method, you’ve identified your business goal and documented performance outcomes for what the workforce needs to do to meet that goal. Now it’s time to identify the learning outcomes that will support people to improve their performance.
A Quick Refresher: What’s the BALDDIE model?
The BALDDIE instructional design model is a modified version of ADDIE that puts more emphasis on the analysis stage and draws in concepts from Cathy Moore's Action Mapping model and LEO Learning's Chain of Evidence method.
4 Rules for Judging Good Learning Outcomes
I titled this post “Learning Outcomes for Experts” because I recognize that many readers already have a lot of experience in writing learning outcomes. Still, at the risk of telling you what you already know, I’d like to share four rules for judging good learning outcomes.
The learning outcome must be tied to the performance and business goals you’ve identified.
The learning outcome should be something that can be assessed.
The learning outcome should be something the learners need to learn (i.e. not something they already know or can do).
The learning outcome may relate to either skills or knowledge as appropriate to the performance goal it supports.
Let’s look at each of these rules in more detail.
- How to Identify Good Business Goals for Learning Programs
- Setting the Right Performance Goals for Learning Programs
Tie outcomes to performance and business goals.
After taking time to research and establish good business and performance goals, it can be easy to fall at the last hurdle and throw in a couple of loosely related learning outcomes representing content you really want to cover.
We even fell foul of this ourselves in designing the learning outcomes for this blog series. You may recall our blog post about Cathy Moore’s action mapping model in which we identified two learning outcomes that didn’t directly support our stated performance goal:
Design a learning program, experience, or resource with a clear plan for impact on business performance.
And the two learning outcomes that don’t directly support our performance goal are:
Outline the problems of designing learning without a clear plan for impact on business performance.
Explain some common models of instructional design and how they relate to designing effective learning.
While these outcomes might be interesting topics, neither directly supports the performance goal and therefore are not good learning outcomes. Action mapping teaches us to only include what is strictly necessary when it comes to both performance and learning outcomes.
Can you assess the learning outcome?
To evaluate the effectiveness of the training, you’ll need to have a way of assessing learners’ knowledge and skills in relation to each of the learning outcomes. This assessment is important because it tells you if:
the learners acquired knowledge from the learning program or if they already had the skills/knowledge, and
there is a relationship between people with the skills/knowledge and those who achieve the performance goals.
That said, not everything is measurable. So if a learning outcome legitimately supports a performance outcome but is not possible (or too expensive) to assess, it’s still better to include it than exclude it.
Assessments should be designed not simply as a pass/fail. They should distinguish between:
learners who are exceptionally good from those who are satisfactory, and
those who are in real need of improvement from those who only just miss the grade.
If everybody passes the assessment with 100%, it’s significantly less useful than one with a lower pass mark but is challenging enough to stretch learners. A more robust assessment enables you to segment learners when evaluating the impact of the skill/knowledge on job performance.
Why does learner segmentation matter?
Segmenting learners gives you the opportunity to compare and contrast whether or not achievement of learning outcomes impacted performance outcomes and/or business goals.
For example: You have Group A, whose learners exceeded expectations for learning outcomes and their eventual performance outcome was X. The learners in Group B had room for improvement in their learning outcomes and their eventual performance was Y. And this segmentation makes evaluation easier.
Been there, done that.
It seems obvious, but learning outcomes should cover knowledge and skill learners need to obtain—not what they already know. This is important so as not to waste learners’ time. Furthermore, if learners already have that knowledge and skill and the business goal has not already been met, it’s a good indication that this particular learning outcome won’t lead to the goal being met.
In cases where only some learners may have the skills and knowledge, consider:
including a pre-assessment that allows learners to test out, or
launching a sample assessment to gauge levels of knowledge and skills to identify weak areas as part of your analysis before designing the learning program.
And what about skills or knowledge?
Not all learning is about simply acquiring information. In fact, learning often is about practicing and developing skills. So when designing learning outcomes to support a performance outcome, think both about what information people need and what skills they need to develop.
Skills-based learning outcomes should be phrased to outline what people need to be able to do. Learners need to be assessed with tests that measure the skill itself, not with multiple choice questions quizzing learners’ knowledge about the skill. That’s why learners often need to be taught using techniques such as hand-on practice, repetition, and feedback—which, as a result, should influence how you design the program.
Up Next: How to design experiences and content that effect change
In our next post, we'll explain how you can respond to naysayers who might challenge moving to a different approach to instructional design.
And the learning measurement survey says...
There's a disconnect between L&D and the wider business. Find out more and read our fourth annual survey report with LEO Learning that provides an evolving picture of L&D’s relationship with measurement and business impact—including real-world examples and extended commentary.