As we covered in the ADDIE model, it’s important to stay aligned with the business goals and outcomes as you develop and deliver learning. And one way to ensure you stay aligned is by mapping how each step of the design process fits into the bigger picture of the organization. In this post, we’ll introduce you to the basics of a relatively new technique in the instructional designer’s toolkit named action mapping, including examples of when learning evaluation might go awry when using this technique.
What is action mapping?
Action mapping is an instructional design method that’s meant to streamline and simplify the design process. That’s why one of the first steps in this technique focuses on analyzing the problem at hand and defining an appropriate solution. As a result, training is not only designed with a purpose and goal from the start, but it also provides the opportunity to identify the information required to achieve that goal.
Everything Supports the Business Goal
So for example, the action mapping process might look like this:
Start with a business goal and identify what actions people need to take to reach that goal. (We’ll cover defining good business goals and identifying actions later in this series.)
Then, for each defined action, design activities that learners can practice in order to improve on those actions.
Finally, identify the minimum information that’s required for learners to complete these activities. This information—and only this information—can then be provided to the learner (e.g. a job aid learners can regularly use in their everyday work).
Cathy Moore developed the action mapping method in 2008. And because she is the creator of and expert in this technique, we recommend visiting her website—which features oodles of resources about action mapping.
What are the benefits of action mapping?
In addition to simplifying the design process, action mapping has a number of benefits, including:
The focus is on alignment with business goals, not simply delivering information.
It forces you to have a clear plan for achieving a specific goal.
It leads to more engaging training because it actually prepares people for real-world tasks.
Thanks to practice activities, learners can apply the knowledge gained—not just recite information.
Because you’ve identified the minimum requirements, you can consolidate information into a reusable, accessible job aid.
A Word of Caution: Action Mapping ≠ Mind Mapping
Cathy Moore identifies several “Corrections to some common misunderstandings” on her website. And I especially want to emphasize the last point she makes:
Action mapping is not a mind map exercise.
The aim of action mapping is to stick with the essential goals, activities, and information needed to meet the business goal. From there, you must justify anything else you add to ensure the resulting solution is still efficient and effective. Simply put, don’t include irrelevant or inconsequential information. A classic example is including information about “the history of X” in your training; you don’t need to know who Gennaro Lombardi is to make a good pizza.
This doesn’t mean your action mapping diagram will necessarily be simple. You may genuinely have many required actions, but the goal is to keep the map as simple as possible.
That’s not to say mind mapping and spitballing ideas can’t be part of the process. In fact, generating lots of ideas may be a helpful step—but only if it’s followed by prioritizing and only keeping the most relevant actions, activities, and information.
Action Mapping & ADDIE: A Powerful Sidekick
Since we covered the ADDIE model in our last post, you may be wondering where and how action mapping actually fits into that process. We’re not suggesting that the two methods are dependent on one another for success, but instead that they share components that can be beneficial if used together. For instance, action mapping has elements that would occur during ADDIE’s Analysis stage (i.e. business goals and job behaviors), while other elements would fall under the Design stage (i.e. activities and information).
This serves as a reminder that ADDIE doesn’t have to be a strict process where you finish one step before moving onto the next. There may be overlap, and it’s ok if some design tasks happen during the analysis phase, or if analysis continues as you work out the design.
And you don’t have to create the action map in one session. It’s a good idea to review and validate each layer of the action map with stakeholders before putting too much effort into pinning down the next layer. Let’s take a look at how using action mapping with ADDIE helps eliminate issues with content dumping and (quantifiable) evaluation.
Action Mapping vs. Content Dumping
Action mapping is designed as an antidote to the problem of content dumping—an instructional design approach that involves collecting comprehensive information about a subject and then turning that information into training and a quiz. (From an ADDIE standpoint, content dumping covers little to no part of the analysis stage and jumps straight into the design stage.)
What’s the problem with content dumping?
Like cooking pizza in a microwave, content dumping is quick and easy but has terrible results. That’s because content dumping doesn’t consider business impact or performance improvement in the training’s design. Instead, you’re left guessing about what, if any, possible performance improvement or business gains might have resulted once the training is complete.
In an earlier blog post, Why Is Learning Evaluation So Hard?, we explained that when you don’t have clearly defined performance and business goals, it’s difficult to measure any kind of success or impact on the organization. And because the content hasn’t been designed to help people do their jobs better, the resulting training is often irrelevant, boring, and ineffective.
Evaluate Your Learning Design [CHECKLIST]
Remember when we said you should evaluate at each stage of the instructional design process? Well, Cathy Moore created a downloadable checklist to help you evaluate your action-mapped learning design!
Evaluation in practice—Let’s try it out together.
Just for fun, let’s apply the second criteria of Cathy’s checklist to the learning outcomes we identified for this blog series.
Is this blog series an information dump?
Cathy’s checklist states that “information dump objectives” tend to use verbs such as understand, identify, explain, or define. Whereas “action-oriented objectives” tend to use verbs relating to actual job tasks such as sell, lead, encrypt, schedule, or design.
While most of our objectives meet Cathy’s criteria, it’s interesting to note that the first two objectives use the verbs “outline” and “explain.” The checklist categorizes these as knowledge objectives that may not be informed by the required actions.
This is absolutely true. And while the other learning objectives were written starting from the performance goal, those first two were added to represent content that I wanted to include at the start of the blog series. The checklist has caught me out! Including this introductory content might be good practice for a marketing blog series, but it’s sloppy learning design, and those first two learning objectives can be considered examples of what not to do in your own learning design.
Is this blog series measurable?
The last objective—identify measurable metrics for each stage of your evidence chain—also uses a verb on the list of knowledge objectives. But in this case, the actual job task is to identify metrics, which means it really is an action objective.
This is a good reminder that, while the verbs used can be good indicators of good and bad learning objectives, the context and meaning of the objectives are what really matter. You can’t fix bad objectives by changing the verbs. You have to revisit to the business goals and identify what actions people really need to be able to do to meet those goals.
Up Next: Training Needs Analysis
In our next blog post, we'll look at another approach to analysis: training needs analysis. In the meantime, sign up for Watershed Insights so you don't miss out!
And the learning measurement survey says...
There's a disconnect between L&D and the wider business. Find out more and read our fourth annual survey report with LEO Learning that provides an evolving picture of L&D’s relationship with measurement and business impact—including real-world examples and extended commentary.