In our previous blog post, we explained the challenges associated with learning evaluation. Simply put, when training isn't properly designed with specific goals in mind, it's nearly impossible to actually evaluate effectiveness or impact on overall organizational goals.
In this post we’ll explore the five stages of the ADDIE model of instructional design—analysis, design, development, implementation, and evaluation—and how this process can help or hurt your learning evaluation methods.
What is the ADDIE instructional design process?
The ADDIE model of instructional design is a five-step process to developing iterative learning and training activities. The ADDIE model of instructional design is as ubiquitous to learning design as Kirkpatrick is to learning evaluation.
Originally developed in 1975 by Florida State University for the U.S. Army, ADDIE remains the default instructional design process for many organizations. This model was originally created as a very detailed process, however, these days most instructional designers will only be familiar with its five main stages:
All five phases are equally important, but in practice, most effort is often put into the Design and Development stages, while the Analysis, Implementation, and Evaluation phases are often less well done.
This could be caused by time constraints on L&D for rapid development, or our tendency to start (and complete) tangible work and skimming over the critical, yet unseen, elements.
What are the 5 steps of the ADDIE design model?1. Analysis
The analysis stage was originally conceived to analyze the job tasks associated with the training. Now, it’s often seen as an analysis of the learners and learning requirements rather than performance analysis.
For example, you might look at the content availability on a certain topic, consider delivery options for the training, and assess project timelines.
Extra Insight: This stage often doesn’t include analysis of the business and performance goals attached to the training, or questions around what learners need to do differently or better to achieve those goals. And typically, very little time is allowed for the analysis phase.
(Our BALDDIE design model, which we’ll cover in a later post, splits analysis into three stages to help give this aspect more attention.)2–3. Design & Development
Designing and developing learning content, resources, and experiences are complex topics that could warrant their own dedicated blog posts. So, for space and time’s sake, we’ve provided brief explanations of each stage.
The training design stage focuses on both the design of the learning experience and materials needed to support the experience. And good design builds on the results of the analysis.
Once the learning has been designed, the development stage focuses on creating and developing those materials and experiences.
Extra Insight: As mentioned earlier, the design and development stages are generally understood and executed well, so no considerations here.
The implementation stage consists of the execution and delivery of the designed content. You can’t just throw your eLearning content into the LMS and hope for the best. Otherwise, you risk wasting all that time, energy, and effort you invested in designing that content.
That’s why you need to consider implementation best practices. This includes how learners will discover content, ease of access to that content, and ongoing maintenance to ensure the content is still functioning and relevant.
Extra Insight: Proper implementation is becoming more valued—especially as the importance of social learning becomes more recognized and learner expectations of digital experiences increase based on digital experiences in their personal lives.
And this is a good thing because even when you design and develop the best eLearning course in the world, it’s not going to have an impact if it’s implemented poorly. In other words, implementation is an important step that should not be ignored.5. Evaluation
The Evaluation stage is used to assess the quality and effectiveness of the instructional design process.
Many people assume that because the “E” in ADDIE comes at the end of the acronym, that evaluation happens at the end of the process. But actually, evaluation has always been intended to be part of every stage.
The analysis should be evaluated.
The design should be evaluated.
The development should be evaluated.
The implementation should be evaluated.
Extra Insight: Rather than saving evaluation till the end of your process, you should be doing this at every stage to ensure issues can be addressed early. Otherwise, you may end up implementing the training before you realize it was built on a foundation of errors (that could have been prevented).
ADDIE also includes both internal and external evaluation so you can take input and direction from those:
- familiar with the project who understand your decisions, and
- with fresh eyes who can challenge your assumptions.
Make a plan and map it out.
If followed closely, ADDIE is an ideal framework for designing measurable learning programs—specifically because it was designed to be an outcome-based approach.
With training evaluation happening at each phase, you are more likely to stay aligned with the business goals and outcomes defined during the Analysis phase. And one way to ensure a well-crafted analysis is to map how everything fits together.
Up Next: Action mapping & instructional design
In our next post, we’ll look at how to apply Action Mapping techniques to draw a map from the business goal through actions and activities to the information required for the learner. Action mapping is also a great technique to use within the Analysis stage of ADDIE.
Want a head start? Read more about action mapping on Cathy Moore’s website.
About the author
As a co-author of xAPI, Andrew has been instrumental in revolutionizing the way we approach data-driven learning design. With his extensive background in instructional design and development, he’s an expert in crafting engaging learning experiences and a master at building robust learning platforms in both corporate and academic environments. Andrew’s journey began with a simple belief: learning should be meaningful, measurable, and, most importantly, enjoyable. This belief has led him to work with some of the industry’s most innovative organizations and thought leaders, helping them unlock the true potential of their learning strategies. Andrew has also shared his insights at conferences and workshops across the globe, empowering others to harness the power of data in their own learning initiatives.
Subscribe to our blog