ADDIE: A 5-Step Process for Effective Training & Learning Evaluation

    

In our previous blog post, we explained the challenges associated with learning evaluation. Simply put, when training isn't properly designed with specific goals in mind, it's nearly impossible to actually evaluate effectiveness or impact on overall organizational goals. In this post we’ll explore the five stages of the ADDIE model of instructional design—analysis, design, development, implementation, and evaluation—and how this process can help or hurt your learning evaluation methods.

What is the ADDIE model?What is ADDIE?

The ADDIE model of instructional design is a five-step process to developing iterative learning and training activities. The ADDIE model of instructional design is as ubiquitous to learning design as Kirkpatrick is to learning evaluation.

Originally developed in 1975 by Florida State University for the U.S. Army, ADDIE remains the default instructional design process for many organizations. This model was originally created as a very detailed process, however, these days most instructional designers will only be familiar with its five main stages:

  1. Analysis
  2. Design
  3. Development
  4. Implementation
  5. Evaluation

All five phases are equally important, but in practice, most effort is often put into the Design and Development stages, while the Analysis, Implementation, and Evaluation phases are often less well done. This could be caused by time constraints on L&D for rapid development, or our tendency to start (and complete) tangible work and skimming over the critical, yet unseen, elements.

Five Steps of ADDIE Explained

1. Analysis

Use ADDIE for analysis

The analysis stage was originally conceived to analyze the job tasks associated with the training. Now, it’s often seen as an analysis of the learners and learning requirements rather than performance analysis.

For example, you might look at the content availability on a certain topic, consider delivery options for the training, and assess project timelines.

Extra Insight: This stage often doesn’t include analysis of either the business and performance goals attached to the training, or questions around what learners need to do differently or better to achieve those goals. And typically, very little time is allowed for the analysis phase. (Our BALDDIE design model, which we’ll cover in a later post, splits analysis into three stages to help give this aspect more attention.)

2–3. Design & Development

Designing and developing learning content, resources, and experiences are complex topics that could warrant their own dedicated blog posts. So, for space and time’s sake, we’ve provided brief explanations of each stage.

The design stage focuses on both the design of the learning experience and materials needed to support the experience. And good design builds on the results of the analysis.

Once the learning has been designed, the development stage focuses on creating and developing those materials and experiences.

Extra Insight: As mentioned earlier, the design and development stages are generally understood and executed well, so no considerations here.

4. Implementation

The implementation stage consists of the execution and delivery of the designed content. You can’t just throw your e-learning content into the LMS and hope for the best. Otherwise, you risk wasting all that time, energy, and effort you invested in designing that content.

Implementation is important

That’s why you need to consider implementation best practices. This includes how learners will discover content, ease of access to that content, and ongoing maintenance to ensure the content is still functioning and relevant.

Extra Insight: Proper implementation is becoming more valued—especially as the importance of social learning becomes more recognized and learner expectations of digital experiences increase based on digital experiences in their personal lives. And this is a good thing because even when you design and develop the best e-learning course in the world, it’s not going to have an impact if it’s implemented poorly. In other words, implementation is an important step that should not be ignored.

5. Evaluation

The Evaluation stage is used to assess the quality and effectiveness of the entire instructional design process. Many people assume that because the “E” in ADDIE comes at the end of the acronym, that evaluation happens at the end of the process. But actually, evaluation has always been intended to be part of every stage.

ADDIE Evaluation Model

The analysis should be evaluated.

The design should be evaluated.

The development should be evaluated.

The implementation should be evaluated.

Extra Insight: Rather than saving evaluation till the end of your process, you should be doing this at every stage to ensure issues can be addressed early. Otherwise, you may end up implementing the training before you realize it was built on a foundation of errors (that could have been prevented).

ADDIE also includes both internal and external evaluation so you can take input and direction from those:

  • familiar with the project who understand your decisions, and
  • with fresh eyes who can challenge your assumptions.

Make a plan and map it out.

If followed closely, ADDIE is an ideal framework for designing measurable learning programs—specifically because it was designed to be an outcome-based approach. With evaluation happening at each phase, you are more likely to stay aligned with the business goals/outcomes defined during the Analysis phase. And one way to ensure a well-crafted analysis is to map how everything fits together.

Up Next: Action Mapping & Instructional Design

In our next post, we’ll look at how to apply Action Mapping techniques to draw a map from the business goal through actions and activities to the information required for the learner. Action mapping is also a great technique to use within the Analysis stage of ADDIE. Want a head start? Read more about action mapping on Cathy Moore’s website.


And the learning measurement survey says...

There's a disconnect between L&D and the wider business. Find out more and read our fourth annual survey report with LEO Learning that provides an evolving picture of L&D’s relationship with measurement and business impact—including real-world examples and extended commentary.

Read the 2020 Measuring Business Impact of Learning Research Report

Andrew Downes

About The Author

As one of the authors of xAPI, Andrew Downes has years of expertise in data-driven learning design. With a background in instructional design and development, he’s well versed in creating learning experiences and platforms in corporate and academic environments.