Learning Program Analytics

    

Now that you know how to use analytics resulting from both learning experiences and learners themselves, it's time to explore learning program analytics—which seek to understand how an overall learning program is performing.

 Focus on L&D's Strategic Impact

What are learning program analytics?

Typically, a learning program encompasses many learners and many learning experiences, although it could easily contain just a few of either. Learning program analytics focus on the overall effectiveness of the program and, particularly, its impact on the business. (These questions often require measuring business data from activities outside of the learning function.)

These analytics answer questions about learning’s strategic impact on the business such as:

  • Do learners behave differently after completing training?
  • Has organizational performance improved because of learning?
  • Has this method of learning saved the company money?
  • Which learning methodology is most effective for improving organizational performance?

Apply Learning Program Analytics

Here's one way you can apply all four levels of complexity in the learning program analytics category. In this example, we'll focus on understanding more about a company’s customer service onboarding program that wants to maximize customer satisfaction.

1) Learning Program Measurement

Track the assessment scores and question results from the final assessment in new hire training. Track the customer satisfaction scores that each agent receives.

2) Learning Program Evaluation

Graph assessment scores across training cohorts. Is there an even distribution? If not, why? Are cohorts evenly distributed with more/less experienced candidates? Are some cohort instructors more effective than others?

3) Advanced Learning Program Evaluation

Run the assessment scores and average customer satisfaction scores of each agent through a correlation engine. Is there a strong positive correlation between assessment score and customer satisfaction score? If not, is it possible that the assessment really isn’t effective at distinguishing more competent agents from less competent agents?

4) Predictive Learning Program Analytics

Can we find any statistical evidence to indicate that individual questions results are good predictors of strong or weak customer service agents? Can we use these predictions to fast track the strong performers for promotion or schedule mentoring for the weak performers?

Up Next: Learning Analytics Platforms

Now that we’ve covered the basics of learning analytics—including the Watershed Method for analyzing learning—we'll explain a learning analytics platform and explore how it differs from a learning management system (LMS) or business intelligence (BI) tool. Don't want to miss out? Subscribe to Watershed Insights to receive L&D industry updates, helpful advice, and more!


Getting Started Is Easy

As you can see, performance analytics open a world of insights and data-driven decision capabilities. And the possibilities for what you can measure and evaluate about your learning programs is nearly endless. Remember, getting started is easier than you think. Even just a few data points can yield powerful results. Use the following guide to help you get started right now!

New Call-to-action


Mike Rustici

About The Author

As an innovative software developer turned entrepreneur, Mike Rustici has been defining the eLearning industry for nearly 20 years. After co-founding Rustici Software in 2002, Mike helped guide the first draft of xAPI and invented the concept of a Learning Record Store (LRS). In 2013, he delivered on the promise of xAPI with the creation of Watershed, the flagship LRS that bridges the gap between training and performance.