More Than Completions: The Business Case for Learning Program Analytics

Learning program analytics helps you answer a myriad of questions by reporting on learner interactions across an entire learning program. It can tell you who has completed training, if there are glitches in a new platform, and knowing where learners drop off.

This post explores the case for learning program analytics in general, and subsequent posts will break down and explain these analytics by type (e.g. learning game, ecosystem, compliance reporting, platform launch, crisis learning, and academic learning).

So far in our Building a Business Case for Learning Analytics series, we’ve looked at types of analytics that focus on either the learner or the learning content. For the next part of this series, we’ll explore the business case for learning program analytics, which looks at both the learner and a learning program.

What Is Learning Program Analytics?

Learning program analytics looks at both the learner and a learning program to gain insights into a program’s overall effectiveness and impact on the business. For example, learning program analytics for a program showing managers how to give good feedback might look at:

  • The progress of managers through the course, including which managers have completed the program
  • Results from any program assessments
  • The extent to which managers apply the learning to give better feedback

A learning program is a collection of learning experiences that a cohort of learners is hoped, expected, or required to complete. This includes everything from formal compliance training to informal learning games.

A program may be a collection of similar activities, such as a series of courses followed by assessments. Or it might take a blended learning approach and offer a mix of eLearning, videos, instructor-led training, games, work tasks, and other activities while spanning multiple platforms.

These collections of learning activities are typically brought together as a program to address a particular learning objective. For example, a program might focus on a specific:

  • competency or group of related competencies,
  • set of compliance requirements, or
  • change management goal.

How Can I Use Learning Program Analytics?

Learning program analytics can you help you answer questions, such as:

  • How many and what proportion of enrolled people have completed the program?
  • How far are people progressing through the program?
  • What is people’s performance in the program like (i.e. assessment scores)?
  • What problems are people running into with the program, and where?
  • Which parts of the program are most well used or are most effective?
  • How do different groups or individuals compare in terms of progress and scores?

When a program runs over a specific time period (e.g. a learning program scheduled before launching a new product), real-time data monitoring can be especially beneficial to highlight any issues with the program or groups of learners who are struggling or aren’t engaging with the content. Real-time monitoring requires automatic data collection that's regularly updated and a learning analytics platform that can display up-to-date reports on the latest data.

This example Watershed dashboard shows some of the reports that can monitor learners’ progress in undertaking a learning program.

What Does Learning Program Analytics Look Like in Practice?

For Caterpillar’s Global Dealer Learning (GDL) team, measuring performance following a learning program is an essential step of their Career Development Process Wheel (see the following image).

Watershed reports and dashboards help them evaluate their training programs and verify that learners achieve required competencies. The GDL team uses this information to inform their next set of goals for future iterations of the career development cycle.

How Can Learning Program Analytics Save Time and Money?

Nuance allows learners to take pre-assessments to test out of training that covers areas or topics they might already know. So if the learner proves they are competent through the pre-test, they don’t have to spend time covering content they already know. The time-savings can equate to large cost savings when you have thousands of learners.

But manually tracking individual completions and results was time consuming and complex. So Nuance implemented Watershed to automate this process, giving them up-to-date reports on training completion as needed. By automating the reporting, Nuance were able to realize the cost savings associated with pre-assessments.

How Can Watershed Help Support Analytics for Learning Programs?

Watershed is designed to facilitate reporting on learning programs. Specifically, the Program Report brings together many of the visualizations that organizations typically want to use to track their L&D programs. This includes:

  • reporting on what percentage of the cohort has completed each program milestone, and
  • drill-downs to explore different parts of the organization or program aspects in more detail.

You can supplement these ready-made program reports with configurable line charts, bar charts, leaderboards, and other visualizations tailored to your programs and reporting needs. And because we set up automated data links with all your learning platforms, you can report on programs that include elements from multiple platforms, instructor-led training, or real-world simulations—all in one place and with the latest data available on demand.

We’ve found that our clients often view learning program analytics reports. In particular, managers find these reports helpful for monitoring the progress of their people. And Watershed data permissions mean you can configure a program report (or whole dashboard) once—so when individual managers log in, they will only see the data about their people. This feature makes Watershed a powerful tool for empowering operational managers with accessible, on-demand data.

Making the Case: Why the Business Needs Learning Programs Analytics

Monitoring learning program progress is vital to ensure everything runs smoothly and learners complete training as expected.

For example, your reporting might tell you a particular department or team is not engaging with the program. You can then take appropriate action to follow up with the responsible manager.

Alternatively, you might find that a particular part of the program has low completion rates. On investigation, you learn there is a technical issue with accessing that part of the program, and you can then take steps to fix the problem.

Successful L&D programs are essential. After all, you’ve invested a lot of time and money into developing and launching them. That investment is wasted if the program sees low completion rates or people can’t access all of it due to a technical glitch. Measuring the program’s impact is also important to ensure its intended business impact.

How Can You Convince Stakeholders of the Value?

It’s best to include learning analytics as part of your business case for the entire program from the beginning rather than making a separate case for analytics after the program is in place. This ensures you’re thinking about measuring impact right from the start of the program development process and ensures you have the investment in place.

Where possible, make the case for including Watershed in several programs under consideration at the same time. The more programs that use Watershed for reporting, the more value you will get from Watershed.

That's not to say that adding program analytics to an existing training program can't have value, especially if it's ongoing. Analytics for existing programs will help inform the design and development of future programs and inform the maintenance and support of the existing program itself.

The business case for program analytics in Watershed centers around the business case for the learning program itself. Watershed helps you realize the value of that learning program—ensuring the program is effective and that the investment in learning is not wasted.

Understand your stakeholders and how they will benefit from learning program analytics.

Meet your stakeholders.

StakeholdersPain PointsBenefits
C-Suite (CLO, CEO, CFO)C-suite wants to ensure the program addresses its intended need.Learning program analytics help evaluate the success of the program.
Learning LeadersLearning leaders want to ensure a new program is effective.Learning program analytics help evaluate a program’s effectiveness.
Instructional DesignersInstructional designers want to make informed improvements to ensure a program’s effectiveness.Instructional designers can use learning program analytics to monitor and improve the program in real-time.
ComplianceThe compliance team wants to know if the program is effective and if learners complete it.The team can use learning program analytics to monitor the completion and impact of compliance programs.
Line ManagersLine managers want to understand their people’s progress through learning programs. Managers can use learning program analytics to monitor team members’ progress through essential learning programs.
LearnersLearners want to keep track of their progress through a learning program.Learners can review reports tracking their progress through a program to know what’s complete and what’s remaining.

Next Course: What Are Learning Game Analytics and Why Do You Need Them?

Gamification is a fun, informal, and sometimes competitive approach to learning. These games also tend to be more creative than a formal eLearning course or assessment.

As such, a game’s activities may not lend themselves to your typical reporting metrics. Additionally, you may need to support competitive elements with reports, such as leaderboards, which are shared with learners.

Learning game analytics is about implementing metrics, reports, and dashboards that provide insights to monitor and evaluate a game's unique elements and players. In the next post, we outline the business case for learning game analytics and explain why reports tailored to the specifics of the game are so important.

Subscribe to our blog

This website stores cookies on your computer to improve your experience and the services we provide. To learn more, see our Privacy Policy