5 Training Measurement Trends from Learning Technologies 2024

The talks at Learning Technologies are always a good chance to see what the L&D and training community is hyped about. And while virtually all sessions included “Artificial Intelligence” somewhere in the title, it was the learning measurement-focused sessions that caught our attention.

A running theme on using data to measure business impact over Return on Investment (ROI) reflects the need for L&D to prove its value to the wider business. Zsolt Olah (Data & Analytics expert at Intel, formerly Amazon) summed this up neatly: “ROI is still not enough, as you are still a cost center.”

Here are our 5 takeaways from the data, analytics, and measurement sessions we attended:

  1. High-performing L&D teams don’t do it on their own
  2. Shift to a “Business Value” mindset
  3. You don’t have to measure everything
  4. Prove and improve
  5. Outlier data is king

High-performing L&D teams don’t do it on their own

Engaging your business stakeholders to understand business goals is essential for baking in measurement into your learning design. This approach helps assess impact and build a chain of evidence that aligns your training programs with business KPI shifts. Attempting to understand a program’s impact after completion rarely works.

Laura Overton’s session on Business-Aligned Learning delivered the neat motto of business alignment: “We, not me.” Before embarking on a program, L&D and the business need to jointly understand, What does 'good' look like?

Engaging business leaders to develop a shared understanding of learning outcomes is not easy—a common sentiment was that L&D is too easily pushed into delivering outputs without sufficient consulting over purpose. One speaker said their L&D team’s policy was to de-prioritize these requests, playing hardball to drive upfront engagement and enable thoughtful learning program design.

The team at IHG Hotels & Resorts ran an inspirational measurement session that offered practical stories of effectively measuring learning’s impact. Their framework for establishing core objectives for a particular program was metric-focused:

What metrics do we want a general manager to influence?

  • What are the metrics?
  • How can training influence them?
  • How is the shift measured?

Shift to a “Business Value” mindset

Building on the above point, Laura Overton encouraged a mindset shift to engage with the business. L&D’s traditional approach to discussing program success is too inward-focused, focusing on Learning Value (e.g., activity, efficiency, engagement, usefulness).

If you can shift your mindset to appreciate the bigger goals that focus on Business Value (e.g., performance, talent, culture), then as an L&D expert, you open your thinking to how learning program effectiveness is perceived outside of your own department.

A balance of both approaches is something we see “Strategic Partner” L&D teams successfully adopt, reflected in our most recent Measuring The Business Impact of Learning Report. The most advanced Learning Analytics approaches use a blend of both L&D focused metrics (e.g., learner satisfaction scores, L&D productivity) and business-focused metrics (targeted business KPI shifts).

You don’t have to measure everything

Any audience question on the thorny topic of how to make compulsory compliance training both engaging and linked to business KPIs led us to the next takeaway…

You don’t have to measure everything.

It’s true! Measuring training impact is hard work. To do it effectively L&D needs access to data (i.e., learner, business, and, likely, HRIS data), engaged business leaders—plus the tech, time, AND the skill sets to analyze the data. So applying a blanket approach to understanding the impact of every training program is unrealistic.

This sentiment was reflected in the discussions with both Alice Thomspson (Marks & Spencer) and Fran Butler and Tayn Pavelic (IHG Hotels & Resorts) in their session on learning impact.

Doubling down on critical projects and applying measurement maps to core tracks enables you to focus and deliver with intent. If a project sponsored by the C-Suite successfully demonstrates impact, you can open the door to further investment in analytics on other tracks.

You can reverse engineer this approach too, and start with the business challenge. XYZ Function is performing worse than its peers—but why? What data can we explore to understand why? From a learning data perspective, outlier content can provide a similarly targeted approach (more on this below).

Prove and improve

This is an age-old adage that we discussed in-depth in our webinar with Bonnie Beresford (catch the recording: The State of Learning Analytics: Views from 1,000 L&D Professionals).

“From prove to improve” covers the concept of starting small and proving value where you can. It's about getting started, understanding the current state of play, and using the data you have to link impact with learning performance. In short, can you prove learning impact?

“Improve” is where the fun starts. That’s when you start analyzing your data in meaningful ways to improve your learning output. Whether you focus on improving content design, the learner experience, or driving more effective system usage, this stage is all about using the data you have to build from where you are.

Did you know there are more than 20 separate use cases where learning analytics can drive value for different areas of the business? Even better, we have an interactive tool that helps you navigate to the pressing issues relevant to you.

“Improve” places you at the more progressive end of Watershed’s Learning Analytics Maturity model. In short, measure to improve (i.e., design, delivery, and experience).

Outlier L&D data is king

As mentioned, a blanket approach to L&D measurement can feel unwieldy and difficult to derive value from. This is where outlier data can help focus your efforts.

The beauty of this approach is you can apply it to both business and learning data. For instance, a Watershed client used regional business data to identify spikes in repeat repairs following a new product launch. They reviewed and revised their training approach to the particular issue. The results successfully correlated the new training program with a reduction in repeat repairs (i.e. the issue was being addressed correctly upon first repair).

Additionally, IHG mentioned during their presentation that they mentioned they could identify training needs and best practices by exploring hotel KPI data (those with either particularly good or bad guest satisfaction scores). In other words, outlier data is fantastic for sparking curiosity.

The same applies the other way round. You can use learning data to focus on the training effectiveness first. For example, our friends at Caterpillar used outlier data from their Kaltura platform to understand and improve their learning video content.

To sum up…

Despite the talks being dominated by AI, it was great to see that learning impact is still firmly on L&D’s agenda. The themes above are conversations and challenges we hear firsthand in our everyday discussions with clients and partners.

And we’ll leave you with one side note. If you’re going to make decisions based on data, it’s handy to know what you’re dealing with. L&D data integrity gives confidence and helps prevent “garbage in, garbage out” scenarios that can lead you down the wrong path.

Subscribe to our blog

5 Ways Learning Analytics Can Transform Your Business

Learning analytics has a reputation for being complex and challenging. But what if you ignore the technicalities and look at the stories that data can tell? In this talk, we look at five ways global enterprises used their learning analytics platform to improve, adapt and transform areas of their business in very different circumstances.

eLearning Learning

This website stores cookies on your computer to improve your experience and the services we provide. To learn more, see our Privacy Policy