Fast Lane to Learning Measurement

Webinar Summary

Executive pressure to prove learning's impact has doubled since 2016.

In September 2016, Watershed partnered with LEO to survey hundreds of L&D professionals worldwide about learning measurement within their organizations. We repeated that survey at the close of 2017, and it raised some warning signs for the future of L&D.

Andrew Downes and Piers Lea hosted this 30-minute crash course webinar, where they recapped survey results and showed how to track and measure learning within organizations and tell stories that resonate with the C-suite. Specifically, Andrew and Piers:

  • recapped the most pressing challenges reported by survey respondents;
  • covered levels of measurement sophistication—including content usage, performance improvement, and organizational impact;
  • explored examples of how to get started with your existing systems; and
  • provided free resources so you can get some results before asking for a budget increase. 

Recording


Attendee Questions

How might video length optimization impact business outcomes (ROI of training)?

Andrew: Great question! The example we reviewed in the webinar was an exploration of how the length of videos affected how likely people are to watch videos in the first place, and how likely they are to complete the video once they start watching.

That sounds fairly far removed from ROI (and it is), but think about it in terms of the chain of evidence we looked at—from the learning experience through to the business impact:

  • Meeting the business outcome is dependent on people doing something different/better.
  • That change in action is dependent on people learning.
  • That learning is dependent on people watching the video that’s designed to teach them.

So, if you collect data that shows that most people aren’t watching the video in the first place, then you know that the video is not having an impact on the business outcome; how could it? In this scenario, you can use data about all your videos to determine which videos are most watched. From there, use what you learn from that to improve how many people watch your videos in future.

It feels a little worse [than pushing a boulder uphill].

Andrew: Getting started with xAPI can be really straightforward, especially if you’re using a tool that has a good implementation of xAPI—such as one of our certified data sources. [I’ve been working with the person who made this comment, and they’ve been struggling with an application that’s not on our certified data list. It has been an uphill battle (and I think we both deserve a medal), but I’m confident we’ll reach a solution soon!]

It rocks to be on the bleeding edge. We are happy to be part of it. The dream is becoming reality.

Andrew: It’s great that people are getting excited about xAPI. It’s new for a lot of organizations, but five years since the launch of xAPI 1.0.0, I’d challenge the “bleeding edge” label. The dream is definitely becoming reality, and we’re seeing more and more enterprise-wide deployments of xAPI as it moves from the cutting edge to the mainstream.

What authoring system is Verizon using to put in the xAPI?

Andrew: Verizon is using authoring tools, but the particular e-learning course they started with was custom built by a content vendor. If you’re looking for an authoring tool with a good xAPI implementation, take a look at our certified data sources list.

What is preventing L&D leaders from making measurement a priority?

Andrew: Competing priorities are the highest-ranked obstacle in investing in a learning measurement strategy. One reason for this is because many L&D practitioners are busy pumping out learning content, but don’t have the time to fully understand whether that content is actually making any difference. The irony is, if we take time to step back and collect data to improve the effectiveness of our learning provision, we can actually be more effective with less content.

What Piers said during the webinar about a mindset change is also really important. People just aren’t used to doing measurement and it requires a significant change in your mindset and way of working. L&D leaders need to be bold and make that change both in their department and in the expectations of the business.

Piers: “Competing priorities” is the biggest reason shown in our research—across the last two years. However, we’ve been conducting workshops at the Royal Institution in London during the last year and have discovered there are other reasons. In particular “knowing where to start” or lacking key skills seem to be big issues. The great thing is that the supply market can now help overcome these, and that is how our group response is set up. There also are plenty of great case studies now to learn from. It really is just a question of getting started. Two years ago, I don’t think the supplier market was ready to step in and help with a modern data-driven approach. Now, we are ready to go!

I dare say #4 [of the 5 steps to get started with learning analytics] is most critical. Data and analytics for L&D starts with curiosity.

Andrew: Step 4, Explore & Analyze, is definitely an important one, and the Watershed clients who I think as "squeezing the most value out of Watershed" are those that take exploring the data to the next level—coming up with innovative reports, visualizations and insights as well as pushing the product to its limits in ways we’d never even thought of. Some of the people we work with have developed reputations amongst their colleagues regarding their obsessions with data, and rumors have started about how they wake wake up in the middle of the night to configure one more bar chart or tinker with measures. It can get quite addictive once you start to find actionable insights that lead to real improvements.

Since we’re ranking the 5 steps (you started it!), I’d like to award Step 2, Get to Know Your Data as the most overlooked step. When connecting a new data source (especially for new xAPI implementations and CSV imports) and setting up reports for the first time, it’s really important to make sure that all the data is accurate and showing what you think it’s showing so you don’t run into surprises down the line.

General Comments:

Great point! There's a huge difference between asking, "What was learning's impact?" verses "Did learning impact people's performance and behavior in a way that moved the business goal in the way we planned?".

Yes! What capability do people need to impact their performance in a way that helps them achieve business goals and how does learning and development influence skills and behaviors in a way that allows them to do so.

This website stores cookies on your computer to improve your experience and the services we provide. To learn more, see our Privacy Policy