5 Ways the Learning Analytics Landscape Is Evolving

    

After combining the data from annual learning measurement surveys with L&D practitioner stories shared during Insights Summits, we’ve noticed a growing shift in the way both L&D professionals and the organization as a whole view learning analytics. That is, L&D doesn’t just want to measure learning's impact, the broader organization is actually prioritizing it. In this post, we’ll discuss the meaning behind these trends and how L&D departments can continue to develop their measurement practices.

Measuring Learning's Impact on the Business

During the past three years, Watershed and LEO Learning have conducted an annual measurement survey to consolidate the individual voices of thousands of market participants—including instructional designers, chief learning officers, and learning technology providers from all over the world.

We now have enough data to start spotting meaningful trends in how the opinions of L&D departments are changing over time. It can be difficult to change institutionalized beliefs about whether an activity—such as learning analytics—is worth the time, but changing this mindset is becoming more attainable, as you'll see below. 


Read the in-depth third annual measurement survey report to dig deeper into the charts, statistics, and general takeaways.


1) Desire + Belief = Action

I want to measure the measure impact of learning.During the last three years, we’ve gone from 3 out of 4 respondents expressing some level of wanting to measure the business impact of learning programs (those who agree or strongly agree), to 19 out of 20 today.

It's possible to show learning's impact.
Further, respondents feel more strongly now than ever before about the possibility to demonstrate learning’s impact.

People want to demonstrate the impact of learning, and they believe they can. This change is due in part to the innovative projects that organizations—such as Visa, Verizon, PwC, Applied, and other learning organizations—have undertaken and shared with the world via conferences, xAPI cohorts, and case studies.

These projects show a wide audience of learning professionals what’s possible in a modern, data-connected learning ecosystem that’s guided by a learning analytics strategy. But this change also is driven by the increasing number of vendors in our space offering high-quality, native xAPI implementations—or at least accessible data exports that can be ingested as xAPI data.


Recommended Resources


In any case, what’s really important about these survey results is what they predict.

If you have a desire to do something and you believe it’s possible, there’s not much else standing in your way to take action.

These results reflect the current reality in which interest within the market is converting to action—whether they be experiments, pilots, or all-out enterprise deployments. Of course, in an organizational context where you have many players and stakeholders, it may not be enough for you alone to have the desire and belief. In this case, demand from the organization is the other key ingredient.

2) Executive expectations continue to rise.

I feel executive press to measure learning's impact.

There's a jump from around a third in the initial results, to more than two-thirds today. That’s a massive majority change, and it’s the highest velocity trend across the entire survey.

Learning professionals are no longer the only audience paying attention to the changing tides of learning analytics. The executives—who are responsible for directing major initiatives and changes to keep their organizations alive and relevant—are taking notice as well.

As so many of us know, the attention of senior executives is often driven by need. And the growing demand suggested by these survey results indicates an increasing organizational awareness of the need for effective development of people in the workplace.

But how will the success of learning programs actually be measured?

3) There are different ways to measure learning’s success.

How is learning success evaluated?

Though the results are more subtle here, there's a shift from non-evaluation and evaluation based on ROI into evaluation based on “organizational impact.”

Historically, organizations have attempted to directly measure the financial return of learning programs, which could ignore the important cause-effect relationships driving that return. But we're seeing a healthy shift showing a more direct measure of learning in the form of its impact on operational metrics and internal KPIs, which should ultimately translate to the bottom line.

This shift in attitude agrees with what our own clients and other interested parties are telling us. They ultimately want to connect data from learning activities with the operational performance metrics of their learners.

For example, connecting these dots is a crucial part of understanding not only whether the recently launched video platform in your enterprise is getting engagement, but also if it’s actually making a difference in customer satisfaction scores or sales results. Being able to connect these dots and quantify the impact of new initiatives or changes in approach is a powerful way to drive further investment in learning.

So what’s getting in the way of measuring the success of learning?

4) Actionable challenges create possibilities.

Challenges to measuring learning's impact.
The perception of difficulty, cost to value, and lack of priority for measuring the impact of learning are decreasing.
 

There’s a fascinating shift away from institutional challenges to operational challenges.

In the initial survey, respondents cited main obstacles such as:

  • It’s too hard,
  • It’s too costly,
  • No demand, and
  • Competing priorities.

Now, they’re reporting their main obstacles are:

  • Don’t know where to start,
  • No access to data, and
  • Other (which we can assume to be more miscellaneous, detailed, operational challenges).

As the possibilities of learning analytics become clearer and demand increases, the perception of difficulty, cost to value, and lack of priority are all decreasing. And they’re slowly being replaced with the sort of challenges people will identify when they are ready to take action, such as how to get started or how to start collecting data. These are much more tractable challenges!

5) Marry data with storytelling to initiate change.

Changing institutionalized beliefs about whether an activity is worth the time is difficult, but using data backed with a narrative that aligns with the business goals can help. To change how learning analytics is viewed and used in your organization, start first within your team, and build out from there.

Need some inspiration? Stay tuned for our next post, as David Rosenfeld shares how he used data storytelling to gradually advance the measurement practices of his L&D colleagues.


Don’t want to wait?

Check out our eBook, 5 Steps to Getting Started with Learning Analytics, to further develop your learning measurement practice.

eBook: How to Start Using Learning Analytics

David Ells

About The Author

David Ells has played key roles in the development of SCORM Cloud, the xAPI spec, and more. As our CEO, he leads our team and loves turning ideas into reality.