Learning analytics supports learning and development in many ways. So how do organizations use learning analytics today? And how do they want to apply these analytics in an ideal world?
This blog post explores findings from two L&D surveys to gauge how learning analytics is currently being used, and the metrics that sit behind measurement efforts. And we consider the perpetual learning analytics dilemma—does your organization measure the type of learning outcomes that demonstrate business impact?
The state of learning analytics: our approach and methodology
This blog series explores the State of Learning Analytics, combining the results from two recent surveys, which provide insights from more than 1,000 L&D professionals on applying learning analytics in their organizations.
- Measuring the Business Impact of Learning (2023). Since 2016, Watershed has partnered with LEO Learning to conduct the annual Measuring the Business Impact of Learning (MBIL) survey.
- Adopting Learning Analytics: Closing the C-Suite/L&D Language Gap. Watershed teamed up with Chief Learning Officer magazine to survey its readers on topics relating to L&D and learning analytics.
How organizations measure L&D success vs. how L&D measures their success
When we ran these surveys, we were curious to see exactly what organizations measure. And with so much data now available across the learning ecosystem, we can broadly split measurements into three main areas:
- L&D productivity (net new content production)
- Learner experience (productivity and satisfaction, such as number of courses completed and happy sheets)
- Business impact
L&D productivity and learner experience have been entrenched as corporate L&D’s measures for success for many years. The highest-performing learning organizations often build on these points to correlate the outcomes from their learning with tangible impact on business daily life.
It’s important to recognize that a blend of the three core groupings is needed to build a rounded outlook of how L&D is performing as a function, and then the impact it is actually having on the organization as a whole.
But do the measures commonly progress as far as business impact? Our findings suggest not. While 99% of those surveyed want to measure the business impact of learning programs, the majority also rated themselves on the lower end of the learning analytics maturity model (either at basic measurement or data analysis).
Survey results paint a mixed picture of how organizations measure L&D; however, there is a clear tendency toward learning experience analytics, which centers around content and learning experiences rather than the learner themselves and improvements in their performance.
For instance, the MBIL survey results asked how success was measured, and the answers were:
- 16% - We're Not Evaluated
- 14% - Content Utilization
- 27% - Learner Satisfaction
- 11% - Return On Investment
- 19% - Organizational Impact
- 14% - Improvements in Job Performance
This shows us that 41% of organizations measure L&D departments using metrics relating to the learning experience—such as content utilization (14%) and learner satisfaction (27%).
Digging into this further, we also asked how metrics were set, factoring in a blend of metric types being used:
- 18% - L&D consults with the wider business so success can be correlated with business impact (e.g. increase in sales)
- 31% - L&D sets the metrics (KPIs focus on program effectiveness such as completion rates and learner satisfaction
- 51% - A blend of both
When we looked at the breakdown of the data in even more detail, we saw that more advanced organizations (those considered to be ‘strategic partners’), were almost three times more likely to use business impact metrics than those who are considered a ‘shared service’.
For more context on the traits that define both strategic partners and shared services, check page 9 of our Measuring the Business Impact of Learning in 2023 report.
What is L&D evaluating with learning analytics?
The CLO survey results show that L&D departments use learning analytics to evaluate learners, content, and programs. Specifically:
- 66% use them to evaluate and improve learner experiences.
- 59% use them for compliance reporting.
- 50% use them for learning content analytics.
L&D teams were less likely to use analytics relating to performance:
- 44% use them for performance and learning.
- 39% use them for correlating learning performance with business KPIs.
- 28% for correlating learning performance with skills.
And last, teams were least likely to use learning analytics on learners:
- 38% use them for organizational reporting.
- 35% use them for learning programs.
- 34% use them for individual/team management.
Compliance: mandatory reporting remains a business essential
Compliance reporting was the second most-selected option for using learning analytics, with 59% of survey respondents listing it in their top 5 uses for learning analytics.
However, when asked how they would like to use learning analytics in an ideal world, only 38% included compliance—it was the second least-selected option.
This was the case even though they could have selected every applicable option. In fact, compliance reporting was the only type of learning analytics where the percentage of organizations selecting that option decreased from current to ideal usage.
Does this imply that organizations want to move away from compliance reporting with their learning analytics? A finer nuance may be that there is more desire to increase learning analytics capability in other areas.
Compliance training often isn’t the most interesting and engaging L&D offering. And especially as some organizations seek to move from a sole focus on compliance training to a learner-centric, performance-focused approach, it can be tempting to think compliance reporting is no longer needed.
But compliance reporting is not going away, and stakeholders—such as regulators and managers—still need to know that people have completed the required training. The good news is that the same modern learning analytics platforms that you might implement to measure performance can also help you report on compliance more efficiently.
Learning analytics platforms can combine compliance data with HR information to create compliance dashboards that automatically update. As a result, managers are empowered to keep track of their teams, while senior leaders and stakeholders have an overall view of organizational compliance.
We want to measure performance, but we don’t.
Less than half of the respondents said they use learning analytics for performance reporting.
When asked about their ideal-world analytics application, however, performance-related analytics were significantly more popular—representing three out of five of the most common responses.
When we compared current uses of learning analytics to ideal ones:
- “Performance and learning” rose from 44% of respondents to 73% (+29%)
- “Correlate learning performance with business KPIs” rose from 39% of respondents to 73% (+34%)
- “Correlate learning performance with skills” rose from 28% of respondents to 64% (+36%)
To put it another way, approximately a third of respondents are not currently using learning analytics to measure performance but would like to.
If you want to measure learning’s impact on job performance and business KPIs, you must design learning programs with specific performance improvements and business impacts in mind. If you don’t develop a program to deliver a particular change in performance and practice, it becomes difficult to measure the impact of the training.
You also need to start collecting data about job performance alongside L&D data in a learning analytics platform. Otherwise, you won’t have any concrete evidence to show what’s working (and what’s not).
Learning analytics: an industry in motion
Learning analytics are critical for helping organizations measure L&D’s impact on business outcomes. After all, if organizations take the time to invest in learning initiatives, shouldn’t they want to ensure those initiatives are worth the investment? It echoes the message the data told in the CLO report that the C-suite desires a shift to tangible reporting on impact.
And while we’ve come a long way from when learning analytics was a rare term, we’re still faced with challenges—such as budget, competing priorities, and lack of stakeholder buy-in—when it comes to measuring learning’s impact.
We hope this blog series will inspire your learning analytics journey to measure the success of your organization.
About the author
As Watershed’s CEO, David Ells takes great pride in leading a dynamic team and turning innovative ideas into reality. His journey within the company began as the Director of Technology, where he played a pivotal role in Watershed’s inception, leading the initial build of the product. His passion for technology and development found its roots at Watershed’s sister company, Rustici Software, where he joined in 2008 as a developer. During his tenure there, he contributed significantly to the creation of SCORM Cloud, a groundbreaking product in the e-learning industry, and led the development of the world’s first learning record store powered by xAPI. And now, as CEO, he’s committed to driving us toward success and delivering exceptional solutions. David’s unique blend of technical expertise and visionary leadership makes him an invaluable asset to the Watershed team.
Subscribe to our blog