Analytics are all the rage these days, as they’re powering some truly remarkable advances across industries. While standing under a sign that says “learning analytics” is a great way to get some attention in the learning community, what is learning analytics and what does it mean for corporate learning programs?
In this blog series, we’ll shed light on the exciting—and sometimes confusing—topic of analyzing learning. We’ll outline a model for talking about the different types of learning analytics and describe the various levels of sophistication in analytics.
Learning Analytics Defined
Even though it's a popular topic, there doesn’t seem to be a standard definition or explanation for using the term. For instance, one commonly used definition of is:
the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs
That’s not a bad place to start, but there’s still a lot of disagreement and confusion about what the term actually means. Complicating things further, most of the existing literature defines the term in an academic context that has subtle but significant differences from how it’s used in a corporate context.
Let's get down to business
In the corporate context, the purpose of learning is to develop people to perform better in their jobs, to prepare them for the next steps in their careers, and to ensure the long-term success of organizations. “Learners” are often better described as “employees,” or sometimes even “prospective employees.”
Learning programs exist to benefit organizational performance, which, in turn, impacts business outcomes. And improving business outcomes is a key priority for most business leaders (i.e., the people who sign your paycheck). Analytics about learners, programs, and experiences link learning with individual, team, and organizational performance.
We suggest that the definition of corporate learning analytics is something more like:
the measurement, collection, analysis, and reporting of data about learners, learning experiences, and learning programs, for purposes of understanding and optimizing learning and its impact on an organization’s performance
Up Next: What does it really mean?
During the following weeks, we’ll explore the world of analyzing learning and discuss topics such as applications, predictive and prescriptive analytics, and learning experience analytics, just to name a few. Be sure to sign up to have the latest installments delivered straight to your inbox.
Many thanks to our friends who donated their time to review this series and provide some great feedback. The opinions expressed in this series are those of Watershed and do not necessarily represent the opinion of any reviewer. No endorsement for Watershed nor this series is implied.
- Charles Jennings, Co-founder at 70:20:10 institute
- John R. Mattox, II, Ph.D., Managing Consultant at Metrics That Matter™
- A.D Detrick, President & Founder at MetriVerse Learning Solutions
- Michael Rochelle, Chief Strategy Officer at Brandon Hall Group
- Dani Johnson, VP, Organizational Learning Research at Bersin by Deloitte
- Lori Niles-Hofmann, Senior Learning Strategist
- Megan Bowe, Vice President of the Board at Data Interoperability Standards Consortium
- Piers Lea, Chief Strategy Officer at LEO and Learning Technologies Group plc
About the author
As an innovative software developer turned entrepreneur, Mike Rustici has been defining the eLearning industry for nearly 20 years. After co-founding Rustici Software in 2002, Mike helped guide the first draft of xAPI and invented the concept of a Learning Record Store (LRS). In 2013, he delivered on the promise of xAPI with the creation of Watershed, the flagship LRS that bridges the gap between training and performance.
Subscribe to our blog