Learning Analytics Categories [UPDATED]

We’ve introduced our Learning Analytics Research Study and explained how learning categories are split into dimensions. And now it’s time to explore results from our study, starting with an overview of these categories, including two new categories we discovered.

What are the categories of learning analytics?

As you may recall, our Learning Analytics Triangle is comprised of the following categories:

1) Learner Analytics

These seek to understand more about a specific person or group of people engaged in activities where learning is one of the outputs. It answers questions about usage patterns and performance for specific learners, such as who is training the most or if everyone in a group has completed compliance training.

2) Learning Experience Analytics

These seek to understand more about a specific learning activity. This category often answers questions about usage patterns for a specific activity, such as what are the most-searched resources or topics or how learners navigate the experience.

3) Learning Program Analytics

These seek to understand how overall learning programs—typically encompasses many learners and many learning experiences—are performing and answer questions about learning’s strategic impact on the business.

But, after digging through our research, we realized that not all our uncovered data and reports fit perfectly into one of our original categories. So, we created two new categories—or bonus categories, if you will. (NOTE: We’re unlikely to add these bonus categories to the Learning Analytics Triangle since they’re not strictly tied to learning analytics.)

4) Business Performance

As we conducted our research, we found a few reports that didn’t fit into any of our original learning analytics categories. That’s because these reports didn’t actually contain learning analytics. Rather, these turned out to be instances of non-learning analytics being mixed in with learning analytics.

For instance, several organizations that were tracking KPI data over time or by business division were bringing that same KPI data into Watershed so they could compare it with learning data—but they were also reporting on KPI data without the learning data.

As a result, we classified these reports under the dimension “KPI-Related Items” and placed them in a new category called “Business Performance” (see the following example).

5) Depends on the Data

We also found a few dimensions that didn’t neatly fit into one of our three main categories. So, we created another category called “Depends on the Data” that covers the following dimensions:

  • None. Some reports didn't have any dimension at all, perhaps just displaying single numbers (e.g. total number of active learners for all time or average NPS score across all courses); these single-number reports can relate to any of the categories and a dimension of “none” doesn’t fit in a category.
  • Classification & Situation. “Classification” and “situation” dimensions can sit under any category since it’s possible to have classifications and/or situations of learners, learning experiences, etc.

What are dimensions?

We’ve split each category into several dimensions—which show how data is organized, or what you’re comparing in a report. For example, in a leaderboard showing top learners, the dimension would be people. In a line report showing completions by month, the dimension would be month.)

So, how popular are these categories in practice?

Learning Program reports have significantly higher average view counts per report. Looking at our data of real-world learning analytics, we found that even though organizations in our study created nearly double the amount of Learning Experience reports compared to Learning Program or Learner reports, Learning Program reports had significantly higher average view counts per report. (And this is still true, but to a lesser extent, if we exclude the organization with the most report views—and which also mainly uses learning program reports).

What does this mean?

This information provides some helpful insights:

  • Reports within the Learning Program category, which look at a group of learners within the context of a particular learning program, tend to be used most frequently. That doesn’t necessarily mean these reports are more useful or impactful in terms of resulting actions, but it’s certainly a good indication that Learning Program reporting is worthwhile.
  • Seasonal changes, such as business financial cycles, can affect Learner Program report usage because they are most relevant while the programs are running, especially around deadlines (e.g. completing program milestones). In fact, if we look at data over time, Learning Experience reports have more views than Learning Program reports in the second half of the year, while Learning Program reports views spike around the end of the financial year in April.

Actionable Insights

Here are three actions you can take away from this blog post:

  1. Learning Program reports tend to get the most attention, so make these reports available to your organization’s managers!
  2. Reports looking at the use and effectiveness of learning experiences are also popular. So, make these reports available to people responsible for sourcing content and providing experiences.
  3. Learner reports tend to be used less than the other categories, so they may not be the best place to start your learning analytics journey.

Up Next: Learner Categories

Next week, we’ll dive deeper into the dimensions within the learner categories and explore how organizations report on the data about their learners. Sign up for Watershed Insights so you don’t miss out!

Subscribe to our blog

Checklist: See your learning analytics in action!

We've created a checklist to help you track the analytics you already have in place and where you’d like to go next.

eLearning Learning

This website stores cookies on your computer to improve your experience and the services we provide. To learn more, see our Privacy Policy