Welcome to our fifth installment exploring results from our Learning Analytics Research Study and what they mean in practice. This week, we’re diving into the dimensions under the Learning Program categories. (We’ll pick up the Learning Experience category next week.)
Learning Program Dimensions
Learning Program analytics means looking at a group of learners on a particular program of learning. A common question is whether people have completed required program elements, but this category also can include questions around a learning program’s effectiveness in improving job performance or driving business KPIs.
We identified four dimensions under the learning program category:
Attempt: comparing different attempts at a course or other program element
Status: comparing people with different statuses for a program (e.g. those whose status is “passed” against those whose status is “failed”)
Interaction: data organized by individual learner interaction that's often displayed in interaction streams (e.g. being hired, completing a program of learning, or simply interacting with an e-learning course element)
Program: data about the program presented together, which is often displayed in Watershed’s program report (e.g. completed learner milestones, amount of time learners took to finish, or program completion percentages of various groups or individuals)
In practice, most of the reports we observed under the Learning Program category were program reports, making up nearly 87% of the view share. Most of the remaining reports in this category (nearly 12%) were interaction reports and leaderboards organized by interaction.
Views by report type (Learning Program category only)
So, how popular is the learning program category in practice?
Looking at our data of real-world learning analytics, we found that even though organizations in our study created nearly double the amount of Learning Experience reports compared to Learning Program or Learner reports, Learning Program reports had significantly higher average view counts per report. (And this is still true, but to a lesser extent, if we exclude the organization with the most report views—and which also mainly uses learning program reports).
What does this mean?
This information provides some helpful insights:
- Reports within the Learning Program category, which look at a group of learners within the context of a particular learning program, tend to be used most frequently. That doesn’t necessarily mean these reports are more useful or impactful in terms of resulting actions, but it’s certainly a good indication that Learning Program reporting is worthwhile.
- Seasonal changes, such as business financial cycles, can affect Learner Program report usage because they are most relevant while the programs are running, especially around deadlines (e.g. for a product training program, you might need to complete the training before the launch of the product). In fact, if we look at data over time, Learning Experience reports have more views than Learning Program reports in the second half of the year, while Learning Program reports views spike around the end of the financial year in April.
Here are two actions you can take away from this blog post:
Learning Program reports tend to get the most attention, so make these reports available to your organization’s managers!
Reporting on data about learning programs lends itself to specialized learning program reports (such as Watershed’s program report), rather than more generic visualizations (such as pie or bar charts).
Up Next: The Learning Experience Category
Next week, we’ll start to explore the different dimensions within the Learning Experience category. We identified a lot more dimensions under this category, so stay up to date and sign up for Watershed Insights for the latest blog posts.
What types of learning analytics are you using?
Download this checklist to start tracking the analytics you already have in place and where you’d like to go next.