Welcome to our fifth installment exploring results from our Learning Analytics Research Study and what they mean in practice. This week, we’re diving into the dimensions under the learning program categories. (We’ll pick up the learning experience category next week.)
Learning Program Dimensions
Learning Program analytics means looking at a group of learners in a particular learning or training program. A common question is whether people have completed the required program elements. Still, this category also can include questions about a learning program’s effectiveness in improving job performance or driving business KPIs.
We identified four dimensions under the learning program category:
- Attempt: comparing different attempts at a course or other program element
- Status: comparing people with different statuses for a program (e.g., those whose status is “passed” against those whose status is “failed”)
- Interaction: data organized by individual learner interaction that’s often displayed in interaction streams (e.g., being hired, completing a program of learning, or simply interacting with an eLearning course element)
- Program: data about the program presented together, which is often displayed in Watershed’s program report (e.g., completed learner milestones, amount of time learners took to finish, or program completion percentages of various groups or individuals)
In practice, most of the reports we observed under the learning program category were program reports, making up nearly 87% of the view share.
Most of the remaining reports in this category (nearly 12%) were interaction reports and leaderboards organized by interaction.
So, how popular is the learning program category in practice?
Looking at our data of real-world learning analytics, we found that—even though organizations in our study created nearly double the amount of learning experience reports compared to learning program or learner reports—learning program reports had significantly higher average view counts per report.
(And this is still true, but to a lesser extent, if we exclude the organization with the most report views, which also mainly uses learning program reports).
What does this mean?
This information provides some helpful insights:
- Reports within the learning program category, which look at a group of learners within the context of a particular learning program, tend to be used most frequently.
That doesn’t necessarily mean these reports are more valuable or impactful in terms of resulting actions, but it’s certainly a good indication that learning program reporting is worthwhile.
- Seasonal changes, such as business financial cycles, can affect learner program report usage because they are most relevant while the programs are running—especially around deadlines. For instance, you might need to complete the training before a product launch.
In fact, if we look at data over time, learning experience reports have more views than learning program reports in the second half of the year, while learning program report views spike around the end of the financial year in April.
Here are two actions you can take away from this blog post:
- Learning Program reports tend to get the most attention, so make these reports available to your organization’s managers!
- Reporting on data about learning programs lends itself to specialized learning program reports (such as Watershed’s program report) rather than more generic visualizations (such as pie charts or bar charts).
Up Next: The Learning Experience Category
Next week, we’ll explore the different dimensions within the learning xperience category. We identified many more dimensions under this category, so stay up to date and sign up for Watershed Insights for the latest blog posts.
About the author
Subscribe to our blog