What interesting discoveries will you find when you start digging into your learning data? Maybe you'll get a clearer picture of what’s really happening with learning programs. Or, maybe you'll find answers to questions you didn’t even know you should be asking.
Uncover surprising L&D insights.
When evaluating learning data, you might have specific questions in mind, such as:
- Did people learn from learning experiences?
- Did they apply that learning in the workplace?
- Which learning materials are being used the most?
- How active are different groups of learners?
While exploring these questions, though, you might happen upon unexpected insights that highlight opportunities and/or challenges that you didn’t even know existed. Take, for instance, the following examples based on real events.
Example 1: Gaming the System
A company has a learning program in which learners can test out of certain learning activities. Learners get one attempt to pass a pretest and don’t have to complete the associated learning activity if they pass the test. Those who fail have to complete the learning activity and then take a posttest—which doesn’t have a target pass mark and can be retaken by learners as many times as they like.
The company uses heatmap reports to visualize data about these tests and activities. The reports on these tests display learners’ first scores for the pretest and posttest, their first passing scores, and their overall highest scores. Because learners are supposed to start with the pretest, their first passing scores should be either on the pretest or their first attempts at the posttest (if they failed the pretest).
While looking at the test result data, it’s discovered that some learners’ first passing scores were from their posttest scores—even though they passed the pretest on their first attempts. This isn’t a data error; rather, some learners were starting with the posttest, repeating it until they figured out the answers, and then taking the pretest. As a result, these learners appeared to pass the pretest, which allowed them to skip the learning activity. In other words, they were cheating!
In this example, the heatmap reporting feature wasn’t designed to catch learners gaming the system, but having the ability to clearly view data revealed what was actually happening. Now, the company can:
- decide if this “cheating” method is an acceptable way for people to learn the material while monitoring learners who take this approach, or
- change its LMS to lock down the posttest until after the pretest has been taken.
Example 2: Searching for Nothing
A company wanted to compile a report of what learners were searching for on its portal. I completed the report and was surprised to see that the top result was nothing, nada, zip—an empty search box. In fact, this had more than 10 times as many hits as the next result. I checked the underlying data, and, sure enough, discovered that the majority of searches were for nothing at all.
After sharing this report with the company, we discovered a page on its portal that recommended next steps to learners. But the only way to access this feature required learners to leave the search box empty and then click the search button. Until that point, the company wasn’t sure if this feature was being used; so they were delighted to find that so many people were, in fact, using it.
What Can We Learn?
Sometimes learners behave in ways we didn’t imagine, and sometimes our data gives us unexpected insights that can be invaluable. By collecting, exploring, and visualizing our learning and performance data, we can get a clearer picture of what’s really happening. We can answer our primary set of questions and maybe even be surprised by answers to the questions we didn’t even think to ask.
About the author
As one of the authors of xAPI, Andrew Downes has years of expertise in data-driven learning design. With a background in instructional design and development, he’s well versed in creating learning experiences and platforms in corporate and academic environments.
Subscribe to our blog