During the summer of 2018, we conducted a Learning Analytics Research Study to research how our clients are actually implementing these analytics. This research aimed to place organizations on the learning analytics triangle as a measure of maturity, and to paint a picture of what this actually looks like in 2018.* In this blog series, we’ll explore the results from our study and how real people are doing real learning analytics.
Does this sound familiar?
About two years ago, Mike Rustici introduced a blog series exploring the question What Does "Learning Analytics" Actually Mean?, where he explained several types of these analytics and revealed the Watershed Method™ of Learning Analytics—including a learning analytics triangle.
This triangle serves as both a:
way of classifying learning analytics, and
maturity model for organizations to start at the center, and move around and outward in the triangle as they improve their learning analytics maturity.
Hey, stop! This is required reading.
Because this blog series builds on the foundation of the triangle, be sure you’ve read our first learning analytics blog series before you go any further.
112 Ways to Slice the Learning Analytics Cat
We like to mix metaphors at Watershed and even have several posters around the office displaying these unique phrases—such as it's like shooting a fish in a pond, more than enough moving pieces in this puzzle, and one of our personal favorites, there’s more than one way to slice a cat.
And after conducting our recent study, we’ve found that there are quite a few ways to slice that said cat when it comes to learning analytics.
As we explore the different ways in which organizations are using these analytics, keep in mind that many have just started to scratch the surface of what is possible. These are real examples of learning analytics that real organizations are doing today, not just our ideas of what might theoretically be possible.
Here’s how we conducted our research:
We started by identifying 5,713 Watershed reports in use by our clients on our production servers. Then, we reviewed each of these reports and tagged them with the three categories and four levels of complexity from the original triangle. But, we wanted to go further, so we also:
split each category into several dimensions and each complexity into analysis types, and
tagged reports by format type and by the type of learning experience being reported on.
The result was 112 different tags—or, what I like to call, 112 ways to slice the learning analytics cat. And, as organizations mature, we expect to see even more varied approaches to these analytics (e.g. Predictive and prescriptive analytics may look slim right now but that's because most orgs haven't made it there yet.)
Having collected this data, we, of course, wanted to analyze it to gain and share insights from the usage data on reports of different types and categorizations. We also want to use our findings to inspire other organizations and L&D professionals. So, here’s what you can expect in this blog series as we:
Explore each of these areas in more detail—starting with categories and dimensions and then moving into categories and complexities.
Share insights each week as we look at different parts of the learning analytics model, starting with insights on overall report usage.
Help you find inspiration by sharing real-world examples of how people are doing these analytics today.
Up Next: Learning Analytics Analytics
Join us next week as we share some interesting insights on overall report usage. You don’t want to miss out, so sign up for Watershed Insights to have the next blog post delivered right to your inbox.
*Disclaimer: All report data for our research study is based on usage of Watershed reports and has been kept anonymous. We also used Watershed to collect and analyze report data.
Want to see your learning analytics in action?
We've created a checklist to help you track the analytics you already have in place and where you’d like to go next.