Learning Analytics: Complexities & Analysis Types

    

The last few posts in our Learning Analytics Research series have looked at the three categories of learning analytics (i.e. learner, program, and experience) and their respective dimensions. Now, we’ll focus several posts on exploring the four levels of complexity—starting with a quick overview of each level.

What are the four levels of complexity?

Watershed Learning Analytics Complexities

1) Measurement

Analytics start with measurement, or the simple act of tracking things and recording values to tell us what happened. Measurement doesn’t require complicated math or statistics, but you must start by gathering data. Otherwise, it’s impossible to do any analytics.

Measurement asks: What’s happening?

2) Data Evaluation

Once you’ve captured the data, it's time to start to evaluating it and assessing whether the data means something good or bad. At this level, we’re applying high-school level math—averages, means, modes, and basic statistics—to aggregate the data and establish benchmarks. In current practice, most analytics fall into the basic data evaluation category, and that’s OK. There’s tremendous value here, and opportunities for some huge wins.

Data Evaluation asks: Is what’s happening good or bad?

3) Advanced Evaluation

Exciting things start to happen as we get into advanced evaluation and apply college-level math. Here, we’re looking at things such as correlations and regression analysis. We’re applying statistical techniques to understand, not only what happened, but also why it happened. Advanced evaluation creates theories about causation, allowing us to focus on what works best and scrap ineffective learning.

Advanced Evaluation asks: Why is this happening?

4) Predictive & Prescriptive Analytics

The most sophisticated levels of analytics are predictive and prescriptive analytics, which require graduate-level math and often rely on AI or machine learning powered by big data sets. Predictive analytics say, “based on what’s happened in the past, here’s what is most likely to happen next.” Prescriptive analytics take that a step further and say, “based on what’s most likely to happen next, here’s the action we should take to optimize the outcome.”

Ultimately, when we get here, we rely on highly intelligent recommendation engines that deliver just the right learning, at just the right moment, in just the right way to significantly improve performance. As an industry, we’re not there yet, but we can get there if we start measuring and work our way up.

Predictive Analysis asks: What will happen if…?
Prescriptive Analysis asks: How can I make this happen?

Recommended Reading


How are organizations progressing through the levels in practice?

Starting with learning analytics means starting at the first level and building on that foundation. It’s not possible to jump straight into predictive and prescriptive analytics. So, how are the organizations we looked at progressing?

How far has L&D as an industry come?

Well, both in terms of numbers of reports and number of report views, the overwhelming majority of reports we looked at in our research study were classified as:

  • Data Evaluation (90% of report views),
  • Measurement (8%),
  • Advanced Evaluation (2%), and
  • Predictive & Prescriptive (0.04%).
Learning Analytics by Complexity

That’s not to say that Advanced Evaluation or even Predictive and Prescriptive Analytics are not happening in a more manual way—but for the reports on display on dashboards, Data Evaluation is by far the most prevalent. That’s not surprising given where we are in terms of the maturity of learning analytics in L&D. For many organizations, just getting data together is a complex process. And, while technologies (such as xAPI) can make this process easier, not all these technologies are available universally.


NOTE: Watershed clients, whose reporting we reviewed for this study, generally have their learning data gathered as part of their Watershed implementation. So, they’re already a little further ahead. And, since they have data in Watershed, it’s relatively easy to start comparing data to see what’s good and what’s bad for Data Evaluation.


Advanced Evaluation and Predictive and Prescriptive Analytics, on the other hand, require more sophistication, and many organizations that have gathered their data are only just starting to move into these areas. Sometimes, this means connecting additional data sources to bring in job performance and business KPI metrics that can be compared with learning data.

Even as Learning Analytics adoption matures, we wouldn’t necessarily expect to see as many reports on dashboards that we would categorize as Advanced Evaluation or Predictive and Prescriptive Analytics. While these levels of complexity are important for the L&D team, they might be a level of complexity too far for operational managers checking on the progress of their people.

Measurement Analysis Types

As the first level of complexity, measurement is by definition less complex than the other levels. For that reason, we’re not going to devote a whole blog post to measurement, but do want to quickly mention the types of measurement analysis our research study observed.

  • Data Check: Displaying the incoming data to make sure it’s right
  • Data Display: Displaying a piece of data exactly as it has been imported
  • Population Count: How many people are in an imported group?
  • Participant List: A list of people who did a thing
  • List of Resources: A list of resources available to learners
  • Interaction: An interaction stream of data showing what’s happening
  • Last Interaction Analysis: What was the most recent thing that happened?
  • Daily Metrics View: What are our KPI numbers right now?

Actionable Insights

If your organization is still at the level of reporting on what’s happening and whether it’s good or bad, you’re still on par with a lot of other organizations. (Although, there might be space for you to improve the quality and scope of your reporting at this level, or to automate some of your data aggregation and reporting).

To take the next step, think about how you can move onto Advanced Evaluation and explore why things are happening. Read some of our previous Insights blog series around Learning Evaluation and Learning Analytics for suggestions of how to do that.


Recommended Reading


Up Next: The Data Evaluation Complexity

Next week, we dive deeper into the Data Evaluation complexity and the 16 analysis types under that complexity. Be sure to subscribe to our newsletter so you don’t miss our next post!


What types of learning analytics are you using?

Download this checklist to start tracking the analytics you already have in place and where you’d like to go next.

Learning Analytics Checklist

Andrew Downes

About The Author

As one of the authors of xAPI, Andrew Downes has years of expertise in data-driven learning design. And, with a background in instructional design and development, he’s well versed in creating learning experiences and platforms in both corporate and academic environments.