Learning Analytics: Data Evaluation Complexity

    

Last week, we saw that Data Evaluation is by far and away the most common complexity level of Learning Analytics we observed in our research study. Within this complexity, we found 16 types of analysis represented, which we’ve bundled into three headings and will explore in this blog post.

1) Frequency and completion

Frequency and completion analysis types look at how often things happen and if things have happened. This includes:

  • Attendance Analysis: How many people showed up? Are sessions and training resources well utilized?

  • Certification Analysis: How many people are certified? Have people obtained their required certifications?

  • Completion Analysis: How many people completed a course, video, quiz, etc.? Have people completed what’s required?

  • Utilization Analysis: How often is a learning resource used? What are the most-used resources?

  • Engagement Analysis: What level of engagement does a resource have (e.g. comments, likes, shares)? What’s engaged with the most?

  • Frequency Analysis: How often does something happen? This is a catchall for anything that doesn’t fall into one of the analysis types covered above.

For example, the following report shows frequency analysis of mistakes during a mock code blue simulation.

Frequency Learning Analysis

This report shows that three steps of the Mock Code Blue see significantly more mistakes than the others. This is a good indication that more training may be needed in these areas.

2) Outcome

Outcome analysis looks at the outcome of things, such as how well people perform or how a learning experience is rated. This includes:

  • Before-and-After Analysis: What changed before and after an event (e.g. assessment scores, job performance, etc.)?

  • Learner Ratings Analysis: How did learners rate a learning experience?

  • Confidence Analysis: How confident are people in their answers?

  • Time Spent Analysis: How much time is spent learning?

  • Progress Analysis: How far are people through an event or experience?

  • Question Analysis: How well are people answering questions?

  • Cost Analysis: What's the cost to provide the event or experience?

  • KPI Analysis: Where do KPI values stand?

The following confidence analysis example shows a comparison of confidence against correctness.

Confidence AnalysisThis kind of report could be used to identify people above the line who are more correct than they are confident, and people below the line who are more confident than they are correct. Confident but wrong people can be dangerous because they will act on their incorrect knowledge.

3) Others

A couple of analysis types that don’t fit in the first two collections are:

  • Resource Analysis: Analysis of resource metadata (e.g. How long are pieces of content?)

  • Search Analysis: What are people looking for? How often is the search feature being used?

For example, the following report shows use of search by device over time:

Search AnalysisThe majority of search is happening on desktop, and there was a significant increase in both desktop and mobile search in the spring of 2018. This may be a useful prompt to dig deeper into why activity increased so dramatically or to proactively evaluate if existing content is accessible and useful on mobile devices. 


What does Data Evaluation look like in practice?

While there’s a wide variety of analysis types, our research study revealed that the majority of report views within the Data Evaluation complexity relates to completion analysis or utilization analysis. (This is a broadly similar picture to categories, where most views are for program and experience reports.)

A lot of the completion analysis reports relate to program analytics, with some relating to experience analytics. And many of the utilization analysis reports relate to experience analytics, with some relating to learner analytics.

Learning Data Evaluation Analysis

As we discussed in our last post, organizations are generally looking to get the basics down first, before moving onto more complex analytics around impact and effectiveness. And these basics mean reporting that people have completed the things they are supposed to have completed, and reporting on the usage of learning resources and platforms. In the breadth of different analysis types, we do see some organizations moving beyond these basics, but that’s not quite yet the norm.

Actionable Insights

Even within data evaluation, there’s a lot you can do beyond completion and utilization, but few organizations are doing so. Think about how you can get more data about engagement and other areas of analysis for richer insights. Use the worksheet below as a starting off point, 

Up Next: The ‘Advanced Evaluation’ Complexity

In our next learning analytics blog post, we'll explore some examples of the next level of complexity: Advanced Evaluation and its five analysis types


What types of learning analytics are you using?

Download this checklist to start tracking the analytics you already have in place and where you’d like to go next.

Learning Analytics Checklist

Andrew Downes

About The Author

As one of the authors of xAPI, Andrew Downes has years of expertise in data-driven learning design. And, with a background in instructional design and development, he’s well versed in creating learning experiences and platforms in both corporate and academic environments.