Learning Analytics Dimensions: Learning Experience Analysis

    

As we completed our Learning Analytics Research Study, we found many reports that fell into the Learning Experience category. Not only were there more reports, but they also were more varied—leading us to create nearly double the amount of dimensions under this category than compared to the Learner and Learning Program categories. And, in this post, we'll explore these learning experience dimensions in practice.

What are learning experience analytics?

Learning Experience Types

Learning experience analytics focus on the learning experiences, platforms, and content themselves. They answer questions, such as:

  • What content is most popular?
  • How can I make my videos more engaging?
  • Why are average scores for a question so much higher than others?
  • What topics need additional content?
  • What time of day is my learning platform the busiest?

These analytics are most likely to be of interest to the L&D team and those who are responsible for creating, sourcing, and providing these learning experiences.

Conversely, learning experience analytics are less likely to be useful to people managers and those more interested in how particular groups of people (or the organization as a whole) are performing.


What caused so many, varied Learning Experience reports?

L&D team members tend to be more advanced Watershed users and have time to really dig into and explore their data. So, it’s not surprising that we see a wide variety of reports when it comes to Learning Experience analytics.


A Quantum Leap into Learning Experience Dimensions

To help simplify these findings, we’ve grouped them into a few buckets. Let’s tackle them one at a time.

1) Asset and experience dimensions

These are dimensions where the reports compare individual assets the learner interacts with or experiences, which includes:

  • Resource: comparing resources (e.g. e-learning content, videos, or documents)
  • Experience: comparing different experiences (e.g. events or classes)
  • Question: comparing questions of an assessment or survey
  • Session: comparing different sessions of a class or other experience
  • Section: comparing parts of a document or other resource

For example, this report compares two survey questions:

Survey Response ComparisonBoth questions show high percentages of always and sometimes responses, but the second question shows comparatively more rarely and never responses. So, this may be a better area to focus future training initiatives.

2) Asset and experience grouping dimensions

Other dimensions compare grouping or collections of assets or experiences, which includes:

  • Content Type: comparing different types of content (e.g. e-learning vs. video)
  • Content Provider: comparing different sources of the content (e.g. vendors)
  • Data Source: comparing the applications that sent the data
  • Duration: comparing different lengths of content or experiences
  • Version: comparing different versions of a piece of content

For example, this report compares the usage of learning content by duration:

Learning Experience Duration
This report shows people are more likely to consume content with a duration of 5, 10, or 15 minutes. However, this may be due to the overall availability of content—which can affect what people are able to watch.

3) How the experience or asset was accessed

Some dimensions compare the way in which the experience or asset was accessed, which includes:

  • Browser: comparing internet browsers (e.g., Google Chrome, Internet Explorer, etc.)
  • Device: comparing different devices (e.g. mobile phone, desktop, or tablet)
  • Workflow: comparing navigation flow to the content (e.g. search, share, homepage link, etc.)

For example, this report compares usage between an iPad and iPhone:

How was the learning experience accessed?

4) Outcome dimensions

Some dimensions compare various outcomes of the learning experience, which includes:

  • Mistake: comparing the number of times different errors occur
  • Response: comparing different responses to a question
  • Score: comparing scores or ranges of scores

For example, the following report compares responses to a question to highlight common knowledge gaps:

Learner Response Comparison

5) Time period dimensions

Some dimensions compare different time periods, whether that’s looking at data during a period of time, or comparing recurring time periods (e.g. day of the week or hour of the day).

For example, this report compares LMS activity by day of the week:

LMS Activity Comparison
Aside from enrollments, LMS usage tails off toward the end of the week with a peak on Tuesday. This suggests  Tuesday may be a good day to enroll people in new programs.

6) The remaining (action and search) dimensions

We observed a verb dimension that compares the different actions taken by the learner, as shown in the following heatmap:

What are learners doing?

We also a search term dimension that compares different search terms:

Compare Search Terms

How are learning experience analytics used?

It’s interesting to note that the organizations in our study primarily conducting learning experience analytics are a different set of organizations compared to those that mainly use learner and learning program analytics. That’s to be expected based on the idea of the learning analytics triangle as a maturity model, and why we recommend starting with one category of analytics before expanding to others. It’s also indicative that learning experience analytics are more likely to appeal to learning and development professionals, while learner and learning program analytics are more targeted at people managers.

Looking at the kinds of learning experience analytics being used, reports mostly compare individual learning experiences (51% of reports and 58% of report views) and time periods (31% of reports and 24% of views).

Reports organized by search term are also popular (7% of reports and 10% of views), which is significant because not all researched clients have platforms with search capability.

Learning Experience Dimensions

Types of Learning Experience Dimensions

Actionable Insights!

We’ve seen that reports comparing individual experiences are used most often. This can be great for identifying the most popular learning content, but what do you do with that data?

Once you’ve identified your best content, consider doing further analysis to understand why it’s the best. This might involve further quantitative analysis by looking at performance by factors, such as duration or workflow, as we’ve seen above. You should also consider qualitative research, perhaps following Brinkerhoff’s Success Case Method.

Up Next: Complexities and analysis types

Next week, we move from categories and dimensions and into complexities and analysis types, exploring the kinds of questions people are asking as they analyze their data.

Andrew Downes

About The Author

As one of the authors of xAPI, Andrew Downes has years of expertise in data-driven learning design. With a background in instructional design and development, he’s well versed in creating learning experiences and platforms in corporate and academic environments.