Learning Experience Analytics

     

Now that you know more about learning analytics and complexity, let's cover learning experience analytics—which help you understand more about particular learning experiences or a set of learning experiences.

Fine-Tune Learning Programs

Fine tune your learning programs
Learning experience analytics often answer questions about usage patterns for specific activities—such as e-learning courses, social learning boards, mobile apps, MooCs, informal information access, or even classroom-based courses.

For instance, learning experience analytics might answer:

  • How much/often is a learning experience being used?
  • Is there “scrap learning” that that can be removed?
  • When is the experience being used and for how long?
  • How do learners interact with the experience?
  • What resources or topics do learners search for most?
  • What are the details of learners’ interactions?
  • How do learners navigate the experience?
  • In what contexts do people access learning resources?
  • What workplace experiences do employees find to be meaningful learning experiences?

These analytics also incorporate content analytics and user experience analytics to help organizations fine-tune learning offerings to ensure learners are receiving the best possible experiences, which then helps maximize the effectiveness of specific learning activities.

Apply Learning Experience Analytics

Remember, the term learning analytics has two dimensions: complexity, or the sophistication of the analytics, and category, which identifies the specific area or type of learning data that’s being analyzed.

Learning Experience Analytics ModelSay, for example, you want to understand more about an e-learning assesment. Here's how you could apply all four levels of complexity in the learning experience analytics category:

1) Learning Experience Measurement

Record each learner's response to a multiple choice question, the result of the question (i.e., whether that response was correct or incorrect), and the overall score the learner received on the assessment.

2) Learning Experience Evaluation

Present a chart showing the distribution of selected answers for a given question. Is there an incorrect answer that's selected more frequently than expected? Could that indicate the question or answer choices are poorly worded or confusing?

3) Advanced Learning Experience Evaluation

Run question result data and the assessment score data through a correlation engine. Are there any question results that are inversely correlated with the assessment score? In other words, are the learners who received the highest scores more likely to answer certain questions incorrectly? If so, does that mean these learners have incorrect information, or are the questions confusing or misleading?

4) Predictive Experience Analytics

Run question result data and assessment score data through a correlation engine. Are there questions that show very high positive correlation with assessment scores? In other words, when a learner gets the question wrong, he or she will most certainly fail the test. Or, when a learner gets the question right, he or she will most certainly pass the test. If you can predict the overall assessment results after only a few questions, can you alter that particular learning experience to provide a more efficient and engaging learning path? 

Up Next: Learner Analytics

Next, we'll discuss learner analytics, which help you understand more about a specific person or group of people engaged in learning-related activities. Subscribe to our e-newsletter to have the latest installments of our learning analytics series sent right to your inbox.


Getting Started Is Easy

Now that you understand the basics of analyzing learning experiences, it's time to start applying them in your own learning program. And it's easier than you might think. In fact, there’s a lot you can do with simple metrics and the data you have right now. We've also created the following guide to help you get started right now!

New Call-to-action

About The Author

As an innovative software developer turned entrepreneur, Mike Rustici has been defining the eLearning industry for nearly 20 years. After co-founding Rustici Software in 2002, Mike helped guide the first draft of the Tin Can API (xAPI) and invented the concept of a Learning Record Store (LRS) - revolutionizing the Learning and Development world. In 2013, he delivered on the promise of Tin Can with the creation of Watershed, the flagship LRS that bridges the gap between training and performance.

When Rustici Software was acquired by Learning Technologies Group (LTG) in 2016, Mike became the CEO of Watershed, where he continues to be an expert in the area of eLearning conformance as well as Learning and Development analytics. He’s also presented on a variety of topics, ranging from disruptive technology and performance improvement to company culture and business innovation.