Now that you know more about learning analytics and complexity, let's cover learning experience analytics—which help you understand more about particular learning experiences or a set of learning experiences.
Fine-Tune Learning Programs
Learning experience analytics often answer questions about usage patterns for specific activities—such as e-learning courses, social learning boards, mobile apps, MooCs, informal information access, or even classroom-based courses.
For instance, learning experience analytics might answer:
- How much/often is a learning experience being used?
- Is there “scrap learning” that that can be removed?
- When is the experience being used and for how long?
- How do learners interact with the experience?
- What resources or topics do learners search for most?
- What are the details of learners’ interactions?
- How do learners navigate the experience?
- In what contexts do people access learning resources?
- What workplace experiences do employees find to be meaningful learning experiences?
These analytics also incorporate content analytics and user experience analytics to help organizations fine-tune learning offerings to ensure learners are receiving the best possible experiences, which then helps maximize the effectiveness of specific learning activities.
Apply Learning Experience Analytics
Remember, the term learning analytics has two dimensions: complexity, or the sophistication of the analytics, and category, which identifies the specific area or type of learning data that’s being analyzed.
Say, for example, you want to understand more about an e-learning assessment. Here's how you could apply all four levels of complexity in the learning experience analytics category:
1) Learning Experience Measurement
Record each learner's response to a multiple choice question, the result of the question (i.e., whether that response was correct or incorrect), and the overall score the learner received on the assessment.
2) Learning Experience Evaluation
Present a chart showing the distribution of selected answers for a given question. Is there an incorrect answer that's selected more frequently than expected? Could that indicate the question or answer choices are poorly worded or confusing?
3) Advanced Learning Experience Evaluation
Run question result data and the assessment score data through a correlation engine. Are there any question results that are inversely correlated with the assessment score? In other words, are the learners who received the highest scores more likely to answer certain questions incorrectly? If so, does that mean these learners have incorrect information, or are the questions confusing or misleading?
4) Predictive Experience Analytics
Run question result data and assessment score data through a correlation engine. Are there questions that show very high positive correlation with assessment scores? In other words, when a learner gets the question wrong, he or she will most certainly fail the test. Or, when a learner gets the question right, he or she will most certainly pass the test. If you can predict the overall assessment results after only a few questions, can you alter that particular learning experience to provide a more efficient and engaging learning path?
Up Next: Learner Analytics
Next, we'll discuss learner analytics, which help you understand more about a specific person or group of people engaged in learning-related activities. Subscribe to our e-newsletter to have the latest installments of our learning analytics series sent right to your inbox.
About the author
As an innovative software developer turned entrepreneur, Mike Rustici has been defining the eLearning industry for nearly 20 years. After co-founding Rustici Software in 2002, Mike helped guide the first draft of xAPI and invented the concept of a Learning Record Store (LRS). In 2013, he delivered on the promise of xAPI with the creation of Watershed, the flagship LRS that bridges the gap between training and performance.
Subscribe to our blog