Big Data Challenge: How to Measure & Analyze L&D Impact [Part 1]

PART 1: How to Measure & Analyze L&D Impact


During the webinar, attendees were encouraged to ask questions. Below are the questions and answers.

A Level 3 or 4 evaluation, while rare, does go beyond the smile sheet and is not subjective. How does using xAPI improve on delivering a good Level 3 evaluation done at 30, 60, 90?

M. Rustici: xAPI​ doesn’t improve the actual evaluation. It just makes collecting and analyzing the data a lot easier. xAPI and a Learning Analytics Platform remove a lot of the friction from the process, making the analysis real time and simple to perform. As we discussed on the webinar, when friction is removed we can do things more frequently, so they become more impactful. That is what we think is happening to learning analytics—the friction that once posed challenges is slowly disappearing.​ ​

Although the xAPI capturing/storing capability is great, we run the risk of just collecting data vs. actionable data. I would have liked to hear more about how to think about formal/informal learning in terms of collecting actionable data. Too much data is not a good thing. What matters is understanding what to measure and why it is important to measure. Are we going to talk about that or are we just going to talk about the technology that allows us to measure everything?

M. Rochelle: We can definitely discuss this point

M. Rustici: xAPI is a tool, and just like any other tool, it is up to the user to apply it appropriately. Deciding what/how much data to collect is one of the first challenges any implementer will face. There are two competing approaches that people take:

The first approach is to start with the question they want to ask, and then collect only the bits of data that will serve to inform the answer to that question.

The second approach is to gather a wide set of data that might be interesting and then explore that data through visualizations to find interesting trends or outliers.

Both approaches are valid and useful. What is “actionable” in my mind depends on the extent of questions and analysis that will be done. Trivial items can be actionable if one is willing to do detailed analysis and make fine-grained optimizations. But it’s important to be realistic when starting out. Focus on the things you will do, not the things you can do.

Can you give a clear example of what data xAPI can track that SCORM or an LMS can't? I've never been clear on that point.

M. Rochelle: xAPI captures all forms of informal learning, while SCORM does not.

Is it safe to say xAPI tracks a learner's options (for example, they watched a TEDtalk on their own), while an LMS only tracks courses and what happens with that course?

M. Rochelle: Exactly!!

M. Rustici: SCORM is limited to capturing data about e-learning courses via an LMS, but the course must be done in a web browser while connected to the internet. But how much of what you’ve learned during your lifetime has come from that modality? Not much, right? Compared to SCORM, xAPI captures available data covering a much broader set of learning activities—such as videos, games, MooCs, simulations, ILT, books, conferences, mentorships, etc.

xAPI also can track data at a significantly more detailed level than SCORM. While SCORM typically is used to track four items—completion, success (pass/fail), score, and total time—xAPI allows you to take a much deeper dive into all the individual interactions a learner has with a learning experience. For instance, xAPI can record that a learner attempted the first scenario of a simulation four times before succeeding alongside each of the incorrect choices made during each of those four attempts.

xAPI also can capture data about behaviors and performance not related to learning. This capability allows us to understand whether the training we provide actually succeeds in changing behavior and improving performance.

I understand the focus is point-of-need training can be tracked with xAPI. But if someone switches over to Watershed, for example, where do we host our formal eLearning, like videos and activities, we've created? Can those be hosted in Watershed?

M. Rustici: Watershed doesn’t host or deliver any content. There are a lot of options available to you, though.

With xAPI, it doesn’t matter from where courses are launched. They can be hosted on a portal or any other website, or they could live in a mobile app or on a device.

LMSs are still good options for pushing eLearning to learners. And because that need isn’t going away, LMSs will be around for a long time to serve that role. Many LMSs also publish their completion data to an LRS, such as Watershed. There also are other tools that provide simpler, learner-centric content delivery experiences. Learning Experience Platforms—such as Degreed, EdCast, or Pathgather—are interesting choices. Single-purpose tools, such as SCORM Cloud or xapiapps, provide some great delivery options as well. In summary, there is a vibrant ecosystem of next-generation learning tools emerging. And this ecosystem decouples all of the functionality that used to be tied up in monolithic LMSs.

Are there any examples of experiential and/or formal curriculums or activities that could use xAPI monitoring?

M. Rochelle: Yes, xAPI can be used in all forms of experiential and informal learning modalities to include gaming, simulation, social, OJT, etc.

Do you agree that you can use xAPI to detect the learner's "mental model" around a concept?

M. Rochelle: Yes, couldn't agree more

M. Rustici: Absolutely. In an advanced deployment, you could collect enough detailed data to get a better understanding of how a learner is approaching a problem or situation.

How can we capture that some users saw a YouTube video about something? Can xAPI be “running” in the browser? In practical terms, how do you track learning outside your LMS with xAPI? Does it require that a video or PDF posted somewhere is wrapped in an xAPI wrapper? I know that there is a self-reporting method where a button on a browser can capture and transmit info about the website, PDF, or video viewed and is sent back to the LMS transcript. Can you describe how an informal learning event would be captured by xAPI back to an LRS?

M. Rustici: There are many ways xAPI can be used to capture learning experiences. At a high level, they can be divided into “active” and “passive” strategies.

Active capture involves the learner (or another party, such as an instructor or observer) to proactively take an action to record an event. In the YouTube example, that might manifest itself as a bookmarklet in the browser that says “I Learned This.” When a user clicks that bookmarklet, an xAPI statement would be sent to the LRS.

Passive capture involves a system automatically sending xAPI statements about its usage. For instance, an internal portal or social learning website (e.g., Sharepoint) might record data about the resources that are being accessed, what is being searched for, who people are seeking out as experts, etc. In a passive scenario, there might be an internal video service (e.g., Kaltura) that reports xAPI statements about video usage.

Watching how people are learning can be a lagging indicator—people searching for the answer because they can't find it through their company's learning choices. I think you are saying that this data will help L&D professionals understand what they need to do differently in order to better meet the needs of their learners. Correct?

M. Rochelle: Absolutely!

Curious if you find that analytics are tightly tied to the type of learning content (or domain, or something else). Can any of this be generalized or are analytics content-type-specific (or domain-specific, etc.). If analytics are specific to the type of content, should xAPI profiles (like cmi5) help users determine things they may want to look at later after first tracking data?

M. Rochelle: Content style and delivery is exceptionally important and determines the effectiveness of learning with an audience and their ability to retain the depth of the learning.

M. Rustici: Yes, absolutely. The xAPI community is working to develop profiles for different types of learning content, but more profiles are needed. If you have an interest in the area, please jump in and lend a hand!

Do you know if there are xAPI cloud tools that are FedRAMP approved?

M. Rustici: Not to my knowledge yet.

Do you mean an LRS will capture training external to the organization? It's not limited like an LMS?

M. Rochelle: Not limited like an LMS, right.

M. Rustici: Correct, an LRS captures training from wherever it happens and can even capture data about learners and activities without having prior knowledge of them. This was one of the core requirements gathered as part of the original Project Tin Can effort. xAPI allows you to record statements about learning experiences from anywhere.

Do you suggest and have experience with using learner personas as way to define the data to collect and report on for learners?

M. Rochelle: Yes, personas are great as long as there is clear stratification of learner audiences and their learning preferences.

M. Rustici: Tailoring learning experiences to create personalized learning is the ultimate end goal, and using personas is a great approach to start that journey. (Personas are just one of many concepts we can borrow from marketers.)

How is xAPI related to marketing analytics specs and protocol? Marketing collects and tracks customer experiences to see connections to behavior. Any links? Is xAPI spec similar? Is there opportunity to leverage marketing tools too?

M. Rustici: YES! Learning has so much to gain from marketing. I think learning is where marketing was 20 years ago. Back then, marketing was a creative discipline that put out a lot of content and hoped it worked. Today, it is a data-driven, precisely targeted discipline. I think learning is embarking on that same journey.

Some particularly relevant concepts include: A/B testing, behavior tracking, funnels, calls-to-action, churn rates, click-through rates, conversion paths, context, drip marketing, dynamic content, engagement rates, marketing automation, native advertising, personas, retargeting, and segmentation.

Does Watershed offer xAPI technology AND Learning Record Stores?

M. Rochelle: Rustici Software and Watershed are companies that work together to provide both.

M. Rustici: xAPI is an open specification that anybody is free to use and adopt. There are a lot of open source and free tools to help you understand the spec and start using it from a technical perspective (NOTE: Try starting at

In an ideal world, though, xAPI isn’t something that a learning organization needs to “do,” rather it is something that is part of all the tools that organization is already using. Like we talked about on the webinar, while there are many available mainstream tools that are xAPI conformant, we still have a long way to go.

Watershed offers a SaaS-based LRS of the Learning Analytics Platform variety. We license subscriptions to this service. We’re also able to help you and your vendors to understand and implement xAPI.

Does Watershed help with data visualization?

M. Rochelle: Yes

M. Rustici: Yes, that is the core value proposition that Watershed offers. Watershed provides an easy-to-use tool to visualize the complex and unstructured set of xAPI data. Think of it as a business intelligence tool made specifically for L&D professionals.

Could you speak to analytics and data collection for organizations that have many learners spread across a large number of different business units/groups? Or even more diverse with enterprise learning held for clients?

M. Rochelle: Yes, extended enterprise learning is challenging from an analytics standpoint using SCORM only. With xAPI, you can capture learning data anywhere someone is learning as long as they are learning on a platform connected to xAPI.

M. Rustici: xAPI greatly enhances your ability to gather data from disparate parts of a large organization. One of the core problems many large organizations face is the large number of diverse systems that are used to deliver learning. xAPI robustly supports the synchronization of learning data from many systems. Data that would normally be stored across eight different LMSs and other systems can easily be centralized in an LRS using xAPI.

Analytics only get more powerful as the volume of data increases.

Does xAPI communicate with other APIs? My organization captures analytics on usage of our products and other metrics. I'm wondering if we can't compare the data captured from training (xAPI) to usage of our products (my company's Analytics API).

M. Rochelle: Through an LRS you can.

M. Rustici: xAPI itself does not communicate with other APIs, but its underlying data format makes it easy to capture data from most systems. A robust LRS allows you to import these other data formats and map them to xAPI for analysis.

How are organizations ensuring captured learning is relevant learning?

M. Rustici: I think starting to ask that question is the whole point of learning analytics. In addition to “relevant,” we might also ask if it is “effective,” “efficient,” or a whole host of other questions.

How can I migrate my xAPI content to SCORM?

M. Rustici: I think you might have asked the question backwards. There wouldn’t be much incentive to migrate xAPI content to SCORM. That would be trying to put something from the big box into the small box.

Migrating from SCORM to xAPI is very possible. There are tools such as SCORM Cloud that allow you to play SCORM content and then send the data to an LRS via xAPI. Most major authoring tools allow you to export eLearning as either SCORM or xAPI simply by changing a configuration option.

You might be asking “how can I use xAPI in my LMS that only supports SCORM?”. The answer to that question is that you can’t. You’d need to set up an LRS to capture the xAPI data. A robust LRS should be capable of summarizing that data into a format the LMS can understand and pushing that data into that system.

How can you encourage companies or organisations to allow you to put this data-gathering technology 'web' into their elearning modules if you are the provider and want to improve your own learning experiences in the future?

M. Rochelle: Yes, you can help companies understand that there can't be blind spots for measuring learning, and the only way to remove blind spots where people are learning is to make sure there is a learning technology ecosystem that fosters and enables their learning and promote that they should learn in the ecosystem without restrictions.

M. Rustici: Standards adoption is really a “chicken or the egg” situation. The only way around it is to just start using the technology. As a provider, create cool things that are outside of the small SCORM box. Show them to your clients, but let them know they need to adopt xAPI for these things to work. There won’t be a huge market on day one, but fortune favors the bold early adopters as the market shifts. We’re at a good point for companies to start making interesting new tools, and the market seems very receptive and ready.

Consider partnering with an LRS provider to enhance both of your go-to-market plans. Watershed maintains a list of Certified Data Sources that we often refer to customers looking to build a next-generation learning ecosystem.

Early adopters incur higher risk. Most large companies want to minimize the risk on large investments. How is this being addressed? Do we have specific examples of early adopters that are large companies who have definitively witnessed value that meets ROI expectations?

M. Rochelle: We can certainly discuss the ROI and Risk Mitigation models for adoption.

M. Rustici: In my experience, large companies are on all parts of the early adopter/laggard curve. Big companies are often very conservative, but they can also be very bold and eager to experiment with new technologies. We find that large companies often want to pilot something before making large investments, and many of them look to the success of other early adopters. We are excited to see many of those early pilots shifting into larger implementations at companies that are willing to publicly share their results.

So xAPI captures [learners’] behaviour outside of the learning module itself...if they're online?

M. Rochelle: Yes

M. Rustici: Yes, it can. It doesn’t automatically capture everything that is happening in the world, but it is capable of capturing things other than formal learning experiences.

If [learners] left the learning and accessed Facebook, would you be able to see that? Are there privacy issues with xAPI?

M. Rustici: xAPI doesn't magically monitor all web traffic. It only tracks data from systems that enable it. Theoretically, a corporate IT department could monitor all traffic on its network and use xAPI to send that data to an LRS. That would absolutely create privacy concerns, but those are the same concerns that would arise when the IT department started monitoring traffic in general.

Does this relate to 'desire paths,' particularly in gamified learning? Can xAPI help develop products by noting these 'technological desire paths,' and translate them into what learners really want and need from an effective learning experience?

M. Rustici: Absolutely. One of the quickest ways to get ROI out of an analytics project is to just start looking at learners’ actual behaviors. What resources are they using? What are they not using? What are they searching for? What are they searching for and not finding?

I'm not sure SCORM is a problem or that xAPI is the solution. I think the challenge is with determining what data will be important/meaningful. I can store anything to a database. The question is what data should be collected?

M. Rochelle: Great question and we will be addressing directly.

M. Rustici: SCORM is a limiter. xAPI is an enabler, but it’s certainly not the entire solution. Learning analytics is a mindset and process. xAPI makes that process easier by removing the friction from data collection and analysis.

Yes, you can store anything in a database and write queries to extract the data, but do you actually do that? Usually, the answer is “no, it’s too much work.” Or, for someone who is doing learning analytics, they have somebody who spends several days every month cobbling data together in Excel just to get rudimentary metrics.

xAPI and a Learning Analytics Platform take what was once a time-intensive, manual chore and turns it into a real-time effortless task. Looking through elegant visualizations to see how all of your hard work is being used actually becomes enjoyable (at least for geeks like me).

Performance will be measured by the employee's manager and peers. Things that cannot be measured by L&D, but need to be measured back in the business. Metrics that indicate improved performance are typically done during performance reviews. So if performance reviews improve, L&D may have contributed to that improvement.

M. Rochelle: Yes, but performance reviews can't happen once a year. Learning and the link to behavioral change and performance change must happen regularly and course correct learning as you go.

Yes, many organizations are transitioning to constant feedback to help course correct throughout the year.

M. Rochelle: Exactly, but not enough and not fast enough.

Learning resources need to be connected to those feedback systems so help contextualize the learning materials associated with the needs of each individual.

M. Rochelle: Yes, learning is not measured properly without learning assessments.

M. Rustici: That’s the idea, yes. Performance is about more than just the individual’s performance review, though. Overall organizational performance is just as important, if not more important. Organizational performance can also be easier to measure, as it relies on metrics that are often already assembled, can be frequently accessed, and have a large enough sample size to be statistically significant.

It's a little more xml-like. you can track anything. An example would be John just "read" the "article" titled "...." or John just "viewed" the "video" titled "..." for 5 minutes.

M. Rochelle: Yes, you're right!

M. Rustici: Yes. xAPI happens to use a JSON data format, not XML, but you’ve got the right idea.

All this does is track "activity." The challenge begins before and after you've collected all this data. First, what do you track that will be meaningful in the future? Second, now that I have all this data how do I make sense of it? Measuring videos, articles, and courses will show me what I already collect. It doesn't show any related impact to the business.

M. Rochelle: This is where you need a learning measurement model.

M. Rustici: You can also capture activities that relate to performing a job, not just learning activities. That is where the ability to show impact on the business comes in. You trained a set of people to do something, did they actually do it? If so, then what were the results?

If we already have an LMS, could we get more granular items in an LRS to capture other types of "certification" criteria? If so, how could we automate?

M. Rochelle: Yes

M. Rustici: Absolutely. Getting at data that is already in an LMS might require some cooperation from the LMS vendor, but xAPI is exactly what enables a lot more granularity and the ability to make decisions based on that data. Automation would come from your LRS provider. A robust LRS will allow you to define sets of criteria for achieving accomplishments.

So if we were to capture all of this additional learning data, is there any system that automatically analyzes the data for you to flag anomalies in the data—like you mentioned more no-shows on Thursdays? Or is it a manual analysis of the data—a person having to sift through tons of data in order to try to see outliers in the data?

M. Rochelle: An LRS has analytics capabilities.

M. Rustici: That is going to depend on the capabilities of your LRS. In Watershed, we have a mix of capabilities. We have preconfigured visualizations that allow you to easily click around, view your data, and quickly spot outliers. We also provide the ability to easily create custom visualizations to spot trends. We offer a formal correlation report that will statistically search for correlations amongst variables and identify outliers that do not conform to the trend.

What are the key leading indicators that will have an impact on personal and organizational performance improvements?

M. Rustici: That’s a whole other webinar. So much of that depends on the learning program, the people involved, and the performance improvements you are seeking.

What if you have no LMS or LRS. You use the web, email, etc.? Trying to get started. I know you need at least an LRS.

M. Rochelle: Yes!

M. Rustici: Even if you don’t have an LRS or xAPI, I’d encourage you to start using whatever data you have in whatever way you can.

Recommended Resources

This website stores cookies on your computer to improve your experience and the services we provide. To learn more, see our Privacy Policy