<img src="https://certify.alexametrics.com/atrk.gif?account=JyqKq1FYxz20cv" style="display:none" height="1" width="1" alt="">

xAPI Spotlight 2:  Modern Ecosystem


cat logo xapiSPEAKER
Mike Miller, Caterpillar
Division Manager, Global Dealer Learning


THINK LARGE, TRADITIONAL ORGS CAN'T USE xAPI?

See how Caterpillar is building a modern learning ecosystem with xAPI to train, retain, and develop the future workforce.

This recorded broadcast consisted of a 20-minute xAPI case study by Mike Miller, followed by 20 minutes of Q&A from attendees.  

 

Q&A with Mike Miller from Caterpillar

Here are the questions attendees asked Mike Miller from Caterpillar during his live broadcast on April 16. Below are his answers. 

  1. Was it a major transition from the traditional or previous ecosystem to the current system. If so, was it evolutional? And how long did it take?

  2. How were you able to capture the experiences from all the tools? Did you have to build connectors for all of them if they didn't have xAPI compatibility?
  3. The prescriptive bit in a user’s calendar (prep for your sales call with X training), is that current state or something for the future? If current state, can you share more about whose xAPI data powers that, what tools used, etc?
  4. When creating HTML/HTML5 content, are you using a particular tool to develop this content?

  5. How are you recording the xAPI items of time view, rank, and more with videos?

  6. Do you have specific learning analyst positions? Did you partner with IT to write the xAPI code?

  7. Are you recording ILT via xAPI? How?

  8. How can you track content that is job aid quality in xAPI?

  9. Are you recording the self-assessment idea for the learners in xAPI? Do they have access to that information?
  10. Could xAPI potentially work with email systems, like Outlook?

  11. How can we ensure data being shared via xAPI through multiple sources isn't a violation of data privacy laws? (especially in Europe)



1a. Was it a major transition from the traditional or previous ecosystem to the current system. If so, was it evolutional? And how long did it take?
1b. Did you use a pilot group when you first rolled out your digitalization strategy/ecosystem?
1c. Did you employ any mitigation strategies when you were building/rolling out this ecosystem?

It is definitely a journey, and not an overnight one. In fact, Caterpillar stepped back in 2013 and 2014 to complete all the research on what we wanted to do. Then, during 2015 and 2016, we started putting some of the components together. And now, in 2018, we really have a lot of the components in place and are starting to fine-tune it. So, you can see it's been a journey over time.

We've taken a slower approach than some companies might. Others may turn it all on simultaneously, but we wanted to understand the tools we were considering, so we really ran with proofs-of-concept. Our goal was to bring a tool online, make sure it could actually do what it said and it could, and get it to work for a 90- to 180-day cycle to see if we got the results we needed. Once we got the results we needed from a tool, we moved into a purchasing decision.

We evaluated two to three tools at a time and then picked the best of the best and then deployed that tool. But we didn't just do it for all tools at the same time. Instead, we compared specific product types together. For example, we evaluated all video platforms at one time and pegged what we thought was the best of the best. Then, we went to authoring tools, and so on.

Previously, we’d been able to put the vendor list together and talk to other companies about why they’ve used certain tools. But, in building [a modern learning ecosystem] since there wasn’t a roadmap from others, it took us a little bit of work to look at the landscape, evaluate all the different tools, and pick best-in-class tools to help us get where we wanted to go.

We did the same thing with Watershed. We looked at BI, Tableau, and Watershed. But Watershed does more for us than those other tools. It's not just about visualization of data and making decisions. Watershed has the ability to actually move data around the ecosystem for us as well—the Learning Record Store is keeping check on all the systems.

That way, when I jump into one tool versus another tool—regardless of the entry point—I should still have the same information available to me. And Watershed enables us to do that data flow, which is something the other visualization tools cannot do. And that’s what we were looking for because, if you're going to have lots of views of data, then you're going to have lots of tools. And you have to think about how data moves amongst these tools—xAPI has really unlocked that for us.

 

2. Integrating content? Or integrating data? How are you normalizing disparate data sources for analyses?
Technically, how were you able to capture the experiences from all the tools? Did you have to build connectors for all of them if they didn't have xAPI compatibility?

The simple answer is that a lot of tools already had some xAPI connectors, but a few of them didn’t. We have a hard rule that if you wanted to be a considered vendor in our ecosystem, your tool had to align with xAPI requirement. So, our vendors [who didn’t support xAPI yet] would have to work with Watershed and others to get up to speed on what the [xAPI] requirements mean and how to integrate their content into the Watershed tool. Luckily, we've got some great experts at Watershed who were able to lend the knowledge, skills, and expertise, so that portion moved very quickly. So, a lot of the companies we were considering were able to pick that up and integrate their content very quickly. 

 

3. The prescriptive bit in a user’s calendar (prep for your sales call with X training), is that current state or something for the future? If current state, can you share more about whose xAPI data powers that, what tools used, etc?

Prescriptive analytics is the future—that’s where we need to go. We're currently on the digital insights where we're learning all the paths and the information from the tool. When you get into that AI workflow eventually, that's where we'll want to use the data to start prescribing learning.

Before we can unlock the prescriptive side, we need to understand everyone’s experiences with the content that we have available and start building out maps on what we think is a logical approach. Then we will move into the AI workflows.

For Caterpillar, the AI workflow is probably a late 2018-2019 initiative for when users approach a product they’re getting ready to sell. In the old days, we would say we'd put an iBeacon on the product and just ping your phone as you get closer to the product so we could push the right information on feature and benefit. That's actually not the right way to go long term.

Now, we're getting so good at AI and at digital insight, we think we can eventually look at your experiences and know what you're doing through the week, and start pushing the right content before you even get to the product— that's the better path. So, we're moving away from discussions around pushing content by proximity, while really looking to unlock that [prescriptive part] in the next 18 months.

4. When creating HTML/HTML5 content, are you using a particular tool to develop this content?
Bravo on the HTML5 based content and the path recording. What tool are you building content with?

We produce quite a bit of content for ourselves because we have the product knowledge. So, when we release a brand-new excavator, we can't buy that content, we have to build it.

We use a tool called in Inkling as our authoring tool that allows us to build that HTML/HTML output. It's a WYSIWYG tool that functions as a click-and-drag type application. And if you know how to code into HTML or HTML5, it'll also let you do that directly. So much like Articulate and all the other cool, easy tools, Inkling is very easy to use. In fact, we used to buy another tool that used to take us two-and-a-half days to train people. Now with Inkling, we can hand that tool to pretty much anyone and, within two hours, they're pretty savvy and using it to produce e-pubs (or what you and I call eLearning).

5. How are you recording the xAPI items of time view, rank, and more with videos?

All the tools that we deploy have great analytics, but keep in mind we're doing a ecosystem approach. As you could see in the tools shown in the slides, we specifically picked tools to help us get to root data. Meaning, they allow us to get down to the smallest details—information that’s usually locked up in the tools. So, using xAPI, the information from Kaltura, Inkling, and other tools is all brought together into this single tool, called Watershed. Watershed lets us put the full experience of a learner together and interrogate the content from one tool and compare it to the data from all others.

So now I can step back and create a full picture of how learner progressed through the ecosystem, and visualize his path. For example, “Mike Miller originally started in Inkling, then he jumped into video, then he jumped over to this tool, and finally he jumped over here.” So I'm able to learn how Mike maneuvered through the ecosystem.

We can now do a much better job of visualization of data—and that's really what it's all about. Once you can visualize the data, then you can interrogate it. But it's about visualization first and then diving down into the details where you see outliers.

6. Do you have specific learning analyst positions? Did you partner with IT to write the xAPI code?

That's a great question. It’s important to point out that Caterpillar is a dealer-facing company that supports 172 dealers worldwide. To support that global network, I have nine learning centers with about a 150 people who sit across them. Those 150 people are organized as such: We have learning consultants, instructional designers, instructors, an ecosystem support team, and project managers.

The learning consultant role works directly with dealers to help them evaluate where they are, where they need to go, and how to leverage our ecosystem and all the tools they have available to the next level. These learning consultants are similar to academic planning counselors for incoming college or university students—they help you lay out a plan.

The instructional design team is more focused on the development of content and that experience, while the instructors function as you would expect.

The ecosystem support team is focused on keeping the tools and solutions alive and running. Our organization really isn't doing much coding, so ecosystem support team is working with our vendors to tell them what our needs are, where we need the tools to evolve, and how are we going to get there. They are really strategic and helping us go forward on that experience management architecture. I will tell you that experience management is the big thing to push for as you look to the future.

And then the last role is project management. It's important for us to be able to run all the initiatives that we have on this side of the equation so we have project managers who are keeping us accountable.

 

7. Are you recording ILT via xAPI? How?

For instructor-led training, we go back to the system of record. Once a learner goes through an instructor-led course, the instructor will update the learning record with that information. We haven’t gotten to the point where we’re videotaping or recording the sessions, but we have introduced tablets into the classroom to start tracking activity. So now, when service technicians go to many of our learning centers, they’re given iPads that have a lot of activities that they’ll progress through while in the ILT class.

The service technicians may even take an iPad down to the labs where they're working with the equipment—interacting with the engine equipment and the iPad back and forth. We want to track those movements and activities. These ILT courses are coming online now, and all that data will be captured—but it’s up to the instructor to go in and say who has passed and to put the scores back in.

 

8. How can you track content that is job aid quality in xAPI?

For the content we’re producing ourselves, we're using a tool called Inkling. Inkling is an authoring tool that allows you to produce e-pubs with a pure HTML/HTML5 output. Everything we host is in the cloud. So, for users to use our content, they log into the system or log into the cloud and they consume.

And in the case where someone doesn’t have internet access, Inkling also has an app version that allows you to take the content offline. When you reconnect, it takes all the activities that happened in the offline environment and loads it to the system. So, whether a learner is in a connected or offline environment, we are able to collect all that information. Again, we're getting more than just a record of “did they pass or fail” or “what was their score.” We're getting all the activities as they engage and interact with that package.

9. Are you recording the self-assessment idea for the learners in the xAPI?
What tool did you build the gap-finder assessment in?
Do the students have access to the minute data that the trainers have? Is it as visual as well?

Just like any other learning tool, we have an assessment engine that allows us to evaluate somebody’s skills, knowledge, and behaviors as they're entering a program. It is much like the pre-assessment students complete before they go to college or university.

This tool allows us to understand what knowledge learners have and, based on the results, where their gaps are and what content, classes, or experience they’ll need to close those gaps. We let them see that entire list and work with their leaders to put together a plan to close those gaps in a set period of time.

Perhaps they have 62 items they need to complete to close those gaps—that could take a couple of years. Or, maybe they only have two gaps to close and can do everything in one period. We don't build the individual learning plans because we want the leader to be engaged in the conversation. So, once we identify where a learner needs to go and what's required to get there, we turn it over to the individual and the leader to work on the priorities toward obtaining the objective.

 

10. Could xAPI potentially work with email systems, like Outlook?

Answer provided by Watershed: Yes, potentially. xAPI is just a mechanism for transferring data over the internet, so really any application that can be connected to the internet can in theory be tracked by xAPI. 

11. How can we ensure data being shared via xAPI through multiple sources isn't a violation of data privacy laws? (especially in Europe)

Answer provided by Watershed: In the context of GDPR, consider an organization using xAPI to share learning records from Learning Record Providers with Learning Record Stores. Each of those providers are free to share data across their platforms so long as they have agreements in place with the client organization determining their adherence to data privacy regulations.

If a Data Processor, be it an LRP or LRS, cannot commit to adhering to data privacy regulations, then the client organization (Data Controller) would be at risk of violating data privacy regulations by continuing to work with them. Additionally, each system generating or collecting xAPI data should ensure their sub-processors adhere to data privacy in the same way their clients do. For example, at Watershed our clients must ensure we adhere to privacy laws, and we must do the same to our providers, such as Amazon Web Services. 


 

Don't Miss out!

The xAPI xAPRIL page will be updated with loads of case studies, prizes, and tools throughout the month of April. Make sure to bookmark it and check back often.

Return to xAPI xAPRIL