Real-World Learning Analytics: Applied Industrial Technologies [Pt. 2]

Building on part one of his blog post, Andy Webb of Applied Industrial Technologies shares how a learning analytics platform has helped increase efficiency within his organization and explains how to get executive buy-in.

How do you find baselines for learning data?

Operationally speaking, we have business definitions around most performance metrics (e.g., sales growth, margin, on-time, etc.). With the LRS, we can now correlate high performers to related learning scores. The range of (learning) scores for top performers is often a good source for the benchmarks.

How have you automated reports within your department to escape “Excel hell”? How much time (approximately) has this saved you?

Ironically, the reports that took the longest to create in Excel are the ones we now use the least in LRS Land. During our ERP deployments, we had a full-time business analyst pulling together LMS reports. It was a painful, manual process that required a lot of cleansing intervention. We had to use V-Lookups and phone calls (gulp!) to clear up discrepancies to connect system utilization reports to training information.

Our learning analytics platform (LAP) builds much slicker stoplight charts that fully automate the process.

Instead of building Excel reports on a full-time basis, our analyst gets to probe the data to uncover stories and experiment with the information.

Sometimes you need to show results fast. Do you have suggestions for quick wins with basic evaluation?

In our situation, the LAP’s chart-sharing features are really handy. We can create a new card, drag a subset of names into a group, build a chart, and throw some data together—which takes between five to 30 minutes. It becomes a live chart with a URL.

Prior to the LAP, it would take a day to construct report requirements, a period of time to validate the data (i.e. database spot checks, emails/phone calls, etc.), and several hours to gather results. When updates were required, we had to go back to step one…almost. Because data was gathered from disparate systems, it needed to be realigned with time-consuming manual V-lookups.

Now, we trust all the data and can just focus on the meaning of the outputs. Our prior LMS required manual updates for active associate information (e.g., status, role and reporting changes, etc.). Connecting our HRIS info to the LRS was priceless.

What’s been one of the biggest surprises you’ve found in your data? What did it tell you and how did it affect your program?

We used a 2x2 grid to understand what associates understand versus how well they perform. Surprisingly, there were a number of unconscious/competent performers. They performed at a high level, but scored low on financial acumen.

In other words, they understood the tasks and were disciplined about it, but did not understand the financial impacts/ramifications of their actions. This becomes a coaching opportunity. If they can better understand the financial implications of their actions, we would expect returns on other disciplines as well.

Data can tell a story if it’s used correctly. How do you design programs that provide the data to answer questions and tell your story?

Yeah, prior to our LAP, there was a big chasm between the LMS and our business metrics. Without data, all we had were assumptions. The most dangerous tools in the assumption tool belt are hope and intuition—and hope isn’t a reliable methodology.

For example, if we just consider completion status and attempt to make business assumptions about financial comprehension, there are too many blanks to fill in. Just because someone did not complete a course, doesn’t necessarily mean we can correlate to comprehension.

Here’s our simple design recipe: We created competencies for our key roles. Each competency has a key metric. You can train to competencies and you can measure KPI performance. Connecting the dots provides a clear path to learning and business insight. Involving executives in the KPI selection process and their SMEs in the training helped validate our efforts to the business.

A well-designed pre-/post-test (or assessment) can provide a wealth of information regarding learner comprehension. A problem for us (and many others) is that most authoring tools and LMS systems only supply a final score. We wanted to break down each question and understand related gaps. Later, we also sought to visually map understanding to actual performance metrics.

For our latest LRS effort, we designed a financial acumen assessment around seven competencies that were connected to specific KPIs. Within each competency, three levels of questions (developmental, competency, and mastery) were developed with SMEs. We weighted the results based on question difficulty.

A learning pet peeve of mine can be found embedded in most quiz-building tools of today’s rapid eLearning movement. If you get part of a multiple-choice question right, you should receive credit for what you understand…right? Otherwise, we’re not really measuring competency (and learners may feel the question is unfair). So partial credit became a personal maxim I held onto, despite the extra effort needed to explain it and build an elaborate scoring system that underlies the course.

In addition to the learning locker concept, xAPI holds the hope of custom learning design and measurement without the constraints of conventional SCORM architecture. If you can envision a concept and write a logical statement, you should be able to connect data points to reach Nirvana. For our team, being able to successfully model comprehension results (partial credits) illustrated delivery on the promise of xAPI.

What are suggestions for getting organizational buy-in with data?

Showing off your progress and results is an important political move that can accomplish several objectives (e.g., validation, inquiry, approval, etc.). You need to do this during each phase of the project and invite feedback.

Tip: When pitching an LRS or LAP to senior management, review your prospective vendor’s portfolio of charts and paste your company’s KPIs over the most relevant graphics. On paper, illustrate how you can construct the data. Personalizing a mockup for leadership makes it more meaningful—like seeing their favorite KPIs on a marquee. It’s fun to watch the lights go on when they connect how the tool can be leveraged to promote their purposes. Ideas start to fly.

After building the learning measurement phase of our project, my boss retired and an outsider replaced her. His first exposure to the tool came during a webinar where I was showing it off to field operations in order to obtain potential LRS adoption. A lot was on the line—including (potentially) the future of what had taken almost a year to construct.

Rather than making obvious data conclusions, I asked the field if the graphic representations matched their experience and expectations. The business dialogue became more important than the completion status or LRS itself. Mission accomplished.

Later, my new boss said, “I’ve worked in quality improvements, and this [LRS] stuff is what you’d crawl through crushed glass to get.”

Up Next: L&D Spotlight on Halliburton

We’ll continue our real-world learning analytics spotlight next week and talk with Amir Bar, who is an instructional designer and learning product developer for Halliburton. Be sure to sign up for our blog to have the next post sent straight to your inbox.

Subscribe to our blog

eGuide: 5 Steps to Getting Started with Learning Analytics

Now that you understand the basics of analyzing learning experiences, it's time to start applying them in your own learning program. And it's easier than you might think. In fact, there’s a lot you can do with simple metrics and the data you have right now. We've also created the following guide to help you get started right now!

eLearning Learning

This website stores cookies on your computer to improve your experience and the services we provide. To learn more, see our Privacy Policy