Real-World Learning Analytics: Halliburton

This week, we're featuring Amir Bar, who is an instructional designer and learning product developer, as he shares his experience working with Halliburton and professional learning and development (L&D) insights when it comes to using learning analytics to build and maintain a successful learning program.

Do you use any traditional non-learning systems do you gather interesting learning data?

We are leveraging SurveyGizmo (now called Alchemer) as a learning tool in addition to the more standard survey purposes. The usage of an online survey tool along with the ability to embed video content from our video platform Kaltura, and add questions to assess learning, allows us to create almost any kind of learning activity.

Because it’s easy to modify the design of surveys quickly, we can learn in real time based on feedback and results. This enables us to optimize our eLearning modules and ensure they fulfill their objectives. Having an xAPI plug-in at SurveyGizmo makes the process of accessing and using this data extremely easy.

In addition to the survey tool, we gather usage data from some of our internal proprietary software, which allows us to understand if the training content we developed for Halliburton employees actually affects usage and achieves its goal.

Sometimes it’s not obvious that there are missing pieces in our data. How do you find holes in your data and what are some ways to reduce the likelihood of repeating the issue?

Knowing your data is the key, and understanding what is missing and what is not needed is the most challenging part of the process. I wish there was a predefined formula, but based on my experience, it is an ongoing and dynamic process in search of solutions.

These solutions then have to be challenged, and more questions asked as part of an iterative process. We found it extremely important to be connected to the field and to understand how people are working and learning.

The better we understand the process as applied in reality, the better we can evaluate our data and see if we are missing anything.

Focus groups and testing solutions on a small scale are two methods that we are applying in our development process.

How have you automated reports within your department to escape “Excel hell”? How much time (approximately) has this saved you?

The application and use of more online tools have allowed us to take better advantage of automated reports. The challenge has been keeping track of the different tools and types of reports.

We see a great value in using an xAPI dashboard where we can combine all the data that we have and then share the right data with the right people. This approach is not just saving time in the creation of reports, but also ensures that the feedback data reaches the right people. We hope that this type of reporting will make more impact on the organization than traditional reporting.

What are a few small ways people can improve their learning programs every week (or on a regular basis)?

The key to improving any learning program is providing more quality content. This can be achieved in two ways. First, there is a lot of knowledge that is created in the company on a daily basis that may not be captured and shared.

For example, webinars or demos can be easily captured and then shared via eLearning tools. Given that the content of these webinars has already been produced, the added effort of capturing them and sharing is relatively low.

The second way is simply asking for feedback. We try to collect feedback on the satisfaction with our online modules at the end of each activity. We simply ask employees if they think it was beneficial. This feedback can then be used to improve the content.

What are your thoughts on predictive analytics for learning? What are some simple ways to prepare an organization for basic predictive analytics?

Predictive analytics for learning is only a small part of a much bigger picture. What I find fascinating is applying the analytics of “doing” (where xAPI can be leveraged). When we take a process and start to analyze how people complete it and interact with it, we get insight into where new learning is needed and how valuable it can be. We found that if we start by asking what learning/training is needed, we might hear that nothing is needed.

But if we help people in the organization capture employees’ experiences, then we can help them see exactly where additional training is needed.

The role of Learning and Development (L&D) departments is going to change dramatically in the years to come with learning and training capabilities becoming more embedded in the operations of the company.

Because it will be possible to show how training affects performance, it will become much easier to calculate the value of training, so I think L&D teams will have a much more significant role within the company's’ operations.

One way to get started with this approach is to start experimenting with xAPI. Being able to visualize work and learning experiences on one dashboard provides a lot of insight and motivation to explore the relationship between performance and training in an easy and fast way.

Where do you go to see examples of innovation in the learning space outside your organization?

I attended the SXSW Edu in 2016 and listened to a session with Sal Khan from Khan Academy. I think that what they are doing around SAT training is fascinating. They basically create an online platform that helps students get ready for the SAT without the need for expensive prep programs.

I think that is the kind of goal we are trying to achieve in our industry by offering employees new ways to learn at any place and anytime while reducing the overall cost of training.

I heard later on the radio that Khan Academy opened a brick and mortar school (they call it Lab School). While some people questioned this move, there are clear strategic benefits. Getting to know your audience is a key to enabling the automation of learning. Conducting research and development with content on a smaller group first is a key step in creating effective online learning that addresses a large group of learners.

What are your goals or plans for your learning analytics program?

We are currently working on combining the experiences from using our software, our video platform, and our online learning into one dashboard. This will allow us to understand how people are using the software, where they have gaps in leveraging the software, and how we can create the right learning products to fill these gaps.

These learning products could then be delivered as part of a seamless user experience. Once we identify the learning needs, we want to be able to leverage our internal and external learning content that is offered by known training companies in our industry.

Up Next: L&D Spotlight on Bridgestone Americas

Stay tuned as we continue our real-world learning analytics spotlight and talk with Monica Griggs, who worked as the director of commercial training at Bridgestone Americas. Don't miss out! Be sure to sign up for our blog to have the next post sent straight to your inbox.

Subscribe to our blog

eGuide: 5 Steps to Getting Started with Learning Analytics

Now that you understand the basics of analyzing learning experiences, it's time to start applying them in your own learning program. And it's easier than you might think. In fact, there’s a lot you can do with simple metrics and the data you have right now. We've also created the following guide to help you get started right now!

eLearning Learning

This website stores cookies on your computer to improve your experience and the services we provide. To learn more, see our Privacy Policy