Calling for Change: Learning Analytics with Tim Dickinson

It was a real treat to watch former Watershed Director of Strategy and current Global Head of Learning Systems & Innovation at Novartis, Tim Dickinson, present on Learning Analytics at the Learning Technologies conference in London.

Tim used his session to share real-world use cases from the L&D team and the analysis they conducted in their area, focusing on simple applications of learning analytics. He encouraged us to reflect on how to use data to inform business decisions rather than the technicalities of getting the data in the first place. Tim’s session explored three questions:

  1. What is our call to change?
  2. What are we doing about it?
  3. What have we learned?

Change Is Calling: Why is learning analytics needed?

This year, there was a real sense at Learning Technologies that L&D teams are recognizing the need for change. You’ve probably heard about what happened at TikTok (spoiler: they sacked their whole L&D team), and our industry has been talking about impact measurement and getting a seat at the table for a long time now. It’s time to turn that talk into action.

For Tim and the team, that call for change is emphasized by some research conducted by Orsolya Hein, Head of Portfolio Strategy at Novartis. She looked into the employees' learning and development experience and the size and utilization of the organization’s L&D content library.

From their employees, Orsolya learned there’s plenty of space for improvement in supporting their learning and development:

  • 80% lack visibility into suitable career opportunities.
  • 67% depend on their managers for key career opportunities.
  • 50% are unaware of future skills they need to develop.
  • 43% struggle to find personalized learning.

In other words, some big chunks of the workforce lack support to identify their next career move and struggle to know what they need to learn or how to learn it.

Studying the course catalog helped the team understand why employees struggle to find personalized learning. It also painted a picture of an overgrown course library where the right content is hard to find, and most content is underutilized:

  • More than 300,000 learning objects are in their LMS catalog—with 20,000 new objects added each month.
  • Approximately 2/3 of that content is underutilized.
  • 94% of the portfolio has less than 50 completions.
  • 53% of courses have no description.

In other words, there’s a lot of scrap learning content, while useful training content is difficult to find.

Answering the Call: Making changes informed by L&D data

Tim and the team are responding to these results with a few data-led initiatives, including reviewing vendor management and content intelligence processes.

Vendor management – taking a deeper dive to understand “vanity” metrics

With so many learning objects provided by multiple content vendors, it is critical to evaluate each vendor’s content to ensure each one delivers value.

Tim warned us against taking vendor-provided metrics at face value. For example, one vendor reported that there were 35,000 unique active users in the last 12 months.

That sounds fairly good (it represents about a third of total employees), but it doesn’t necessarily tell the whole story. Tim encouraged us to dig deeper and ask questions of vendor-provided metrics. In this case, Tim asked:

  • Who are these users? Does this include people who left the company after completing the training? Or people in the extended enterprise who are not considered employees?
  • How is “Active” defined? Does this include everybody who logged into the platform once, or did they need to complete some training? How many people regularly use the content each month to develop their skills?
  • What’s left? Taking these considerations into account, Tim and the team came up with their own definition of an active user based on current employees who regularly complete content. They ended up with a figure for active users that was about a quarter of the one quoted by the vendor.

Understanding how many people actually use a content library and actively engage with its resources is essential to inform conversations with content vendors around license numbers and costs. It also helps inform decisions about whether to continue licensing content from particular vendors.

Content intelligence – the importance of metadata

Because 53% of courses lacked descriptions, learners didn’t have the information to decide whether to take those courses. This lack also meant that systems didn’t have the needed course metadata to recommend content to learners or present accurate search results.

In fact, working with the team from content intelligence platform Filtered, Tim and the team found that while vendor-provided content had a tagging precision of around 80–90%, internally created content was closer to only 60%.

That’s because internally created content did not have sufficient metadata for tagging, which meant it was neither compatible with systems that automatically recommend content to learners nor searchable.

These insights informed new processes for managing content (illustrated in the following diagram). The gray boxes represent new processes:

  • A decision process around when not to invest in new content if they can address the learning with existing content.
  • A process around retiring end-of-life content. Steps were added for deciding whether to build or buy content based on the purpose, target audience size, expected shelf life, and activation plan of the content.

Getting Everybody on the Call: Democratizing content and data

Four out of five of the top skills goals represent general skills that vendor content libraries commonly address (e.g. project management, leadership, change management, and communication).

However, clinical research is a specialized skill with fewer external training content options. But because they have in-house expertise, they address this skill primarily with internally created content.

In fact, user-generated content represents 13% of the content in the LXP portfolio. Tim calls this a democratization of content creation: empowering and equipping internal experts to create helpful content without requiring significant resources from L&D.

With so much training content, especially user-generated content, there is high demand for impact measurement analytics—well above and beyond what the L&D team can provide directly. Instead, Novartis is beginning to democratize learning analytics and reporting in the same way they’ve democratized content creation in their LXP.

Democratizing learning analytics means that the Learning Insights team can focus their efforts on the highest priority measurement projects. To help with choosing which projects the team picks up, Dushyant Pandey, Global Head of Learning Insights at Novartis, sets four tests for learning impact measurement projects:

  1. Priority. The program must be a high priority for the business.
  2. Alignment. The goals of the program must be aligned with business goals.
  3. Performance. There must be benchmarks for expected performance gains.
  4. Cost. There must be benchmarks for anticipated expenses.

Defining and enforcing these tests help ensure programs are designed with impact measurement in mind to qualify for an impact measurement study.

Making the Right Call: Lessons learned from learning analytics

Having explained how data is used to inform L&D processes and decisions, Tim ended his presentation with a summary of five lessons learned:

  1. People, Process, Technology. These are the three key ingredients needed for learning analytics. (Based on our Measuring the Business impact of Learning (2022) research, we would also add “budget” as another essential ingredient.)
  2. Metadata. Having good metadata for your content is really important so learners can find it and you can report on it appropriately.
  3. Data integration. Connecting data from all your learning platforms with HR and business data is essential for comprehensive learning analytics.
  4. Democratized measurement. The L&D team can’t do everything; you need to equip the business to measure learning themselves.
  5. Learning is the No. 1 talent attraction criterion. Learning is essential and has a significant impact on the business.

5 Trends from Learning Technologies 2022

Tim’s valuable and insightful session was just one of many at Learning Technologies this year. Find out more about what’s hot in L&D right now from Ash Laurence in “5 Trends from Learning Technologies.”

Subscribe to our blog

5 Ways Learning Analytics Can Transform Your Business

Learning analytics has a reputation for being complex and challenging. But what if you ignore the technicalities and look at the stories that data can tell? In this talk, we look at five ways global enterprises used their learning analytics platform to improve, adapt and transform areas of their business in very different circumstances.

eLearning Learning

This website stores cookies on your computer to improve your experience and the services we provide. To learn more, see our Privacy Policy