A Shiny Apple Is Not a Tasty Apple: Why L&D's Success Metrics Are Wrong

I’m a big fan of apples. I love them in a crumble or baked with raisins in the middle. But not all apples are equally tasty. An apple may look good from the outside, but taste bland and flavorless inside.

And the same can be said for learning—it might look great from the surface, but is it really serving its intended purpose? In this post, we’ll discuss how L&D often prioritizes style over substance when defining success and then explore metrics that actually prove learning’s impact.

Let’s go back to my apple analogy for a moment. Red Delicious apples look amazing, but taste anything but delicious. That’s because farmers deliberately bred gene lines of the apple with a beautiful, shiny red skin that hides bumps and bruises that might put off a potential customer.

But the genes for undesirable yellow stripes are the same genes responsible for flavor, giving the Red Delicious a boring, bland taste. Farmers focused on the surface-level looks but not the core purpose of the apple—its taste!

And just like those farmers, L&D teams often mirror this approach, using surface-level measures of success, such as:

  • Number of new courses published
  • Number of hours of learning completed
  • Learner satisfaction

Sure, these metrics ‘prove’ how busy and productive L&D has been. But where’s the focus on the learner outcome, or the subsequent business impact?

If you focus solely on those surface-level metrics, where is your motivation to ensure the learning opportunities you provide are helping people improve? And, in fact, could focusing solely on these metrics actually hinder performance improvement as learners are lost in an ever-growing sea of scrap learning, too busy clicking “next” to develop their skills?

Does C-Suite Even Ask L&D for Impact Metrics?

One of our clients shared that they struggled to make time for learning analytics and measuring the business impact of his courses. They’re not incentivized to make time for learning measurement or even to design courses that have a business impact in the first place. Instead, they’re incentivized to create great experiences that people enjoy; you could argue that whether or not those people actually learn something is irrelevant.

It’s a story we’ve seen repeated. It’s easy to assume that organizations that are further along the learning analytics maturity model are measuring for impact. But even genuinely forward-thinking, progressive learning leaders tell us they can be frustrated when the insights they report on don’t lead to any meaningful changes in learning programs or delivery.

This story is one that was played out with startling clarity in research we conducted with Chief Learning Officer in the fall of 2021. With over 400 L&D and Talent Management professionals input, the survey results told of a clear divide in the metrics the C-suite want, and the metrics they are given.

Delve into this a little deeper with our report Adopting Learning Analytics: Closing the C-Suite/L&D Language Gap.

Ensure L&D’s Seat at the Table by Embedding “Impact Metrics” into Learning Design.

So how did we get here, and how can we move forward?

The C-Suite can’t ask for better or more in-depth metrics if they don’t know what to ask for or how to ask for it. So they just stick with what they already know: completion rates, net promoter scores, and smile sheets. As a result, L&D is left reporting on metrics that provide little to no real evidence of how their initiatives impact the bottom line. And if the C-Suite doesn’t expect L&D to prove business impact, then we’re stuck with these default measures.

However, if learning objectives are embedded from the start, learning programs can be designed to focus on impact metrics such as: If Learner X completes Y program, we should see % increase in Z (insert your applicable Business KPI).

Not only does this approach give L&D meaningful goals to focus their programs on, it also allows them to prove to the business that learning can have a meaningful impact. Reporting becomes more insightful and focused, but—and it's a big one—this only works if there is appetite from the C-Suite.

The “seat at the table” that L&D longs for is dependent on this two-way interaction between L&D proving impact with learning analytics and those who drive business goals embedding these targets from the top-down.

L&D’s Targets Are Wrong and Damaging (But We Can Fix That).

By setting the L&D team’s target around the amount of content produced, completions, and learner satisfaction, the C-suite incentivizes the team to prioritize employee contentment over employee development.

These targets are not only unhelpful, but can also actively hinder employee development:

  • Targets around content production lead to L&D teams producing more content without stopping to consider whether that content is useful—resulting in huge, mostly unused content libraries. As a side note, this 'production-line approach' approach encourages learning to be churned out, without consideration of refining, repurposing, or simply updating existing content.
  • Targets around completions and learner satisfaction create a false impression that people are learning and developing. When in reality, that content is consuming learners’ time and energy that could have been spent on activities that actually help them improve and focus on business outcomes.

What Do Good L&D Targets Look Like?

So, how does L&D go about getting the perfect apple—one that could tempt Snow White, while passing the taste test?

Instead of focusing on surface metrics, the L&D team should set targets that go to the “core” of learning and development and show whether people are improving. This includes general metrics, such as:

  • Employee promotion rates. People are developing sufficiently to progress in their careers.
  • New skills qualification rates. People are demonstrating new skills and earning credentials.
  • Increased employee retention rates. People are developing career paths and, as a result, feel fulfilled and stay with the organization.

L&D’s targets should also include specific items relating to the organization’s business goals. For example, the business plans to offer a new service and needs to train and hire a certain number of employees to deliver that service. L&D can set a target around ensuring those employees have the required skills.

What Difference Would Better Targets Mean for L&D?

Better targets that focus on performance improvement—rather than content production, completion rates, and learner satisfaction—should have a profound effect on how L&D operates.

First, this change in focus should lead to an increase in performance improvement. Second, it should result in a decline in course production for two reasons:

  1. There’s no longer a goal around the number of courses produced.
  2. It takes time and effort to design impactful training content rather than content that’s designed for the sake of it.

Producing less training content might seem disconcerting if you are used to targets around content production. However, creating a smaller amount of resources designed around performance improvement should be significantly more impactful than creating a large quantity of content without performance in mind. This is especially true if, like many organizations, you already have an excess of resources.

It’s Time to Put the “Learning” and “Development” Back into L&D.

For L&D to shift from content production to performance improvement requires a change in targets and incentives. And if L&D leaders want a “seat at the table,” this is a necessary change. Simply tracking completion is an “outdated modality”. It’s time to put the “learning” and “development” back into L&D!

Subscribe to our blog

And the learning measurement survey says...

See how global changes have affected the world of learning and development by reading the results from our sixth annual survey with LEO Learning. It shares an evolving picture of L&D’s relationship with measurement and business impact—including real-world examples and extended commentary.

eLearning Learning

This website stores cookies on your computer to improve your experience and the services we provide. To learn more, see our Privacy Policy