Healthcare Training Measurement: 3 Real-World Examples

Now that we’ve covered the importance of both learning ecosystems and learning analytics on healthcare training measurement, it’s time to explore real-world examples. CHRISTUS Health and Nebraska Medicine are two examples of healthcare organizations using their ecosystems and Watershed to measure the effectiveness and impact of their training initiatives.

As you read through the examples, keep in mind how you would save time, money, and frustration by utilizing your data to pinpoint specific areas of focus and to address them swiftly,

For instance, consider a quick shoulder-to-shoulder session as opposed to a lengthy eLearning course or full day of in-class training, a targeted piece of microlearning, or even a simple intervention between managers and associates via Slack or text.

Arming managers with the understanding of how training programs and interactions affect outcome is as simple as looking at the data.

CHRISTUS Health: Creating an award-winning talent development ecosystem

CHRISTUS Health is a healthcare organization with approximately 45,000 employees and 600 centers across the United States, Mexico, and South America. They recently won a Brandon Hall award for excellence in learning largely due to their vision of how to enhance the employee experience through utilizing technology.

CHRISTUS Health is focused on becoming a High Reliability Organization (HRO)—an initiative that strives to achieve zero harm to patients and associates.

To help realize this vision, the Talent Management team aimed to build a learning ecosystem that better meets the needs of learners, utilizes their large quantity of content, and provides streamlined reporting to leaders so they can make data-driven decisions.

Build a measurable foundation first.

Because CHRISTUS had multiple learning platforms, they decided to build a single entry point for associates [via learning experience platform (LXP), EdCast] to streamline access to learning resources.

And, to receive the highest fidelity of data from these various systems under the LXP hood, CHRISTUS opted to build an xAPI-centric ecosystem with Watershed’s Learning Record Store as the data hub.

Their first step into improving the learner experience was to lay a foundation that would enable them to measure it.

Standardized access to the learning data would help the learning team understand more about the learning experience and how that experience connects back to business drivers. For example:

  • If a person completes a certain amount of hours of a learning activity, what does their performance look like? And is there a correlation between performance and the learning activity?
  • If a person consumes a certain number of hours of learning per month, is this person typically a high performer?

And further, access to this data would enable the Talent Management team to answer operational questions to:

  • Evaluate content partners and vendors based on feedback and usage data.
  • Provide insight into what content associates were electively searching and consuming.
  • Identify what types of content sources, for both formal and informal learning, were being consumed; if there are duplicate content pieces; and what content is preferred.

So this example shows the importance of setting the right foundation of tracking the data across a learner-centric ecosystem, but what does measurement look like once everything is connected? Let’s take a look at Nebraska Medicine to find out.

Nebraska Medicine: Creating training content for every level of the organization

Nebraska Medicine is an award-winning healthcare organization that consists of two hospitals, 39 specialty and primary care clinics, and 8,000 employees. Their research and efforts in the fight against highly infectious diseases has been globally recognized for years.

In sticking with the theme of enhancing the learner experience, Nebraska Medicine’s L&D team recognized the impossible challenge of a one-size-fits-all system. After all, the hospital staff’s experience ranges from basic education to doctorate level.

As a result, the L&D team needed to provide training content that fits every level of education in a way that suited the needs of each individual learner (i.e., designing content for specific roles and specialties).

This not only included relevant, relatable content, but also the ability to use technology to distinguish between the different learners as well as serve up adaptive, dynamic content.

For example, a nurse practitioner who showed competency on a mock code blue has effectively tested out of specific areas of a generic CPR/AED training.

Because Nebraska Medicine’s LRS collects and provides recognition of this successful mock code blue, their system is smart enough to exempt that nurse practitioner from additional training. And that’s just scratching the surface of the types of efficiency improvements the L&D team has been able to identify.

The team continues to improve their learner population. This includes:

  • Tracking results and intervening to provide additional training where needed.
  • Looking at trends to see if benchmarks are being hit. And if those benchmarks aren’t being met, they can investigate to see if there are training issues that need to be addressed. The team can see if certain learning environments or areas are standing out as better or worse than others (e.g., loud rooms vs. quiet rooms, heavy traffic areas vs. secluded areas, etc.)
  • Incorporating workflows to track performance and measure if people are using and applying their knowledge.
  • Identifying efficiencies in reporting. That is, an audit committee can access info on a regular basis without the need for a siloed, single facility process. When data is no longer siloed or requires manual reporting, reporting and access to the data is more efficient (e.g., the democratization of data mentioned in the previous blog post).

Accountability through Transparency

Remember, using data to visualize performance can be incredibly impactful for large healthcare systems with multiple locations. Being able to visualize and benchmark how locations are performing in relation to goals gets quick buy-in from stakeholders.

Furthermore, having these capabilities enables you to pinpoint opportunities for learners’ personal growth without wasting time.

At a glance, you’re easily able to see where:

  • associates stand on key performance measurements;
  • divisions or geographic locations stand against benchmarks or one another; and
  • L&D can influence room for improvement.

But it all starts with data. So when looking to improve your learner experience and impact in healthcare, take a step back and make sure you not only have access to that data, but also the systems that will help you aggregate and visualize that data.

Subscribe to our blog

eLearning Learning

This website stores cookies on your computer to improve your experience and the services we provide. To learn more, see our Privacy Policy