Spring 2021
Learn More About
Issue Map
Advertisement
Advertisement
A gray envelope that says Top Secret
CTDO Magazine

Do You Measure Up?

Thursday, April 15, 2021

My journey to improve my learning measurement capabilities was necessary, albeit bumpy.

Here’s the truth: I wasn’t measuring up when it came to consistently measuring and demonstrating the business value of our L&D programs. And I’m fairly confident that I’m not the only learning or talent development leader who has felt that way.

Advertisement

In fact, according to the Association for Talent Development’s 2019 research report Effective Evaluation: Measuring Learning Programs for Success, only 40 percent of talent development professionals surveyed believe their learning evaluation efforts helped them meet their organization’s business goals. And only 16 percent measured the return on investment of learning programs.

So, while I may not be alone, as the chief learning officer of an urban, pediatric healthcare system, it is my responsibility to get better at measuring the learning impact on business outcomes. Here’s how I did that and the lessons I have learned along the way.

The path to measurement

My journey to get better at measurement started in 2014, when my team and I codified the first iteration of our “Measurement and Reporting Guidelines.” The document outlined an easy-to-follow, three-step process to categorize a project, plan the measurement, and determine reporting needs.

The first step is to categorize a project as high, mid, or low profile based on strategic alignment, visibility, and audience size. The second step is to determine the measurement requirements based on those categories.

High-profile projects are measured at the business outcomes or return on investment level, while low-profile projects can be measured solely at the mastery level (or Level 2 in Kirkpatrick’s model). The final step suggests reporting methods, whom to report to, and the frequency of reporting based on the project category.

As we rolled out the measurement guidelines, it became clear that my team and I needed upskilling first. Many of our organization’s learning professionals have a first career in a healthcare-related specialty, and some do not have academic preparation in adult education.

We created a system of development workshops, one-on-one coaching, and consulting practice integration to help us develop the skills necessary to be better at measuring project and program outcomes. We made the case for change to our team and provided exemplars of effective measurement plans as well as a library of measurement options for each of Kirkpatrick’s levels. From a leadership perspective, we had everything in place: clear expectations plus the skills and tools needed to execute. We were well on our way.

Measuring success

During the past several years, my team and I have achieved some success. For example, we developed a great measurement plan for a large suicide-screening training initiative for the nursing staff. Through automated reports from the electronic medical record, we could quickly identify that inpatient nurses were performing the suicide screening for greater than 95 percent of the new admissions.

We also saw that the emergency department was struggling with the new process, which prompted us to further analyze and modify the process for that team.

We began to hear feedback as well. For example, a teenager came into the hospital for ankle surgery and shared their thoughts of self-harm with the nurse during the screening. The nurse was able to get the individual the help and resources they needed. Without that screening, the teen may not have received the necessary support.

All those metrics show our value in changing behaviors and helping the organization meet its healing mission.

Another success was the annual publication of a learning dashboard. It was largely focused on activity metrics, such as learning hours and formats, but we started publishing outcomes for high-profile projects and programs too. The most recent iteration includes compliance, learner reaction, and a few outcomes for our diversity, equity, and inclusion work as well as for our emerging leader program.

The learning curve

Yet, although my team and I have made great strides, the journey has not been seamless and painless. As a leader, I have failed along the way. But those failures have been opportunities to learn and grow.

For example, I had completely failed to appreciate the learning curve and nuance to the talent analytics space. All the industry articles made talent analytics sound so easy. Business literature could consolidate the process down to four simple steps or five innovative ways, but putting meaningful measurement into practice was much more complicated.

First, my team and I didn’t own the data. Even our engagement survey data was “owned” outside our team, so simply getting to the data involved some work and building bridges.

Second, our systems didn’t talk to each other easily, and merging learning and program data with other people measures has been an ongoing challenge.

Further, I had the stark realization that I had an entire new lexicon and body of knowledge to learn with talent analytics and systems integrations. Elements such as data mining, data visualizations, and xAPI integrations were blind spots where I hadn’t appreciated my own ignorance.

The bandwidth dilemma

Another key failure in our measurement journey was the bandwidth dilemma.

Measurement takes time and effort. So often, individuals are pushed quickly to the next project or deliverable.

I frequently found myself asking, “Is the juice worth the squeeze?” Is pushing on measurement follow-up worth risking the next project timeline or team member engagement?

I vividly recall the look a direct report gave me when I asked for an outcomes report on a program after she’d just listed about a dozen items she was actively working on. That was not my proudest leadership moment.

Even our learning dashboard was a bandwidth drain. Condensing data from multiple sources into an engaging, visual presentation was a tedious manual process.

Working on automating our data integrations and visualizations using business intelligence software took away time from keeping our dashboard updated and vice versa. Consequently, the learning dashboard that is still posted is a year outdated (although the pandemic takes some of that blame).

Leaders face hard choices, and frequently, the bandwidth dilemma is one of the hardest to navigate.

For me and my team, many times our measurement failures fell into three categories: measurement plans were delayed, altered, or forgotten due to bandwidth challenges; they were killed or altered by technology or data challenges; or over time, as we raced to keep up with the pace of change, we experienced a delay in our learning team identifying meaningful outcomes at the onset of projects.

Advertisement

Lessons learned

As I reflect on mine and my team’s continuing measurement journey, I have identified several lessons that, hopefully, other leaders can use in their own measurement efforts.

Talent leaders need to upskill themselves in the discipline of data science. As I mentioned, getting more sophisticated in program measurement is not as easy as “four simple steps.”

Options for your personal upskilling include everything from free vendor-supplied webinars and materials to inexpensive massive open online courses in analytics from such providers as Coursera and edX.

Leading business schools offer more expensive but deeper academic certificate programs in business analytics. And industry associations, including the Association for Talent Development, offer meaningful certificate programs.

Following execution, leaders need to provide space for measurement activities. The bandwidth dilemma is real—plan accordingly.

Ditch the grand, multipronged measurement plans for a realistic plan that focuses on a few key measures. Build measurement activities into project plans, and allot hours for post-implementation measurement where possible.

Watch the cart and the horse. I tried to evolve my team’s measurement and reporting capabilities before I had a fuller understanding of the discipline and before we had the systems and data capabilities to make it easy.

In many ways, I contributed to a level of disdain for higher-level measurement because it was so hard to combine data into real insights. I put the cart before the horse.

Get your systems in place to the greatest extent possible before you place unrealistic expectations on your team.

It’s a journey, not a destination. Our measurement journey has been complete with potholes and delays along the way. But we have made progress and learned other valuable lessons.

For example, my team and I now place a higher value on different types of measurement such as anecdotes and stories that add color to the numbers. Don’t forget to pause on your journey and assess how far you’ve come and what you’ve learned.

It’s true—none of us always measure up. There are certainly many things my team and I have to be proud of, and we will continue to reflect and learn from our shortcomings.

For next steps, I will keep looking for ways to be better and to use the lessons I have encountered to learn and move the needle forward. I feel no shame in failure when pushing into uncertain areas unless I repeat mistakes I should have learned from previous experiences.

Read more from CTDO magazine: Essential talent development content for C-suite leaders.

About the Author

David Campbell is vice president and chief learning officer at Children’s Health in Dallas, Texas, the leading provider of pediatric healthcare in North Texas. Having spent 10 years as a Registered Nurse, David brings a unique perspective to the healthcare CLO role. He’s been recognized by CLO Magazine, Association for Nursing Professional Development, and ATD for his work to modernize healthcare learning and development. Contact him at [email protected].

Be the first to comment
Sign In to Post a Comment
Sorry! Something went wrong on our end. Please try again later.