logo image

ATD Blog

Ask a Trainer: Post-Course Evaluations

By

Mon Oct 18 2021

Ask a Trainer: Post-Course Evaluations
Loading...

Brought to you by

Hi Tim,

I work as a new instructional designer, within a newly created training and development team. Over the last several months, as we’ve created new instructor-led and e-learning courses, we’ve started discussing the need for integrating post-course surveys into the training experience.

Advertisement

Like most surveys, the goal is to determine our learners’ satisfaction with the course materials, the effectiveness of the facilitator, and the level of value they found in the content. However, I must admit, when I’ve worked on teams that have done this in past, we haven’t always been good about aggregating, analyzing, and reacting to the results.

We’re wondering if we should require the survey before the learner is marked complete within the LMS. What questions should we ask? Should we use a different strategy?

What are your thoughts on post-course evaluations?


Thanks for sharing your question. It’s interesting, as I believe there are a lot of conflicting opinions about the value of post-course evaluations. According to Kirkpatrick’s Levels of Evaluation, a Level 1 evaluation is meant to measure and evaluate how the learner reacted to the training. As you mentioned in your question, this usually involves delivering a survey to the learner, asking them about how they liked the training experience, the effectiveness of the facilitator, the value of the content as it relates to their jobs, and so on.

That all sounds well-intentioned, right? But, I have to be honest: I believe most post-course evaluations are a waste of time. I know that must sound extreme to some folks, but I promise you it’s not. You see, during more than 10 years working in this industry, I’ve found that most learning organizations do a lousy job of evaluating learner reaction.

Advertisement

The truth is, I think you’re asking yourself the wrong questions. Instead, I think you need to take a step back and re-evaluate what you’re hoping to accomplish with your post-course evaluations in the first place.

Here are three questions I think you should ask:

1. What will I do with the data?

The first question you need to ask yourself is what will you do with the data once you’ve collected it. Unfortunately, we often force our learners to complete surveys, and then we do nothing with the information we’ve collected. In fact, you even admitted in your question that you haven’t always done a good job of analyzing and reacting to the results of your post-course surveys.

So, if we’re collecting data and doing nothing meaningful with it, it begs the question of why we are forcing our learners to take surveys in the first place. Yes, it may make your learners feel a sense of “we’re listening” or “your opinions matter,” but those feels will wear off when they realize none of their feedback is being implemented.

As you think about your post-course evaluation strategy, start with identifying exactly what you hope to accomplish with the data you collect and what actions you will take with it.

Advertisement

2. Are learners qualified to evaluate the things you’re asking?

The next question you need to ask yourself about your post-course evaluation strategy is whether your learners are qualified to evaluate the things you’re asking them to evaluate. Most post-course surveys ask learners to evaluate the effectiveness of the facilitator and the content and whether the experience was a good use of their time.

On the surface, these questions seem to make sense; however, are your learners in a position to truly evaluate whether the facilitator did a good job facilitating, or can your learners tell whether the content was effective or not? Is it possible for your learners to have disliked the facilitator and that facilitator still did a good job facilitating? Or is it possible for your learners to have hated the content and that content was still well-designed?

My point in all of these questions is that your learners’ satisfaction or dissatisfaction with the facilitator or the learning content is not mutually exclusive with the facilitator’s or the content’s effectiveness.

And so, as you think about evaluating your learning content, consider if there are more effective and accurate ways of evaluating course effectiveness. Perhaps that means conducting Level 2 or 3 evaluations to measure learner knowledge or behavior. It may also mean implementing facilitation standards that your trainers are measured against.

3. Is it too early for learners to evaluate their reactions?

The final question you need to ask yourself about your post-course evaluation strategy is whether it’s too early for your learners to accurately evaluate their reactions. Similar to my previous point, many post-course surveys ask learners to evaluate the value of the learning content and its relevance to their jobs.

Asking these questions seems to make sense, as you want to know if your learners found the information useful. However, these questions are immediately presented to the learner after they received the training. If our goal is to help learners perform better on the job, how can learners evaluate whether the content was effective if they’ve not yet had the opportunity to implement it?

And so, as you think about your post-course evaluation strategy, consider the timing of when you’re surveying your learners. Instead of mandating a survey immediately after a course has been completed, perhaps you give your learners a few weeks to digest and implement the skills taught before you survey them. This can let your learners provide more well-informed feedback.

The Bottom Line

I know I gave you a lot to think about, and I know much of it sounds very cynical. But I also know your intentions are in the right place with your desire to conduct post-course evaluations, and that’s a good thing! However, it’s important that you take a step back and re-evaluate what you’re hoping to accomplish. Implementing a post-course survey is a waste of time—not just your time, but also your learners’ time—if you’re not asking the right questions or doing anything meaningful with the data collected.

So, I hope this sparks some ideas to help you determine the right strategy for measuring and evaluating the effectiveness of your learning content.

Tim


What other tips do you have for post-course evaluations? Share them by commenting below.


Do you have a learning question you’d like me to tackle? You can email them to [email protected]. Also, make sure to visit the Ask a Trainer Hub to check out all your questions and my answers.

You've Reached ATD Member-only Content

Become an ATD member to continue

Already a member?Sign In


Copyright © 2024 ATD

ASTD changed its name to ATD to meet the growing needs of a dynamic, global profession.

Terms of UsePrivacy NoticeCookie Policy