"We all need people who will give us feedback. That's how we improve"

When Bill Gates tells us we all need feedback, it's hard to argue. We've written before about the importance of customer reviews, and the benefits it can bring to your business - from more sales, to customer retention. However, gathering feedback is just one small piece of the puzzle. Afterwards, it's time to start putting it all together, and listen to your feedback. This post walks through some common feedback methods and performance measures, and shows how you can use them to evaluate and develop your training.

Training Organisation Scorecard

How Feedback Can Be Gathered

In order to listen to feedback, it goes without saying that we have to actively gather it. While we can sit and wait on our learners coming to us, in reality, this is unlikely to happen. According to customer service expert Ruby Newell-Legner, a typical business can expect to only hear from 4% of its dissatisfied customers. That means it can take time for an issue to surface, but fortunately when it does, there's huge benefits in solving them.

According to the White House Office for Consumer Affairs, a happy customer who had a problem and got it resolved will on average tell between 4 and 6 people. That's 4-6 strong brand advocates for your business, and the chances are they'll know people who could benefit from your training! That's more course sales, without any marketing or sales involvement! However, if we're to achieve these goals, we need to gather feedback in a way that allows us to easily evaluate, communicate, and change our training for learners.

Survey

Online Surveys

When searching for feedback, an online survey can often be the best way to get responses. Online surveys empower students to fill in a survey at a time which suits them, and allows them to be honest. Think about things from the learner perspective. It's entirely natural to be fearful of handing over truthful feedback to their instructor in class. Online surveys offer a way to counteract that, by offering anonymity (if selected by the training provider), and less chance of confrontation or fear of judgment. That benefit also extends to instructors, who may have poured their soul into creating content that didn't meet a learner's expectation.

Conducting surveys online also carry the benefit of restricting response options, creating guidance that's easier to act on. Typically, these responses take the form of a basic Likert scale, or evolve into an NPS score. Combined, it becomes quick and easy to breakdown each question by its niche, and get actionable insight.

And of course, online surveys are often better for resource management - both in terms of money, and time. That becomes especially true when surveying a large student body. Learn how we integrate with SurveyMonkey to help you tackle training evaluation by clicking here.

Paper Surveys

The phrase "The pen is mightier than the sword", was first recorded by Edward Bulwer-Lytton in 1839. Curiously, that same author also found fame for one of the worst opening sentences to a book in history. It's a saying that has stood the test of time, and in terms of feedback, sometimes interview notes and written responses get the best results.

Wait, what? In this digital age, why pursue paper? The obvious place to start, is that ending a class with a written survey is catching the learner at the optimum time - their recall of your training should be at its best. Research suggests that students remember as little as 10% of the content one week later - so it stands to reason that they won't recall much about the delivery either. A paper survey is perhaps less convenient to the learner, but it provides an instant snapshot of class opinion, whereas online surveys can easily be abandoned or not responded to. We also have to be mindful of the presence of digital immigrants and digital natives in the same room, and retrieve accurate feedback from all backgrounds.

Step Summary

Experiment with online and paper surveys, and figure out what works best for you. Are the results much the same, or is there a bias/opinion that only comes up in one medium? Having an understanding of this data will enable you to think more about your learning delivery, and maximise learner engagement.

Acceptance and Qualifying

Accept Feedback

Learning to accept that your training has an issue, or that students aren't engaging, is difficult. It's hard to not take it personally, when a learner says they don't like you, your training, or the training you oversee! Unfortunately, we all know that in life some people will respond better to different people and different methods. That's outside of our control, and it's important to accept these facts. Fortunately what you can control, is the feedback that you accept. Now that you've collected your feedback, it's time to assess the responses and gather your thoughts.

Key Actions

Work through the points below to reach your conclusion:

  • Listen: Try your best to put yourself in the shoes of the learner. If you had been them, how would you engage with the training? Perhaps try filming your courses, and observing the class behaviour? Are the instructors engaging with their classes, or are they losing the crowd? See if there's a pattern in the feedback you're evaluating, or ask respondents to further elaborate. Don't forget, feedback is a gift to be cherished - so make sure your respondents know their time is valued!
  • Reflect: Think about how the feedback reflects on the training you offer. Will students be happy to tell colleagues and friends about this, or will they be moaning about it for weeks to come? Consider how word of mouth amongst learners will impact on your brand image and industry reputation, and whether that affects your business future. Are you missing an opportunity to succeed? Or worse, are you allowing a disaster to unfold?
  • Decide: If you're going to ignore the feedback given, make sure you're clear on the reasons why, and that it's justifiable. To figure out if your decision is valid, consider how the respondent would feel if you explained your decision to not act on their feedback - would they understand, or disagree? If they would disagree, perhaps reflect again and see if there's a marginal improvement you could make. If you decide to act upon the feedback, it's time to build your action plan.

Creation of an Action Plan

Deciding to act on feedback is a key part of the journey, but figuring out how to make changes is often the most difficult. To do this, let's introduce the ICE scoring system. Attributed to improvement and growth specialist Sean Ellis, and increasingly common in startups and the marketing profession, ICE enables management to evaluate the potential of change and make decisions accordingly. Here's how it works:

  • Impact: What is the potential impact of making this change?
  • Confidence: How confident am I that we'll succeed in making this change?
  • Ease: How easily can we make this change?

Scoring

Scoring follows a 1-10 system (10 for the most impactful/confident/easily implemented option, 1 for the least). Breakdown each feedback request/solution, and assign each component of change a score out of 10. Then, add up the total for each request.

Implement

From the highest number to the lowest, work through your list of changes. This should help your newly evolved training hit the ground running, and help you identify the most effective changes. Not only does this show learners that you value their feedback and will change your training, but it empowers your team with quick wins from the start.

Why do we need quick wins? As with all new habits and changes, the early steps can be the hardest. That's why having a process that delivers early success is so vital to effective process change. Not only this, but you'll also see what your particular learners value most. For example, while many blogs and training journals will advocate a blended learning approach, perhaps your cohort have a desire for a given medium and aren't being heard?

Review, Refine, Retest

Review

As with all positive changes, waiting to see the difference is often the most frustrating part. That's why it's important to regularly review, refine, and retest your changes across different cohorts. Ask yourself the following questions when evaluating your training and instructors:

  • Looking back at the last feedback, have I made changes to impact positively inline with my learner's requests? If not, why not?
  • Having made changes, how can I further enhance this? For example, you may have started uploading your content online, but maybe you should start a course Facebook group to connect with your learners?
  • When I retest the learners, what key success indicators should I be looking for? What actions can I take to improve instructor performance, and learner satisfaction?

Self Evaluation

Finally, it's time to evaluate yourself, and how you personally respond to feedback as a manager or leader. That could be feedback on your performance, or feedback on the training you provide. Whether you're hands on with training or not, it's important that you too get the chance to improve. It can be hard to do, but being mindful of your own performance is a key part of modern CPD. Doing so will enable you to benchmark your standards, identify your weaknesses, and exceed beyond them.

Want to get a snapshot of your training right now? Download and complete our training scorecard, and find out what level of success your training delivers today.

Download Your Training Scorecard

Please complete our lead form.