Most “end of program” feedback surveys for online courses focus on one thing only: Testimonials.
But as an impact-driven entrepreneur, you care about more than just testimonials.
You want to collect honest feedback on your course so you can boost the completion rates to 80-90%, and make sure your students are actually getting the results you promised them.
Typical surveys focus more on testimonials than actual feedback.
A typical feedback survey in the online business space includes questions like this:
- Which problems did you struggle with before joining [COURSE]
- What was hard or frustrating about those problems?
- How did [COURSE] help you solve those problems?
- What results did you get from [COURSE]?
- How is your life or business different now because of [COURSE]?
- What concerns did you have before joining [COURSE]?
- What would you say to someone with those same concerns that is considering joining [COURSE]?
- What was your favorite part of [COURSE]?
- How can we improve [COURSE]?
Take a close look at the survey questions above. What do you notice?
I notice that 90% of the questions focus on collecting testimonials… And there’s only ONE question that helps you get ideas for improving the course. And that question is quite vague as well, so you likely won’t get AMAZING responses from it.
It’s as if the feedback question is there just because it has to be there… It’s an afterthought.
And sadly, most entrepreneurs straight up ignore the answers to this question because the answers are often hard to read, and they much prefer hearing glaring testimonials they can use in their sales copy.
Sure, you care about the testimonials from your students so you can prove that your online course works. But more than that, you care about the success of your students.
You want to create the best possible course on the topic in your industry, and to do that, you know you need to collect feedback from your paying customers to improve it.
I too have been guilty of sending out the testimonial surveys instead of true feedback surveys to my audience. That’s what I was taught I “should” do, so I just went along with it.
But a few years ago, when I realized just how important it was for me to create online courses that actually deliver on the promises they make, I knew I needed to find a better way to write feedback surveys.
There are no great examples of feedback surveys out there :(.
I searched the internet for feedback and “program evaluation” surveys, but most of the questions in those survey examples seemed to be along the lines of:
- “How would you rate the course overall?”
- “How useful were the course materials?”
- “How clearly did the instructor explain the course material?”
I imagined the kind of answers and data these questions would get me, and I wasn’t satisfied.
Even if someone says “the course materials were useful” or “the instructor didn’t explain the course materials well”, how does that help me improve the course? I wouldn’t be much closer to knowing what specifically isn’t working, or how to improve it.
That’s when I decided to take the things into my own hands.
I experimented with different survey questions that focused on getting qualitative feedback from my readers that would help me improve my courses, rather than just fishing for testimonials.
Today, I’ll share some of my ideas, findings, survey questions and examples with you.
Introducing Impact-Driven Feedback Surveys
I recently finished my first round of a copywriting coaching program called Copywriting Genius, which had a 66% completion rate. Not a bad start, but not where I want it to be either.
Since this was the first time I ran a group copywriting coaching program, I treated it as a “BETA” program, and experimented with different coaching call formats.
On some calls, we did more live copywriting drills, on others we did more live “hot seats”, and on others I walked students through some of my own copy live on the call.
We also had different features in the program:
- Live training calls (with live copywriting drills / missions, followed by hot seats)
- Additional Copy Missions (PDFs that students could work through by themselves)
- A 60-minute 1on1 copy coaching call
- Unlimited video copy critiques via email
- A Copy Critique vault with all the recorded copy critiques
- Live coaching call recordings
I had a hunch which of these worked well for my students, which of them they actually used, and which of them they found helpful, but relying on your hunches is not enough to create the best possible online course.
I wanted to get cold, hard data, and hear from my students what worked for them. This would help me keep the features that worked, and remove or change ones that didn’t.
I also taught my students a variety of different topics through the weekly live coaching calls:
- Week 1: Rapid Research
- Week 2: Twisting The Knife
- Week 3: Painting The Dream
- Week 4: Remarkable Content
- Week 5: Less is More
- Week 6: Flow
- Week 7: Captivating Leads
- Week 8: Writing Voice
Again, I had a hunch which of these my students found helpful, but I wanted to hear it from them.
This would help me replace the topics that weren’t as effective for my students with better, more useful topics in the future.
I threw in a few more feedback questions, and finally, I wrote the survey in a tone that would help me collect honest, not sugar-coated feedback (more on that in a sec).
Copywriting Genius Feedback Survey Example
I ended up creating this survey:
Let’s walk through it step by step.
Survey Introduction: A call for honest feedback
One problem I noticed with trying to get honest feedback is that many students go into a “testimonial mode” when they fill out surveys for online programs.
They unconsciously and unintentionally say a lot of good things about the programs, share all the amazing results they achieved with the program, and focus on the positives, instead of the negatives.
That’s because they want to look good (some even want to be seen as great testimonials, especially in the online business space), and they’re afraid of offending the online course creator.
The problem? This kind of sugar-coated feedback isn’t all that useful for actually improving your online course.
Therefore, I decided to share some guidelines for my students as they filled out the survey. I told them to share their honest feedback and to not worry about hurting my feelings.
I also told them to be honest in the results that the course was really responsible for.
I find it inauthentic to say that the course was responsible for a $50,000 product launch when the person that went through it also went through 3 other courses and had a business coach at the same time.
Video Testimonial Ideas
In the survey introduction, I also included the video testimonial instructions to make it easy for my students to record video testimonials:
I included these instructions because I noticed that my students often wanted to record video testimonials for me, but always asked me what they should talk about in them.
Notice how I framed these as suggestions, not rules. I gave my students a few ideas in case they didn’t know where to start, but didn’t want to make them feel obliged to follow them.
The Feedback Matrix: How to see what works and what doesn’t
Next, I created three different Feedback Matrixes.
First, I asked my students which parts of the program they actually used.
This helps me get data on the features that my students actually used (so I could make them a core part of my program), and the features that sounded good to me, but the students didn’t end up using.
Next, I asked them how helpful they’ve found specific features.
This helped me find any potential disconnects in my program:
- If many students used the features, but few found them helpful, it meant I needed to improve these features to make them more helpful.
- If few students used the features, and the ones that used them found them very helpful, it meant I needed to communicate why these features are so useful better, or make it easier for my students to use them.
- If few students used the features, and few found them helpful, it means I could cut those features out of the program in the next iteration.
- If many students used the features, and many found them helpful, I would keep those features in the program in the future.
I then asked the same question, but targeted it at the lessons I taught during the weekly live trainings:
This way, I could see which lessons my students found helpful (I would keep these in the next iteration of the program) and which could be more helpful (I will replace these in the next iteration).
What I love about the feedback matrixes is that they’re super easy to fill out as a student, give you really granular quantitive feedback as a course creator, and help you dissect which parts of your program are working and which aren’t.
Next, I asked my students how likely they are to recommend this program to a friend:
This would help me measure the overall satisfaction with the program. If the program is as good as I want to be, most of the students will say “very likely”. If not, I have work to do.
Now I’ll admit that this question isn’t optimal. The optimal question to test the effectiveness of the program would be to ask if they already recommended this program to a friend. It would give me more accurate, honest data.
The reason why I chose not to opt in for this question with this iteration of the program is that I’ve only launched this program once, on a small scale. These were my first students in the program. It wouldn’t make sense for them to recommend this program to their friends, because even if they did, their friends would have no option of joining it.
I noticed that people typically recommend programs to their friends as they’re being launched. This means I could get better data through this question by following up with my original students 3 or 6 months after they’ve gone through the program (as that’s when they’d actually have a chance of recommending it to others).
Next, I did use a handful of testimonial questions in this survey, as I did want to collect testimonials for my upcoming launch of the program.
First, I asked my students what their favorite part of the program was:
I could use some of their responses in my sales copy, but I could also use this as an extra data point to know what’s working in the program.
Next, I asked my students about their tangible and intangible results with my program:
These would make for great testimonials, but they’d also help me measure the results that my students are getting, and benchmark them against the promises I make in the copy (and then tweak these promises accordingly).
My final research question was an open-ended question asking my students what they would improve about the program:
This would help me come up with new ideas on how to improve the program, as well as highlight any additional weak points in it.
Because I strive to be as transparent and authentic as possible in my business, I asked my students if they’re ok with me using their survey responses in my copy:
If they responded with yes, I’d use some of their quotes in my copy in the future. If they responded with not sure, I’d draft up the copy and send it to them for review. If they responded with no, I’d keep their responses for internal research purposes only.
I then asked my students to provide all the necessary materials for the testimonials (their name, title, website, headshot…). This would save us a lot of time back and forth down the line.
I also asked them for their email to get in touch with them if necessary (although I do already have their emails on file since they joined the program, so this question isn’t that necessary).
Finally, I gave my student an option to share a video testimonial with me in case they wanted to.
Notice how I make it really clear that this is completely optional.
This also gives me another interesting data point of overall satisfaction with the program – if a lot of my students send me testimonial videos, I know I’m going in the right direction.
Ideas for improvements
In this post, I wanted to share my survey with you and the process behind writing it, to give you some ideas on how you can create your own feedback surveys.
As I actually wrote the post, I reflected on survey and found quite a few ways to improve it though.
Because this is a fairly unexplored field, I’ll continue testing different surveys and survey questions until I develop a feedback survey that really works for me. I know I won’t wake up and write the perfect survey, but I can iterate my way into it, and make every survey better than my last one.
One of the drawbacks I instantly saw with the survey is that it might take longer to complete than I thought it would.
While my survey software said it should take about 8 minutes to complete, and my estimation was that it would take my students 5-10 minutes to complete it, the actual time might be longer. One of my students spent a whole 60 minutes filling it out!
This might mean that I receive less feedback than I’d like to (though the feedback will be deep).
To account for this, I can do a few things in the future:
- Split the feedback and testimonial surveys (more on that below)
- Make it an expectation for students to fill out a feedback survey when they join the BETA program (I did that, so I could refer back to it)
- Set better expectations, and tell my students up-front that this survey might take them 30 minutes or more to complete
- Set a deadline to fill out a survey (this worked well for me in the past)
- Have my assistant follow up with students who don’t fill out the survey in time and gently remind them to do it (this worked well in the past)
For context, this is the email I sent out to my students to ask them to fill out the survey:
I could work some of the ideas / messaging into the email above.
The BIG solution: Create separate surveys for feedback and testimonials
Looking back at my survey, it feels like I tried to accomplish two things at a time instead of one.
I’m tried to collect feedback AND testimonials with a single survey, which makes it a bit messy.
If I cut out the parts of the survey where I asked for testimonial-related questions, I could either make the survey shorter (easier to fill out), or add a few additional questions to collect even deeper research.
I’m definitely experimenting with that the next time I write a feedback survey.
Additional feedback question ideas
In terms of additional questions to ask, here are some examples that come to mind:
- What did you expect to get out of the program when you signed up for it?
- Where did the program fall short of your expectations?
- Where did the program exceed your expectations?
- Which features fell short of your expectations?
- Which features exceeded your expectations?
- Which parts of the program did you put a lot of effort and time into, but didn’t seem to produce the results you wanted?
(I’d love to get more ideas from you – please do share them in the comments below!)
Surveys are the beginning of the research process, not the end
Surveys are a great way to collect feedback from your online course students, especially if you have hundreds of students in your courses.
They can give you some great quantitative data, and hint at some deeper insights.
The drawback of the surveys is that they typically don’t help you uncover the deepest insights for WHY certain features didn’t work as well as you wanted them to, or how to improve them.
To supplement for that, I layer a few additional research methods on top of the surveys:
- Follow Up Emails: Whenever you see an interesting response in the survey, reach out to the student with follow up questions via email, so they can elaborate on their thoughts.
- Feedback Calls: Schedule 1on1 calls with your students to dive deeper into their survey responses.
- Live Feedback: At the end of live trainings, ask your students what they liked about them, and what they’d like to see more of in the future.
I try to get on a feedback call with at least 5-10 students every time I run an online program, especially if it’s in an early development stage.
Through those calls, I uncover the deeper insights that help me improve my programs. The surveys I use are only the beginning, not the end.
Let’s build an impact-driven business – together.
I’m on a mission to connect impact-driven online entrepreneurs and build a better online course industry.
An industry that focuses on the success and results of your students first – rather than just sales conversions, ROI and maximizing revenue.
If you’re an impact-driven online entrepreneur, join the Impact Over Revenue movement by signing up to my email list through the box below.
You’ll receive more useful techniques for building an impact-driven online business, as well as My Ultimate Guide to Starting a 6-Figure Online Business where I show you what it REALLY takes to build an online business in 2020 (the RIGHT way).
What about you? Which questions do you use in your feedback surveys to get good data for improving your online courses?