Companies everywhere use surveys to measure customer satisfaction. But are their surveys good? When thinking about how to improve survey results, the usual path a company goes down is this.
First, a company buys a license to a platform like Qualtrics, Medallia, or SurveyMonkey.
They Google to find examples of survey questions. And they brainstorm questions with their colleagues. Then they combine the questions with the platform and release their survey into the world.
Most of these surveys are poor and companies don’t know how to improve their survey results. As Quality Guru W. Edward Deming said, “if you do not know how to ask the right question, you discover nothing.”
So, what’s missing?
- A Persuasive Invite
- A Weighting Factor
- A Critical Quality Review
Do these three things and you will improve your survey results dramatically. What constitutes dramatic improvement?
The data generated by your survey will provide a truly accurate view of the customer experience. And the data will be actionable. You’ll gather clear facts enabling you to make decisive plans and gain consensus around specific next steps.
1: A Much-Overlooked Step, the Persuasive Invite
The only way to get a fair representation of your customers is to get them to take your survey.
That means how you invite your customers to take your survey matters. After all, if your customers aren’t interested in your request, they won’t respond. Period. End of story.
For example, here’s an email that begs for a low response rate:
Bank of America makes a few significant blunders. First, screaming at me in all-caps is impersonal, rote, and cold. It says we did a mail merge and didn’t bother to correct the list.
Next, what about my call experience with ASHLEY does Bank of America want to know? How long I waited in line? Whether she resolved my issue?
Since it is unclear what Bank of America wants me to rate, it’s hard to answer the question. Anything that is even mildly difficult causes customers to bail. And why is the word experience used anyway? It fattens the sentence for no good reason.
And then, there is the issue of including the first survey question. We do it all the time—but only for short surveys. This survey was anything but short. So that one question up front feels like false advertising. Don’t do that. For top response and completion rates, be transparent with your ask.
Compare this to a better survey invite. One that consistently achieves a greater than 28% response rate:
This email invite garners high response because it calls on five proven principles.
- The customer has a clear reason to respond (they want to close out the ticket).
- Linking to the survey twice emphasizes what the customer is being asked to do. Usability studies show this heightens response. Adding action words to the command (“click here”) reinforces what you want customers to do and boosts response even further.
- Several of Robert Cialdini’s persuasion principles work well in email invites, for example, appeals to social instincts and authority. Here, the Cialdini principle at work is the word “because.” It’s one of those words that are proven to influence action.
- Next, as direct marketers have done for decades (because it works), the invite ends with a strong P.S.
- And finally, a disarming approach (the good, the bad…) lets customers know their input is genuinely valued.
How can you use this to improve your survey results? Basically, remember to treat the invite as a part of your survey strategy, not an afterthought, and you’ll be on the right track.
Remember, in general, the more customers you hear from, the more representative your data will be.
2: A Weighting Factor
When it comes to the customer experience, it’s not all equal. Some aspects of the customer experience matter more than others.
For example, sometimes, what the customer values most is getting an answer to their question quickly. In other situations, the customer may prioritize getting tips along with a very complete explanation. Or, perhaps, courtesies upstage everything else. That’s why, for your satisfaction score to be accurate, it needs a weighting factor.
There are two ways to establish a weighting factor.
First, you can do a correlation study to determine what aspects of the experience tend to have the greatest impact on your satisfaction scores. Or, as in this example, you can ask your customers to show their priorities.
If you learn that delivery has a 3x outsize impact on satisfaction, then use a multiple of 3 for aspects of your customer experience that involve delivery.
3: A Critical Quality Review
I’ve written about this here, so I’ll keep this brief.
Executives tend to look for what’s working well rather than what’s missing. This is true for almost everything, including customer surveys!
To improve your survey results, make sure that your questions are written to encourage customers to give their honest feedback. Don’t lead them to the answer that you want to hear!
For example, don’t ask, “How satisfied are you with your call experience with ASHLEY?”
This assumes the customer is at least somewhat satisfied.
Instead, ask your customers to rate Ashley’s expertise. Or ask about whether she resolved the problem.
Survey questions should be objective and specific. It’s all too easy to be vague and leading—that’s why you need an outsider to check each question for biases and generalities. That could be a Customer Experience Consultant, someone from another department, or even a panel of your own customers.
Are you thinking about how to improve survey results? Start with these three points. They will dramatically improve your results and make your surveys well worth the effort.
At Interaction Metrics, we love to write actionable surveys that get to the heart of the customer experience. Armed with objective data and critical insights, we help our clients increase customer loyalty.
Interested? Set a time to talk with Martha Brooke, Chief CX Consultant, here.