Last Updated: May 14, 2025
Survey Overload
I know you get asked to take customer feedback surveys all the time, because I do. Even the shortest business trip results in at least 5 surveys: Delta wants to know about your flight; Hilton wants to know about your stay; Enterprise asks about your car rental and on and on.
But the most prevalent of all surveys is that one at the bottom of your sales receipt, the request from Apple, Kohl’s, Nordstrom, Target and virtually all retailers to “tell us how we did.”
What We Did

We thought it would be interesting to know what level of science and engagement the nation’s largest retailers bring to their surveys. The surveys say they want to know about our experiences as a customer, but do they really want to know? Or, is this just PR spin?
What We Found
Well, friends, unfortunately… it’s PR spin. The nation’s largest retailers run tragically poor customer feedback surveys, they’re bad for customers, bad for companies—they’re a waste of time and money all the way around.
We examined surveys for 51 of the largest US retailers, finding that retailers like Nordstrom and Wal-Mart waste customers’ time—and their own—with critically flawed satisfaction surveys.
Why it Still Matters Today
While this research was conducted in 2016, the problems it exposed are even more relevant today. The study uncovered how point-of-purchase surveys, like those you get after buying a phone charger or a pair of jeans, often fail to capture anything meaningful.
And here’s the kicker: most customer surveys fail to meet basic design standards.
-
One study found that 94.6% of 37 surveys had at least one violation of established best practices.
-
Another review of 20 surveys concluded that only 15% were high quality—the rest needed major improvements or weren’t recommended at all.
So what are these big retailers, like Amazon, Apple, Wal-Mart, Kohl’s, and Target, doing wrong? Are there lessons that can be learned from their mistakes? And how can you make your survey better than some of the biggest companies in the world?
Since the nation’s largest retailers are producing shockingly flawed surveys, the findings from this study should be considered by all companies that use customer satisfaction surveys.
Two Big Problems
The problems weren’t subtle. Two issues showed up again and again:
- Bias. Most surveys were loaded with subtle (and not-so-subtle) bias, leading to skewed results.
- Disrespect. Surveys didn’t seem to care about the customer’s actual experience—or their time.
Based on an objective evaluation of 15 survey elements, on average, the surveys scored a 43, a clear F grade.
What’s Wrong With Bias?
Let’s look at the problem of bias. There were five types of biases in these surveys, each negatively affecting data accuracy in different ways.
#1. Leading Questions
Known within psychology as priming, leading questions are designed to elicit a particular response. Ace Hardware asked: “How satisfied were you with the speed of our checkout?” This question is phrased in a way that assumes the customer is at least somewhat satisfied. In our study, we found that 32% of all questions were leading!
#2. Forced Wording
The Gap asked customers: “Rate your agreement with the following statement: The look and feel of the store environment was very appealing.” “Appealing” is a weird word. Customers are more likely to think “it’s a mess,” “that was fun,” or “it’s well-organized.” Furthermore, the question would seem to have an agenda behind it—as in Gap executives want to hear that their store environment was very appealing.
#3. Faulty Scales
Wal-Mart asked its questions on a 1-10 scale. This scale introduces two problems:
-
- First, there is an even number of selections and therefore no true midpoint:
Selecting a 5 would imply a lower than neutral score, while selecting a 6 would imply a higher than neutral score.
- Second, there is no zero and some experiences are just that, zeroes, not sort of poor, plain old bad.
- First, there is an even number of selections and therefore no true midpoint:
#4. Double-Barreled Questions
his is where one question asks about multiple topics, usually that’s two questions compressed into one. Lowe’s asked customers: “Were you consistently greeted and acknowledged in a genuine and friendly manner by Lowe’s associates throughout the store?”Here, we see four questions in one. Yikes! To improve, Lowe’s should instead divide this question into four, or even better, consider what they really want to know and devise a clearer way to ask it.
#5. Question Relevance
Ace, Gap, JC Penney, and O’Reilly Automotive all asked about their associate’s product knowledge (e.g. “Please rate your satisfaction with the employee’s knowledge of parts and products)—and none of these retailers offered the NA option. It’s likely that a large portion of shoppers didn’t ask a question of any associate and so would have no way of accurately providing customer feedback.
What Else We Found
On top of the myriad data accuracy issues, our Point-of-Purchase Survey Study showed that retailers have little regard for their customers.
For example, Walmart asked 4 introductory questions irrelevant to the customer’s experience, and required the input of 2 receipt codes. Really? That’s a hassle.
But the biggest, most consistent engagement mistake? Many of the customer feedback surveys were just too long—the average length was 23 questions. A survey should certainly never take longer than the interaction itself, in fact, it should take less time.
Family Dollar asked a whopping 69 questions in their survey—with 10 seconds a question that’s over ten minutes spent reflecting on items that cost a buck.
On the other hand, 7-Eleven had the best survey—it was 13 questions, none of which were leading or used biased wording.
Designing a quality customer feedback survey is a process, requiring multiple edits to reach the best version. Throwing in every question is how NOT to design a survey. Think about what you want to know, and carefully craft your questions.
Set Clear Expectations
It’s also important to set expectations at the outset, communicating how long the survey will take, and then meeting that expectation. Nordstrom advertised their survey as 2 minutes, but with 25 questions it took closer to 5 minutes.
Most retailers didn’t provide any estimate of survey length, and instead simply let their customers click into the abyss.
3 Tips For Better Surveys
To execute a customer feedback survey that’s better than just about every major retailer, get serious about accuracy and engagement:
- Ensure your survey collects accurate and actionable data. Eliminate biases such as leading questions, forced wording, and faulty scales.
- Make every question clear and relevant to the customer.
- Show the customer that you respect and value their time by designing a survey that only asks what’s necessary and that states at the outset how long it will take.
If you follow even a few of the guidelines we’ve provided here, your survey will be leagues ahead of the biggest companies in the world.
The retailers selected for this study were the National Retail Federation’s (NRF) largest retailers, omitting supermarkets and membership stores. The surveys were collected between June 23 and July 27, 2016.
We Can Help!
For additional hints about how to improve the quality of your customer feedback, get our Genius Tips. And if you’re interested in more about the first of its kind, Point-of-Purchase Survey Study, ask us for the complete report.
If your company is relying on outdated or biased surveys, it’s time to rethink your approach. Want a second opinion? We’re happy to review your survey—or help you build one that gets real results. Get in touch!
Selecting a 5 would imply a lower than neutral score, while selecting a 6 would imply a higher than neutral score.