Listening to your customers is smart—as long as you ask thoughtful questions that elicit useful insights. Unfortunately, when it comes to customer surveys, too many companies go through the motions, conducting customer surveys just to say they did it. That’s a big, BIG blunder. To avoid this, customer experience teams need to establish clear goals and put on their research hats. This means tackling customer experience with a research mindset.

Surveys designed without a research mindset are a waste of time and energy for businesses and their customers.

What’s more, when a company fails to approach their survey with a research mindset, they generate junk data and their survey can even tarnish their brand image. Consider for example the latest survey I got from Lively, Inc.

Lively, Inc.’s Dead-in-the-Water Survey

I recently signed up with Lively, Inc., a San Francisco-based startup that provides Health Savings Accounts (HSAs). I created my account but had questions about how to fund it, so I decided to check with my accountant first.

A few weeks later, even though I still hadn’t funded my account or used any other features on the company’s website, I received an email survey asking how likely I would be to recommend Lively to a friend.

How could I answer this question when I hadn’t used their service yet? Did the Lively team think it was important to determine how likely customers are to recommend them based solely on the experience of setting up an account?

Here’s what I suspect: Lively didn’t have a goal with their survey, and they weren’t really trying to learn anything from it. Instead, they assumed that interacting with customers via a survey WAS the end goal.

Bad Surveys Are Common

To be fair, Lively’s co-founders Alex Cyriac and Shobin Uralil (now respectively CEO and COO) don’t claim to be survey experts. Cyriac’s past includes a stint with a payroll and compliance company, while Uralil worked in commercial energy efficiency. Because they are not researchers (and most CEO’s are not) they probably don’t know that sending half-baked surveys is a mistake. And, they don’t know what they don’t know: They don’t know that a survey can be a source of learning that can yield significant business insight.

Lively is not alone in sending “check-the-box-we’re done” non-research driven surveys. GoDaddy, the goliath web-hosting company recently sent me a survey asking how likely I was to recommend a trademark service that they might offer sometime in the future. How could I possibly know whether I’d recommend a service that’s yet to be created?

It’s worth repeating: sending a survey is NOT the end goal, the goal of a survey is to learn.

What Lively Could Learn

For example, Lively could learn why customers open accounts but don’t fund them. Their hypotheses about why this happens might include that customers are:

  1. Confused about how to use the service
  2. Comparing Lively to other options before depositing money
  3. Finding superior options through their employer, a bank, or another company

These are all possibilities worth testing—but that requires a research mindset, so let’s look at that.

Qualities of the Research Mindset

You’ll know you have a research mindset when you: have a goal with a hypothesis, match methods to the investigation of that hypothesis, and confirm or deny that hypothesis through measurement.

For example:

  • If Lively’s goal is to learn about the web interface and the experience of signing up, they should ask about that.
  • If they want to learn why customers don’t fund their account, they should ask about that.

Once they know what the goal of their survey is, they should hypothesize about potential answers to those questions. Then, they should design a 2-3 question survey combining simple rating questions with an open-ended question.

The open-ended question enables them to uncover unknown themes from the verbatims. They could apply intelligent coding (possibly driven by AI, although probably driven by analysts) to measure those emergent themes. This would give Lively a complete, measured view that addresses a goal and provides useful business insight.

What Lively got right is they measured something, namely Net Promoter Score (NPS). What they got wrong is they weren’t measuring something that is relevant and interesting to a specific point in the customer experience.

 More about Your Research Mindset

Here are a few tips for how to adopt a research mindset.

  • Formulate thoughtful hypotheses: Once you have established your goals, create hypotheses that could be disproven by the information you gather. To be able to disprove something requires you to be very clear about what you are testing and requires thoughtful planning.
  • Use the right tool for the job: Keep in mind that surveys are an important and versatile feedback tool, but they are not always the best method. For example, if a company is interested in what services it could add to increase customer retention, interviews or focus groups may be the best methods.
  • Avoid bias: Regardless of the method you use, avoid leading or biased questions. Your job is to elicit accurate data that you can use to grow your company. Even the Net Promoter Score (NPS) question has some bias, it assumes customers are at least somewhat likely to recommend.
  • Target different customer groups differently: Your customers are not homogenous and can be at vastly different stages of their experience with your company. With that in mind, tailor your questions to target customer segments (e.g. new customers, committed customers, former customers, etc.).

Companies that are customer centric often emerge as market leaders, so the payoff is there. To achieve customer centricity, what’s required is intellectual curiosity and a commitment to asking important questions in worthwhile ways.

Lively, find a better question for your new customers. And for all companies: Let’s make the world better through better customer listening!

Categories: Customer Satisfaction Surveys
E-book cover

Get our short CX Trends Report, published quarterly.

  • This field is for validation purposes and should be left unchanged.

Written by the analysts at Interaction Metrics, we highlight the latest developments in the fast-changing world of CX.