3 reasons customer satisfaction surveys fail

Customer satisfaction surveys don’t always work. Perhaps you have first-hand experience with a survey failing and looking back, still don’t know why.  But rest assured, you are not alone.  We run into many people that have done surveys before and been unsatisfied with their results.  They feel like they did all of the right things.  Many have:

  • Spent a ton of time crafting specific questions based on what they wanted to know about their clients/prospects.
    • They may have searched Google to find ‘expert’ survey questions.
    • They may have used a research firm or consultant to help them create their questions.
  • Used cutting edge technology to create and send the survey.

But something was obviously missing because after the survey results came back, they were left feeling like all of their effort/time/resources were wasted. What happened?

In our experience, there are three major reasons customer satisfaction surveys fail.  Typically the sender:

  • Didn’t have a significant number of people respond;
  • Had people respond but didn’t get the information they hoped for; and
  • Received great data but didn’t know how to create actionable steps from it.

With each of these issues, there are many variables that could have caused their survey to fail.  Today, we will pull back the curtain on some of the most common.

Reason 1 – Survey Response Rate Was Low

If your survey went out by email (this is what we see most often), typically there are four areas to analyze for issues.

  • The subject line for your mailing.
  • The number of messages sent.
  • The motivation you provide for people to respond (the email’s content).
  • The technology you used to send out the customer satisfaction survey.

Subject line:

Often the most unloved, yet powerful portion of an email communication is the subject line.  You simply must spend time on it.  And if your mailing list is large enough (5,000 recipients or so), you MUST use split testing on the subject line.

We’ve seen open rates increase by more than 50 per cent based entirely on changing a few words in the subject line.  Imagine how many more of your surveys will be taken when 50 per cent more people open the mailing?

The number of messages sent:

The average professional receives about 100 emails per day. How easy is it for them to miss your message (even if they are interested in looking at it)?  When it comes to sending out a survey via email, a single send will never do.

A second reminder email should always be sent several days later to the people who have not filled it out.  This will dramatically increase your recipient’s likelihood of seeing and engaging with your email.


Providing substantial motivation with your email’s content is paramount to a good response rate.  Make sure survey recipients realize how valuable their feedback is. And how it will help your company become an even better resource for them.

Let them know that you will follow up with them (BUT ONLY if you truly will) if they have questions or concerns.

Always, always, always use a nice looking Call to Action button on your email, coded in HTML (don’t use an image).  A larger button is better to use since so many people will likely be opening it on their phone.  Speaking of which…


The technology you use for sending out the survey must be mobile friendly.  That means from the email to the form on the website, everything needs to look nice on a phone and not require constant zooming.  You will lose an enormous number of survey-takers if this aspect of your technology falls flat.

Our analytics show that nearly 40 per cent of people are now using mobile devices to take our client’s surveys. But here is the kicker. For some of those clients, the numbers are near 80 per cent!  Imagine if our clients’ emails or surveys weren’t mobile-friendly – they would lose a ton of people from the emails to the surveys.

Thankfully all of our clients are mobile optimized out the gate.  If your survey tool doesn’t provide this for free, you may want to consider looking for a new survey tool.

Reason 2 – Didn’t Receive Information They Needed

This is typically due to one specific reason. Even good marketers struggle to construct questions that will get them the actual info they need.  That’s why some of the most popular key phrases in Google, when you search for ‘customer surveys’, are ‘customer survey questions’ and ‘customer survey template’.

This person is actively looking for advice (in the form of a Google search). However, he doesn’t realize that the generic advice he’ll receive from these results won’t provide questions that are specific and meaningful to his organization.

So how do you get company-specific questions?

You could turn to a survey consultant, but her fees are often expensive.  You could also rely on the company that provides your surveys. (When needed, we provide this as a complimentary service for our customers).

But our best recommendation is to form an internal panel that provides feedback from stakeholders in each department (marketing, sales, client services, etc.).  Their feedback provides you excellent insight into what things the survey should help answer. It also helps create buy-in that will allow for actionable steps following the survey.

Reason 3 – Received Great Data, Now What?

I think this reason that customer satisfaction surveys fail is the most frustrating of all.  The person putting together the survey seemingly did everything right. She crafted both a great subject line and email content that engaged email recipients enough to take action. And she created meaningful questions that the survey taker was willing to answer.

But now, she is stuck. Why? Because, as with many who create surveys, this is the first time she has all of this disparate data in front of her and it is overwhelming.

Whether the results are positive or negative, the marketer has to think how it translates into actionable next steps. Also, how to speak to her survey recipients going forward based on the results.  In other words – it is hard.

Setting your goals

But it is only difficult because the survey creator forgot to set up these next steps at the same time as the survey questions.  Going into any new survey project with a goal/goals for the outcome is essential to measure its success.

With a first survey, a company’s goal may simply be to receive baseline customer satisfaction results that they can begin working internally to improve.  A different company’s goal may be that they wanted to take a survey to get a better feel for what their clients’ demographics are.  A company’s goal could be one of many things, but it is ALWAYS specific to the company sending the survey.

With a goal in mind, the nitty-gritty details don’t seem quite so overwhelming.  Our clients find that the custom reports included within the CustomerCount® system are able to help them understand survey data without hiring a data scientist.

How?  Because our reports are role-based and show the information that the person logging into the system is looking for.

As an example, a manager of a call center may want to see how all of his people are doing in one report while the call center representative just wants to see his own metrics.

Both have the same idea – they want to take actionable next steps. However, by making the data specific to the person logging in, he/she doesn’t have to weed through the details of things that are too broad or too specific to take action.

Have you ever had a bad experience sending out a customer satisfaction surveys?

If nothing else, we hope this article shows you that the problems you faced were things we have heard before at CustomerCount.  Thankfully, they are all things that can be solved by simply having a plan in place and the technology to execute it well.  If you have questions or would like to learn more about CustomerCount, please reach out today by calling us at 317-816-6000 or by contacting us now.

Recommended Posts

Leave a Comment