You invest a good few hours into writing and designing your NPS survey, carefully picking the questions and wording them just how you want them.
When you’re finally done, you choose who to send it to and fire it off, crossing your fingers for positive responses. Then you wait…
…And you wait, and you wait some more. Hardly anyone fills it out. Why did this happen?
When sending out surveys, most companies worry about being inundated with reports of bad customer experiences. Despite your service team’s best efforts, the post-purchase part of the customer journey is always a blind spot unless the customer reaches out again or reconverts.
But behind the unpleasant possibility of negative responses from customers lies a potentially more disappointing outcome: A lack of responses altogether. If only a very small subset of your customer base responds, whether they report a good, bad, or neutral experience, you can’t do much with that data.
How can you structure your surveys to get a higher rate of good-quality responses, so your survey results actually reflect the general sentiment of your customer base? What is a good survey response rate to aim for? What’s a good NPS score? And what are some reasons why your surveys may not be performing as expected? Let’s dive in.
Jump to:
What is a good survey response rate?
Why is survey response rate and quality important?
How to improve survey response rate
Why do I have a low survey response rate?
Why is NPS score important?
How to get the most accurate picture of customer sentiment
When it comes to survey response rates, there is no one number everyone should shoot for; your NPS response rate goals should be based on your industry, audience, and survey delivery method. Survey response rate benchmarks can vary depending on who you ask and where you look, and are influenced by a number of factors, so it’s best to view your benchmark as a range or a minimum percentage to hit.
Also, it’s important to keep in mind that response rate alone doesn’t fully reflect the quality of survey data. Surveys are meant to gauge insights from your customer base in its entirety, so responses should be representative of your customer demographics. More on that in a minute.
According to CustomerGauge, the average NPS survey response rate for B2B brands is 12.4%, but rates can vary anywhere between 4.5% and 39.3%. And Delighted, a feedback program by Qualtrics, has reported that companies sending out NPS surveys should look to hit a response rate of at least 20%. The average survey response rate - for all surveys, not just NPS surveys - reported by Delighted users ranges between 6-16% depending on the survey channel.
Factors to consider when setting a NPS survey response rate benchmark include:
Most of the time, a low number of survey responses relative to the number of sends means the data is less accurate. The point of surveys is to use a smaller group to represent the opinions of a larger one - i.e., your entire customer base or a significant subset of it. Few responses mean the data may not be statistically valid.
The lower your response rate, the higher your margin of error will be. Simply put, a margin of error represents how closely your survey results reflect the views of your customer base as a whole, and offers a range for how much your calculations may differ from reality.
Without getting too into the weeds with statistical formulas, the margin of error is calculated using:
A small sample size relative to your audience size is unlikely to be representative of your entire customer base, so your margin of error will have to be larger.
The following is the formula for calculating margin of error: