Businessman Filling FormWhen we think about tracking customer satisfaction via surveys, the analysis is almost always on the survey responses themselves: how many said they were satisfied, what is driving satisfaction, and so on. (See a related post on 4 ways to report customer satisfaction.)

Not shocking (and of course we should look at the results to questions we ask), but there is another layer of data that can be analyzed: the data about the survey itself.

First and foremost is response rate.  (Quick review: response rate is the proportion of people who respond to an invitation to take a survey; read more here.) Response rate itself is important to reduce non-response bias (i.e., to reduce our concern that the people who do not respond are potentially very different that those who do respond), but it’s also a proxy for engagement. The more engaged your customers are with your organization, the more likely they will be to participate in your research. Therefore, tracking response rate as a separate metric as part of your overall customer dashboard can provide more depth in understanding your current relationship with customers (or citizens or members or…).

So, you’re probably now asking, “What response rate correlates to high engagement?” Short answer – it depends. Industry, topic matter, type of respondent, sampling, etc. can all make an impact on response rates. So while I’ll offer some general rules of thumb, take them with a grain of salt:

  • Less than 10%: Low engagement
  • 10-20%: Average engagement
  • 20-50%: High engagement
  • 50%+: Rock-star territory

Yes, we’ve had over 50% response to our clients’ surveys.

The important caveat here is to be weary of survey fatigue. If you are over surveying your customers, then response rates will decrease over time as people tire of taking surveys (shocking, right?). What is considered surveying too much will vary depending on the length of the survey and subject matter, but surveying monthly (or more frequently) will almost certainly cause fatigue, while surveying yearly (or less frequently) will probably not cause fatigue. One to 12 months? It’s a gray area. (Feel free to contact us for an opinion on your specific case).

Another potential source of survey meta data that you could use to assess engagement is depth of response to open-ended questions. The easiest way to measure this is to use word count as a proxy  – the more they write, presumably the more they care enough to tell you something.

For example, we did a large voter study for a state agency, and when asked their priorities on the given service topic, we received paragraph responses back. This, combined with other results, showed just how engaged they were with the topic (though not necessarily the agency in this case) and how much thought they had given it. Useful, as well as somewhat surprising, information for our client.

The next time you’re looking at survey data, be sure to look at more than just the responses themselves.