How to improve online surveys?  Ask anyone that creates or takes surveys and you’ll probably get an opinion.  We’ve even chimed in on the topic in the past. (And on another Seth post.)

Seth Godin recently posted Five tips for better online surveys, and, as usual, we have our own two cents to contribute.

1. Every question you ask is expensive. (Expensive in terms of loyalty and goodwill). Don’t ask a question unless you truly care about the answer. This means that a vague question with vague answers (extremely satisfied…acceptable…extremely dissatisfied and no scale to compare them to) is a total waste of time. What action will you take based on that? It’s smarter to ask, “how much would you say lunch was worth?”

We agree that every question is expensive.  Every question should be serve a purpose – either by answering a research goal, providing flow to the survey, or gathering “needed” demographics.

2. Every question you ask changes the way your users think. If you ask, “which did you hate more…” then you’ve planted a seed.

We agree, but we’re not sure whether Seth is suggesting to plant or not to plant.   Of course, poorly worded questions bias the respondents .  And (to use the example) asking, “which did you hate more…” implies you hated any.  We think it’s better to ask both ends of the spectrum: “which did you like or dislike?”  (Which I think means we’re more on the side of: don’t plant.)

3. Make it easy for the user to bail. If you have 20 questions (that’s a lot!) make it easy to quit after five and have those answers still count. If you waste my time and then don’t count my answers, see #2.

If everyone started to treat surveys as if you could take the first two questions and then move on, all data would be crap.  Incompletes typically aren’t counted because you can’t cross-tab any of their results.  If everybody bails it probably means your survey was poorly designed.

There should be an implicit agreement between survey designers and survey takers that designers agree to only include questions that provide useful data and engage the respondent as much as possible, and provide an accurate estimate of the time that will be required up front, and the taker agrees to pay attention and answer questions conscientiously and finish the survey.

4. Make the questions entertaining and not so serious, at least some of them. Boring surveys deserve the boring results they generate.


5. Don’t be afraid to shake up the format. Instead of saying, “Here are ten things, rank them all on a scale of one to five…” why not let people compare things? “We had two speakers, Bob and Ray. Who was better?”

Agree again, though be careful what you are actually comparing.  If you ask is “Bob or Ray better?”, is the answer useful?  Maybe that is all you need to know (who is preferred), but you’ll have no idea why.  Is Bob better looking?  Was Ray’s topic more interesting?  We think a corollary that cuts across several of these points is something like “Ask more specific questions.”  Maybe instead of “how satisfied were you with the customer service,” ask “did your server refill your drinks as often as you needed.”  Some of the boredom with surveys is that everyone has been asked the same six questions 100 times.  Asking specific questions  makes obvious to the participant (and the surveyor) what actions might come about as a result of the response (and, personally, it makes me feel like surveyors care about my opinion again).

Back to the larger point, we agree that making question formats more interactive can help keep respondents engaged (Quirk’s Marketing Research Review recently had an article (free registration required) on this as well.)