Five Ways to Improve Survey Response Rates

No one wants to take as survey. We’ve all probably lost count of how many times that we’ve stopped answering a survey, and we could all list a variety of reasons why: it was too long, the questions felt too personal or sensitive, we had other priorities at the time, or the survey just didn’t seem worth our time.

When designing a survey, it’s tempting to try to get as much information out of respondents as you can, but remember that you want people to actually take your survey. You’re asking for their time and attention in addition, which is required before you can get access to the content of their responses. The following considerations can go a long way toward getting people to devote their full time and attention to answering your survey.

  1. Limit length – This is hard. We want to get as much information as possible, but the more questions you have, the longer the survey is going to take respondents to complete, and the more likely they’ll stop midway through. It’s better to have thoughtful answers to a few questions than no answers to a lot of questions. A good target time length is 15 minutes; this means you can include about four banks of five quantitative questions (20 unique questions) and two open-ended qualitative questions.

  2. Identify what you really want to know – Chances are you’ll want to ask more questions than our suggested 22-questions limit, so you’ll need to separate out your ‘must haves’ and your ‘nice to haves’. Identify the questions you’d like answered and the questions you really need to have answered. Keep your top two must-have qualitative questions, then assemble your banks of quantitative questions, starting with your must-haves and adding in any nice-to-haves for which you have room

    If you still find yourself with too many questions, explore whether another data gathering tactic might work. Focus groups, for instance, can be an excellent way to take the pulse on a variety of topics where precise statistics are not needed. When assessing something like the responsiveness of a heating or cooling system, data points like temperature can be measured directly, without asking building occupants about their thermal comfort. In fact, in this scenario, measuring temperature directly may yield more accurate results by avoiding potentially confounding factors, such as someone who runs cold having control over the thermostat. In a situation like this, one person’s preferences could skew the results, with many people reporting feeling too hot, even if the heating and cooling systems are performing optimally.

  3. Limit demographic questions – Demographic questions can make people very uncomfortable. The more you ask, the more suspicious your respondents will feel that their responses may be traced back to them. Limit your demographic questions to things you genuinely need to know in order to conduct your analysis. It can also be a good idea to include a sentence or two explaining why you’re asking the questions. A little clarity can go a long way toward dispelling suspicions of nefarious intent.

  4. Target questions to a specific audience – It’s okay to ask questions that may not apply to everyone, but if you do, make sure that questions for a subset audience only go to that audience. Presenting questions to an audience to which they don’t apply is a great way to make them feel like you’re misusing their time. When you ask people questions that they shouldn’t answer, you’re not just asking them to stop filling out your survey, you’re also asking them to start it back up a few questions later. A respondent might easily stop, but getting them to start again can be a significantly bigger ask. Fortunately, platforms like SurveyMonkey allow you to implement skip logic that will only display questions if certain requirements are met, such as the response to an earlier demographic question, for example. This feature enables targeting questions to the appropriate audience without putting unnecessary questions in front of other respondents.

  5. Understand your audience – User mindset can have a significant impact on how people answer questions, so think strategically about your audience and the conditions that may affect their ability to fill out a survey in a meaningful way. Who are you asking? When are you asking? For instance, if you attempt to survey a student population the week before Finals, your response rate will likely be drastically lower than if you asked at a less busy time. Even if your response rate ends up being excellent, the timing of the survey may make the conclusions misleading—responses will reflect the mental state of a population dealing with far more stress than they likely would be at other times of the year.

    As another example, the week after distributing bonuses is a time when people tend to feel good about their company. Giving an employee satisfaction survey at that time may end with a high response rate, but it will also skew your responses and undermine the validity of any conclusions.

    In sum, we want data, but how we go about collecting it will influence whether the data we get is meaningful and can be reliably used to make good decisions. The key to this is always to remember your audience, and design with them in mind at all times