There’s a million ways a survey project can go wrong.
We see it happen all too often. An energetic startup founder designs a survey he thinks will change the way he does business—turning his “gut instinct” into validated, data-driven insights about his customers.
But the data comes back, after weeks of work, and it’s anything but helpful. Worst case, it’s 100% useless because of some error in the way the survey was designed.
We hate seeing this happen. So we’ve put together a list of industry-secrets when it comes to designing high-quality market research surveys. Some of these may seem like common-sense, but some of them are likely to surprise you.
Remember, we at PeopleFish review every survey we field for our clients, and we won’t ever field something that will yield useless data. That said, we can’t deep-dive into the nuances of every question—ultimately, only our clients know exactly what they need to learn from their target market. That’s where these tips come in handy!
Tips for Writing Surveys
Writing a survey is harder than it seems. Here are some tips—some obvious, some not—on writing a survey that will yield useful findings for your team.
- Design your surveys in a word processor (Microsoft Word, Google Docs, etc.). It’s too easy to get thrown off by all the bells and whistles in a survey platform and lose track of your primary research questions and goals.
- Use short, simple sentences. No exceptions.
- No double-barreling. Don’t try to fit two questions into one.
- Stick to single-select, multi-select and rank-order question types. Only use other types if absolutely necessary, and don’t require respondents to rank more than seven items.
- Use scales wherever possible. For example, less Yes/No and more Always/Sometimes/Never. Use only five- or seven-point scales, and fully label all scale items (don’t label either end and use numbers in between).
Tips for Cleaning Data
At PeopleFish, we clean our clients’ survey data for them before sending over results. So for those of you working with us, there’s no need to worry about fully understanding these steps. But we thought it might be helpful to explain standard data-cleaning methodology—we do this each of these steps for every survey we field.
- Check for speeders. Use Excel to calculate the sample’s median time spent taking the survey, then flag everyone who finished in less than half the median time. Some exceptions apply, depending on the survey logic and the total length of the survey.
- Check for flatliners. These are respondents who selected the same scale item across multiple scales. While this doesn’t necessarily mean they should be disqualified, flag anyone that appears to have “flatlined” in this manner.
- Check for gibberish & logical inconsistencies. This is self-explanatory—respondents who write nonsense in open-ended boxes, or who contradict themselves across answers need to be flagged.
- Add up the flags for each respondents, and discard the worst offenders. Use your judgement to decide what the flag threshold should be.
- Track the edits you make while cleaning data. Start a new sheet after flagging for each of the three items listed above, so that someone else could conceivably review what you did to see if they agree with your flag thresholds.
- Your sample size shouldn’t change much through your survey. It’s OK to keep some partials, but by-and-large, the number of people who answered the first question in your survey should be close to the number who answered the last.
Tips for Reporting Data
Most of our clients report their data to someone—investors, their boss, as marketing collateral, etc. Here are some tips to make sure you make maximal use of your survey findings in the form of a report.
- Remember that when you segment (cut) your data, the sample sizes for each segment drop. Keep this in mind, and don’t segment if it means indefensible sample sizes within segments.
- Condense figures and graphs wherever possible. Don’t make clients switch between pages or slides to view obvious comparisons between questions—put like topics and similar questions together.
- Don’t impute meaning into respondents’ answers. Just report the facts. Draw conclusions, of course, but that’s different than imputing meaning. Conclusions come with caveats and some degree of confidence. Imputing meaning is asserting more about your respondents’ answers than is actually there.
- Write a Key Findings section at the beginning of your report. Most clients won’t read much beyond this anyways, so be sure it contains the most salient findings and directly answers their research questions.
- As when writing the survey, use short and simple sentences in your report. No exceptions.