Some Best Practices for Market Research Surveys

There’s a million ways a survey project can go wrong.

We see it happen all too often. An energetic startup founder designs a survey he thinks will change the way he does business—turning his “gut instinct” into validated, data-driven insights about his customers.

But the data comes back, after weeks of work, and it’s anything but helpful. Worst case, it’s 100% useless because of some error in the way the survey was designed.

We hate seeing this happen. So we’ve put together a list of industry-secrets when it comes to designing high-quality market research surveys. Some of these may seem like common-sense, but some of them are likely to surprise you.

Remember, we at PeopleFish review every survey we field for our clients, and we won’t ever field something that will yield useless data. That said, we can’t deep-dive into the nuances of every question—ultimately, only our clients know exactly what they need to learn from their target market. That’s where these tips come in handy!


Tips for Writing Surveys

Writing a survey is harder than it seems. Here are some tips—some obvious, some not—on writing a survey that will yield useful findings for your team.

  1. Design your surveys in a word processor (Microsoft Word, Google Docs, etc.). It’s too easy to get thrown off by all the bells and whistles in a survey platform and lose track of your primary research questions and goals.
  2. Use short, simple sentences. No exceptions.
  3. No double-barreling. Don’t try to fit two questions into one.
  4. Stick to single-select, multi-select and rank-order question types. Only use other types if absolutely necessary, and don’t require respondents to rank more than seven items.
  5. Use scales wherever possible. For example, less Yes/No and more Always/Sometimes/Never. Use only five- or seven-point scales, and fully label all scale items (don’t label either end and use numbers in between).

Tips for Cleaning Data

At PeopleFish, we clean our clients’ survey data for them before sending over results. So for those of you working with us, there’s no need to worry about fully understanding these steps. But we thought it might be helpful to explain standard data-cleaning methodology—we do this each of these steps for every survey we field.

  1. Check for speeders. Use Excel to calculate the sample’s median time spent taking the survey, then flag everyone who finished in less than half the median time. Some exceptions apply, depending on the survey logic and the total length of the survey.
  2. Check for flatliners. These are respondents who selected the same scale item across multiple scales. While this doesn’t necessarily mean they should be disqualified, flag anyone that appears to have “flatlined” in this manner.
  3. Check for gibberish & logical inconsistencies. This is self-explanatory—respondents who write nonsense in open-ended boxes, or who contradict themselves across answers need to be flagged.
  4. Add up the flags for each respondents, and discard the worst offenders. Use your judgement to decide what the flag threshold should be.
  5. Track the edits you make while cleaning data. Start a new sheet after flagging for each of the three items listed above, so that someone else could conceivably review what you did to see if they agree with your flag thresholds.
  6. Your sample size shouldn’t change much through your survey. It’s OK to keep some partials, but by-and-large, the number of people who answered the first question in your survey should be close to the number who answered the last.

Tips for Reporting Data

Most of our clients report their data to someone—investors, their boss, as marketing collateral, etc. Here are some tips to make sure you make maximal use of your survey findings in the form of a report.

  1. Remember that when you segment (cut) your data, the sample sizes for each segment drop. Keep this in mind, and don’t segment if it means indefensible sample sizes within segments.
  2. Condense figures and graphs wherever possible. Don’t make clients switch between pages or slides to view obvious comparisons between questions—put like topics and similar questions together.
  3. Don’t impute meaning into respondents’ answers. Just report the facts. Draw conclusions, of course, but that’s different than imputing meaning. Conclusions come with caveats and some degree of confidence. Imputing meaning is asserting more about your respondents’ answers than is actually there.
  4. Write a Key Findings section at the beginning of your report. Most clients won’t read much beyond this anyways, so be sure it contains the most salient findings and directly answers their research questions.
  5. As when writing the survey, use short and simple sentences in your report. No exceptions.

As always, we’re open to any questions about these items. Hit us up on Twitter, or inquire here about launching a survey for your business.

How to Clean Your Market Research Survey Data

You’ve fielded a market research survey.

For weeks, you wrote and rewrote your survey questions. You paid for a SurveyMonkey license and spent hours learning how to program your survey. You leveraged dozens of industry connections to get survey answers — a hard-earned set of 300 respondents.

Getting here wasn’t easy.

But unfortunately, you’re not done. Before drawing findings from your survey, you need to clean your data. This is absolutely essential for maintaining the quality of your research. Here are the three most important things to look for when cleaning your survey data.*

Speeders

These are respondents who took your survey too fast. Identifying responses like this is based on the median time spent taking your survey.

The rule of thumb here is to disqualify responses from anyone who completed your survey in less than half the median time. There are some exceptions, like if your survey includes a logic branch that had certain respondents answering just a few questions. But in general, anyone going more than twice as fast as the average respondent is likely someone who sped through the survey without giving the questions much thought.

You can identify speeders by downloading your survey data into Excel, then subtracting the “time completed” from the “time started.” Most survey platforms I’ve used record this information. If yours doesn’t, you may have to skip this flag, but be sure to check for the following two.

In general, not more than 10% of your survey sample should be discarded for speeding.

Flatliners

These are people who picked the same answer to every (or most) multiple-choice question in your survey. For example, say that you asked four open-ended questions about price (like a Van Westendorp question set). A flatliner answered the same thing for each of these four questions (say, $10).

If you notice that more than 10% of your survey respondents are flagging on flatlining, you may want to look more closely at the questions you’re including in your scan for flatliners. It may be that some respondents only appear to be flatlining, when they are, in fact, giving honest answers. This may have to do with the way you’ve asked your questions (for example, if you placed student, 18–22 years, unmarried, and no kids all as the first answer options to questions asking about employment, age, marital status,and children).

You can only determine this by looking at respondents’ individual answers — but don’t bother with this if less than 10% of your survey sample is flatlining.

Gibberish and Contradictory Answers

These types of responses can be harder to spot. They require you to look, line by line, at answers to open-ended questions in order to identify ones that 1) are gibberish (i.e., dk3i8sw) and/or 2) don’t correspond to other answers in that row. For example, if someone says they are single at the beginning of the survey, then mention their “wife” or “husband” in a later open-ended question, delete that respondent. They are not being honest, and you want data that you can stand on.


If you designed your survey well, your data cleaning shouldn’t result in discarding more than 15% of your responses. If you’re worried you’re throwing out too many, take a closer look at the ones you’re throwing out. Consider keeping a few that give other indications of being good, honest answers or the ones that flagged on only one of the three criteria listed above.

*PeopleFish analysts clean every one of our client’s survey datasets according to the criteria set forth in this article. Data cleaning is a standard piece of our survey project offering. Nevertheless, it’s helpful for researchers to understand how survey data is cleaned, for their own knowledge and, of course, should they want to conduct their own market research survey independently of a market research firm like PeopleFish.