Questionnaire design: Size isn’t everything
Hmmm, not very deep is it? What to you expect to discover from your four or five questions? […] I can’t see how you will gain data of any worth from that. […] I reckon it took longer for me to click through the answers than it did to come up with the questions… [Survey respondent comment]
You try to do your best by respondents, but you can’t always get it right for everyone. This time, I wrote a survey that was too short. I have to admit I’ve not had a respondent criticise me for that before!
The Market Research Society Code of Conduct says that:
B.14 Members must take reasonable steps to ensure that personal data collected are relevant and not excessive
This means that when you are writing a questionnaire you shouldn’t ask for demographic details that you don’t really need or won’t actually use. In my opinion it is right to take that a stage further and apply it to all questions in a questionnaire. Don’t include any questions that you don’t really need or won’t actually use. Sometimes this means your questionnaire needs to be five questions long, and sometimes it needs to be 50 questions long. But my philosophy of questionnaire design is that it should be as short as you can reasonably make it. Any fool can write a rambling questionnaire that covers every possible element of a subject, but condensing this into a shorter, more focused (and consequently more ethical) questionnaire is where the skill lies. And as various famous dead people have been attributed with saying “Please forgive me for writing such a long letter, I didn’t have time to write a short one.” My point is that size isn’t everything when it comes to questionnaires and sometimes short but sweet is the better tool for the job. Quantitative projects are designed to collect statistics, and are usually about breadth (surveying more people in order to get more robust statistics) rather than depth (asking a lot of questions). The value is in asking the right questions to a pre-determined population and looking for patterns in their answers.
I used to manage an omnibus survey, which is a representative population survey that runs on a frequent basis and clients buy space on in for their questions. Writing for an omnibus survey demands very focused questionnaire design skills and you usually end up preparing a whole load of very short questionnaires with a handful of questions in each. Why? Because the client pays by the question so they want to keep it short! Doing this job means I have seen the results from loads of very short questionnaires. Consequently, I know that genuinely useful business decisions can be (and are) made based on one or two very well thought out questions.
Imagine if you were running an advert on the TV about cakes, and you wanted to know what proportion of the population had seen it. Well, you could use one question to find that out. Get yourself a representative sample of 1,000 people, show them the advert, and ask “Have you seen that advert before?” Now you now know that 50% of the population have seen your advert. You can decide whether that is good enough, and what your advertising department should do next. Job done, one question.
Alternatively you could ask that same question alongside a whole host of demographic questions. To the respondent this looks like a really short questionnaire but the researcher can get a huge amount of depth from the findings by cross-tabulating everything against each other. Yes, you might have only asked one question about whether you have seen an advert about cakes, but you can use the data to conclude that your advert is doing a better job of reaching (say) women, 25-34 year olds and people in Scotland. Then you can compare that to your target market and use that to plan your next campaign and track its success.
What I’m saying is it isn’t about how much you ask, but rather it is about asking the right questions to the right people and undertaking the right analysis. In situations such as these you could ask more questions but often there is no need. And if there is no need, it is ethically more appropriate not to.
Keeping the questionnaire as short as possible benefits the respondent because it means you are not wasting their time, or harvesting their innermost personal thoughts for no good reason. A shorter questionnaire takes less time to fill in, which tends to result in better respondent engagement and better completion rates than you might have got from a longer one. Usually, these outcomes are seen as being a good thing by researcher and respondent alike.
That’s all well and good in theory.
In practice respondents have no way to judge whether the questionnaire is a reasonable length and fit for purpose. They don’t know how long and tedious and repetitive it could have been. They don’t have access to full details of the rationale behind why the researcher wrote the questionnaire in the way that they did, and what they plan to do with the results. This means that respondents are inclined to feel that no-one could possibly need to know all of the seemingly random things asked in a long questionnaire, and that a short questionnaire could not possibly cover everything anyone might need to know on a subject.
Nevertheless, if the survey has come from a credible researcher, a rationale and results plan will have been formulated. A client usually wants to do a survey because they need evidence to make a business decision. They consequently usually come to a researcher with a business question in mind which will elicit this evidence, such as “I want to know what proportion of consumers think that / do that / buy that / eat that [delete as appropriate.]” The researcher will take this question and use it to design an unbiased questionnaire that will provide the required independent evidence, and the findings will be used by the client to make their business decision. But all of this business stuff that is decided pre-questionnaire design is not communicated to the respondent. This means that sometimes a questionnaire about A, B and C makes the respondent think about X, Y and Z and the respondent consequently thinks that the researcher has been remiss in not including X, Y and Z in the survey. What is more likely is that the researcher and client have talked about X, Y and Z and have decided not to put them in for some reason. Maybe the client knows all about X already. Maybe they know Y cannot feature in their decision-making so there is no point in asking. Maybe they don’t sell Z so they don’t care about it. Maybe their budget only covers A, B and C. Maybe a questionnaire covering A, B and C is already 15 minutes long so there is no space for squeezing in X, Y and Z this time. Many many possibilities.
Thing is, this feels so far removed from the questionnaire at the point of filling it in that it is natural that a respondent doesn’t think about it and instead sees what *hasn’t* been asked as a failing. And where a questionnaire is short, the volume of content that *hasn’t* been asked seems enormous. I can see why a short questionnaire could make a respondent feel cynical, even if I think they are misguided!
In the case of the criticised survey above, I simply didn’t need to know any more than what I asked. I actually tried to think of more stuff, but nothing else was relevant to what I intended to do with the findings. And for the record, the survey was ten questions long (not “four or five”), so it must have been really well written to make it feel even shorter :o)