I don’t know how many opinion researchers are themselves members of online survey panels. I’m a member of a couple, mainly because I like to see what other researchers are doing in terms of survey and questionnaire design; to keep up with the current state of the art. Of course, that means seeing what not to do, as well as what I should adapt to my own surveys.
I read lots of blogs from market researchers describing problems with questionnaire design (things like poor scales, biased questions, etc.) but I don’t often see the problem that I find is my pet peeve with surveys. This problem makes it difficult to do that most basic thing, the thing that I try to do with every survey I take: answer the questions as honestly as I can. It’s perhaps easiest to illustrate this problem by way of example.
Recently, I was completing a survey in which the screening questions were related to car ownership. The survey asked if I owned a car (I do) and then asked whether I purchased or leased the vehicle (the former). The next question asked whether I planned to purchase a vehicle in the next 3 years. Again, the answer was yes. The survey then went on to ask a series of questions about what I was looking for in a car dealer. The problem? I do not intend to buy my next vehicle from a dealer. When my current car is paid off (in about a year), I plan to drive it for another year or so, and then sell it and buy a used car for cash. I have no plans to go to a dealer at all. The designer of the questionnaire I was completing clearly never considered this possibility. The questions assumed I would be shopping for a new vehicle at a dealership, and there was no way to skip the dealership-related questions.
Another example: I am a member of a survey panel for a gas station chain. I purchase gas from this chain occasionally, but always by paying at the pump. In a couple of cases when pump payment was not working, I would go inside the store to pay, but I never buy (or even look at) anything inside the store; I just pay for my gas and go. I have completed a number of surveys for this chain that ask a bunch of questions about my preferences in terms of what the stores should sell: should they sell sandwiches (and what type?), coffee (what price?) and so on. Again, I never have the opportunity to say that I don’t shop at the convenience store and have no interest in what they are selling, so I either have to answer all the questions “don’t know” or answer as though I was a shopper. But, if I do the latter, are the answers I give meaningful?
I believe that a lot of what guides this type of questionnaire design is an effort to broaden the response base for the key questions to ensure they have as many respondents as possible and thus reduce the margin of error for those questions. The problem is that many of the respondents they end up with are people like me -very occasional or never-users of the products/services they are asking about. To take another example I have experienced first-hand, if I visited a fast-food restaurant once seven months ago, because I was out on my motorcycle and needed a (aptly named, in my case) “butt break”, does that mean that I should be answering a long series of questions about what I want to see in fast-food restaurants just because I screening in by virtue of having visited a fast-food restaurant in the past year?
It’s not so bad if you have an “other” write-in option where you can describe your situation and then the researcher can decide whether they still want to keep you, but it’s more common to just have to go through question after question where there just isn’t a response choice that really reflects your situation.
One thing I’ve learned from this is to always include an open-ended “comment” question at the end of my surveys so that, if I have inadvertently made a similar error, the respondent can tell me that the questions I have included simply are not relevant to their particular situation.
Have you had similar experiences completing surveys? Other survey pet peeves? Let me know in the comments!