A recent rainy day offered us a good reminder that survey question order can make a big difference in how people respond to survey questions. In fact, a psychology experiment explored the impact that the weather has on how people answer survey questions.
When we consider how many factors influence the data we collect from our surveys, it forces us to take as many steps as possible to keep our data pure.
Respondents are affected not only by the order in which they read survey questions, but the order of available answers, and the type of scales that we use. Logic and randomization can help us combat these data pollutants.
Common Types of Survey Question Bias
Question order bias: survey questions that come early in the survey can influence how people answer survey questions later in the survey. For example, if we ask several specific satisfaction questions and then a more general satisfaction survey question, our answers to the general survey question are likely to be skewed.
Assimilation effect: When this occurs, the response to a later question is more similar to those of former questions than it would be if it preceded those questions or was asked on its own.
When someone indicates a high level of satisfaction with a product in an early question, their subsequent responses will probably show a high level of satisfaction too.
Contrast effect: In this phenomenon the response to a later survey question is more extreme than the response to a former survey question than it would be if it preceded that question or was asked on its own.
A respondent may feel the need to balance out an extremely positive response to an early question with a very negative answer later in the survey.
Response bias: In self-administered surveys, respondents tend to prefer the first few options in a list. On phone or in-person surveys, responses that come later in the list tend to be selected more often.
Acquiesence bias (yea-saying): In general, this bias makes people more likely to agree with a statement when given the option to agree or disagree.
Demand characteristics: This is a tricky one to combat. When people know they are part of survey, they may try to figure out its purpose and behave accordingly.
Desirability Bias: Similarly, respondents have a tendency to answer survey questions in such a way as to exhibit the behaviors and characteristics that are desirable and deny undesirable character traits.
Extreme responding: This type of response pattern refers to people’s tendency to pick the most extreme options in a scale question. Some cultures, however, exhibit the opposite behavior, preferring to select the neutral option on a scale whenever it’s available.
Randomizing Survey Questions and Answers
There are several ways to employ randomization to combat these various forms of survey question bias, from the order in which you show your questions and pages all the way down to the answer options themselves.
Random page order within a survey: If your survey spans multiple pages, you can adjust the order that they are presented to respondents. This works particularly well if your survey has distinct sections that lend themselves to different orders.
Random survey question order: This is probably the most commonly used type of randomization; it simply displays the survey questions in a random order. While useful, it can be complicated when combined with logic.
Random answer options: As we saw, response choice order has a big impact on the way people answer survey questions. If you can show your answer options in a random order, you’ll go a long way towards eliminating this type of bias in your survey data.
Historical Experiments Showing Issues With Survey Question Order
Back in 1983 Norbert Schwarz and Gerald Clore published a study examining how mood relates to more global evaluations of general well-being.
Essentially these researchers wanted to know if being in a good mood leads people to report that their life, overall, is going great (or conversely, if being in a bad mood leads people to report that their life, overall, actually isn’t so hot).
The experiment was conducted in the spring on either rainy or sunny days. A researcher in Chicago called participants randomly selected from the University of Illinois at Urbana-Champaign phone directory and asked the following questions:
1. First, on a scale of 1 to 10, with 10 being the happiest, how happy do you feel about your life as a whole?
2. Thinking of how your life is going now, how much would you like to change your life from
what it is now? This is also on a scale of 1 to 10. Ten means “change a very great deal” and one means “not at all.”3. All things considered, how satisfied or dissatisfied are you with your life as a whole
these days? (with number 10 being the most satisfied).4. And, how happy do you feel at this moment? Again, 10 is the happiest.
That’s all the questions I have. Thank you for your time and cooperation.
For half of the participants, these survey questions constituted the entire interview.
For the other half of the participants, the researcher asked “By the way, how’s the weather down there?” before asking the four interview questions.
Overall, participants in the study reported feeling better on sunny days and worse on rainy days (no surprise there).
What is surprising however, is that when the researcher asked only the interview questions (without mentioning the weather) participants rated their life overall in a manner consistent with the current weather conditions.
On sunny days they reported being happier, were less interested in making life changes, and said they were generally satisfied with their life. On rainy days participants reported being less happy, were more interested in making life changes, and said they were generally less satisfied with their life.
Interestingly, when the researcher began the interview by mentioning the weather, participants reported being happy, were not particularly interested in making life changes, and said they were satisfied with their life overall regardless of whether it was rainy or sunny.
These participants answered questions similarly to the sunny day respondents in the “no weather prime” condition.
So, attending to the fact that it’s raining outside helps people separate their bad mood (short term affect) from more global evaluations of their life overall. These types of unintended carry-over effect are important to keep in mind when designing surveys.
References:
Gerald Clore