If you’ve ever taken an online academic survey, you’ve probably seen the sponsoring university’s name, seal, or logo in a banner at the top of the screen. These banners are typically included by default when researchers create a survey with their institution’s licensed survey software. For instance, when we create a new survey with Florida State University’s Qualtrics licensed software, the FSU name and seal are automatically displayed at the top of every page:
The default view when we create a new survey using FSU’s Qualtrics license.
We wanted to know whether these banners, which often appear in surveys, influence the answers people provide to researchers. Existing work on surveys conducted in person or over the phone suggests that respondents often answer questions in a manner they think will be favorable to their interviewer. For instance, respondents tend to report more support for gender equality when interviewed by a woman rather than a man, more support for racial equality when interviewed by an African American rather than a white person, and stronger religious beliefs when interviewed by someone wearing religious garments rather than secular dress.
How we conducted our research
Do university logos create response biases that are analogous to those from in-person interviews? To find out, we fielded three studies, recruiting survey respondents from Amazon’s Mechanical Turk—the most popular subject pool for survey experiments. In each study, we randomly assigned respondents to one of three experimental groups that differed only in the image displayed at the top of the screen. This approach allowed us to examine whether people who viewed a banner image from a particular university responded differently from people who viewed one from a different university or from people who viewed no image at all.
Study 1: No effect on social issues or religiosity
We began with a sample of 1,646 respondents, comparing the effects of banners from Notre Dame, a prominent Catholic University, The Ohio State, a nearby public university, or no image at all. If the respondents tried to please survey sponsors, we would expect that respondents who viewed the Notre Dame banner, compared to the others, would report more conservative responses about abortion and birth control, or more frequent participation in religious practices. Instead, we found no meaningful differences across the three groups.
Study 2: No effect on feelings toward social groups
Aside from religious affiliation, universities also vary in their ideological reputation. For example, UC Berkeley has long held the reputation as a beacon of liberal values whereas Liberty University consistently ranks among the most conservative campuses. Yet in a sample of 2,339 respondents, responses to questions about feelings toward liberals, conservatives, or other social groups were remarkably similar regardless of whether the respondent viewed a banner from Berkeley, Liberty, or no banner at all.
Of course, some people care more than others about what a stranger might think of them. People who care most about presenting themselves in a positive light should be most responsive to the survey banners. To identify such presentation-conscious respondents, we included a short survey battery before we revealed the banner images. Even among the most presentation-conscious respondents, we found no meaningful differences across our three treatment groups.
Study 3: No effect on effort
Universities also vary in prestige, which may influence the effort that people put into their survey responses. To examine this possibility, we recruited 2,212 respondents to compare differences in effort across groups that viewed banners from Harvard, Fitchburg State, or no image. Again, we found no apparent effects, as measured by the number of don’t know responses provided on a political knowledge quiz, the time taken to complete the survey, or the length and complexity of responses to an open-ended essay question.
- No evidence that banners affect responses: We examined the effect of banners for some of the most recognizable universities using items previously linked to various forms of response bias. If banner-induced sponsorship effects were widespread, we would expect some systematic differences between our treatment groups. One might imagine larger effects for regionally-proximate respondents, alumni, or other subsets of individuals who would be particularly sensitive to such cues. Although possible, our results show that these groups are not sufficiently large portions of typical MTurk samples to lead to detectable sponsorship effects in our experiments (and, presumably, in other social science research using MTurk samples). For these reasons, we believe that the threat of bias from such banners is small.
- Researchers should still omit banners: It’s possible that some other combination of universities, survey questions, or respondents would lead to response bias. Since we cannot feasibly evaluate all of these innumerable combinations, we recommend that scholars err on the side of caution by omitting logos from their survey interfaces. Although our results suggest no harm in including them, they also reveal no benefits (such as increased effort). Since our results may not apply to all contexts, scholars can simply avoid the potential for bias by omitting institutional logos.
- Researchers should provide screenshots of their survey interface: In conducting our study, we were disappointed to learn that researchers rarely include screenshots of their survey interface in their written reports. This frequent omission prevented us from determining how often researchers include university logos in their surveys. The omission of screenshots connects to broader problems discussed within the field because it reduces research transparency and inhibits replication. By omitting institutional logos, researchers can include screenshots without revealing their identities during anonymous peer review.