All Academic, Inc. Research Logo

Info/CitationFAQResearchAll Academic Inc.
Document

Explaining Information Effects in Collective Preferences
Unformatted Document Text:  sometimes draw attention to the potential for slight changes in the wording of questions to overdetermine responses (e.g., Schuman and Presser 1981) , to race-of-interviewer effects (e.g., white respondents are more supportive of affirmative action policies when the questions are posed by African-American rather than white interviewers, e.g., Schuman and Converse 1971; Singer and Presser 1989) , and to the differences obtained by open-ended questions that ask respondents to provide answers in their own words versus forced-choice questions that require respondents to pick an opinion from a small list of alternatives (e.g., Schuman and Scott 1987; Singer and Schuman 1988) . The survey research community has long been attentive to the possibility that survey results are conditioned in ways subtle and gross by the methods of sampling and asking questions. A sizeable literature has developed over the past sixty years to address concerns about the effects of question wording, order, and mode of presentation. While earlier research aimed to provide practical guidance to pollsters by describing common question design and response problems (e.g., Payne 1951; Rugg and Cantril 1944) , more recent work has offered new insights on these problems by clarifying the psychological processes giving rise to survey response effects (e.g., Schwarz and Sudman 1996; Tanur 1992; Tourangeau, Rips, and Rasinski 2000; Zaller 1992a; Zaller and Feldman 1992) . An important finding from this work which bears on the present discussion is that ill-informed people can be more prone than others to a wide range of response effects (Narayan and Krosnick 1996; Krosnick 1991; Krosnick and Alwin 1987; Schuman and Presser 1981; although see Knäuper 1999). Because response effects introduce systematic errors into surveyed opinions, and because these errors are especially likely to bias the opinions of the ill informed, response effects should contribute to the magnitude of information effects in collective preferences. 1 Variations in the size of information effects across question topics thus could be produced merely by the presence or absence 1 This expectation assumes that response effects among actual survey respondents would be reduced to an unknown extent in a simulated world of “fully informed” respondents. I assume not that respondents with high levels of political knowledge invariably engage in purely “optimal” systematic processing of the survey instrument, but merely that imputing higher levels of ability to all respondents should mitigate response effects brought about by heavy reliance on heuristic processing. It follows that variance in the size of information effects revealed by the simulations should be related to the amount of “error” in individual opinions produced by response effects in the survey data.

Authors: Althaus, Scott.
first   previous   Page 5 of 53   next   last



background image
sometimes draw attention to the potential for slight changes in the wording of questions to
overdetermine responses
(e.g., Schuman and Presser 1981)
, to race-of-interviewer effects
(e.g.,
white respondents are more supportive of affirmative action policies when the questions are
posed by African-American rather than white interviewers, e.g., Schuman and Converse 1971;
Singer and Presser 1989)
, and to the differences obtained by open-ended questions that ask
respondents to provide answers in their own words versus forced-choice questions that require
respondents to pick an opinion from a small list of alternatives
(e.g., Schuman and Scott 1987;
Singer and Schuman 1988)
.
The survey research community has long been attentive to the possibility that survey results are
conditioned in ways subtle and gross by the methods of sampling and asking questions. A sizeable
literature has developed over the past sixty years to address concerns about the effects of question
wording, order, and mode of presentation. While earlier research aimed to provide practical guidance
to pollsters by describing common question design and response problems
(e.g., Payne 1951; Rugg
and Cantril 1944)
, more recent work has offered new insights on these problems by clarifying the
psychological processes giving rise to survey response effects
(e.g., Schwarz and Sudman 1996;
Tanur 1992; Tourangeau, Rips, and Rasinski 2000; Zaller 1992a; Zaller and Feldman 1992)
. An
important finding from this work which bears on the present discussion is that ill-informed people
can be more prone than others to a wide range of response effects (Narayan and Krosnick 1996;
Krosnick 1991; Krosnick and Alwin 1987; Schuman and Presser 1981; although see Knäuper 1999).
Because response effects introduce systematic errors into surveyed opinions, and because these
errors are especially likely to bias the opinions of the ill informed, response effects should contribute
to the magnitude of information effects in collective preferences.
1
Variations in the size of
information effects across question topics thus could be produced merely by the presence or absence
1 This expectation assumes that response effects among actual survey respondents would be reduced to an
unknown extent in a simulated world of “fully informed” respondents. I assume not that respondents with high levels
of political knowledge invariably engage in purely “optimal” systematic processing of the survey instrument, but
merely that imputing higher levels of ability to all respondents should mitigate response effects brought about by
heavy reliance on heuristic processing. It follows that variance in the size of information effects revealed by the
simulations should be related to the amount of “error” in individual opinions produced by response effects in the
survey data.


Convention
Convention is an application service for managing large or small academic conferences, annual meetings, and other types of events!
Submission - Custom fields, multiple submission types, tracks, audio visual, multiple upload formats, automatic conversion to pdf.
Review - Peer Review, Bulk reviewer assignment, bulk emails, ranking, z-score statistics, and multiple worksheets!
Reports - Many standard and custom reports generated while you wait. Print programs with participant indexes, event grids, and more!
Scheduling - Flexible and convenient grid scheduling within rooms and buildings. Conflict checking and advanced filtering.
Communication - Bulk email tools to help your administrators send reminders and responses. Use form letters, a message center, and much more!
Management - Search tools, duplicate people management, editing tools, submission transfers, many tools to manage a variety of conference management headaches!
Click here for more information.

first   previous   Page 5 of 53   next   last

©2012 All Academic, Inc.