All Academic, Inc. Research Logo

Info/CitationFAQResearchAll Academic Inc.
Document

Framing Public Discussion of Gay Civil Unions
Unformatted Document Text:  Framing Public Discussion 34 The Discussion Events Most monthly discussion “events” consisted of three parts: a pre-discussion survey, online discussion, and a follow-up post-discussion survey. Participants in the main discussion panel (N = 915) were asked to do all three parts, whereas those in the control panel (N=139) completed only the survey portions. Participants logged on to their “discussion rooms” at pre-arranged times, using their Web TV devices, television sets, and infrared keyboards. The full TV screen was used. Participants typed their comments and, when they hit the “enter” key on their keyboards, would post these comments to all other group members present in the room. All discussions were moderated by project assistants working out of the Annenberg Public Policy Center at the University of Pennsylvania, and were carefully coordinated and scripted to maintain consistency across groups. Prompts and questions were “dropped” by moderators into the discussions at pre-arranged times. The full text of all discussions, including time-stamps for each comment, was automatically recorded. Discussions were lively and engaging, and participants contributed on average between 200 and 300 words per event. The first event, with discussions held in mid-April, focused on getting acquainted and identifying issues of main concern to participants. The second, held in mid-May, focused on educational issues and the third event, in mid-June, dealt with issues of crime and public safety. The fourth, held at the end of July and in early August, centered around participants’ views of campaigning. The main campaign season involved three further discussions. Right after Labor Day in September, groups viewed and then talked about advertisements from each campaign. Following the first presidential and vice-presidential debates in October, groups discussed the candidates’ stands on health care and taxes, and how effective they thought each campaign had been to that point. In the week prior to the election, groups talked about a variety of other issues that had surfaced during the campaign. With the election results still in doubt, groups met again in early December to discuss the electoral process, how each candidate and the press were handling the disputes over the election, and the role of the Electoral College. Given the pre-discussion and post-discussion surveys every month, the project amounted to a 28-wave panel study for the discussion group, and a 19-wave panel study for the survey-only control group. Given this extraordinary level of burden, it is not surprising that cooperation rates were far from perfect. However, the majority of study participants did complete most surveys. Survey cooperation rates were generally similar for both the discussion and control groups, hovering at around 70 percent early in the project and declining over the course of time to about 60 percent at the project’s end. By far the most demanding elements of the project were the online discussions themselves. Rates of participation in these discussions ranged from about 40 percent at the outset and declined to roughly 30 percent toward the end, producing groups that averaged between 5 and 6 participants each. There was a fair degree of turnover in attendance from one event to the next. By the end of the eighth event in December, over 70 percent of the discussion group (663 respondents) had attended at least one of the online discussions, and roughly 40 percent (or 350) had attended half or more of the events.

Authors: Price, Vincent., Nir, Lilach. and Cappella, Joseph.
first   previous   Page 34 of 38   next   last



background image
Framing Public Discussion
34
The Discussion Events
Most monthly discussion “events” consisted of three parts: a pre-discussion survey,
online discussion, and a follow-up post-discussion survey. Participants in the main discussion
panel (N = 915) were asked to do all three parts, whereas those in the control panel (N=139)
completed only the survey portions.
Participants logged on to their “discussion rooms” at pre-arranged times, using their Web
TV devices, television sets, and infrared keyboards. The full TV screen was used. Participants
typed their comments and, when they hit the “enter” key on their keyboards, would post these
comments to all other group members present in the room. All discussions were moderated by
project assistants working out of the Annenberg Public Policy Center at the University of
Pennsylvania, and were carefully coordinated and scripted to maintain consistency across groups.
Prompts and questions were “dropped” by moderators into the discussions at pre-arranged times.
The full text of all discussions, including time-stamps for each comment, was automatically
recorded. Discussions were lively and engaging, and participants contributed on average
between 200 and 300 words per event.
The first event, with discussions held in mid-April, focused on getting acquainted and
identifying issues of main concern to participants. The second, held in mid-May, focused on
educational issues and the third event, in mid-June, dealt with issues of crime and public safety.
The fourth, held at the end of July and in early August, centered around participants’ views of
campaigning. The main campaign season involved three further discussions. Right after Labor
Day in September, groups viewed and then talked about advertisements from each campaign.
Following the first presidential and vice-presidential debates in October, groups discussed the
candidates’ stands on health care and taxes, and how effective they thought each campaign had
been to that point. In the week prior to the election, groups talked about a variety of other issues
that had surfaced during the campaign. With the election results still in doubt, groups met again
in early December to discuss the electoral process, how each candidate and the press were
handling the disputes over the election, and the role of the Electoral College.
Given the pre-discussion and post-discussion surveys every month, the project amounted
to a 28-wave panel study for the discussion group, and a 19-wave panel study for the survey-only
control group. Given this extraordinary level of burden, it is not surprising that cooperation rates
were far from perfect. However, the majority of study participants did complete most surveys.
Survey cooperation rates were generally similar for both the discussion and control groups,
hovering at around 70 percent early in the project and declining over the course of time to about
60 percent at the project’s end.
By far the most demanding elements of the project were the online discussions
themselves. Rates of participation in these discussions ranged from about 40 percent at the
outset and declined to roughly 30 percent toward the end, producing groups that averaged
between 5 and 6 participants each. There was a fair degree of turnover in attendance from one
event to the next. By the end of the eighth event in December, over 70 percent of the discussion
group (663 respondents) had attended at least one of the online discussions, and roughly 40
percent (or 350) had attended half or more of the events.


Convention
Convention is an application service for managing large or small academic conferences, annual meetings, and other types of events!
Submission - Custom fields, multiple submission types, tracks, audio visual, multiple upload formats, automatic conversion to pdf.
Review - Peer Review, Bulk reviewer assignment, bulk emails, ranking, z-score statistics, and multiple worksheets!
Reports - Many standard and custom reports generated while you wait. Print programs with participant indexes, event grids, and more!
Scheduling - Flexible and convenient grid scheduling within rooms and buildings. Conflict checking and advanced filtering.
Communication - Bulk email tools to help your administrators send reminders and responses. Use form letters, a message center, and much more!
Management - Search tools, duplicate people management, editing tools, submission transfers, many tools to manage a variety of conference management headaches!
Click here for more information.

first   previous   Page 34 of 38   next   last

©2012 All Academic, Inc.