Citation

On the Importance of Importance: An Examination of Weighting Evaluation Ratings with Importance Ratings.

Abstract | Word Stems | Keywords | Association | Citation | Get this Document | Similar Titles



Abstract:

Many authors have suggested that in making decisions about objects, people combine their evaluations of the facets of the object with the perceived importance of those facets. Others have suggested that evaluations themselves carry importance information based on the extremity of ratings. We sought to examine these positions by examining 6 different impact measures (2 of which were importance scales).

Method

4934 respondents (2585 males, 2349 females) participated, randomly drawn from the Harris Poll Online panel. In an online survey, each rated two concepts for a new fictitious product. Each product had 6 facets rated with 2 evaluative scales (good-bad; like-dislike). In addition, they were assigned one of 6 facet impact scales (5 or 6 category importance scales; 5 or 6 category influence scales; 5 or 7 change in likelihood to buy scales). Four overall criteria were assessed for each product (overall evaluation, liking, purchase intent, recommendation).

Results

We performed a series of regressions, examining the predictive utility of the evaluative scales for the 4 criteria. All models were significant. While the proportion of variance explained using the impact measures alone was generally much less, these models were significant as well (although the change in likelihood to buy scales were more comparable to the results found for the evaluative scales). Then we combined the facet impact information with facet evaluation scales and regressed these combined variables on the criteria. We found no significant gain in variance accounted for in spite of examining a variety of different combinatorial techniques.

Discussion

With regard to the area of new product evaluation, it appears that impact information may not enhance predictive utility above that obtained from evaluative scales alone. We are seeking to replicate these findings in other content areas.

Author's Keywords:

importance, survey design, predictive utility
Convention
All Academic Convention can solve the abstract management needs for any association's annual meeting.
Submission - Custom fields, multiple submission types, tracks, audio visual, multiple upload formats, automatic conversion to pdf.Review - Peer Review, Bulk reviewer assignment, bulk emails, ranking, z-score statistics, and multiple worksheets!
Reports - Many standard and custom reports generated while you wait. Print programs with participant indexes, event grids, and more!Scheduling - Flexible and convenient grid scheduling within rooms and buildings. Conflict checking and advanced filtering.
Communication - Bulk email tools to help your administrators send reminders and responses. Use form letters, a message center, and much more!Management - Search tools, duplicate people management, editing tools, submission transfers, many tools to manage a variety of conference management headaches!
Click here for more information.

Association:
Name: American Association for Public Opinion Research
URL:
http://www.aapor.org


Citation:
URL: http://www.allacademic.com/meta/p116227_index.html
Direct Link:
HTML Code:

MLA Citation:

Krosnick, Jon., Thomas, Randall., Powell, Ellie., Lafond, Rachel. and Behnke, Susan. "On the Importance of Importance: An Examination of Weighting Evaluation Ratings with Importance Ratings." Paper presented at the annual meeting of the American Association for Public Opinion Research, Sheraton Music City, Nashville, TN, Aug 16, 2003 <Not Available>. 2009-05-26 <http://www.allacademic.com/meta/p116227_index.html>

APA Citation:

Krosnick, J. A., Thomas, R. K., Powell, E. , Lafond, R. C. and Behnke, S. , 2003-08-16 "On the Importance of Importance: An Examination of Weighting Evaluation Ratings with Importance Ratings." Paper presented at the annual meeting of the American Association for Public Opinion Research, Sheraton Music City, Nashville, TN <Not Available>. 2009-05-26 from http://www.allacademic.com/meta/p116227_index.html

Publication Type: Conference Paper/Unpublished Manuscript
Review Method: Peer Reviewed
Abstract: Many authors have suggested that in making decisions about objects, people combine their evaluations of the facets of the object with the perceived importance of those facets. Others have suggested that evaluations themselves carry importance information based on the extremity of ratings. We sought to examine these positions by examining 6 different impact measures (2 of which were importance scales).

Method

4934 respondents (2585 males, 2349 females) participated, randomly drawn from the Harris Poll Online panel. In an online survey, each rated two concepts for a new fictitious product. Each product had 6 facets rated with 2 evaluative scales (good-bad; like-dislike). In addition, they were assigned one of 6 facet impact scales (5 or 6 category importance scales; 5 or 6 category influence scales; 5 or 7 change in likelihood to buy scales). Four overall criteria were assessed for each product (overall evaluation, liking, purchase intent, recommendation).

Results

We performed a series of regressions, examining the predictive utility of the evaluative scales for the 4 criteria. All models were significant. While the proportion of variance explained using the impact measures alone was generally much less, these models were significant as well (although the change in likelihood to buy scales were more comparable to the results found for the evaluative scales). Then we combined the facet impact information with facet evaluation scales and regressed these combined variables on the criteria. We found no significant gain in variance accounted for in spite of examining a variety of different combinatorial techniques.

Discussion

With regard to the area of new product evaluation, it appears that impact information may not enhance predictive utility above that obtained from evaluative scales alone. We are seeking to replicate these findings in other content areas.

Get this Document:

Find this citation or document at one or all of these locations below. The links below may have the citation or the entire document for free or you may purchase access to the document. Clicking on these links will change the site you're on and empty your shopping cart.

Associated Document Available Access Fee All Academic Inc.


Similar Titles:
Examining the Importance of Organizational Structure On Voluntary Medical Error Reporting Over Time

The Rural Conundrum: Examining E-Rate Implementation in Rural America

The Importance of Allowing for Diversity of Opinion in the Examination of Third-Person Perceptions

Video Game Ratings Accuracy: Evaluating the Entertainment Software Review Board (ESRB) video-game rating system


 
All Academic, Inc. is your premier source for research and conference management. Visit our website, www.allacademic.com, to see how we can help you today.