Citation

Measuring Principals’ Decision-Making Knowledge and Skills through Cases

Abstract | Word Stems | Keywords | Association | Citation | Similar Titles



Abstract:

Cases have been heralded as effective tools to teach prospective administrators how to interpret and diagnose problems and to help assess their critical thinking skills (Sykes & Bird, 1992). To do so, however, it is key that cases be designed to elicit specific evidence of the knowledge and skills in question (Mislevy, Steinberg, Almond, & Lukas, 2006).
An assessment’s construct validity derives from the fact that its content contains key knowledge and skills, it generates responses from respondents that demonstrate such knowledge, and it predicts scores on other measures of the same knowledge and skill (Cronbach, 1988). Cognitive task analyses can be used to elicit such evidence about assessments by focusing on key aspects of responses: actions, prerequisite knowledge, descriptions, if-then decision rules, and thinking (Jonassen, Tessmer & Hannum, 1999).
We present here our process and findings to examine the construct validity of newly designed instructional cases that assess administrators’ decision-making expertise. First, we reviewed key research and theory to identify key decision-making knowledge and skills around which to design the cases. Second, we conducted a cognitive task analysis session with six expert in-service principals (four elementary, one middle, and one high school). Experts first wrote summaries of their individual processes for making larger school-wide decisions. Then in a focus group we debriefed these processes; we then asked them each to complete a leadership case and take notes explaining how the case required them to use their decision-making process. Finally, in a follow-up focus group experts compared the instructional case’s decision-making steps to the individual process they had initially summarized.
Experts’ responses provided confirmatory evidence regarding the construct validity of the cases to assess decision-making skills and knowledge. Their individual summaries matched the 4-step decision-making model inherent in the instructional cases even though they carried them out in more of an iterative than a step-wise fashion. Experts also focused more on how to communicate the decision to others and the need to do so multiple times. Finally, the experts validated the last step of the cases’ decision-making model, which is reflecting upon the decision and its effectiveness. Experts’ broader reflections about their experience in completing a case showed that participants felt it was an accurate simulation of their on-the-job decision-making.
We discuss the results of this cognitive task analysis of expert decision makers and the implications for refining learning materials that provide novices rigorous practice exercises for making decisions about broader school issues they will face on the job, such as through cases or other learning materials.
Convention
Submission, Review, and Scheduling! All Academic Convention can help with all of your abstract management needs and many more. Contact us today for a quote!
Submission - Custom fields, multiple submission types, tracks, audio visual, multiple upload formats, automatic conversion to pdf.Review - Peer Review, Bulk reviewer assignment, bulk emails, ranking, z-score statistics, and multiple worksheets!
Reports - Many standard and custom reports generated while you wait. Print programs with participant indexes, event grids, and more!Scheduling - Flexible and convenient grid scheduling within rooms and buildings. Conflict checking and advanced filtering.
Communication - Bulk email tools to help your administrators send reminders and responses. Use form letters, a message center, and much more!Management - Search tools, duplicate people management, editing tools, submission transfers, many tools to manage a variety of conference management headaches!
Click here for more information.

Association:
Name: UCEA Annual Convention
URL:
http://www.ucea.org


Citation:
URL: http://citation.allacademic.com/meta/p378358_index.html
Direct Link:
HTML Code:

MLA Citation:

Dexter, Sara. and Tucker, Pamela. "Measuring Principals’ Decision-Making Knowledge and Skills through Cases" Paper presented at the annual meeting of the UCEA Annual Convention, Anaheim Marriott, Anaheim, California, <Not Available>. 2014-11-28 <http://citation.allacademic.com/meta/p378358_index.html>

APA Citation:

Dexter, S. L. and Tucker, P. D. "Measuring Principals’ Decision-Making Knowledge and Skills through Cases" Paper presented at the annual meeting of the UCEA Annual Convention, Anaheim Marriott, Anaheim, California <Not Available>. 2014-11-28 from http://citation.allacademic.com/meta/p378358_index.html

Publication Type: Symposium Paper
Abstract: Cases have been heralded as effective tools to teach prospective administrators how to interpret and diagnose problems and to help assess their critical thinking skills (Sykes & Bird, 1992). To do so, however, it is key that cases be designed to elicit specific evidence of the knowledge and skills in question (Mislevy, Steinberg, Almond, & Lukas, 2006).
An assessment’s construct validity derives from the fact that its content contains key knowledge and skills, it generates responses from respondents that demonstrate such knowledge, and it predicts scores on other measures of the same knowledge and skill (Cronbach, 1988). Cognitive task analyses can be used to elicit such evidence about assessments by focusing on key aspects of responses: actions, prerequisite knowledge, descriptions, if-then decision rules, and thinking (Jonassen, Tessmer & Hannum, 1999).
We present here our process and findings to examine the construct validity of newly designed instructional cases that assess administrators’ decision-making expertise. First, we reviewed key research and theory to identify key decision-making knowledge and skills around which to design the cases. Second, we conducted a cognitive task analysis session with six expert in-service principals (four elementary, one middle, and one high school). Experts first wrote summaries of their individual processes for making larger school-wide decisions. Then in a focus group we debriefed these processes; we then asked them each to complete a leadership case and take notes explaining how the case required them to use their decision-making process. Finally, in a follow-up focus group experts compared the instructional case’s decision-making steps to the individual process they had initially summarized.
Experts’ responses provided confirmatory evidence regarding the construct validity of the cases to assess decision-making skills and knowledge. Their individual summaries matched the 4-step decision-making model inherent in the instructional cases even though they carried them out in more of an iterative than a step-wise fashion. Experts also focused more on how to communicate the decision to others and the need to do so multiple times. Finally, the experts validated the last step of the cases’ decision-making model, which is reflecting upon the decision and its effectiveness. Experts’ broader reflections about their experience in completing a case showed that participants felt it was an accurate simulation of their on-the-job decision-making.
We discuss the results of this cognitive task analysis of expert decision makers and the implications for refining learning materials that provide novices rigorous practice exercises for making decisions about broader school issues they will face on the job, such as through cases or other learning materials.


Similar Titles:
Measuring the Group Decision Making Process in a Simulated Crisis Decision-Making Environment

How did the teacher practical knowledge impact the teaching decision-making: A case study based on the novice and expert teacher

Knowledge Limits in Sustainability Decision-making? The Case of Synthetic Biology


 
All Academic, Inc. is your premier source for research and conference management. Visit our website, www.allacademic.com, to see how we can help you today.