Citation

Designing and Using Program Evaluation as a Tool for Reform

Abstract | Word Stems | Keywords | Association | Citation | Get this Document | Similar Titles



Abstract:

Patton (1996) argues that evaluation should be broadly conceptualized as a knowledge generating activity. We believe evaluation can shape our understanding of every aspect of a program and the participants. The Pseudonym University Educational Leadership Team and its P-12 practitioner, community and business leader partners collaboratively developed a comprehensive, on-going evaluation plan to assess a newly implemented field-based graduate program, the success of its participants, and faculty effectiveness. Our evaluation process began early on, assessing progress during the conceptualization and design process and continues throughout all major processes ranging from admissions, orientation, and professional learning opportunities for field-based coaches and current school leaders. We designed opportunities for on-going evaluation of program and faculty effectiveness, student engagement, and outcomes. For example, through coursework, student cohorts are tasked with authentic group think-tank projects through which the processes of content knowledge and skill acquisition, application of learning, cognitive complexity, and professional dispositions are assessed by faculty, p-12 leaders, peers, and the students themselves. These projects culminate in products presented to stakeholders, thereby acting as a high-stakes outcome evaluation. Our evaluation plan is complex and comprehensive, utilizing data produced through the normal work of the program along with data collected through specially designed instruments and protocols. We draw on the expertise of partners and stakeholders, field-based coaches and the insights of students to help us evaluate individual courses, field internships, our graduate participants, faculty, and program implementation and we utilize multiple data sources such as questionnaires, examinations, observations, presentations, portfolios, reflections, and check lists based on governing standards and best-practice research (Orr, 2006). As Patton advocates, our evaluation system is designed to integrate rich and varied sources of information generating knowledge useful for informing, defining, reforming, and transforming (Reed, Kochan, Ross, & Kunkel, 2001) our program, participants, faculty, and stakeholders. In order to sustain a high-quality program, our evaluation is a part of our program’s culture (Sanders, 2002), allowing us to engage in processes of continuous improvement.
Convention
Submission, Review, and Scheduling! All Academic Convention can help with all of your abstract management needs and many more. Contact us today for a quote!
Submission - Custom fields, multiple submission types, tracks, audio visual, multiple upload formats, automatic conversion to pdf.Review - Peer Review, Bulk reviewer assignment, bulk emails, ranking, z-score statistics, and multiple worksheets!
Reports - Many standard and custom reports generated while you wait. Print programs with participant indexes, event grids, and more!Scheduling - Flexible and convenient grid scheduling within rooms and buildings. Conflict checking and advanced filtering.
Communication - Bulk email tools to help your administrators send reminders and responses. Use form letters, a message center, and much more!Management - Search tools, duplicate people management, editing tools, submission transfers, many tools to manage a variety of conference management headaches!
Click here for more information.

Association:
Name: UCEA Annual Convention
URL:
http://www.ucea.org


Citation:
URL: http://citation.allacademic.com/meta/p274983_index.html
Direct Link:
HTML Code:

MLA Citation:

Ross, Margaret. "Designing and Using Program Evaluation as a Tool for Reform" Paper presented at the annual meeting of the UCEA Annual Convention, Buena Vista Palace Hotel and Spa, Orlando, Florida, <Not Available>. 2013-12-13 <http://citation.allacademic.com/meta/p274983_index.html>

APA Citation:

Ross, M. E. "Designing and Using Program Evaluation as a Tool for Reform" Paper presented at the annual meeting of the UCEA Annual Convention, Buena Vista Palace Hotel and Spa, Orlando, Florida <Not Available>. 2013-12-13 from http://citation.allacademic.com/meta/p274983_index.html

Publication Type: Symposium Paper
Abstract: Patton (1996) argues that evaluation should be broadly conceptualized as a knowledge generating activity. We believe evaluation can shape our understanding of every aspect of a program and the participants. The Pseudonym University Educational Leadership Team and its P-12 practitioner, community and business leader partners collaboratively developed a comprehensive, on-going evaluation plan to assess a newly implemented field-based graduate program, the success of its participants, and faculty effectiveness. Our evaluation process began early on, assessing progress during the conceptualization and design process and continues throughout all major processes ranging from admissions, orientation, and professional learning opportunities for field-based coaches and current school leaders. We designed opportunities for on-going evaluation of program and faculty effectiveness, student engagement, and outcomes. For example, through coursework, student cohorts are tasked with authentic group think-tank projects through which the processes of content knowledge and skill acquisition, application of learning, cognitive complexity, and professional dispositions are assessed by faculty, p-12 leaders, peers, and the students themselves. These projects culminate in products presented to stakeholders, thereby acting as a high-stakes outcome evaluation. Our evaluation plan is complex and comprehensive, utilizing data produced through the normal work of the program along with data collected through specially designed instruments and protocols. We draw on the expertise of partners and stakeholders, field-based coaches and the insights of students to help us evaluate individual courses, field internships, our graduate participants, faculty, and program implementation and we utilize multiple data sources such as questionnaires, examinations, observations, presentations, portfolios, reflections, and check lists based on governing standards and best-practice research (Orr, 2006). As Patton advocates, our evaluation system is designed to integrate rich and varied sources of information generating knowledge useful for informing, defining, reforming, and transforming (Reed, Kochan, Ross, & Kunkel, 2001) our program, participants, faculty, and stakeholders. In order to sustain a high-quality program, our evaluation is a part of our program’s culture (Sanders, 2002), allowing us to engage in processes of continuous improvement.

Get this Document:

Find this citation or document at one or all of these locations below. The links below may have the citation or the entire document for free or you may purchase access to the document. Clicking on these links will change the site you're on and empty your shopping cart.

Associated Document Available Access Fee All Academic Inc.
Associated Document Available Access Fee UCEA Annual Convention


Similar Titles:
Political Economy of IMF Program Design: Why do Some IMF Programs Require More Reforms than Others?

Designing, implementing, and evaluating programs focused on country-led reforms

Developing effective tools for program evaluation: Experience from the evaluation of the USAID National Book Program in Egypt


 
All Academic, Inc. is your premier source for research and conference management. Visit our website, www.allacademic.com, to see how we can help you today.