All Academic, Inc. Research Logo

Info/CitationFAQResearchAll Academic Inc.
Document

Making Assessment Matter: Changing Cultures, Improving Teaching, and Transforming Departments
Unformatted Document Text:  3 Making Assessment Matter department chair onto his colleagues or from a higher administrator onto a department and be successful. Unfortunately, the assessment literature in its discussion of departmental assessment frequently ignores the quandary of the department resistant to assessment and instead focuses on idealized implementation processes. This more ideal model, referred to in this chapter as “Structural Implementation” or “Mission-Based” assessment, requires the department to identify key learning objectives that are reinforced through the curriculum and systematically evaluated. This approach is predicated upon departmental and university support (American Association for Higher Education 1992; Astin 1993; Banta et al. 1996; Palomba and Banta 1999). A recent survey of political science departmental chairs found that over 50% of the responding undergraduate departments and 45% of responding departments with graduate offerings engage in some form of assessment (Kelly and Klunk 2003). However, it was also clear from this study that “the development of learning assessment strategies by departments does not seem to follow the ‘ideal type’ learning assessment models” (455). In fact, many of the respondents to the survey indicated that they implemented assessment instruments without adopting specific measurable learning objectives. As anyone who has ever witnessed formal or informal faculty discussions of assessment plans realizes, one of the dominant critiques of assessment is that ideal circumstances frequently do not exist. Department chairs are often charged to produce assessment plans and measurable outcomes without collegial support or the resources necessary to garner such support; little has been written on levereging departmental support for common assessment goals (Deardorff 2007). Regardless of the assessment environment in which a department operates, the only way in which a program will gain faculty support for assessment is if it can demonstrate how assessment can provide the tools to enable us to better practice our profession. The assessment literature clearly demonstrates that there is no single way to create a compelling assessment plan. As the chapter in this volume by John Ishiyama demonstrates that political science departments vary widely in the learning outcomes they identify and in the assessment techniques they use. Factors such as selectivity in admissions, existence of graduate programs, and the culture of institution all impact the type of assessment engaged in or by the department. Consequently, the process and structure of assessment must reflect the needs and idiosyncrasies of both institutional and departmental communities. Consideration of such characteristics as the percentage of tenured versus non-tenured faculty members in the department, the balance between

Authors: Deardorff, Michelle. and Folger, Paul.
first   previous   Page 3 of 18   next   last



background image
3
Making Assessment Matter


department chair onto his colleagues or from a higher administrator onto a department and be
successful. Unfortunately, the assessment literature in its discussion of departmental assessment
frequently ignores the quandary of the department resistant to assessment and instead focuses on
idealized implementation processes. This more ideal model, referred to in this chapter as “Structural
Implementation” or “Mission-Based” assessment, requires the department to identify key learning
objectives that are reinforced through the curriculum and systematically evaluated. This approach is
predicated upon departmental and university support (American Association for Higher Education
1992; Astin 1993; Banta et al. 1996; Palomba and Banta 1999).
A recent survey of political science departmental chairs found that over 50% of the responding
undergraduate departments and 45% of responding departments with graduate offerings engage in some
form of assessment (Kelly and Klunk 2003). However, it was also clear from this study that “the
development of learning assessment strategies by departments does not seem to follow the ‘ideal type’
learning assessment models” (455). In fact, many of the respondents to the survey indicated that they
implemented assessment instruments without adopting specific measurable learning objectives.
As anyone who has ever witnessed formal or informal faculty discussions of assessment plans realizes,
one of the dominant critiques of assessment is that ideal circumstances frequently do not exist.
Department chairs are often charged to produce assessment plans and measurable outcomes without
collegial support or the resources necessary to garner such support; little has been written on levereging
departmental support for common assessment goals (Deardorff 2007). Regardless of the assessment
environment in which a department operates, the only way in which a program will gain faculty support
for assessment is if it can demonstrate how assessment can provide the tools to enable us to better
practice our profession.
The assessment literature clearly demonstrates that there is no single way to create a compelling
assessment plan. As the chapter in this volume by John Ishiyama demonstrates that political science
departments vary widely in the learning outcomes they identify and in the assessment techniques they
use. Factors such as selectivity in admissions, existence of graduate programs, and the culture of
institution all impact the type of assessment engaged in or by the department.
Consequently, the process and structure of assessment must reflect the needs and idiosyncrasies of
both institutional and departmental communities. Consideration of such characteristics as the
percentage of tenured versus non-tenured faculty members in the department, the balance between


Convention
Submission, Review, and Scheduling! All Academic Convention can help with all of your abstract management needs and many more. Contact us today for a quote!
Submission - Custom fields, multiple submission types, tracks, audio visual, multiple upload formats, automatic conversion to pdf.
Review - Peer Review, Bulk reviewer assignment, bulk emails, ranking, z-score statistics, and multiple worksheets!
Reports - Many standard and custom reports generated while you wait. Print programs with participant indexes, event grids, and more!
Scheduling - Flexible and convenient grid scheduling within rooms and buildings. Conflict checking and advanced filtering.
Communication - Bulk email tools to help your administrators send reminders and responses. Use form letters, a message center, and much more!
Management - Search tools, duplicate people management, editing tools, submission transfers, many tools to manage a variety of conference management headaches!
Click here for more information.

first   previous   Page 3 of 18   next   last

©2012 All Academic, Inc.