Page 78 - International Journal of Process Educaiton (Special Issue)
P. 78
description), though some are analytical (performance Identifying Performance Measures for a Program
characteristics that are presented individually more (Parmley & Apple, 2007)
specific/fine measurement). Eight of the most useful and
polished performance measures are published in holistic Performance Levels for Assessors (Jensen, 2007)
form in the Book of Measures (Pacific Crest, 2013):
professionalism, self-assessing, learning, problem solving, Overview of Measurement (Burke & Bargainnier,
critical thinking, teaming, quantitative reasoning, and risk- 2007)
taking. Pacific Crest shared a methodology for creating
performance measures in the Handbook for Designing and Constructing a Table of Measures (Racine, 2007)
Implementing Performance Measures (Apple & Krumsieg,
2002; see Figure 2). Measuring Quality in Design (Cordon, Beyerlein, &
Davis, 2007)
The Faculty Guidebook (2007) includes modules that
share increasing expertise on performance measures: In addition to modules explicitly focused on measuring
performance, other aspects of the scholarship in the
Fundamentals of Rubrics (Bargainnier, 2007) Faculty Guidebook support the creation and application
of performance measures. The module, Theory of
Performance Levels for Learners and Self-Growers Performance (Elger, 2007) offers a comprehensive model of
(Myrvaagnes, 2007) performance, making it possible to analyze and appreciate
that the measured level of performance is the result of the
interaction of the components of that performance. The
Figure 2 Methodology for Creating Performance Measures
Step Description
1. Build a team. Include at least ten individuals from a minimum of seven disciplines including the
sciences, applied sciences, social sciences, humanities, professional schools, and
performing arts.
2. Identify a facilitator. The facilitator who facilitates the process must be neutral to any discipline-specific
bias.
3. Divide into work teams. Each team should include three or four persons from varying disciplines.
4. Write a descriptive defi- Each team writes a two- or three-sentence description of the specific skill for which
nition of the skill. the measure is being developed.
5. Synthesize into a de- The facilitator leads a session using the sentences from the previous step to cre-
scriptive paragraph. ate a paragraph that accurately and completely describes the learning skill being
measured.
6. Identify a skill expert. Identify a person who possesses an “expert” level of proficiency with the specific
skill. Let the behaviors of this expert serve as a model.
7. Brainstorm factors. Brainstorm factors which account for variability in the performance with respect to
the specific learning skill.
8. Produce a top 10 list. Reduce the list of brainstormed factors (from the previous step). Produce a new list
which contains the top ten factors in rank order of importance. Match or pair up the
top 10 items.
9. Identify five qualitative The labels you choose should correspond to performance levels ranging from “nov-
labels. ice” to “expert.”
10. Develop five statements These statements identify behaviors associated with the different performance lev-
that clarify behaviors. els. Use the criteria and factors identified for Level 5 (expert) first, followed by Level
1 (novice), then Level 3, Level 2, and lastly Level 4.
11. Write parallel state- Write parallel statements for each of the five levels of performance. Modify state-
ments. ments according to the appropriateness of behavior for that level.
12. Test the classification Test by assessing the performance of people at each level in different contexts. Use
levels. several assessors to improve quality and help determine which behaviors can be
defined in a better way.
76 International Journal of Process Education (February 2016, Volume 8 Issue 1)