expertise, consumer-oriented, and program-oriented evaluation approaches

Post on 17-Jul-2015

807 Views

Category:

Education

17 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Expertise, Consumer-Oriented, and Program-Oriented Evaluation Approaches

John H. Curry, Ph.D.

Expertise-Oriented Approach

•The oldest type of formal evaluation•Relies on professional expertise to

judge the quality of an institution, program, product, or activity.

Types of Expertise-Oriented Evaluations

•Formal review system

•Informal review system

•Ad hoc panel review

•Ad hoc individual review

Formal Review Systems

•Mostly used in accreditation

•Examine existing structure of organization

•Examine published standards

•Follow a specified schedule

•Increasing emphasis on outcomes

•Use the opinions of multiple experts

•Status of those being evaluated is affected by the results

Informal Review Systems

•Used primarily for evaluations that lack published standards or follow a specified review schedule.

•Use multiple reviewers

•Status of those being reviewed is affected by results

•Examples: Peer review of articles, theses or dissertation committees

Ad Hoc Panel Reviews

•Occur at irregular intervals when circumstances demand

•Reviews not related to institutionalized evaluation or standards

•Usually one-shot evaluations prompted by a particular, time-bound need for evaluative information

Examples of Ad Hoc Panels

•Panels to develop standards

•Funding agency review panels

•Blue ribbon panels

Ad Hoc Individual Reviews

•Review of any entity by an individual selected for his/her expertise

•Usually to judge value or make recommendations

•Example: Employment of a consultant to review an educational, social, or commercial program

Consumer-Oriented Evaluation

•Much like expertise-oriented

•Helps inform decisions on what to purchase or trade

•Judge the quality of something, establish value, merit or worth of a product

•Audience is broader: The purchasing public, and they are not known to the evaluator.

Consumer-Oriented, cont.

•Developed by Scriven

•Much like needs assessments

•Functional analysis of product

•IMPORTANT to: •IDENTIFY the criteria correctly

•DEVELOP standards to judge those criteria

•COLLECT data

•SYNTHESIZE information to make a final judgment

Sample Consumer-Oriented Criteria

•Need

•Market

•Performance (Field Trials)

•Performance (Consumer)

•Performance (Critical Comparisons)

•Performance (Long-Term)

•Performance (Side Effects)

•Performance (Process)

•Performance (Causation)

•Performance (Statistical Significance)

•Performance (Educational Significance)

•Cost-effectiveness

•Extended Support

Consumer-Oriented Examples

•Consumer Reports

•US News and World Report

Program-Oriented Approaches

•Focus on learning some key features of the program, and then serve to help the evaluator decide which questions should be asked.

•Most common type: The objectives-oriented evaluation

Objectives-Oriented Evaluation

•Distinguishing feature: some activity is specified, and then the evaluation tries to determine the extent to which those objectives are achieved.

Tylerian Evaluation

•Steps:•Establish broad goals or objectives

•Classify the goals or objectives

•Define the objectives in behavioral terms

•Find situations in which achievement of objectives can be shown

•Develop or select measurement techniques

•Collect performance data

•Compare performance data with behaviorally stated objectives

•Generally discredited today as oversimplifying a complex system

Provus Discrepancy Evaluation

•Steps:

•Agree on standards (objectives)

•Determine whether a discrepancy

exists between performance and

standards

•Using that information, decide

whether to improve, maintain, or

terminate the program or some

aspect of it

Theory-driven evaluation

•Program theory is “the construction of a plausible and sensible model for how a program is supposed to work.”

•Programs fail to achieve goals for two different reasons:

•The program isn’t delivered as planned and, therefore, isn’t

ready to be tested (implementation failure)

•The program is delivered as planned, and the results

indicate that the program theory was incorrect (theory

failure)

Theory-driven evaluation, cont.

•Steps:•Engage relevant stakeholders

•Develop a first draft of program theory

•Present a draft to stake holders for

discussion, reaction, and input

•Conduct a plausibility check

•Communicate findings to key stakeholders

•Probe arrows for model specificity

•Finalize program impact theory

Goal-Free Evaluation

•The evaluator purposely avoids becoming aware of the program goals

•Predetermined goals are not permitted to narrow the focus of the evaluation study

•Goal-free evaluation focuses on the actual outcomes rather than intended program outcomes

•The goal-free evaluator has minimal contact with the program manager and staff

•Goal-free evaluation increases the likelihood that unanticipated side effects will be noted

Reference

•Fitzpatrick, J., Sanders, J., & Worthen, B. (2011). Program evaluation: Alternative approaches and practical guidelines (4th ed.). Upper Saddle River, N.J.: Pearson Education.

top related