developing a model of trainer evaluation leslie w. zeitler, lcsw may 2010: 13 th annual national...
DESCRIPTION
3 Context Child Welfare Training Evaluation in CA: Child Welfare Training Evaluation in CA: Oversight by Macro Evaluation Team (subcommittee of Statewide Training & Education Committee) Oversight by Macro Evaluation Team (subcommittee of Statewide Training & Education Committee) Began conceptualization in 2002 Began conceptualization in 2002 Began actual evaluations in 2005 Began actual evaluations in st Strategic Plan (through 2009): focused on child welfare worker and supervisor knowledge/skill evaluations. 1 st Strategic Plan (through 2009): focused on child welfare worker and supervisor knowledge/skill evaluations.TRANSCRIPT
Developing a Model of Trainer EvaluationDeveloping a Model of Trainer Evaluation Leslie W. Zeitler, LCSWLeslie W. Zeitler, LCSW
May 2010: 13May 2010: 13thth Annual Annual National Human Services Training Evaluation SymposiumNational Human Services Training Evaluation Symposium
““Problem/Brainstorm Session”Problem/Brainstorm Session”
22
Problem/Brainstorm SessionProblem/Brainstorm Session
Not a presentationNot a presentation Is an invitation to contribute to thinking on this Is an invitation to contribute to thinking on this
topic…topic…
33
ContextContext
Child Welfare Training Evaluation in CA:Child Welfare Training Evaluation in CA: Oversight by Macro Evaluation Team Oversight by Macro Evaluation Team
(subcommittee of Statewide Training & Education (subcommittee of Statewide Training & Education Committee)Committee)
Began conceptualization in 2002Began conceptualization in 2002 Began actual evaluations in 2005 Began actual evaluations in 2005 11stst Strategic Plan (through 2009): focused on child Strategic Plan (through 2009): focused on child
welfare worker and supervisor knowledge/skill welfare worker and supervisor knowledge/skill evaluations.evaluations.
44
Data Currently Collected in CA:Data Currently Collected in CA:
Under 1Under 1stst strategic plan (2004-2009) and continuing strategic plan (2004-2009) and continuing through 2through 2ndnd strategic plan (2009-2012): strategic plan (2009-2012): Trainee DemographicsTrainee Demographics Trainee Satisfaction (at the regional/county level, Trainee Satisfaction (at the regional/county level,
but not statewide)but not statewide) Trainee Knowledge (acquisition of knowledge)Trainee Knowledge (acquisition of knowledge) Trainee Skill (application of knowledge in the Trainee Skill (application of knowledge in the
classroom via embedded evaluations)classroom via embedded evaluations)
55
Current (2Current (2ndnd) Strategic Plan for ) Strategic Plan for CW Training Evaluation in CA:CW Training Evaluation in CA:
In 2In 2ndnd strategic plan, starting to look at additional strategic plan, starting to look at additional factors that affect trainees and training:factors that affect trainees and training: Attitudes, valuesAttitudes, values Stereotype threat pilotStereotype threat pilot Item analysis by trainerItem analysis by trainer Development of model of trainer evaluationDevelopment of model of trainer evaluation Etc.Etc.
66
Models of Trainer Evaluation?Models of Trainer Evaluation?
Do you have a systematic method or model for Do you have a systematic method or model for evaluating trainers?evaluating trainers?
If so, is it a model based in theory?If so, is it a model based in theory? If so, which model?If so, which model?
77
Factors for Consideration:Factors for Consideration:
Need to identify:Need to identify: Purpose(s) of trainer evaluationPurpose(s) of trainer evaluation Elements/dimensions for evaluationElements/dimensions for evaluation Methods of evaluationMethods of evaluation Measurement formatMeasurement format Uses of dataUses of data
88
PurposePurpose
Purpose(s) of evaluating trainersPurpose(s) of evaluating trainers Improve quality of trainingImprove quality of training Standardize training deliveryStandardize training delivery Establish basis for trainer fees (On a statewide Establish basis for trainer fees (On a statewide
level)level) As part of ongoing professional development for As part of ongoing professional development for
trainerstrainers For personnel purposes (promotion, probation, For personnel purposes (promotion, probation,
firing)firing)
99
I.D. Elements/dimensions for evaluation:I.D. Elements/dimensions for evaluation:
Dimensions of trainer performanceDimensions of trainer performance From 2001 APHSA/NSDTA Trainer Competency From 2001 APHSA/NSDTA Trainer Competency
Model?Model? From Institute for Human Services competencies?From Institute for Human Services competencies? From other (competency) models?From other (competency) models?
1010
Identify Methods of Evaluation:Identify Methods of Evaluation: Observation? Observation?
By whom? By whom? Peers?Peers? Supervisors? Supervisors?
Self-reflection?Self-reflection? Trainee feedback?Trainee feedback? Aggregate data from trainees’ tests? Aggregate data from trainees’ tests?
What would be the advantages and disadvantages of What would be the advantages and disadvantages of each method of evaluation?each method of evaluation?
1111
Identify Measurement Format:Identify Measurement Format:
Likert scale?Likert scale? Anchors? (How would we go about anchoring any Anchors? (How would we go about anchoring any
scale?)scale?)
What tools have other training evaluators used?What tools have other training evaluators used? Sample Tools:Sample Tools:
Kentucky (Debbie Dever)Kentucky (Debbie Dever) Indiana (Evoke Communications/Quay Kester)Indiana (Evoke Communications/Quay Kester)
1212
Identify Uses of Collected Data:Identify Uses of Collected Data: How often are trainer evaluations done?How often are trainer evaluations done? Who gets to see the trainer evaluation data?Who gets to see the trainer evaluation data? What decisions could result from ongoing collection What decisions could result from ongoing collection
and analysis of trainer evaluation data?and analysis of trainer evaluation data? High stakes?High stakes? Medium stakes?Medium stakes? Low stakes?Low stakes?
What analyses could be done?What analyses could be done? In aggregate, to see improvement over time?In aggregate, to see improvement over time? Etc.Etc.
1313
Anything else?Anything else?
What else, if anything, needs to be considered when What else, if anything, needs to be considered when designing a trainer evaluation model?designing a trainer evaluation model? Involvement of trainers in the design process?Involvement of trainers in the design process? Etc.Etc.
1414
Additional Thoughts?Additional Thoughts?
Contact Leslie W. Zeitler at:Contact Leslie W. Zeitler at:[email protected]@berkeley.edu
California Social Work Education CenterCalifornia Social Work Education Center