lecture 12: evaluating training programs
TRANSCRIPT
![Page 1: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/1.jpg)
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this site.
Copyright 2006, The Johns Hopkins University and William Brieger. All rights reserved. Use of these materials permitted only in accordance with license rights granted. Materials provided “AS IS”; no representations or warranties provided. User assumes all responsibility for use, and all liability related thereto, and must independently review all materials for accuracy and efficacy. May contain materials owned by others. User is responsible for obtaining permissions for use from third parties as needed.
![Page 2: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/2.jpg)
Evaluating Training Programs
William Brieger, MPH, CHES, DrPHJohns Hopkins University
![Page 3: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/3.jpg)
Section A
Overview and Definitions of Evaluation
![Page 4: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/4.jpg)
3
Evaluation Defined
Evaluation is the comparison of an object of interest against a standard of acceptability
In this case, the object of interest is the trainees’performance or achievement of the tasks spelled out in the training objectives
The standard of acceptability is also the level of achievement specified in the objectives
Evaluation is an ongoing process
![Page 5: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/5.jpg)
4
Accountability
According to Management Sciences for Health, providing training to staff has many costs:
The resources involved in preparation, delivery Travel and lodging for participants, Staff time away from the workplace
![Page 6: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/6.jpg)
5
Managers Need to Know
To justify these costs, managers need to feel confident that the training will make a difference in staff performance
That staff members have:
Not only acquired new knowledge, attitudes, and skills from the trainingBut can, and do, put them into practice back on the job
![Page 7: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/7.jpg)
6
Types of Training Evaluation
Evaluation begins with
Needs assessmentsBaseline evaluationsInput evaluations
And continues with
Process evaluationPhoto: USAID, The BASICS
![Page 8: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/8.jpg)
7
Types of Evaluation
Outcome evaluations (this module)
To assess new or improved: Knowledge, Attitudes, and Skills (KAS)After training
Continued
Photo: USAID, The BASICS
![Page 9: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/9.jpg)
8
Types of Evaluation
Impact evaluations of (next module)
Job performanceOrganizational performanceProgram performanceDemographic/health indicators
Photo: USAID, The BASICS
![Page 10: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/10.jpg)
9
Outcome Evaluation: What Are We Looking For?
Evidence that trainees have acquired new knowledge, attitudes and skills (KAS)
Feedback from trainees on their perceived gains or gaps
![Page 11: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/11.jpg)
10
What Else Are We Looking for?
Trainee impressions onquality of sessions
Specific comments onadequacy of allarrangements
Logistics: Space,facilities, time, comfort …Presentation: level of comprehensibility, appropriateness, challenge
![Page 12: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/12.jpg)
11
Tools for Outcome Evaluation
Post-test questionnaire
Observation of trainee performance
Feedback forms
FGDs among participants
Meeting of training committee to:
Consider results of above andSummation process evaluations
![Page 13: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/13.jpg)
Section B
Pre- and Post-Tests
![Page 14: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/14.jpg)
13
Pre- and Post-Tests
A pre-test can be planned as part of the activities during the very first training session
Questions should be based on:
Diagnostic findings and training objectivesSuch tests focus mainly on knowledge
The same test is repeated at the last session to determine gains in knowledge
![Page 15: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/15.jpg)
14
Types of Questions
Types of questions include
Open-ended questionsMultiple-choice questionsAttitude/opinion statements
![Page 16: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/16.jpg)
15
Balance in Questions
Tests need to achieve a balance between
Covering the scope of training knowledge andBeing simple enough to complete in 30 minutes or less
![Page 17: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/17.jpg)
16
Open-Ended Questions
Examples
List all the ingredients that can be mixed together for a homemade solution for oral rehydrationDescribe the danger signs of dehydrationMention the steps for making homemade ORS
Open-ended interviews are useful for low literacy trainees
![Page 18: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/18.jpg)
17
Multiple-Choice Questions
From the list, check all the items that can be mixed together for a homemade solution for oral rehydration
Sugar Soft drink Salt Vegetables Water
![Page 19: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/19.jpg)
18
Attitude Items at Pre-Test
Agree Uncertain Disagree
Mothers can be trusted to mixORS at home
It is best for the mother tobring a child with diarrhea tothe clinic for management
It is inconvenient to teachmothers at clinic to managediarrhea
Telling mothers how to makeORS is adequate to ensurethey know what to do
![Page 20: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/20.jpg)
19
Patent Medicine Vendor Training in Nigeria
Knowledge scores
PMV Training committeewas formed
PMVs requested lessonson common illnesses andskills such as reading aprescription
Either clerk or ownerattended, not both
Pre/post test enhanced by using control
![Page 21: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/21.jpg)
20
Ensuring There Was Really a Gain
The two sets of PMVs were not identical33 took pre-test, 37 took post-test
A paired t-test was calculated for the 28 individuals who took both tests
Their pre-test (46.0%) and post-test (70.8%) scores showed the significant gain at post-test
t = 12.161, p < 0.001Thus a need for identifiers to match tests
![Page 22: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/22.jpg)
21
PMV Training in Kenya: Quality Assurance Project
Goal: To equip PMVs with customized job aids to communicate new malaria guidelines to private drug outlets
All were taught to educate customers and train other PMVs on malaria recognition and prompt treatment
![Page 23: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/23.jpg)
22
Testing Specific Areas of Knowledge
Specific Malaria Knowledge, by Intervention Status
Main Malaria MessagesIntervention
(n=101)Control(n=151)
Total(n=252)
Fever is main symptom of malaria 94.1 94.7 94.4
Fansidar is more effective than other drugs 92.1 85.4 88.1
Fansidar is not too strong for children * 69.3 47.4 56.3
Fansidar and panadol is correct treatment * 88.1 64.2 73.8
Fansidar is a single dose treatment * 83.2 64.9 72.2
Fansidar can be sold in shops * 76.2 43.7 56.7
Continue feeding child with malaria 95.0 90.1 92.1
Don’t sell another drug if child gets worse * 91.1 71.5 79.4
Don’t always sell what is demanded * 87.1 65.6 74.2
Shouldn’t sell smaller doses of drug * 95.0 82.8 87.7* Significant at p<.01
![Page 24: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/24.jpg)
23
Trainees Respond Differently
![Page 25: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/25.jpg)
24
Teacher Training for Visual Acuity Testing
Ibadan, Nigeria
17-point knowledge test
What structure of the eye is responsible for image formation?What are three common causes of blindnessName two ways of detecting visual problems in pupils
![Page 26: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/26.jpg)
25
Health Worker Attitude Changes
CDT in onchocerciasis control
Attitude statementsCommunities are quite capable of managing the distribution of ivermectinOnchocerciasis control should be best run by the district (as opposed to state or national levels)
![Page 27: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/27.jpg)
26
More Attitudes Statements
Community involvement in ivermectin distribution saves the time of the health worker for doing other things
Health workers in this community cannot handle ivermectin distribution because they are over-worked
Health workers do not believe that community-directed distribution of ivermectin is the best way to make ivermectin available to the people
![Page 28: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/28.jpg)
27
Health Worker Attitudes Change
-10
-5
0
5
10
15
20
Pre-Test Post-Regular Post-Enhanced
Group/Time
mea
n at
titu
de s
core
![Page 29: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/29.jpg)
28
Teachers Self-Efficacy for Testing Visual Acuity
VeryCapable
Capable UnsureNot
Capable
Recognizing a child withvisual problems
Using a visual acuitychart
Interpreting chart/testresults
Knowing when to makea referral
![Page 30: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/30.jpg)
29
Changes in Self-Efficacy Scores
One often expects knowledge gains in both groups
Exposure to testing procedures creates some knowledge
In this case exposure to skill test at pretest decreased control teachers confidence
![Page 31: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/31.jpg)
30
Tests: A Summary
Pre-/post-tests document knowledge gains
Matching results insures that individuals actually had significant gains or not
Using a control (e.g., people coming for a future round of training) ensures effect was due to program
Continued
![Page 32: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/32.jpg)
31
Tests: A Summary
Looking at individual questions/sections helps identify problem areas
Finally, comparisons are needed by trainee background
![Page 33: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/33.jpg)
Section C
Observation, Feedback, and Analysis
![Page 34: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/34.jpg)
33
Observation Checklists
When we were developing training objectives, we mentioned the words:
Behavior, observable, measurable
A checklist enables us to observe and measure trainee behavior
The checklist spells out the specific steps or tasks one would expect the trainee to perform at the end of the program
![Page 35: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/35.jpg)
34
Patient Counseling Checklist
Fully Partially Not Done
Establish a cordial relationshipwith the client throughgreetings and introductions
Uses open-endedinterviewing to encourage theclient to speak fully forhim/herself
Practices active listening; isattentive and does notinterrupt
![Page 36: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/36.jpg)
35
Training Primary School Teachers to Test Visual Acuity
Materials and instructions were provided
On the table is a Snellen visual acuity chart, an opaque cover, a frame inserted with + 1.00 power lenses, and a meter rulePlease assess the visual acuity of these pupils at an appropriate distance giving clear and audible instructions
![Page 37: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/37.jpg)
36
Test Instructions and Results
Record your findings
Make an assessment on the state of each child's visual acuity and ocular condition on the sheet of paper providedThere were eight steps
![Page 38: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/38.jpg)
37
Feedback Forms
In addition to obtaining information on trainee knowledge and skills, trainers can obtain feedback on satisfaction with the quality of the program
Trainees can rate aspects of the program on one-page forms, as seen in the next slide
These forms are filled out anonymously
![Page 39: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/39.jpg)
38
Simple Feedback Forms
Check the column that best shows youropinion of the program (1=never; 5=often)
1 2 3 4 5
1. I feel that I will be able to use what Ilearned
2. The program was presented in aninteresting manner
3. The training facilities met my needs
4. The program covered the promisedobjectives
5. The trainers encouraged participationand questions
![Page 40: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/40.jpg)
39
Plus Open-Ended Questions
What did you find most useful in the program?
What did you find least useful?
What suggestions do you have for improving the program?
![Page 41: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/41.jpg)
40
Focus Group Discussions (FGDs)
FGDs not only provide feedback to the trainers, but also allow the trainees to integrate their experiences
FGDs should have no more than six members
These should be scheduled toward the end, possibly over lunch
The room or space used should offer privacy
![Page 42: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/42.jpg)
41
Using FGDs
One valuable use of the FGD method is to bring together trainees who observers thought were too quiet during sessions
This would give them a chance to air their views and provide evidence of learning
![Page 43: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/43.jpg)
42
Value of FGDs
FGDs offer an opportunity to think about the future application of knowledge and skills gained
Photo by Kabiru Salami
![Page 44: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/44.jpg)
43
In FGDs, Think about “Back on the Job”
It is not enough to document that trainees have acquired new KAS before they leave
Silberman suggests trainees consider how they will apply the new KAS when they return
![Page 45: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/45.jpg)
44
Back on the Job
This discussion is an evaluation of the practicality and feasibility of the new KAS
And a way to learn problem-solving skills
Engaging in realistic discussion identifies obstacles for applying new knowledge and implementing new skills
![Page 46: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/46.jpg)
45
Planning for “Back Home”
Developing action plans based on these discussions helps one apply new KAS
Including a plan to monitor oneself
This will prevent disappointment when enthusiastic returned trainees meet a wall of indifference or jealousy
![Page 47: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/47.jpg)
46
Having the Tools
Training needs to equip people with skills, knowledge, attitudes
This may not be enough
VHW trainees are given drug boxes so that they can start work immediately
![Page 48: Lecture 12: Evaluating Training Programs](https://reader031.vdocuments.us/reader031/viewer/2022020119/5891bb621a28ab220c8be970/html5/thumbnails/48.jpg)
47
Finally, Use Test Results and Feedback
Training committee should summarize results as soon as possible
Identify gaps that need attention during follow-up visits and correspondence
Improve the design of the next program
Copyright 2005, Bill Brieger and The Johns Hopkins University. All rights reserved. Use of these materials permitted only in accordance with license rights granted. Materials provided “AS IS”; no representations or warranties provided. User assumes all responsibility for use, and all liability related thereto, and must independently review all materials for accuracy and efficacy. May contain materials owned by others. User is responsible for obtaining permissions for use from third parties as needed. Unless otherwise stated, all photos are the property of Bill Brieger.