program evaluation projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9....

33
Florida State University EME 6357: Evaluation of Instruction and Training in Human Performance Technology (HPT) Program Evaluation Project Final Report BioPro for Teachers: A program designed to assist with the professional development of teachers of high school Biology. Byron S. Small Email: [email protected] Spring, 2014 Professor: Dr. Aubteen Darabi

Upload: others

Post on 27-Sep-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

Florida State University

EME 6357: Evaluation of Instruction and Training

in Human Performance Technology (HPT)

Program Evaluation Project

Final Report

BioPro for Teachers: A program designed to assist with the

professional development of teachers of high school Biology.

Byron S. Small

Email: [email protected]

Spring, 2014

Professor: Dr. Aubteen Darabi

Page 2: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report ii

Table of Contents

List of Figures …………….………...…….…….……………..………………………..……… iv

List of Tables ……………..………...…….…….……………..……………………………..…. v

Introduction ……………………...…….…….……………..……………………….……...…… 1

Purpose and rationale …………...…….…….……………..……………………….……....…… 1

BioPro foundations ……………………....….……………..……………………….…...……… 2

Program components ……………….....…….……………..……………………….……...…… 3

Program setting ……………….....…….…….……………..……………………….……...…… 3

Process and procedures for delivering training ……………….…………………….…….......… 4

Program inputs …….………….....…….…….……………..……………………….……...…… 5

Anticipated output and outcomes …………………………..……………………….……...…… 7

Organizational support ……………………………………..……………………….……...…… 8

Logic Model Presentation …………………...……………..……………………….……...…… 9

Methodology ….…………………….……….……………..………………...…….……...…… 10

Evaluation Questions ….………...…….…….……………..……………………….…….……. 12

Formative (Process) Evaluation Questions …..…….…….……………..…………...……….… 13

Summative (Outcome) Evaluation Questions ….…………..……………………………...…… 15

Data Collection Methods & Measures ……………...…….…….……………..…………......… 16

Process (Formative) Evaluation ……………...…….…….……………..……………..….……. 18

Page 3: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report iii

Qualitative Measures ……………...…….…….……………..…..…….....……. 19

Quantitative Measures ……………...…….…….……………...…….......……. 19

Outcome (Summative) Evaluation ………….………...…….…….……………...…….........… 20

Qualitative Measures ……………...…….…….……………..……..…....……. 20

Quantitative Measures ……………...…….…….……………..…………....…. 20

Communication & Dissemination of Evaluation Results …….……………..…………...……. 21

Conclusion …….………………...…….…….……………..………………………...…...……. 22

References ……..………………...…….…….……………..……………………….…….……. 23

Appendices …….………………...…….…….……………..……………………….…….……. 24

Appendix A: Iterative Analysis of Training System According to its HPT Functions ….... 24

Appendix B: HPT-based holistic logic model for evaluation of training programs s …...... 25

Appendix C: Sample interview questions ……………………………………………........ 26

Appendix D: Kirkpatrick’s Four Levels of Evaluating Training ……………………......... 27

Appendix E: BioPro's general professional development program goals and plans …........ 28

Page 4: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report iv

List of Figures

Number Page

Figure A Systemic View of a Training Program in HPT Context …….………..…………... 2

Figure B Organizational hierarchical structure depicting PD programs .……….………...... 3

Figure C Basic System Components …….……………………………………..…………... 5

Page 5: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report v

List of Tables

Number Page

Table 1 Adapted Logic Model for the BioPro for Teachers program ………...…….…….. 10

Table 2 BioPro for Teachers: Training Program Inputs .…………….......…….………...... 14

Table 3 BioPro for Teachers: Training Outputs and Outcomes …….………...…………... 15

Table 4 Evaluation Management Plan …………….………………...………...…………... 17

Table 5 Evaluation Communication Plan …………………………...………...…………... 21

Page 6: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 1

BioPro for Teachers:

A program designed to assist with the professional development of teachers of high school Biology.

Introduction

Every year, teachers from within the public education system in The Bahamas, come

together to participate in a one week professional development (PD) training session aimed at

enhancing the general skills, knowledge and attitudes of their various disciplines. These annual

sessions however, have reportedly not lived up to their expectations. Oftentimes, they have

amounted to nothing more than complaining sessions about what is wrong in the profession,

rather than concrete suggestions about the way forward or the professional enhancement of the

participating educators. The majority of the teachers leave these sessions feeling disappointed,

unfulfilled and denied of the opportunity for true PD. The big question then is: 'why is this the

case?' After conducting a needs analysis, the government has ascertained that the source of the

concerns about these sessions, have been linked to the training modules themselves. They have

therefore commissioned this program evaluation to comprehensively examine the shortcomings

and pitfalls of this week long workshop. This evaluation will focus on a particular subject area

(high school Biology) with future intentions of examining the correlation between this chosen

discipline and others of the PD workshop.

Purpose and rationale

High school Biology teachers, like those of many other disciplines, are expected to

remain abreast of new ideologies, discoveries and innovative ways of analyzing data and/or

experimenting with models. Additionally, current global and societal trends dictate national

curricula and thus inherently shape subject syllabi. Recognizing the need for continuous PD, the

government has instituted an internal training program to address this concern. BioPro for

Page 7: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 2

Teachers is an annual professional development training program that has been designed to assist

high school Biology teachers (the key stakeholders) with the required proficiencies and outlook

in their field of study. As supported by Kirkpatrick and Kirkpatrick (2006) who suggest that "the

most common reason for evaluation is to determine the effectiveness of a program and ways in

which it can be improved." This report will systematically explore the essential components of

this training initiative, but the program's efficacy will be examined using a systemic approach to

evaluating training programs in a human performance technology (HPT) context as diagrammed

in Figure A.

Figure A Systemic View of a Training Program in HPT Context

BioPro foundations

Housed in the Ministry of Education (MOE), this program is a part of a larger

overarching objective by the MOE to "organize appropriate professional development activities

for teachers and other staff, to address their specific needs."1 Because this training initiative is a

subset of a larger system, it is subjective to the business functions, purpose and mission of the 1 Ministry of Education's draft 10 year education plan, 2009.

Page 8: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 3

Ministry of Education Department of Education

Human Resource Section

Professional Development

School Maintenance

School Leadership

Administration

Policies and Procedures

Division

umbrella organization. In essence, the acceptance of this training program as a subsystem of a

larger performance driven system, substantiates our preferred systemic approach to this goal-

based evaluation.

Program components

The fundamental components of this training program will be defined using the iterative

analysis technique presented in Appendix A. Essentially, this approach focuses on the functions

of the program that are used to develop or improve, not only the educators' knowledge of

pertinent information, skills and abilities, but also their attitudes toward the profession.

Appreciating that this training initiative is also used to improve workplace performance, its

efficacy will be determined by the extent to which the participants can demonstrate that learning

has taken place by exhibiting the transfer of acquired knowledge and skills to the work

environment. Finally, the program is ultimately designed to show that the training efforts of the

MOE are yielding positive results that will translate into improved organizational performance

and standards. This iterative approach helps to clearly delineate the impact of the training

transfer process from the trainee, to the workplace and ultimately to the organization.

Program setting

BioPro, generally, falls under the auspices of the MOE but more specifically is a function

of the PD unit of the human resource section. Figure B illustrates this set up.

Figure B Organizational hierarchical structure depicting professional development programs.

Page 9: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 4

Zimmerman and Holden (2009) emphasize that "understanding the organizational infrastructure

supporting a program to be evaluated is essential to knowing how high of a priority that program

is to the organization's ongoing efforts" and with this supporting view, it was pleasing to see that

PD is listed as a major function of the human resource department within the MOE.

Additionally, this program along with many others, is a part of the PD opportunities that

are offered to all educators in the public education system. The training sessions are held at

varying locations and all required resources, inclusive of facilitating/ancillary staff, program

designers and managers as well as the trainers, are provided by the MOE. Teachers are simply

required to attend the respective workshops within their discipline and the training is delivered

face-to-face at designated locations. The use of technology is strongly encouraged in these

classroom-style learning environments where there are clearly defined instructors and registered

participants. The sessions are not organized as certificate courses and prerequisite qualifications

are not required, but attendance counts toward what is known as 'professional development'

hours/points. Trainee evaluations however, if conducted, amount to nothing more than a level

one2 assessment of Kirkpatrick's four levels of evaluation (Appendix D). The efficacy of this

limited scope for assessing training goals will be further explored via this evaluation.

Process and procedures for delivering training

All teachers within the public education sector are invited to the workshops and the

sessions are further subdivided by fields of study. Training is typically conducted at the end of

each academic year in late June, or at the latest, by the first week of July. Each subject area or

discipline during their respective sessions, generally follow the ideal goals and plans outlined at

Appendix E, but at some point during the training, the focus is directed toward content specific 2 Level 1: Reaction. Generally measures what participants think or feel about the training.

Page 10: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 5

material. Moreover, the training program is designed to address current trends and issues related

to the educators' respective fields. Another goal of the training is to discuss, devise and develop

realistic objectives for the upcoming school year with suggestions and recommendations for

implementation.

Program inputs

For a training initiative of this magnitude, and through a systems thinking approach, there

are any number of variables that can be viewed as inputs to the program and must therefore be

addressed in this evaluation. Figure C illustrates the basic components of a system and from

observations, there is a clear correlation between the operations of a system and its interrelated

parts, and those of a training program. In fact, this link establishes the framework for this

evaluation and backs the 'systems thinking' approach that has been employed throughout this

evaluation. The intention ultimately, is to present the systemic justification and/or rationale for

the various processes and components of the program by tangibly communicating their

significance to the training's efficacy. Against this backdrop, we begin with the inputs to the

system. For this program evaluation, inputs have been segmented into two basic groupings of

Figure C: Basic System Components3

Processes Services

either the components that are associated with the training itself or the support gained by, or

through, the sponsoring organization. 3 Source: A. Darabi (2014) - class slides (EME 6357)

Page 11: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 6

As a first example of training inputs, we look at the trainers. Trainers are expected to be

well-qualified, knowledgeable and well-respected in their various fields, and as can be

anticipated with any professional development program, the participants would also expect their

instructors to be very skilled and highly competent. It is important to note that educators (as

trainees) often envisage themselves being trained by instructors who are also able to clearly

demonstrate mastery of the subject content that they are expected to deliver even though highly

skilled trainers need not necessarily possess the comprehensive knowledge of subject matter

experts (SME).

Secondly, as this program is designed to cater to its participants, it is blatantly clear that

without the trainees, there simply would be no training. Though the details about trainee

expectations will be presented later, the essential idea here is that presumably, if there are/were

prerequisites to participating in this program, they ought to have been clearly delineated and

fulfilled by the time participants get to this phase of the program. Essentially, the trainees must

be assessed beforehand to determine their readiness for the program.

Thirdly, and this is also a major component of the inputs to the program, is the

instructional content of the training sessions. The validity of this significant variable must be

explored to examine the relevance of the content to the program's objectives and, by extension,

the overarching goals for conducting training. The essence of why content is so important is

subsumed in the question of why training is important. Since training is particularly designed to

address some shortcomings of knowledge, skills and/or attitudes (KSA), the content therefore,

must be relevant and appropriate to the overall goals to effectively accomplish these objectives.

Other training inputs that are also important to this evaluation effort are the program's

structure and process. For this evaluation, it is important to speak to the functionality of the

Page 12: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 7

structure that has been put in place to bring the training to fruition. Since we must be concerned

with the agency that has authorized and established BioPro for Teachers as well as the persons

responsible for managing this initiative, it is clear why the program's structure must be

addressed. Viewing structure as a critical input variable also reveals why we examine the

resources that have been allocated for the program as well as how they will be used and

managed. In addition to this, structure makes us look at the physical location or venue of the

training and critically assess the facilities with a view to determining whether or not the

environment is conducive to learning.

Accepting that process is another important variable to consider, tells us that this

evaluation must take a look at the policies and procedures that support the implementation of the

program and explore how the program will be maintained and monitored. Finally, the 'process'

encompasses the strategies used to select prospective participants. Since trainees are not

randomly selected, this evaluation will cause management to appreciate why the trainee selection

process is an important component to the success of the training initiative.

Anticipated output and outcomes

This goal-based evaluation will follow a two-pronged approach to assessing BioPro's

efficacy. First, we will examine the physical training environment including resource

availability, allocations and use. These elements will form the basis of our formative

evaluations. Secondly, the number of participants trained, staff usage and involvement as well as

the complement of trainers will be factored into this segment of the evaluation.

Once the program itself has been assessed, this report will focus on the program's

efficacy through a summative evaluation. This will draw into account and focus on, but will not

be limited to, the remaining three levels of assessment articulated by Kirkpatrick's four-level

Page 13: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 8

evaluation model (Appendix D). Thus for all intents and purposes, this evaluation will address

the trainees' ability to demonstrate that the targeted KSAs have been transferred and that trainees

are exhibiting observable behavior to substantiate this claim (levels two and three). In addition

to this, we will make efforts through this report to present data to substantiate whether or not this

training initiative is yielding the anticipated organizational results.

Organizational support

Because BioPro is a subset of the MOE, it will be accommodated and facilitated only to

the extent to which it receives the needed organizational support from its parent body. It is

therefore incumbent upon the MOE to make provisions for the transfer of any newly acquired

skill or knowledge to the workplace. Otherwise what value would any training initiative bring to

the organization if the new competencies cannot be seamlessly assimilated into the work

environment. Furthermore, any summative evaluation to determine the program's efficacy is

hinged on the participants' ability to transfer newly learnt skills, abilities or knowledge to the

workplace. In fact, assessing this transferability will be one of the major components of this

evaluation. Posavac (2011) suggests using variables from varying sources as this helps to

provide useful and valid evaluations, and with this in mind, the work environment itself will also

serve as another major setting for this evaluation revealing an intention to go beyond the

instructional environment.

Organizational support also goes beyond exploring its mission, vision and goal

statements. Conceptually, it extends to include the resources that have been allocated to

facilitate the program. To what extent is the parent organization involved in raising the

necessary funding to conduct the training sessions? This is an important concern and must be

Page 14: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 9

addressed during this phase. Support also addresses the efficacy of communication channels,

program procedures and guidelines, feedback and overall expectations.

Logic Model Presentation

To provide clarity, a logic model4 is a graphical representation of the logical relationships

between the various program components. It is a tool most used by evaluators to examine a

program's efficacy. By logically connecting resources to inputs, activities, outputs and

outcomes, this pictorial tool depicts the fundamental relationships between the various system

components.

Table 1 represents a basic graphical illustration of the outlook of this program evaluation

in the form of a Logic model. This initial diagram was designed to present the vital components

of the program using a systems approach. The inputs, when processed, produces outputs and

outcomes. The conceptual map of Table 1 represents an overview of the main variables of the

training program that must be addressed. It should be noted that though the main components of

training inputs such as training structure, processes, content and trainer competencies are

inherent in the basic logic model of Table 1, it is evident that a more detailed presentation would

also be necessary to fully represent co-relational variables. Because this simplified version lacks

the depth needed to effectively conduct the objects of this evaluation, a more comprehensive

approach was sought and this can be found at Appendix B.

As a result, the ensuing methodological framework for the evaluation of BioPro for

Teachers, will loosely follow the ideas generally outlined in the adapted Logic model of Table 1,

but more specifically, will be evaluated using the components of the "HPT-based holistic Logic

4 Logic models are also referred to as Logical frameworks.

Page 15: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 10

Model for Evaluation of Training Programs" A. Darabi (2014) detailed at Appendix B. Table 1

will be used as a guide only when details are not necessary.

Table 1: Adapted Logic Model for the BioPro for Teachers program

INPUTS PROCESSES OUTPUTS OUTCOMES IMPACT Human Capital and Financial Resources

Use of resources and capital

Measurable units from training sessions

Behavioral changes in participants

Participants on the organization and communities

• Trainers • Staff • Participants • Content • Training Materials • Funding

• Train

participants to achieve general and content specific objectives

• Use workshop

content aids and paraphernalia to deliver training

• Use resources

to facilitate the training sessions.

• Number of

trainers • Staff member

complement • Number of

participants • Types of job

aids and amount • Amount of

funding

• Exhibits newly

acquired skill, knowledge or attitude

• Speak of job

satisfaction • Trainees'

students favorably impacted

• Exhibits more

professional behavior

• Less work

related complaints

• Improved

student performance

• Professionally

represent trade/industry

• Advocates value

of professional development through training

Methodology

The analysis presented the major aspects of the program and issues to be considered.

This phase will provide the methodological framework for gathering the necessary data.

Objectively, this component will provide an overview of the strategies that have been used to

systemically address the extent of BioPro's effectiveness. This segment therefore, is critical to

the overall evaluation effort. Essentially, it describes the methods that have been employed to

Page 16: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 11

drive the data collection efforts and demonstrate how the analysis has influenced the formulation

of relevant and appropriate evaluation questions. It therefore gives attention to the pertinent

aspects of the program by:

• Identifying the shortcomings of the program's modality by examining its content delivery

options - whether these are lecture style presentations, interactive or hands-on sessions, or

through the active involvement of the participants with accomplishing the tasks at hand.

• Examining the stakeholders in the delivery of this PD workshop with a view to assessing

instructional designer competencies, as well as instructor skills and knowledge of their

respective subject areas.

• Determining the involvement of the various stakeholders in the design, delivery and

management of the various aspects of bringing the workshops to fruition.

• Analyzing the content of each module of the sessions with a view to determining authenticity,

validity and suitability.

• Exploring the use and management of program resources (human, financial and physical).

• Examining anticipated outputs and/or outcomes of the various sessions, and the workshop as a

whole, to determine if objectives/goals are being attained.

• Examining workshop environments to determine if settings are conducive to productive

learning and participation.

• Assessing overall trainer competencies as well as knowledge and mastery of the subject

matter.

• Exploring the trainee selection process as well as examining trainee readiness.

The resulting evaluation questions of Table 2 will provide the justification for the

methods used in this report and will also be helpful in determining why certain strategies were

Page 17: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 12

incorporated. Furthermore, since planning is crucial to the evaluation process, due diligence was

observed to ensure that all components of the program being assessed are scrutinized with a view

to mitigate concerns that might threatened the efficacy of the training.

Evaluation Questions

Pertinent evaluation questions provide the foundational framework for a productive

evaluation exercise. It is important therefore, that these questions are reliable and are guided by

sound principles that employ the best possible strategies to receive relevant and appropriate

responses. The answers to these direct questions help to validate the purpose, structure, process

and content of the training program. Essentially, it has been noted by evaluators and further

summarized by M. Baehr (2004), that evaluation methodology5 consists of four main

components:

1. Defining the parameters of the evaluation;

2. Designing the methods used for the evaluation;

3. Setting standards and collecting the evidence; and

4. Reporting and making decisions.

One way to define the parameters is by designing a list of comprehensive questions that

provide the support for the evaluation's overall goals. These questions ought to be incorporated

into the evaluation to give credence to its purpose. They help to broaden the scope of an issue or

concern and are general in nature. Commonly referred to as divergent questions, they seek to

determine why an evaluation is needed, what purpose will its results serve, who will the results

be reported to, how will the information be used and how will its findings/results be

implemented. 5 Source: Evaluation Methodology. http://www.pcrest3.com/fgb/efgb4/1/1_4_7.htm

Page 18: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 13

On the flip side of this brainstorming exercise where creative divergent questions are

formulated, is the narrow focused convergent questions. In fact, convergent questions typically

seek 'yes/no' type responses, or some other monosyllabic style answers, as replies to inquiries,

thus do not necessarily engage any creative thinking processes or long discussions on issues. For

this evaluation exercise, we will mainly follow the principles of convergent questioning to select

essential questions from a comprehensive list of divergent questions that will be presented, in

samples, throughout this report. The issues presented in the logic model in Appendix B will be

explored using both divergent and convergent questioning techniques as this methodological

report presents the formative and summative evaluations of the BioPro for Teachers training

program. Furthermore, the evaluation questions previously mentioned referencing 'why, what,

who, and how', will be used to augment the significance of this evaluation methodology section.

Formative (Process) Evaluation Questions

It is important to note that during the evaluation process, the evaluator should be

concerned about the integrity of the program being assessed. The extent to which this is

ascertained is outlined in the expected training inputs and outputs diagrammed in the Logic

model presented at Appendix B. This model serves as a template for determining the essential

elements needed to conduct the training sessions. This includes the human capital and financial

resources that will be used as inputs, and the measurable units from anticipated outputs.

Appendix B also presents some of the major areas for consideration during this formative

evaluation phase of the BioPro for Teachers program. During this formative phase, the evaluator

must be concerned with answers to the following questions as outlined in Table 2, presented in

samples, that are subdivided into their relevant segments:

Page 19: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 14

Table 2: BioPro for Teachers … Training Program Inputs (Key Evaluation Questions)

TRAINING COMPONENT

EVALUATION ISSUE MEASUREMENT INDICATORS

Training Structure

Is the training structure/setup according to the objectives of the plan?

• How many trainers will be required? • How many participants have enrolled in this course? • Who in the Ministry of Education (MOE) is responsible

for overseeing this training program? • What resources are needed to facilitate the training?

Training Process

Is the process being followed according to the plans?

• What procedures have been implemented to monitor the training processes?

• Is the training design in alignment with policies, regulations and standards?

• How are the trainees selected for participation?

Training Content

Is the content relevant?

• Is the content of the training aligned with the professional development (PD) goals of the MOE?

• Is the content current, relevant and applicable to all high school Biology teachers?

Trainers' Competence

Are the trainers competent?

• What qualifications must the trainers possess to successfully complete training objectives?

• Will the trainers be selected from a pool of documented professional instructors?

Trainees' Prior KSAs

Are the trainees training ready?

• Do the participants have the necessary prerequisite skills and/or knowledge to begin training?

• What are the participants' views of these mandatory training sessions? Necessary, helpful or not?

MOE's PD support

What resources have been allocated for training?

• How much funds have been allocated for the program's materials?

• How many staff members will be employed to assist with the program's operations?

• What skills would be required of staff?

Page 20: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 15

Summative (Outcome) Evaluation Questions

At the end of the program, it is anticipated that participants should be able to demonstrate

that they have been trained. That is, they should exhibit behavior that substantiates observable

changes in either knowledge acquisition, abilities and/or attitudes. A summative evaluation is

designed to provide this report. It would require an evaluator to investigate the rationale behind

the short and long term outcomes of the training as outlined in the Logic model.

In determining the outcomes in the short-term, a selected number of high school Biology

teachers will be personally interviewed and another chosen few will be observed in their

respective classroom settings. Another set of randomly selected teachers will be observed at

least three months following the training to ascertain the transfer of training and thus the

program's efficacy.

Key questions, all beyond Kirkpatrick's level one evaluation (Appendix D), that seek this

information, and again categorized accordingly, would be those samples as outlined in Table 3.

Table 3: BioPro for Teachers … Training Outputs and Outcomes (Key Evaluation Questions)

TRAINING COMPONENT

EVALUATION ISSUE MEASUREMENT INDICATORS

Training Outputs

Did the training

reach the

intended

participants?

• How many participants are attending the training?

• How many trainers are being utilized and are they

sufficient?

• How many training sessions are being conducted?

• Is technology being used extensively? If so, to what

extent and if not, why not?

Page 21: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 16

TRAINING COMPONENT

EVALUATION ISSUE MEASUREMENT INDICATORS

Expected Training Outcomes

(Short term)

Can the trainees

exhibit KSA

acquisition?

• Can participants demonstrate that they have acquired

some new knowledge/skill as a result of the training?

• Did the training address job satisfaction?

• Do participants feel more competent or confident with

fulfilling the requirements of their job?

Resulting changes from

training (Long term)

Are participants

demonstrating

the transfer of

training to the

environment?

• What new KSAs are trainees exhibiting on the job?

• Would participants recommend the training program to

colleagues?

• Are trainees more satisfied with their job because of the

training?

Data Collection Methods & Measures

BioPro for Teachers is a training program designed specifically for high school biology

teachers and it is within this context that the foundations for the data collection exercise will be

performed. As the coordinating evaluator, to gather the data desired from the evaluation

questions, I intend to:

• Randomly choose five teachers who will be interviewed through face to face discussions;

• Randomly select eight teachers who will be interviewed via telephone conversations

(these do not include the face to face interviewees)

• Sit in on at least three science teachers' focus group sessions from different high schools

• Design and distribute a survey to the participants in the program, for completion at the

conclusion of the training, mainly to suffice a (Kirkpatrick's) level one evaluation

Page 22: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 17

• Design and distribute a pre and post test questionnaire that addresses the main objects of

the training. A post questionnaire survey to ascertain content retention, will be

administered no earlier than one week following the training.

• Observe four teachers at their respective work places. The intention is to further

ascertain knowledge or skill acquisition and to assess the transfer of training.

The data associated with the aforementioned collection methods are outlined in the 'Evaluation

Management Plan presented below.

Table 4 Evaluation Management Plan (EMP)

Evaluation Management Plan

Question Type

Evaluation Questions

Information Required

Information Source

Collection Method

Analysis Procedures &

Criteria

Process (Formative) Evaluation Questions

Are the trainers competent?

Are they qualified to

teach?

Instructor database

Computer systems network

Scrutiny of qualifications and interviews

Is the content relevant?

What is being taught?

Instructor manuals

Computer systems network

Analysis of content

Number of participants?

Total enrollment?

Program Coordinator

Computer systems network

Determine level of participation

Participants' prior skills or knowledge?

Prerequisites to training/course?

Training Coordinator

Skills Questionnaire

Pre test to acquire trainees' prior knowledge

Funds allocated for materials?

Resource fund allocations?

Budget Reports Computer systems network

Review Budget Reports

Number of staff required?

How many are required?

Program Coordinator

Telephone interviews and

prior data

Benchmark or assess need

Page 23: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 18

Evaluation Management Plan

Question Type

Evaluation Questions

Information Required

Information Source

Collection Method

Analysis Procedures &

Criteria

Summative (Outcome) Evaluation Questions

Training addressed job satisfaction?

Gained from Survey results

Program Participants

Interview five teachers

Numbers who express

satisfaction

Recommend training to

colleagues?

Results of Interviews

Program Participants

Interview five teachers

Numbers who recommend

Can demonstrate

acquired knowledge or

skill?

Gained from Observations

Posttest prepared for

Program Participants

Classroom Observations

Percentage of those who

exhibit skill or knowledge

Process (Formative) Evaluation

Evaluating the process is a necessary course of action to examine the strategies being

incorporated to achieve end results. In determining the initial suitability of the trainers, for

example, the evaluation team will have access to a database of qualified trainers, who may not

necessarily be SMEs, (although when referencing educators as trainees, this may be preferred)

from which it is anticipated that after scrutiny, the best instructors would have been chosen. It is

expected that this process will yield that the better suited instructor is selected for their skill set

appropriate to their training modules. The key question 'are the trainers competent?' is an

attempt to gather all the necessary data that would be used to establish competency.

Another example of questions that would explore the evaluation process can be observed

in determining how the financial resources have been allocated to accomplish the objects of the

Page 24: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 19

training. How much has been budgeted for the required materials and/or associated

paraphernalia to successfully deliver the training modules? These are just some of the questions

that will be raised at the meeting with the Executive Board since funding would come under their

purview.

Qualitative Measures (Process Evaluation)

As pointed out in the data collection methods, interviews and focus groups will serve as

major strategies that will be used to collect qualitative data in response to evaluation questions.

Since interviews are versatile tools for data collection, they can be explored to address complex

issues. It is a preferred strategy for this evaluation because rather than focus on statistical data,

interviews provide much leeway for one's personal interpretation of the events. They also allow

the evaluator to establish a rapport with stakeholders further ensuring that the data gathered is

more reliable. A sample copy of the questions can be found in Appendix C.

Quantitative Measures (Process Evaluation)

The BioPro for Teachers program brings high school Biology teachers from all over the

country together in one forum for one week. However, upon completing the training these

teachers all return to their respective locations, making subsequent evaluations quite the

challenge. Unfortunately, for Kirkpatrick's levels three and four to be adequately addressed,

evaluation teams must be disbursed to the various locations making this evaluation exercise more

costly than would be preferred. But to make this evaluation a worthwhile endeavor, these teams

must be formed and disbursed accordingly. Samples of the types of questions that will be asked,

and subsequently tabulated for quantitative analysis, can be found in Appendix C.

Page 25: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 20

Outcome (Summative) Evaluation

Outcomes are concerned with the short and long term impact of the training program.

Beginning with the transfer of either knowledge, skills and/or attitudes to the participants

themselves in the present time, an organization also looks at the longer term impacts of the

training to the job and the institution as a whole. This level of the evaluation concerns itself

therefore with the acquisition of the intended KSAs and the transfer of such attributes to the

work environment.

Qualitative Measures (Outcome Evaluation)

The qualitative measures that will be used to gather the data from the training will be

collected from selected interview techniques. Focus groups, surveys and interviews will be used

to measure the outcomes of organizational impact during this summative phase. Samples of the

types of questions that will be asked, as it relates to gathering the appropriate data to validate

organizational concerns of training transfer, can be found in Appendix C.

Quantitative Measures (Outcome Evaluation)

The quantitative measures that will be used to gather the data for outcomes of the training

will also be gathered from surveys and interviews. Samples of the types of questions that will be

asked can also be found in Appendix C. All qualitative data will be compiled into some

quantitative format so that results can be referenced using some form of numeric comparison.

For example, we would want to document the percentage of participants who would recommend

the program to their colleagues. Though the information here will be gathered through

interviews, that data will be tabulated to yield comparative information that can be used to make

more informed decisions about trainee satisfaction and transfer.

Page 26: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 21

Communication & Dissemination of Evaluation Results

All reports will be presented either formally and written in a way to address the major

issues of the program, or through the interviews and surveys designed for this purpose. The

rationale for the strategies used are summarized in the Evaluation Communication Plan (ECP) as

shown in Table 5.

Table 5 Evaluation Communication Plan (ECP)

Evaluation Communication Plan Information

Type Issues to be Addressed

Report Audience

Report Content

Report Format

Report Schedule

Presentation Context

Process (Formative)

Challenges and/or

concerns Client Progress Face to face

Presentation Quarterly Executive Board meeting

Role in Program

Program Staff Revision Face to face

Presentation When needed Staff Meeting

Program's Content Instructors Revision Phone calls Monthly Instructors'

Meeting

Program Readiness Participants Updates Handouts Bi-annually Workplace

Outcome (Summative)

Challenges and/or

concerns and final report

Client Final

Face to face Presentation with formal

report

Upon completion

Executive Board meeting

Views, support value and structure

Program Staff Final Face to face

Presentation Upon

completion Staff Meeting

Challenges and/or

concerns and final report

Instructors Final

Face to face Presentation with formal

report

Upon completion

Instructors' Meeting

Involvement and Opinions Participants Final Formal

Report Upon

completion Workplace

Page 27: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 22

Once the more relevant and appropriate questions have been advanced and answered, and

the preferred strategies for acquiring data have been identified, this evaluation effort moves into

the implementation stage. It is incumbent upon the stakeholders, particularly the supporting

organization, to review the results of this report and consider its implications for future training

programs.

Conclusion

This evaluation report is designed to explore the components of BioPro for Teachers with

a goal of outlining the many variables in the program that may either support or hamper the

success of the program. Adopting a systems thinking approach shaped the foundations of this

systemic goal-based evaluation. It is important to note that although Kirkpatrick's four levels of

evaluation played an important role in determining data many gathering strategies, the principles

of interviewing and administering tests, questionnaires and surveys were all gathered through

other collection methodologies. Once all data is tabulated, a summative report will be presented

to the Professional Development Unit at the MOE for consideration at its bi-annual executive

meeting that has been summoned to review this report. It must be noted that the ideas and data

presented in this report are the intentions of the evaluator and thus an anticipated path to

conducting this program evaluation. It is also anticipated that all data gathered to compile this

report will remain confidential and will be used only for its intended purpose: to evaluate the

efficacy of the BioPro for Teachers' training initiative.

Page 28: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 23

References

Darabi, A. (2014). HPT-based holistic Logic Model for Evaluation of Training Programs [Class

handout]. College of Education, Florida State University, Tallahassee, Florida

Darabi, A. (2014). Suggested Format for Evaluation Plan Summary [Class handout]. College of

Education, Florida State University, Tallahassee, Florida

Darabi, A. (2014). Systemic View of a Training Program in HPT Context [Class handout].

College of Education, Florida State University, Tallahassee, Florida

Kirkpatrick, D. L., Kirkpatrick, J. D. (2006). Evaluating Training Programs: The Four Levels

(3rd Edition). Berrett-Koehler Publishers.

Posavac, E. J. (2011). Program Evaluation: Methods and Case Studies. 8th ed, Boston: Prentice

Hall.

Zimmerman, M. A., & Holden, D. J. (2009). A Practical Guide to Program Evaluation Planning:

Theory and Case Examples. Thousand Oaks, Calif: SAGE Productions.

Page 29: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 24

Appendices

Appendix A

Derived from class notes - A. Darabi (2014) [Class Handout]

Page 30: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 25

Appendix B: HPT-based holistic logic model for evaluation of training programs

(Adopted from A. Darabi's (2014) [Class Handout]

Training Structure Training Process

Training Content

Trainers' Competencies

Trainees' Prior KSAs

Supporting Environment

for desired changes

Addresses BioPro's functional structure for producing training services. Questions if the training is structured according to the objectives of the plan? E.g. * How many trainers will be required? * How many participants have enrolled in this course?

Focuses on the design and implementation of the training process. Evaluates if the process is being followed according to the plans? E.g. How are the trainees selected for participation?

Looks at the instructional design and the development of the training content. Concerns would be if the content is current, relevant and applicable to all high school Biology teachers.

Keys in on trainers' qualifications (KSAs) required for completing the training and accomplishing training goals. Should be asking if the trainers are competent?

Trainees' competencies (prior KSAs) including motivation and incentives. Must explore if the trainees are training ready. E.g. Do the participants possess prerequisite KSAs to begin training?

Support for the training by organizational mission, vison, goals, allocated resources, available info/guidelines/feedback, open communication, clear expectations, etc. Questions can include those like: What resources have been allocated for training?

Activities and participation: what BioPro is designed to accomplish and who the program is targeting. This will include examples like the number of trainees trained, the number of trainers used and number of sessions that were necessary.

Assesses the short and long term impact of activities for producing the training. Here, the concern is with the transfer of training to the work environment. For the short term by asking if the trainees are able to exhibit KSA acquisition? And for the long-term by addressing if participants are demonstrating training transfer to the workplace and its impact on their environments?

Page 31: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 26

Appendix C Sample interview questions (face to face and/or telephone interactions)

Sample interview (face to face or telephone) questions for Participants

1) What did you find to be the most valuable part of the training?

2) Why do think it was important to attend the training?

3) Would you encourage your colleagues to attend the training? If so, why? If not, why not?

4) Did you get what you were expecting to receive from the training sessions?

5) Where you satisfied with the facilities?

6) Did you feel that the trainers were competent at performing their expected duties?

Sample Quantitative interview (face to face or telephone) questions for

Training Outputs

1) How many participant took part in the training?

2) How many trainers were used?

3) How many staff members were used to assist with running the program?

4) Where there sufficient job aids and/or handouts for each training session?

Sample set-up questions for Program Coordinators

1) How will you go about creating the training program? 2) How will you go about categorizing the content for the program? 3) What is the most significant component of the training? 4) What are you expecting the participants to accomplish out of the training sessions?

Page 32: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 27

Appendix D: Kirkpatrick’s Four Levels of Evaluating Training

Source: http://americanstudentsinbritain.blogspot.com/2011_07_01_archive.html

Page 33: Program Evaluation Projectbyronsmall.weebly.com/uploads/3/1/7/0/31700761/smallb... · 2018. 9. 10. · Florida State University . EME 6357: Evaluation of Instruction and Training

EME 6357: Final Report 28

Appendix E: Example of BioPro's general professional development program goals and plans

Source: http://education-2020.wikispaces.com/Professional+Development