science olympiad invy evaluation
TRANSCRIPT
MC Barons
Science Olympiad Invitational
Evaluation Plan
Renee Kowalchik
MC Science Olympiad Coach
Prepared and Submitted for
EIDT 6130
Program Evaluation
2 Science Olympiad Invitational Evaluation Plan
Program Analysis
Last year I allowed myself to be talked into being the coach for the school’s first Science
Olympiad team. Less than a year later, I find myself planning for our school to host a Science
Olympiad Invitational. If we are to be a competitive team, we will require a budget and a way to
practice before competing at the regional competition. It is our hope, mine and the students
involved in the team, that the Invitational will accomplish both of those goals.
The Invitational will consist of 25 to 30 teams of fifteen students from schools in the
region. It takes place in a single Saturday; there are 23 events in which two or three people from
each team compete. Awards are given for the top four teams in each event, and the top six teams
overall. The whole event is being planned, organized, and implemented by the students who are
members of the Science Club and me. While this is the first time the Science Club has attempted
such an event, the social studies department hosts a similar event for the last two years that has
been successful. Their success has helped with the perceptions of the Science Olympiad
Invitational.
The two main goals for hosting an Invitational are to make money for the Science Club
and to give our Science Olympiad team a chance to practice before the actual competition. The
students who were involved this year decided they loved the experience of competing at the
Olympiad; however, we did so with no money for supplies or learning tools to help them. By
charging a fee for other teams to compete in the invitational, we can make enough profit to
support the team and the rest of the Science Club activities. Also, an Invitational is basically a
scrimmage to help prepare for the real competition. The teams will follow all of the rules and
expectations outlined by the National Science Olympiad organization, and in doing so will have
Science Olympiad Invitational Evaluation Plan 3
the opportunity to test their machines, see sample test questions, and get ideas and suggestions
on how to improve from the other teams and coaches.
A final goal for this project is that the Science Olympiad Invitational will become an
annual event. This goal is more complex than the others in my opinion. It requires a cost-benefit
analysis, the “identification and assessment of all the benefits that are anticipated from the
project plus all of the costs for performing the project” (Portny et al., 2008, p. 31). The benefits I
expect are those mentioned above, while the costs include both money and time. The time for
organizing, communicating, and implementing this project are significant. One of the most
important questions to answer when this is completed is, “Was it worth it?”
There are several groups of stakeholders, each having “different perspectives, different
interests, and different concerns about the program” (Fitzpatrick, Sanders, & Worthen, 2011, p.
64). The primary stakeholders for the Invitational are the students involved in the Science
Olympiad competition, the Science Club members, and the other coaches/teams who may
participate. The teams involved in the competition all have the same interest, to prepare for the
regional competition. The Science Club members will be involved in the planning and
implementation, and will benefit from the money raised at the event. Other stakeholders, like the
high school administrators, teachers, and custodial staff, will be affected by the Invitational, but
not to the extent of the other groups. The teachers and custodial staff will be concerned with the
use of the rooms in the school, as will the administration. In addition, the administration has
another concern, which is the liability involved when that many people are on school property.
The contextual environment in which the Invitational is being held will have an affect on
its success and on how an evaluation is received (Fitzpatrick et al., 2011). As mentioned before,
there has been a similar event, the National History Day competition that the school has hosted
4 Science Olympiad Invitational Evaluation Plan
in the past. This has lead to a favorable attitude by the faculty and administration for this type of
endeavor. Even though the program is in its planning stages, I have made sure that all
stakeholders have been kept up to date with progress by meeting with administrators, students
and maintenance staff, and by following up these conversations with e-mail updates. This has
helped to keep the stakeholders involved and aware of the decisions that are being made. So far,
everyone has been very supportive and positive, but I am aware that there is always the
possibility that that could change (Yarbrough, Shulha, Hopson, & Caruthers, 2011).
There is one ethical challenge that concerns me as I evaluate this program, and that is my
own bias. I am heavily invested in the success of this Invitational. Most of the responsibility for
the planning and implementation is mine, and naturally I want it to be successful. “Evaluators
should be alert to examining how their relationships with those who operate or manage the
program can influence the choices or decisions made” (Portny et al., 2008, p. 99). The students
involved will also have biases that need to be overcome. They directly benefit from the
Invitational, and have a close relationship with me as well. It will be difficult for them to be
critical of my performance for fear of hurting my feelings. Including an outside viewpoint, in this
case another Science Olympiad coach and Invitational host, may reduce the affect that the biases
mentioned may have on the evaluation (Fitzpatrick et al., 2011).
Science Olympiad Invitational Evaluation Plan 5
Evaluation Model Comparison
Evaluation Model Advantages Disadvantages
EXPERTISE AND
CONSUMER-ORIENTED
APPROACHES
Expertise oriented evaluations are more objective, as they are completed by someone with experience in the subject who is not involved in the program
Consumer oriented methods allow the evaluator to be the lone decision-maker for the evaluation, giving her nearly complete control over the evaluation decisions(Fitzpatrick, Sanders, & Worthen, 2011)
There is no expert to which I have access to evaluate this program. The one person who might qualify is actually helping to plan the program with me, and so would not be completely objective in his evaluation
The audience for this evaluation is not broad enough to fall into the category of consumer-oriented. While there are certainly “customers,” they are specific and qualify more as stakeholders than an evaluation audience
I would prefer to have a more inclusive team to help with the evaluation – me, students, volunteers, etc. – and none of us qualify as experts as this is the first time the program is being implemented.
PROGRAM-ORIENTED
EVALUATION APPROACHES
Use logic-models as a basis for evaluation. The development of a logic-model is helpful when planning a new program like the one on which my project is based, and can help with development, data collection, and evaluation of the program
Uses objectives to determine success or failure of a program.
Involves stakeholders in discussion of program objectives, has validity in that it evaluates outcomes clearly identified and agreed upon
There are specific objectives for the program being developed, but due to the newness of the program, I don’t want to miss any possible benefits (or drawbacks) because the evaluation only focuses on objectives identified at the beginning of the program.
6 Science Olympiad Invitational Evaluation Plan
(Fitzpatrick et al., 2011)
DECISION-ORIENTED
EVALUATION APPROACHES
Goal of this type of evaluation is to improve programs – which is exactly what I need for the program I am developing
Evaluation focuses on the different stages of program development and works well when evaluation starts early in program development
Systematic, organized process – very logical(Fitzpatrick et al., 2011)
Focus is on decisions – not stakeholders.
Not very flexible if significant changes occur during program implementation.
PARTICIPANT-ORIENTED
EVALUATION APPROACHES
Stakeholders are an integral part of decision-making, can be in all parts of the evaluation process
Stakeholder involvement improves validity and makes it more likely the evaluation will be utilized(Fitzpatrick et al., 2011)
Works well with the CIPP structure – complements the different stages by adding stakeholder concerns and ideas to the process
For my program, I already have stakeholders identified (my student helpers/volunteers) who will be planning and implementing the program. It makes sense to have them be part of the evaluation as well.
Stakeholders have limited time and may not be interested in participating in the evaluation.
Objectivity may be called into question due to the fact that those planning/implementing the program are also those intimately involved with evaluating it
Select an Evaluation Model
Science Olympiad Invitational Evaluation Plan 7
For my evaluation of the Science Olympiad Invitational, I plan to use the CIPP (Context,
Inputs, Process, and Product) framework in the decision-oriented approaches as the main
methodology for conducting the evaluation. In addition, I would like to incorporate the P-PE
(Practical Participatory Evaluation) model from the Participant-oriented approaches. In my
opinion, the merging these two specific methodologies will help limit the negatives of each
individually while strengthening the advantages.
A decision-oriented approach makes the most sense in this evaluation - as the name
implies it is an evaluation that gathers information to be used to form decisions (Fitzpatrick,
Sanders, & Worthen, 2011). And that is exactly what I need as I move forward in this totally new
project. Also, an evaluation model that looks at the whole process from beginning to end as the
project moves from planning to completion will be most beneficial. I am in the planning stages
for the Invitational I will be evaluating, so I can complete the context evaluation and input
evaluation this summer as my planning progresses. In the fall and leading up to the actual event,
I can use the process evaluation to track how well the process of getting other teams registered is
going and how the scheduling and the communication with all involved happens. And finally,
once the Invitational is complete, the product evaluation can be used as a summative evaluation
of the whole process mostly to help determine if it is worth the time and effort to make this an
annual event. My one concern in choosing this model is that the planning for the evaluation of
the different stages is done before the program has even been implemented, and if significant
changes to the program are required as implementation takes place then the evaluation will need
to be modified as well. It is important to remember to allow for some flexibility in the plan.
I would also like to include two main groups of stakeholders in this process, which means
adding a participant-oriented model to the evaluation process. I like the P-PE method as it relies
8 Science Olympiad Invitational Evaluation Plan
on "primary stakeholders; that is, those who are most interested in results and in a position to use
them" (Fitzpatrick, Sanders, & Worthen, 2011, p. 205). There are plenty of stakeholders in this
project, but most will not really care to be involved in the evaluation or about the evaluation
results. There are two primary stakeholders that I think would be beneficial to include. First,
there is a core group of students that I would like to involve in the evaluation, as they are the
reason the Invitational is being hosted and will be helping with the work to make it happen. And
in truth they are already involved as they are the ones who have helped to get permission to host
the Invitational in the first place. I would also like to include the educator who runs an
invitational at a school in another region, and who has offered his help and guidance through my
planning and implementation of this project. He has been tasked by the Science Olympiad
organization to encourage more coaches to host invitationals and could greatly benefit from the
results of my evaluation. Choosing to add the P-PE methodology within the structure of the CIPP
model will help to structure the way stakeholders are incorporated into the evaluation plan.
Evaluative Criteria
Evaluation Questions:1. Did the facilities meet the needs of the program?2. Was communication throughout the event (registration conclusion) timely and
easy to understand?3. Did participation in the Invitational improve the Science Olympiad Team?4. Did the event raise enough money to support all functions of the Science Club,
including the Science Olympiad Team?5. Should this program become an annual event?
I chose these five questions because they address the initial goals and objectives
identified when the program was proposed. The needs identified were to help the Olympiad team
to gain experience for improvement and to raise funds to support the different functions of the
Science Olympiad Invitational Evaluation Plan 9
Science Club. Each of these questions aims to determine if the Science Olympiad Invitational
will meet those goals on an annual basis. Because the main methodology for the evaluation is a
decision-based one, the questions focus on the decisions that will need to be made (Fitzpatrick,
Sanders, & Worthen, 2011) regarding the Invitational, specifically, will it continue annually and
if so, how to continue to improve the program.
The stakeholders I plan to involve in determining evaluation questions are the students
involved in the planning of the program and the other coach who has experience in running an
Invitational and who is volunteering to help me through the process. These two are the primary
stakeholders, and they are willing to be involved in the evaluation in addition to the planning and
implementation of the program itself. Other stakeholders, parents, coaches, custodians,
administrators, will be part of the evaluation process through questioning, but will not contribute
directly in the decision-making for the evaluation process.
I will work closely with the primary stakeholders to determine the evaluative criteria and
standards as well, as “criteria should not be viewed as a checklist, nor received unquestioningly
form on high” (Baer, 1997). I need the other coach’s experience to help make-up for my own
lack of experience in determine what the important criteria for success may be. The students are
an integral part of determining criteria as they are the ones who will benefit directly from the
Invitational; as such they should have significant input on what “will be used to judge the
program and standards for success” (Fitzpatrick et al., 2011, p. 332).
In choosing the questions and criteria with the help of these stakeholders, there will need
to be discussions to determine the “levels of program performance that are acceptable” before the
evaluation actually begins (Fitzpatrick et al., 2011, p. 333). The standards for the facility use,
things like teams having their own rooms for home base and the appropriate space for specific
10 Science Olympiad Invitational Evaluation Plan
events, will need to be determined as well as standards for communication within the registration
process, during the program, and in the final announcement of winners and presentation of
results. There will also need to be standards for what needs to be accomplished compared to the
effort in planning and implementation, to determine if the event should become an annual one. It
is important for questions to deal with all aspects of the program from implementation to the
outcomes (Laureate Education, n.d.).
The one aspect of the Invitational that is not being evaluated is the date when it is being
held, even though the date may have significant impact on the Invitational. The reason for
leaving this out of the evaluation is because even if it does have an impact, it cannot be changed.
The date was chosen because it needed to happen at least a few weeks before the regional
competition, and tt had to be on a Saturday because the whole building is needed for at least 9
hours, making a weekday impossible. There are other Invitationals that have specific weeks that I
do not want to compete with for teams, and due to other obligations and scheduled events in the
school, I was left with one date. It makes no sense to spend resources debating the effects of the
specific date when it cannot be changed and will always be the first Saturday in February.
Evaluation Reporting Strategies
Stakeholder Reporting Strategy Implications Stakeholder Involvement
Science Olympiad Team
1. Open meeting/discussion to review data and draw conclusions.
This group is one of the primary stakeholders – they have the most to benefit from this
This group will be used often for data collection throughout the evaluation. They will be
Science Olympiad Invitational Evaluation Plan11
2. Preparation of the Evaluation report/website. This will include photos/videos from the Invitational in addition to the traditional parts of an evaluation report all arranged in a visually appealing way.
Rationale: These are my most involved students, and so they are a key part of the evaluation team. They are the group that will be impacted the most by the Invitational success/failure, and have the most interest in the program. Representatives of this group will help to refine the report for the rest of the stakeholders giving them a means of contributing to the process (Hall, Ahn, & Greene, 2012). The report will be presented on a website which will accomplish several goals. First, it will give the diverse groups of stakeholders access to the information (Greene, 2001). Second, it is a less formal way of presenting the evaluation results, which I think is more applicable to this situation. And finally, it can be formatted in such as way as to include all of the necessary parts of an evaluation report but have them displayed and presented in such a way as to be more
program (Fitzpatrick, Sanders, & Worthen, 2011). If the Invitational succeeds, they will be better prepared for the actual competition and have funding for next year. In addition, the Invitational will become an annual event. If it does not succeed, new sources of funding and ways to prepared for the competition must be found.
the most involved students as they will be part of the planning, implementation, and will participate in the Invitational as well. In addition, this group of students will be an integral part of analyzing data and drawing conclusions (Greene, 2001).
12 Science Olympiad Invitational Evaluation Plan
interesting and appealing than a printed report (Fitzpatrick et al., 2011).
Science Club Students1. Email invitation to view the website overview of the program, including evaluation data and conclusions.
2. Presentation at a monthly meeting of the Science Club.
Rationale: A visually appealing presentation of the program and its evaluation will make it more likely to be read and used (Fitzpatrick et al., 2011). I can create a website that will efficiently reach most of the stakeholders and be interesting to view. In addition, because of the significant impact the Invitational success or failure has on this group, a group from the evaluation team (likely me and some of the more involved students) will deliver a presentation of results and conclusions. Students are not likely to read a written report, and so an oral presentation will be more effective at gaining the attention of the group (Fitzpatrick et al., 2011).
This group, although involved in the Invitational only as volunteers and workers, has a significant stake in the success of the Invitational. This program is to be the funding for the activities the club has planned, and so if the Invitational does not succeed a new source of funding must be found.
This group will not be involved to a significant extent in the reporting of the evaluation results. I plan to use this group as my “beta testers” for the evaluation report/website to make sure everything is functioning, clear, and understandable before I send the link to the other stakeholder groups to view the website.
Visiting Teams (Coaches/Students)
1. Email invitation to view the website overview of the program, including evaluation data and conclusions.
Rationale: For many of
The teams that participate in the invitational are looking for a way to prepare for the actual competition. The success of the Invitational will impact their ability to prepare
While the visiting teams will be surveyed and interviewed for data collection, they will not be part of the reporting process other than to receive the finished report via the website.
Science Olympiad Invitational Evaluation Plan13
the same reasons listed above, I have chosen this format for the visiting teams. In addition, technology makes it much easier and cost-effective to reach the groups who I will no longer have direct contact with once the Invitational is over (Fitzpatrick et al., 2011).
in the future, depending on whether or not the Invitational becomes an annual event.
Volunteers/Parents1. Email invitation to view the website overview of the program, including evaluation data and conclusions.
2. Article in the district newsletter to promote and summarize the program and evaluation, and to share the web address for the evaluation report.
Rationale: Again, most of the reasons above apply here as well. The technology reaches people that we do not have direct contact with, and the website is an aesthetically appealing way to present the information. The district newsletter will reach parents and community members who otherwise may not know about the Invitational and will have enough information to educate those who do not have access to the website, as technology is not always accessible to everyone (Fitzpatrick et al., 2011).
The parents and volunteers have an interest in seeing the Science Olympiad team be successful. The evaluation will affect them indirectly as they will either be asked to volunteer in the future or help fund the group in another way.
While the parents and volunteers will be surveyed and interviewed for data collection, they will not be part of the reporting process other than to receive the finished report via the website.
1. Meet/discuss data Dave has been tasked Dave will be an integral
14 Science Olympiad Invitational Evaluation Plan
Dave (My helper) and draw conclusions.
2. Email invitation to view the website overview of the program, including evaluation data and conclusions.
Rationale: Dave has been (and will be) a source of support and information as I plan and implement this program. He will be very helpful in collecting data and analyzing data to draw conclusions for the evaluation. Dave will be an integral part of the evaluation team and will have significant input in the production of the evaluation report/website.
by the Science Olympiad organization to get more schools to host Invitationals. This case study, and the outcomes of the evaluation, may be helpful to him at other schools. While case studies like this are not always generalizable in total (Fitzpatrick et al., 2011), there may be individual lessons learned here that can be taken an applied elsewhere.
part in analyzing data and drawing conclusions as well as the production of the evaluation report/website
Administration1. Email invitation to view the website overview of the program, including evaluation data and conclusions.
Rationale: Again, most of the reasons above apply here as well. The technology reaches people that we do not have direct contact with, and the website is an aesthetically appealing way to present the information.
The administration wants the good PR from this program, and so has an interest in seeing the final report presented in such a way as to maintain a positive result.
The administration will not be part of the evaluation reporting process other than to have access to the final report/website.
Custodial Staff1. Email invitation to view the website overview of the program, including evaluation data and conclusions.
Rationale: Again, most of the reasons above apply here as well. The technology reaches people that we do not
The custodial staff has a vested interest in how the building gets used and the work they will be required to do before, during, and after the Invitational.
While the custodial staff will be surveyed and interviewed for data collection, they will not be part of the reporting process other than to receive the finished report via the website.
Science Olympiad Invitational Evaluation Plan15
have direct contact with, and the website is an aesthetically appealing way to present the information.
Science Olympiad Organization
1. Email invitation to view the website overview of the program, including evaluation data and conclusions.
Rationale: Again, most of the reasons above apply here as well. The technology reaches people that we do not have direct contact with, and the website is an aesthetically appealing way to present the information.
The Science Olympiad organization would like to encourage more schools to host an invitational. A positive outcome that is well documented and easily accessible could help them meet their goal.
The Science Olympiad organization will not be part of the evaluation process and/or report other than to have access the finished report/website
Women’s Club of Manheim
1. Email invitation to view the website overview of the program, including evaluation data and conclusions.
2. Presentation at a monthly meeting of the Women’s Club
Rationale: Again, most of the reasons above apply here as well. The technology reaches people that we do not have direct contact with, and the website is an aesthetically appealing way to present the information. In addition, I would like to be able to thank this group in person for their support, and so a face-to-face presentation would be most appropriate in this case.
There is no direct impact on the Women’s Club from this program. However, because they have granted us money to support the Invitational, they have an interest in its success or failure.
I have recently been awarded a $500 grant from this local organization to support the Science Olympiad Invitational. While not required, I think it would be in our best interests to share information about how the program was implemented with a focus on how their contribution made this possible. This group will not be directly involved in evaluation decisions, but does have an interest in the outcomes of the program and evaluation.
Values, Standards, and Criteria: The values upon which evaluation decisions are based will be those that are linked to the success of the students involved in the Invitational. Primarily, the students in question are those who will participate as members of the school Science Olympiad team, although a secondary group of students that will also be considered are those from visiting schools and those in
16 Science Olympiad Invitational Evaluation Plan
the Science Club. Success of the students in this case is determined by the level of preparation for the actual Olympiad competition that the Invitational provides, for both my students and those from other schools. Although the different stakeholder groups may have emphasize different criteria in judging success, it can be agreed upon by those involved that the success of the students is the main focus of the program. Therefore, a “successful” Invitational is one that will produce a Science Olympiad team that is well prepared for the regional Science Olympiad competition. In including the different stakeholder groups in the determination of evaluation values and criteria, there is an implicit valuing of a democratic process as well (Hall, Ahn, & Greene, 2012).
Criteria, and the standards by which they are measured, must be determined before the data collection begins in order to avoid disagreements later (Fitzpatrick et al., 2011). The criteria upon which success for this program will be judged include use of the facilities, clear communication throughout the event, improvement in the Science Olympiad participation, and amount of money raised. Standards for these criteria will be determined through conversations with the different stakeholder groups, which will give them the opportunity to contribute to the evaluation process (Hall, Ahn, & Greene, 2012). Flexibility in the criteria and standards will be important with this being a new program (Fitzpatrick et al., 2011). Changes in the original plan may happen at any time, and the evaluation plan may need to change to reflect that.
Potential ethical issues:1. I plan to post photos/videos of the events, along with evaluation summaries, for the different stakeholder groups to access. This will make the information accessible to the visiting teams and to the parents, the Women’s Club, and the public in general. This creates an ethical issue of privacy of the individuals whose photos are being published, my students, visiting students, and other coaches and volunteers. I plan to deal with this by having a form that will be completed as part of the registration packet indicating that photos/videos may be used for this purpose so that everyone is aware of this plan.
2. There are several groups of stakeholders that have a significant interest in the evaluation having a positive outcome. It will be important to first, not be swayed by these groups to present data in such a way that they are more positive than they really are, and second, to be tactful with any negative conclusions that may result from the evaluation (Fitzpatrick et al., 2011).
Science Olympiad Invitational Evaluation Plan17
References
Baer, W. C. (1997). General plan evaluation criteria. Journal of the American Planning
Association, 63(3). Retrieved from http://www.tandf.co.uk/journals/titles/01944363.asp
Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2011). Program evaluation: Alternative
approaches and practical guidelines (4th ed.). Upper Saddle River, NJ: Pearson
Education.
Greene, J. C. (2001). Dialogue in evaluation: A relational perspective. Evaluation, 7(2), 181-
187. Retrieved from http://sagepub.com
Hall, J. N., Ahn, J., & Greene, J. C. (2012). Values engagement in evaluation: Ideas, illustrations,
and implications. American Journal of Evaluation, 33(2), 195-207.
http://dx.doi.org/10.1177/1098214011422592
Laureate Education. (n.d.). Evaluating programs in instructional design: Developing logic
models (outcomes models) [Video file]. Retrieved from https://class.waldenu.edu
Portny, S., Mantel, Jr., S. J., Meredith, J. R., Shafer, S. M., Sutton, M. M., & Kramer, B. (2008).
Project management. Hoboken, NJ: John Wiley & Sons.
Yarbrough, D. B., Shulha, L. M., Hopson, R. K., & Caruthers, F. A. (2011). The program
evaluation standards: A guide for evaluators and evaluation users. Thousand Oaks, CA:
SAGE Publications.