evaluation 101 everything you need to know to get started evaluating informal science education...
TRANSCRIPT
Evaluation 101Evaluation 101
Everything You Need to Know to Get Started Evaluating
Informal Science Education
Media
Everything You Need to Know to Get Started Evaluating
Informal Science Education
Media
Saul Rockman and Jennifer Borse - Rockman et al
Disclaimer: • This is not a course,• It is not a certification program,• No postsecondary degrees will
result from your attendance
• But we do hope that you try thisat home!
This presentation is largely based on:
And our website: http://evaluationspringboard.org/science
Two perspectives:
QuickTime™ and a decompressor
are needed to see this picture.
The P.I.
!
The Evaluator
QuickTime™ and a decompressor
are needed to see this picture.
?
Why do an evaluation?
• Ensure that your product/program is successful…and…
Chapter 1: Intro to Evaluation
• Prove that it is successful!
What you used to think was a necessary evil….
still is!
What is an evaluation?
• Ensure that your product/program is successful…and…
FormativeEvaluation
Chapter 1: Intro to Evaluation
SummativeEvaluation• Prove that it is
successful!
“Evaluation is not just for preparing good proposals, It is also an integral part of running
good projects.”
Lynn Dierking, in “Framework for Evaluating Impacts of Informal Science Education Projects”(2008)
Formative Evaluation Focused on development and improvement of a project
• Are components of the project being carried out as intended? If not, what has changed and why?
Is the project moving according to the projected timeline?
What is working well? What are the challenges?
Is the budget on track?
What needs to be done to ensure progress according to plan?
Chapter 1: Intro to Evaluation
Summative Evaluation
Measures the outcomes and impacts of a project:
• Were the project’s goals met?
• What components of the project were most effective?
• What specific impacts did the project have on intended audiences (as well as secondary audiences)?
Chapter 1: Intro to Evaluation
Summative Evaluation: Informal Education and Outreach Framework
Impact Category Public Audiences Professional Audiences
Awareness, knowledge orunderstanding (of)
STEM concepts, processes, or careers
Informal STEM education/outreach research or practice.
Engagement or interest (in) STEM concepts, processes, or careers
Advancing informal STEMeducation/outreach field
Attitude (towards) STEM-related topic orcapabilities
Informal STEM education/outreach research or practice
Behavior (related to) STEM concepts, processes, or careers
Informal STEM education/outreach research or practice
Skills (based on) STEM concepts, processes, or careers
Informal STEM education/outreach research or practice
Other Project specific Project specific
Source: Friedman, A. (Ed) (March 12, 2008) Framework for Evaluating Impacts of Informal Science Education Projects [On-line] (p. 11). Available at http://insci.org/resources/Eval_Framework.pdf
Three Big Questions:
1. Are you doing what you said you were going to do?
Chapter 1: Intro to Evaluation
2. How well is it (the project, program, or initiative)
going?
impact See NSF Framework Chapter 3For more info about “impact.”
3. Does what you are doing have an impact?
Now you’re ready to start!
Chapter 1: Intro to Evaluation
Step 1:Prepare• Set the stage
Chapter 2: Doing an Evaluation
• Gather background information
Background
• Develop a Logic Model
Logic Model:
What’s going to happen
and why?
And don’t forget yourStakeholders!
Logic Model for the ISE Program
Intended audience
and strategic impact
Activities that Target Public
Audiences
Mass MediaExhibitsLearning
TechnologiesYouth/Community
ProgramsGrant
recipient
Collaborators and
consultants
Other stakeholders
NSF
Activities that Target
Professional Audiences
Seminars/ConferencesProfessional Development
Materials/Publications
Number of attendees
Number of members
Number of users
Number of viewers
Number of visitors
Number of users
Number of participants
Awareness, knowledge or
understanding of STEM concepts,
processes or careers
New skills based on engagement
Behavior resulting From engagement
Attitude towards STEM-related
topics
Engagement or Interest in STEM
concepts, processesor careers
New knowledge practices
that advance the
informal education
field
INPUTS ACTIVITIES OUTPUTS OUTCOMES STRATEGICIMPACTS
Step 2:Design Plan
Chapter 2: Doing an Evaluation
• Framing questions
• Organizing tools
Identify key questions that will guide the evaluation - What do you want to know (Remember the 3 Big ?’s) Consider: Audience, Resources, and Time
Constructs: concepts that can be measured
Indicators: examples of success that can be measured
Category of Impact Potentialindicators
Evidence thatimpact was
attained
Awareness, knowledge orunderstanding of STEMconcepts,processes or careers
Engagement or interest inSTEM concepts, processes, orcareers
Attitude towards STEM-relatedtopics or capabilities
Behavior resulting fromexperience
Skills based on experience
Other (describe)
Intended Impacts, Indicators and Evidence
Question/Method Matrix
Step 3:Select Methods
Chapter 2: Doing an Evaluation
What you measure with depends on what you
are measuring.
Step 3:Select Methods• Quantitative
Chapter 2: Doing an Evaluation
• Surveys or questionnaires• Objective tests of comprehension• Gate counts, television ratings, website hits• Time spent in exhibits• Number of posts to a website or comments/questions
• In-depth interviews• Focus Groups• Observations• Analysis of authentic data, user/visitor created products
• Qualitative
• Analyze a large quantity of data• Findings are more generalizable
• Get a more in-depth _understanding
• Helps with interpretation of _quantitative data
Tips
Chapter 2: Doing an Evaluation
Think about what you want to know
and be able to say at the end of your
evaluation.
QuickTime™ and a decompressor
are needed to see this picture.
Tips
Chapter 2: Doing an Evaluation
• Gate counts• Website hits and tracking data• Television ratings
• User-created products• Data from past research or evaluation efforts
Consider what data you may already have access to,
i.e., Existing Data:
Tips
Chapter 2: Doing an Evaluation
Don’t limit yourself to one method - consider using
Multiple Methods
Mixed Methods Evaluations
Step 4:Collect Data
Chapter 2: Doing an Evaluation
Surveys•Types of questions•Word Choice•Sampling•Pilot!•Anonymity•Personal questions•Keep it short•Strategies for collection
Type of Question Choice Considerations Resulting Stats.
Yes/No (o r alternativeresponse)
Cats arebetter thandogs.
o Yeso Noo No Response (or N/A)
Choose one.
You maywant to addan “other” or“notapplicable.”
v Need to be exclusive ofeach other
v Need to be inclusive of allpossible responses
v Flat datav Simple to interpret and
aggregate
% yes% no
% who respond
M ultiple Response
o Cats are better than dogs.o Cats and dogs are good pets.o Cats and dogs can get along.o People have diff erentperspectives on cats and dogs.
Choose oneor moreresponses.
v If data are totaled, sum ofresponses may equal morethan 100% is more thanone response is allowed
% of respondentswho choose eachalternative
% who respond
Scaleda. Cats are better than dogs.Agree Disagree1 2 3 4 5 6 7
b. Most often, dogs are betterpets than cats.never Always1 2 3 4 5 6 7
Choose onenumber froma continuum.
v You can report averagev Types of scales include:
Agreement, comfort level,skill level and frequency.
v Whether you use a 3,4…10 point scale mayaff ect your results
You may report %who chose any oneparticular number,but more often themean is reported
% who respond
Category orInterval Scale
a. Cats are better than dogs.___Agree___Agree a little___Disagree a little___Disagree
Choose oneof severalchoices.
You maywant to addan “other” or“notapplicable.”
v Whether you use a 3,4…10 point scale mayaff ect your results
v Some believe you canreport average
v Need to be exclusive ofeach other
v Need to be inclusive of allpossible responses
% who chose any oneparticular category,and thoughcontroversial, oftenthe mean is reported
% who respond
Ranking ororderingPlease rank the following fromone to four with one being thebest and four being the worst. Cats Dogs Giraffes Platypuses
Rank thechoices withnumbers on ascale(e.g. 1-4).
v Direct comparisonbetween choices
v Sometimes people do notfeel comparison isappropriate
v It is important to choosesimilar choices
Average number foreach choice acrossparticipants
% which make alllists
% who respondOpen-ended
Please explain why cats arebetter than dogs in the spacebelow.
Write aresponse.
v Difficult to interpret andaggregate; data must becoded to sort
v Good for quotesv Agreement here means
more than agreement withpre-determined choices
v Question must be clearenough to get the types ofanswers desired
Categories ofresponses and report% of responses thatfall into each
% who respond
Types of SurveyQuestions
Step 4:Collect Data
Chapter 2: Doing an Evaluation
Assessments
A B C D E Get your #2 pencils ready!
Objective assessment: only one right answer
Subjective assessment:
more than one correct answer
Step 4:Collect Data
Chapter 2: Doing an Evaluation
Interviews & Focus Groups
•Structured vs. In-Depth•One-on-one or pairs
•Small group (8-12)•Find group consensus•Encourage diversity in responses
Step 3:Collect Data
Chapter 2: Doing an Evaluation
Observations
• Who, What, Where, When, How?• Use a rubric or structured protocol to ensure consistent data collection
+
Step 5:Analyze Data
Chapter 2: Doing an Evaluation
• Quantitative •Prepare the data: code, enter, and check for errors
•Run analyses: what differences and patterns do you see? • Qualitative
•Coding - start general…then get more specific
•Use instruments and goals to guide analysis • Integrate/Synthesize
•Use data from different sources to get the big picture and draw conclusions
Step 6:Take Action!
Chapter 2: Doing an Evaluation
• Report
• Make Recommendations/Changes
•Clear and concise; provides adequate evidence for claims and enough explanation to make sure readers understand your interpretation of the data
•You don’t have to report on every piece of data or every finding: Know when to say when!
•Be specific
•Plan for further evaluation after changes are made
Wide-Ruled PaperStacking the Deck for your Evaluation
v Start early. Leave time for aggregating your data; data collection andaggregation can be very time consuming.
v Design your evaluation paying careful attention to your proposal goals.
v Consider your project’s realm of influence broadly. Your project is likely toreach many different audiences; think broadly when you are consideringwho to gather data from.
v Be realistic: Think about measurable outcomes. Given time and logisticalconstraints it can be difficult to measure certain impacts or outcomes – so berealistic when you are setting your evaluation goals. Although your projectmay be successful in changing many things, try to concentrate on those forwhich you can collect data.
v Start small and keep things simple. Brainstorm ideas for evaluation activitieswhich measure the broad scope and specific focus of your goals, then designevaluation tools as simply as possible, documenting how much time it willtake to develop, collect, and analyze the data and write the report. Berealistic.
v Build it in. As you plan your evaluation timeline, try to incorporate theadministration of all data collection activities into everyday activities.
Take a look at our website:
Evaluationspringboard.org/science