rahel kahlert, mpa evaluator, dana center, university of texas at austin mary walker, ph.d
DESCRIPTION
Optimizing Evaluation Quality & Cost Effectiveness Evaluating the University of Texas Master Teacher Summer Institute (MTSI). Rahel Kahlert, MPA Evaluator, Dana Center, University of Texas at Austin Mary Walker, Ph.D. Director of the MTSI, University of Texas at Austin. - PowerPoint PPT PresentationTRANSCRIPT
AERA Annual Meeting 2004, San Diego April, 14 2004
Optimizing Evaluation Quality & Cost Effectiveness
Evaluating the University of Texas Master Teacher Summer Institute (MTSI)
Rahel Kahlert, MPAEvaluator, Dana Center, University of Texas at Austin
Mary Walker, Ph.D.Director of the MTSI, University of Texas at Austin
Slide 2AERA Annual Meeting 2004, San Diego April, 14 2004
About theMaster Teacher Summer Institute
• Three week of training for in-service teachers in teams or singles
• Academic setting earning graduate credit
• Inquiry/problem-based approach to learning in collaborative groups
• Goal: to increase teachers’ content and pedagogical knowledge in physics
Slide 3AERA Annual Meeting 2004, San Diego April, 14 2004
About Inquiry-based physics
• Physics by Inquiry was developed by McDermott and PERG at Univ. Washington
• In-service teachers learn to teach science as an inquiry-based process
• One week each on:• Electric circuits• Optics• Kinematics (developed by Marshall & Castro)
Slide 4AERA Annual Meeting 2004, San Diego April, 14 2004
Evaluation standards used
• Service orientation• Identify stakeholders• Make evaluation useful for decision makers
• Cost effectiveness• Limited resources available• Maximizing resources
• Balance different evaluation standards and their purpose
Slide 5AERA Annual Meeting 2004, San Diego April, 14 2004
Data sources
Triangulation of data
• To collect stakeholder perceptions• Application materials (use of project records to
reduce data collection)• Focus groups• Open-ended surveys• Follow-up interviews
• To measure teacher content knowledge• Pre-assessments• Post-assessments
Slide 6AERA Annual Meeting 2004, San Diego April, 14 2004
Stages of data collection
From snapshot to continuous data collection• Previous evaluations:
• One survey at the end of MTSI• Snapshot approach
• Current evaluation has several stages:• Prior to the Institute• During the institute• At the end of the Institute• Follow-up several months later
Slide 7AERA Annual Meeting 2004, San Diego April, 14 2004
Qualitative data
• Focus groups paired with surveys (pluses and minuses)• Open-ended short surveys
• ++ Capture each teacher’s individual voice• ++ Provide reflection time• – – No interactive nature
• Focus groups• ++ Are of interactive nature to “dig deeper” • ++ Crystallize main themes and controversial issues• – – Individual voices might be overlooked
• Combination of both maximizes advantages
Slide 8AERA Annual Meeting 2004, San Diego April, 14 2004
Quantitative data
• Use of pre-existing validated assessments• Cuts costs to develop assessments• Reduces threat to validity• Baseline data available • Quasi-control groups available
• Methods• Test of difference between two paired means• Grouping test items into cluster for detailed
content analysis
Slide 9AERA Annual Meeting 2004, San Diego April, 14 2004
Sample finding
• Inquiry-based approach• Addresses teachers’ need for hands-on activities • Promotes visual and conceptual understanding
of teachers (and students)
• Follow-up evaluation shows:• Teachers used actual activities in the classroom• They shared these activities with other teachers
Slide 10AERA Annual Meeting 2004, San Diego April, 14 2004
Sample finding
• The two aspects of inquiry-based physics• Helps to increase teachers content knowledge• Shows how teachers can use inquiry-based
instruction to help students learn
• Follow-up evaluation shows:• Teachers perceived that first aspect was
emphasized over second• Were anxious about applying the method in their
own classrooms
Slide 11AERA Annual Meeting 2004, San Diego April, 14 2004
Sample finding
• Alignment• Topics were horizontally and vertically aligned• Close collaboration between MTSI faculty• No collaboration with MTSI Calculus
• Follow-up evaluation shows:• Teachers took initiative in promoting alignment
within their department• Organized professional development around
alignment
Slide 12AERA Annual Meeting 2004, San Diego April, 14 2004
Sample recommendation
Highlight the pedagogical aspect of inquiry-based teaching
• Teach how to ask inquiry-based questions
• Step back and explain why a section was taught in a certain way
• Teachers were taught through the inquiry-based process, but wanted more emphasis on how to apply this process in their own classrooms
Slide 13AERA Annual Meeting 2004, San Diego April, 14 2004
Sample recommendation
• Carefully address mixed-experience levels of participants
• For example: Provide technology training both before and during the MTSI to address teachers’ mixed technological knowledge and interests
• For example: Consider a pre-training session for basic mathematics concepts for teachers with weaker math backgrounds
Slide 14AERA Annual Meeting 2004, San Diego April, 14 2004
Next steps
• Current question:• How does MTSI benefit teachers’ content
knowledge and pedagogical knowledge?
• Future question:• Does an increase in teacher content and
pedagogical knowledge through inquiry-based MTSI have a positive impact on student learning?
• Currently relying on teacher self-reports
Slide 15AERA Annual Meeting 2004, San Diego April, 14 2004
Conclusion
• It is essential that evaluator and stakeholders (e.g. program director) work closely together• To ensure usefulness of evaluation• To not double the work
• There are several steps that help cut costs• Use project records data• Use validated, existing assessments• Use focus groups versus interviews
Slide 16AERA Annual Meeting 2004, San Diego April, 14 2004
Contact information
Mary Walker
Rahel Kahlert