The Research Design
Research for Better Schools
Philadelphia, PA
Jill Feldman, Ph.D., Director of Evaluation
What research questions will we ask about MSRP impact?
1. Does MCLA effect core subject teachers’ knowledge and use of research-based literacy strategies?
2. What are the separate and combined effects of MCLA and Read 180 on students’ reading achievement levels, especially students identified as struggling readers?
3. What are the separate and combined effects of MCLA and Read 180 on students’ achievement in core subjects, especially students identified as struggling readers?
What outcome measures will we use?
1. Iowa Test of Basic Skills (ITBS)• Vocabulary, Fluency, Comprehension
2. TCAP• Reading, Social Studies, Science, Mathematics
3. Gateway and End of Course Assessments• ELA, Mathematics, and Science
What research questions will we ask about MSRP implementation?
1. To what degree do the implemented MCLA & R180 treatments match the intended program standards and features?
2. What contextual district and school level factors may be influencing the implementation of MCLA & R180?
3. How do the professional development events, materials, or structures present in the control schools compare to what is present in the treatment schools?
Research Design for MCLA
• 4 matched pairs of schools (N=8) randomly assigned to treatment (MCLA) or control (no MCLA) condition
• Content area teachers in cohort 1 to participate in MCLA for Years 1 and 2
• Control group teachers (cohort 2) to participate in MCLA in Years 3 and 4
MCLA: Random Assignment of Schools
MCLA: Exploring Efficacy
Attempts to address questions about whether or not MCLA can work
• Depends upon rapid turnaround of data collected • Relies upon formative feedback to guide program
revisions• Requires close collaboration among project stakeholders
– To develop measures– To share information and data– To communicate regularly about changes and challenges– To troubleshoot and cooperatively address challenges
Research Design for Read 180TM
• Random assignment of “eligible” students enrolled at 8 SR schools, where eligibility means:– No prior participation in READ 180TM
– Two or more grade levels behind in reading– Scores in bottom quartile on state assessment (TCAP)
• READ 180TM is the treatment• Counterfactual (business as usual*) is the
control
Read 180: Random Assignment of Students
Read 180: Exploring Effectiveness
Attempts to address questions about whether or not Read 180 will work…
• Provides evidence about what happens when R180 is implemented “off the shelf,” (without formative eval support)
• Requires MCS to set aside local need for feedback to address questions of importance to field
• Establishes a one-way firewall between MCS and RBS
Please review the safety card in the seat pocket…
1. Balance local knowledge of students’ needs within the identified “eligible pool” without creating selection bias
2. Address high rates of student mobility
3. Accurately describe the counterfactual
4. Obtain parental consent (and students’ assent) to administer the ITBS
5. Design procedures to prevent crossover
6. Deal with (inevitable) startup delays
Air Traffic Control: Did Random Assignment Work?
Are the student groups comparable?
• Students eligible for READ180 ™: N = 2,277Total students in 8 SR schools: N = 6,170Students eligible as % of total: 36.9%
• No differences in race, gender, ethnicity, or poverty level between conditions
• Higher % of ELLs in control group (87 of 1,337 students, or 6.5%) than in R180™ (35 of 940 students, or 3.7%)
• Higher % of Sp Ed 8th graders in R180™ group (28.2%) vs control (20.9%)
What, how, and from whom should data be collected?
• Use multiple measures and methods– Interview developers, instructors, coaches, & principals
– Surveys of teacher knowledge and attitudes
– Focus group discussions with teachers
– Evaluator observations of PD sessions
– Evaluator observations of classroom implementation
• Use data to challenge/confirm findings from single sources• Share findings with key stakeholders to determine whether:
– data collected are appropriate to support decision making
– evaluation findings reflect actual experiences
– revisions to the logic model, IC map, and/or instruments are needed
Helen/Bob’s piece here…
The Flight Plan
The MCLA Program Logic Model
Inputs: Funding, staff, curriculum resource center, facilities, incentives, research materials
PrincipalsAttend four three-hour principal fellowship sessions each year for two (or four?) years
Participate in motivational, recruitment and celebratory events
Discuss MCLA at faculty meetings
Conduct walkthrough observations
Provide opptys for teacher collab
Allocate space for CRC materials
TeachersAttend # weekly MCLA training
Develop and implement 8 CAPs per year?
Meet with coaches for feedback to improve implementation of MCLA strategies
Integrate use of leveled texts to support development of content literacy among struggling readers
StudentsUse MCLA strategies to read/react to content related text (independently? In collaborative groups? Neither? Both?)
PPrincipals# hours of Principal Fellowship participation
# of MCLA events attended
Teachers # of hours of MCLA training attended
# hours of coaching (contacts)
# of CAPS implemented? Observed? videotaped?
# of new lesson plans integrating literacy in content area lessons
# and type of materials checked out of CRC
Students# classes taught by teachers participating in MCLA
# MCLA strategies students learn
# (freq?) of MCLA strategy use
Memphis Content Literacy Academy Evaluation Logic Model
PPrincipalsAwareness of and interest in staff implementation of MCLA concepts and strategies
TeachersIncreased knowledge of MCLA strategies
Improved preparedness to use research-based literacy strategies to teach core academic content
Increased use of direct, explicit instruction to teach reseach-based comprehension, fluency, and vocabulary strategies in content area classes
Integrated use of MCLA strategies to support development of content literacy
Students Increased familiarity with and use of MCLA strategies when engaging with text
Increased internalization ofliteracy strategies
Increased interest in school/learning
PrincipalsImproved school climate
School-wide plans include focus on content literacy
Improved instructional leadership
TeachersIncreased effectiveness supporting students’ content literacy development
Continued collaboration among community of teachers to develop and implement CAPs
StudentsImproved reading achievement and content literacy: 10% increase in students scoring proficient in Reading/LA and other subject areas of TCAP
mean increase of five NCEs on ITBS (comprehension? vocab?)
Activities
Outputs Short–term Outcomes Long-term Outcomes
Higher Quality Teaching& student achievement
Defining what will be evaluated
Developing the MCLA Innovation Configuration (IC) Map• Involve diverse groups of stakeholders
• The development team• The implementation team (MCS administrators & coaches)• Experienced users• Evaluators
• Identify major components of MCLA• Provide observable descriptions of each component• Describe a range of implementation levels
MCLA: The Conceptual Framework
Wheels Up: Resisting Premature Use of “Auto Pilot”
With the IC map guiding development, the following
measures were designed to collect data a/b MCLA
implementation:• Surveys
– Teacher knowledge about & preparedness to use MCLA strategies
– Teacher demographic characteristics– Teachers’ MCLA Feedback
• Interviews– Principals, coaches, development team, and MCS administrators
• Teachers Focus Group Discussions
Operationally defining components:“Job Definition”
14.1) Role and responsibilities of the Teacher with respect to literacy instruction a b c d e
Job definition: When asked, teachers define their job as providing literacy instruction along with their content instruction.
When asked, teachers define their job as covering required subject matter content.
Content of lesson plans: Teachers’ lesson plans show that how they plan to integrate instruction on literacy strategies with their instruction on subject matter content.
Teacher lesson plans only show how they plan to teach specific subject matter content.
Implementation of lesson plans: Observation of teachers’ lessons show that they integrate instruction on literacy strategies with their instruction on subject matter content.
Observation of teachers’ lessons show that they only teach specific subject matter content.
Aligning the IC Map and Instrument Development: “Job Definition” – Teacher Survey
Thinking about MCLA classes THIS SEMESTER (Fall ’07):
Strongly Agree Agree Disagree
Strongly Disagree
7. It was hard to find the time to attend MCLA classes every week. ¡ ¡ ¡ ¡
8. I believe using the strategies I learned in MCLA class will improve students’ understanding of important content.
¡ ¡ ¡ ¡
9. The MCLA materials were linked to the district’s content standards.
¡ ¡ ¡ ¡
10. My (main) MCLA class instructor modeled how to implement each new strategy, at least once, from beginning to end.
¡ ¡ ¡ ¡
11. I used class handouts to plan classroom instruction. ¡ ¡ ¡ ¡
12. There is not enough time to add the use of literacy strategies to the existing curriculum.
¡ ¡ ¡ ¡
13. I was already familiar with much of the material covered in MCLA classes.
¡ ¡ ¡ ¡
14. I am satisfied with my MCLA class experience overall. ¡ ¡ ¡ ¡
15. I would be willing to contribute a videotape of my CAP implementation as a tool for use to train teachers.
¡ ¡ ¡ ¡
16. I appreciate the chance to collaborate with colleagues in MCLA. ¡ ¡ ¡ ¡
17. I found the Joint Productive Activities (JPA) to be useful. ¡ ¡ ¡ ¡
18. My students would benefit from my use of JPAs during instructional time.
¡ ¡ ¡ ¡
19. I am unsure how useful the MCLA literacy strategies will be for my students.
¡ ¡ ¡ ¡
20. It has been difficult to equip my classroom with leveled reading materials given all my other responsibilities.
¡ ¡ ¡ ¡
21. I would recommend MCLA classes to fellow teachers. ¡ ¡ ¡ ¡
22. MCLA supports achievement of other important goals in my school’s improvement plan.
¡ ¡ ¡ ¡
23. I look forward to resuming MCLA classes in the spring. ¡ ¡ ¡ ¡
“Job Definition” - Principal Interviews 3. In your view, which staff are responsible for literacy instruction? [Probe: To what extent do you expect content area teachers to address the literacy needs of struggling readers?] 4. What are your school’s main student achievement improvement goals? 9. To the extent that you are familiar with MCLA, how connected/disconnected is the Academy from your school’s current improvement plans? [Probe: Please describe specific links (or disconnects) between MCLA and current improvement plans.] 10. When MCLA is implemented at your school, do you think it will require teachers to do different things in addition to what is already expected of them? [Probe: If yes, please describe whether the additional demands support or conflict with achievement of other/more important priorities.] 11. Do you expect MCLA will have an effect on student learning? 14. What expectations, if any, do you have for teachers’ participation in MCLA next year? [Probe: What percent of eligible teachers do you expect to enroll? Will specific grade level, content area, or teams of teachers be encouraged to participate or will the decision to enroll be left to the discretion of individual teachers?] 15. Thinking about next year, on a scale of 1 to 5 where “1” is not at all realistic, and “5” is very realistic, how realistic is it to expect you to:
a. allow teachers to observe each others’ classes/share ideas [Probe: If response is a 3 or higher, ask “how often?”]
b. discuss MCLA at faculty meetings? c. attend the annual MCLA kick-off event in August? d. attend MCLA teacher evening course sessions? [Probe: How many?] e. attend celebratory events (i.e., the Laureate ceremony) f. discuss MCLA with enrolled teachers? g. recruit new teachers to enroll in MCLA? h. observe teachers’ use of literacy strategies? [Probe: How often?] i. provide feedback to teachers about their use of MCLA strategies? j. participate in all Fellowship activities? [Probe: What level of participation
do you think is realistic?]
Where the rubber hits the runway…
Classroom Implementation
Operationally defining components: Implementation of Lesson Plans
14.1) Role and responsibilities of the Teacher with respect to literacy instruction a b c d e
Job definition: When asked, teachers define their job as providing literacy instruction along with their content instruction.
When asked, teachers define their job as covering required subject matter content.
Content of lesson plans: Teachers’ lesson plans show that how they plan to integrate instruction on literacy strategies with their instruction on subject matter content.
Teacher lesson plans only show how they plan to teach specific subject matter content.
Implementation of lesson plans: Observation of teachers’ lessons show that they integrate instruction on literacy strategies with their instruction on subject matter content.
Observation of teachers’ lessons show that they only teach specific subject matter content.
Implementation of lesson plans:Collecting classroom observation data
MSR-COP Data Matrix
Interval 1 Interval 2 Interval 3 Interval 4 Record Interval Start & End Times à : – : : – : : – : : – :
Instructional Mode(s)
Literacy Strategy(ies)
Cognitive Demand
Level of Engagement
Instructional Mode Codes
AD Administrative Tasks J J igsaw SGD Small-group discussion A Assessment LC Learning center/station SP Student presentation CD Class discussion L Lecture TIS Teacher/instructor interacting w/ student
DI Direct, explicit instruction related to a literacy strategy
LWD Lecture with discussion/whole-class instruction
TA Think-alouds
DP Drill and practice (on paper, vocally, computer)
OOC Out-of-class experience TPS Think-Pair-Share
GO Graphic organizer TM Teacher modeling V Visualization (picturing in one’s mind)
HOA Hands-on activity/materials RSW Reading seat work (if in groups, add SGD) WW Writing work (if in groups, add SGD)
I Interruption RT Reciprocal teaching
Cognitive Demand Codes 1 = Remember Retrieve relevant knowledge from long-term memory (recognize, identify, recall) 2 = Understand Construct meaning from instructional messages, including oral, written, and graphic
communication (interpret, exemplify, classify, summarize, infer, compare, explain) 3 = Apply Carry out or use a procedure in a given situation (execute, implement, use) 4 = Analyze Break material into its constituent parts and determine how the parts relate to one another and to
an overall structure or purpose (differentiate, organize, attribute, outline) 5 = Evaluate Make judgments based on criteria and standards (check, coordinate, monitor, test, critique, judge) 6 = Create Put elements together to form a coherent or functional whole; reorganize elements into a new
pattern or structure (generate, hypothesize, plan, design, produce, construct)
Level of Engagement Codes LE = low engagement, ? 80% of students off-task ME = mixed engagement HE = high engagement, ? 80% engaged
Implementation of lesson plans:Collecting classroom observation data
LITERACY ACTIVITY CODES
VOCABULARY STRATEGIES
B Bubble or double-bubble map M Mnemonic strategies
CC Context clue PT Preteaching vocabulary
E Etymology SFA Semantic feature analysis, maps, word grid
G Glossary or dictionary use WS Word sorts
IW Interactive word wall use
FLUENCY STRATEGIES
CR Choral reading/whole group reading RR Repeated oral reading
LM Leveled content materials TRA Teacher models/reads aloud passage
PB Paired or buddy reading
COMPREHENSION STRATEGIES
APR Activate prior knowledge PV Previewing text (T.H.I.E.V.E.S., L.E.A.R.N., and S.E.A.R.C.H.)
CT Connecting text to students’ lives
RT Retelling/summarizing with guidance Q Questioning for focus/purpose
GR Retelling with graphics MU Monitoring understanding
OR Oral retelling QAR Question-answer relationships/ ReQUEST
REF Reflection/metacognition
SGQ Students generating questions
WRITING STRATEGIES
JU Journal or blog use WR Written retelling
SW Shared writing
4.2
Please remain seated with your seatbelts fastened…
• Timely turnaround of data summaries• Team meetings to debrief/interpret findings• Testing what you think you “know:”
– Productive (& challenging) conversations– Data-driven decision making– Taking Action – Following up (ongoing formative evaluation feedback)
• Elizabeth’s piece here
Complimentary Refreshments:CRC Materials
Complimentary Refreshments:CRC Materials
Category Resources Used (N=235) %
National Geographic-Life Science/Human Body 45 19.1% Social Studies - Various Materials 25 10.6% National Geographic-US History and Life 30 12.8% National Geographic-Earth Science 19 8.1% National Geographic-Life Science 17 7.2% Science - Various Materials 15 6.4% National Geographic-Math Behind the Science 21 8.9% Professional Library 13 5.5% National Geographic-Science Theme Sets 13 5.5% Mathematics - Various Materials 10 4.3% National Geographic-Social Studies Theme Sets 6 2.6% National Geographic-Ancient Civilizations 4 1.7% National Geographic-Physical Science 4 1.7% Professional Development 3 1.3% Science Matters/Visual Science Encyclopedia 3 1.3% Science Theme Sets 5 2.1% US Regions 2 0.9%
Percentage Distribution of Planned Coaching Activities Logged in Year 1 (N=4,233 entries logged)
Activity Frequency PercentageCoach’s administrative tasks 1358 32.2Conferencing with teachers 716 17.0Observation 698 16.5School administrative tasks 339 8.0Collaborative teacher support 330 7.8Coach’s professional development 303 7.2Assisting teachers in class 138 3.3Striving Readers evaluation tasks 138 3.3Helping teachers prepare 71 1.7Modeling 59 1.4Videotaping/other 73 1.7
Activity Frequency PercentageCoach’s administrative tasks 1358 32.2Conferencing with teachers 716 17.0Observation 698 16.5School administrative tasks 339 8.0Collaborative teacher support 330 7.8Coach’s professional development 303 7.2Assisting teachers in class 138 3.3Striving Readers evaluation tasks 138 3.3Helping teachers prepare 71 1.7Modeling 59 1.4Videotaping/other 73 1.7
Ground Transportation: The Coaching Role
Trust b/w coach and teacher(s) is critical: • To provision of CAP implementation support
• Pre-conference meeting• CAP Observation
– Co-teaching; modeling– Videotapes for use to train teachers, coaches, evaluators
• Post observation conference
• To effective and strategic selection of CRC & supplemental resources
Avoiding Wind Shear…
Team’s unwavering commitment to helping
teachers support the success of struggling
adolescent readers
sum > individual parts
…and we have the data to prove it!
Across grade levels, the picture is the same…
8th Graders’ Reading Levels
School-wide comparisons with schools nation-wide