examining the effects of mathematics and science professional development...

67
EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT ON TEACHERS’ INSTRUCTIONAL PRACTICE: USING PROFESSIONAL DEVELOPMENT ACTIVITY LOG Longitudinal Study of the Effects of MSP-Supported Professional Development On Improving Mathematics and Science Instruction National Science Foundation MSP-RETA grant December 2006 Council of Chief State School Officers Washington, DC

Upload: others

Post on 13-Jul-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT ON TEACHERS’ INSTRUCTIONAL PRACTICE:

USING PROFESSIONAL DEVELOPMENT ACTIVITY LOG

Longitudinal Study of the Effects of MSP-Supported Professional Development On Improving Mathematics and Science Instruction

National Science Foundation MSP-RETA grant

December 2006

Council of Chief State School Officers Washington, DC

Page 2: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

COUNCIL OF CHIEF STATE SCHOOL OFFICERS The Council is a strong advocate for improving the quality and comparability of assessments and data systems to produce accurate indicators of the progress of our elementary and secondary schools. The CCSSO education indicators project is providing leadership in developing a system of state–by–state indicators of the condition K-12 education. Indicators activities include collecting the reporting statistical indicators by state, tracking state policy changes, assisting with accountability systems, and conducting analyses of trends in education. STATE COLLABORATIVE ON ASSESSMENT AND STUDENT STANDARDS – SURVEYS OF ENACTED CURRICULUM CCSSO leads a collaborative of state and district education agencies that pool resources to plan and implement projects applying the Surveys of Enacted Curriculum (SEC), to conduct alignment analyses, and to train leaders in using data with local educators. The Surveys of Enacted Curriculum (SEC) are a practical, reliable set of data collection tools being used with teachers of Mathematics, Science and English Language Arts (K-12) to collect and report consistent data on current instructional practices and content being taught in classrooms. The resulting data provide an objective method for educators to analyze the degree of alignment between current instruction and state standards and assessments. For further information, go to http://www.SECsurvey.org. This research paper summarizes findings from a three-year longitudinal study conducted by Council of Chief State School Officers with subcontracts to American Institutes for Research (Washington, DC) and Wisconsin Center for Education Research (Madison, WI) supported by a grant from the National Science Foundation, Math Science Partnership Program, RETA grant (EHR-0233505).

Council of Chief State School Officers Elizabeth Burmaster (Wisconsin), President

Rick Melmer, President-Elect Gene Wilhoit, Executive Director

Rolf K. Blank, Director of Education Indicators

Copies of this report may be ordered for $10 per copy from:

Council of Chief State School Officers

Attn: Publications One Massachusetts Ave., NW, Suite 700

Washington, DC 20001 202-336-7016

Fax: 202-408-8072 http://www.ccsso.org/publications/

Page 3: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report i

EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT ON

TEACHERS’ INSTRUCTIONAL PRACTICE: USING PROFESSIONAL DEVELOPMENT ACTIVITY LOG

Kwang Suk Yoon Michael Garet

Beatrice Birman Reuben Jacobson

American Institutes for Research

December 2006 This research report summarizes study results as of Year 3 of a three-year longitudinal study conducted by Council of Chief State School Officers with subcontracts to American Institutes for Research (Washington, DC) and Wisconsin Center for Education Research (Madison, WI) supported by a grant from the National Science Foundation, Mathematics Science Partnership (MSP) Program, RETA grant (EHR-0233505). We would like to thank the MSP project leaders, school districts, and teachers who were involved in this study and the following people for their collaborative work: Rolf Blank (CCSSO); John Smithson, Alissa Minor (WCER); Andrew Porter (Vanderbilt University); and Elham-Eid Alldredge, Craig Gilmore, and Joan Wang (REDA International Inc.).

Page 4: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

ii MSP PD Evaluation Study Report

TABLE OF CONTENTS

Overview ..................................................................................................................................................2

Goals of MSP-RETA Study of Professional Development ........................................................................4

Study Design and Instruments..................................................................................................................6

Measures of Teaching Practice and Professional Development ............................................................14

Descriptive Data on Teachers’ Participation in Professional Development ...........................................20

Descriptive Data on Changes in Teachers’ Instruction ...........................................................................25

Analysis Strategy: SEC and PDAL .........................................................................................................31

Math and Science Analysis Results .......................................................................................................36

Conclusion ..............................................................................................................................................43

Appendices .............................................................................................................................................47

Appendix A. Response Options for Key Survey Items & Scales.............................................................48

Appendix B. SEC Mathematics and Science Scales ..............................................................................49

Appendix C. Analyses of Variance Tables..............................................................................................52

Appendix D. Longitudinal Analysis Graphs by MSP Sites .....................................................................58

References..............................................................................................................................................59

Page 5: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 2

Overview

Over the past decade, a large body of literature has emerged on effective professional development, teacher learning, and teacher change (Carey & Frechtling, 1997; Cohen & Hill, 1998; Kennedy, 1998; Loucks-Horsley et al., 1998; Richardson & Placier, 2001; Supovitz, 2001). Despite the amount of literature, however, relatively little systematic research has been conducted on the effects of different professional development programs on improving teaching or improving student outcomes (see Kennedy, 1998; Loucks-Horsley & Matsumoto, 1999; Supovitz, 2001; for literature review). While a professional consensus has emerged suggesting that particular characteristics of professional development make it “high quality” or “effective,” there has been little direct evidence on the extent to which these characteristics are related to better teaching and increased student achievement (see Garet et al., 1999; Hiebert, 1999; and Loucks-Horsley et al., 1998; for exceptions).

As part of its National Evaluation of the Eisenhower Professional Development Program, American Institutes for Research (AIR) developed a model to analyze the relationship between features of professional development and teachers’ self-reported increases in knowledge and skills and changes in teaching practice. On the basis of national data, AIR concluded that six key features of professional development are effective in improving teaching practice. Three are characteristics of the structure of the activity: (1) the type of the activity—whether it is a reform type such as a study group or teacher network, in contrast to a traditional workshop or conference, (2) the duration of the activity, including the total number of contact hours and the span of time over which it extends, and (3) the extent to which the activity has collective participation of groups of teachers from the same school, department, or grade. The remaining three features are characteristics of the substance of the activity: (4) the degree to which the activity has active learning opportunities for teachers, (5) the extent to which the activity has a content focus on mathematics or science, and (6) the degree to which the activity promotes coherence in teachers’ professional development by incorporating experiences that are consistent with teachers’ goals and aligned with state standards and assessments (see Garet, et al., 1999; Garet et al., 2001).

A longitudinal study that AIR conducted as part of the Eisenhower evaluation lends support for the validity of the six quality features of professional development activities (Desimone, Porter, Garet, Yoon, & Birman, 2002). For example, the study found that professional development focused on specific, higher-order teaching strategies increased teachers’ use of those strategies in the classroom. Moreover, this effect was even stronger when the professional development activity had features of high quality as described above (e.g., active learning, coherence, and collective participation). These findings from the AIR study are consistent with the available recent studies that found a relationship between content focus in professional development and effectiveness in improving teaching (e.g., Cohen & Hill study of California’s professional development with the state mathematics curriculum framework, 2001; Kennedy, 1998; Brown, Smith & Stein, 1996; Zucker et al., 1998).

AIR’s longitudinal study has a number of strengths. First, the study design was rigorous in that it examined the effects of Year 2 professional development features on Year 3 instructional practices, controlling for Year 1 instructional practices and other variables such as characteristics of teacher, target class, and school. Second, the features of professional development were measured at year 2 on a separate survey, which was independent from the survey of instructional practices at years 1 and 3. Third, the study was able to ensure that teachers reported on the same target class (course) at each wave of data collection. Therefore, the study was able to rule out the possibility that any change in classroom instruction over time resulted from change in the course teachers taught during the study period. Lastly, the analytic approach took account of the complex, longitudinal data involving multiple measures of instructional practices within years.

Page 6: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

3 MSP PD Evaluation Study Report

However, there is an important limitation in AIR’s longitudinal study of the effects of professional development on instructional practices. As acknowledged by Desimone et al. (2002), the study did not obtain a comprehensive and complete description of each respondent’s entire set of professional development experiences in mathematics and science. Instead, the study was based on a single professional development activity for each teacher. Each teacher was asked to report on one professional development activity he or she participated in over the prior year, choosing the professional development activity to report on based on its perceived helpfulness to the target class for which the teacher also provided data on instruction.

Focusing on a single professional development activity to estimate the effects of professional development on teaching practice may underestimate or overestimate the true effect. In the real world, teachers are exposed to a wide array of professional development activities that vary in their quality and effectiveness, or lack thereof (Cohen & Hill, 1998; Kennedy, 1998; Locks-Horsley et al., 1996). Teachers’ instructional practices are likely to be affected by various professional development activities that they experience, even though some activities are more useful or effective than others. Therefore, it is crucial to secure an accurate representation of teachers’ actual participation in all of their professional development activities in order to gauge the true impact of professional development on teachers’ instructional practice.

This study builds upon the AIR Eisenhower evaluation. We have adopted the six quality features of professional development activities as well as the measurement scheme of comparing Year 3 to Year 1 instructional practices, utilizing Year 2 professional development features as the linchpin for change. We go further, however, to obtain estimates of the effects of the full portfolio of teachers’ professional development on teaching practice by examining a comprehensive and complete description of each respondent’s set of professional development experiences in secondary mathematics and science. To these ends, we developed an innovative approach using teacher logs to capture the full scope of professional development activities that teachers experience over an extended period of time.

Organization of the Report In this report, we first describe the goals and design of our study. Then, we give an account

of the methods used for the study. Specifically, we describe the sample and instruments used in the study to collect data on the relevant measures. In doing so, a particular emphasis is placed on a new approach called the Professional Development Activity Log (PDAL) that AIR developed to capture the complete portfolio of teachers’ on-going professional learning activities (Yoon et al., 2004). Next, we present descriptive data collected through the PDAL. It is followed by an explanation of the statistical model that we employed to analyze longitudinal data relating to the effects of professional development on teaching practice. Finally, we present the results of our longitudinal data analysis by illustrating a few key measures of teachers’ use of instructional strategies in the classroom.

Page 7: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 4

Goals, Research Questions, and Study Design In collaboration with the Council of Chief State School Officers (CCSSO) and the Wisconsin

Center for Education Research (WCER), AIR conducted a three-year empirical study to assist the National Science Foundation (NSF) and its Mathematics-Science Partnership (MSP) Program with building capacity to improve teacher knowledge and skills through professional development. The CCSSO/AIR/WCER evaluation study was initiated through a Research, Evaluation, and Technical Assistance (RETA) grant to address the lack of large-scale, empirical evidence on the effects of professional development on mathematics and science instruction. Toward that goal, we developed and tested a new methodology to measure the quality of the complete set of teachers’ professional development activities over a 15 month period. In addition, we used the data obtained to examine the effects professional development on improving the quality of instruction in mathematics and science education. The study has two main research questions:

• To what extent is the quality of the professional development supported by MSP activities consistent with research-based definitions of quality?

• What effects do teachers’ professional development experiences have on instructional practices and content taught in mathematics and science classes? For example, are high-quality professional development activities more likely than lower-quality activities to bring about changes in teachers’ classroom practices?

In order to address these research questions, the CCSSO/AIR/WCER team’s longitudinal study design involved three main steps. Exhibit 1 outlines the conceptual framework for the study and highlights key data collection activities and time frames.

(1) In Year 1 of the study (2002-03), using teacher surveys called the Survey of Enacted Curriculum (SEC), we collected baseline data on teachers’ instructional practice (e.g., the content coverage and the strategies that teachers employ in mathematics and science instruction) prior to start of MSP professional development activities;

(2) In Year 2 (2003-04), using PDAL, we measured the characteristics and quality of the professional development activities in which teachers participated; and

(3) In Year 3 (2004-05), we re-examined instructional practice using the SEC to determine if teachers’ instruction had indeed changed after participation in MSP-supported activities, and analyzed the change in practice in relation to degree of alignment with state content standards.

As Exhibit 1 demonstrates, similar to AIR’s earlier longitudinal study (Desimone et al., 2002), the basic design of this study was to investigate the effects of Year 2 professional development features on Year 3 instructional practices, controlling for Year 1 instructional practices and other variables such as characteristics of teacher, target class, and school. One of goals of this study was to identify features of professional development that are likely to be related to changes in teachers’ instructional practice.

Page 8: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

5 MSP PD Evaluation Study Report

Exhibit 1: Conceptual Framework

Sample The study design included four selected MSP sites (or projects) from the first cohort of NSF

MSP awards (Fall 2002):

• Brockport/Rochester, NY • Cleveland, OH • Corpus Christi area, TX • El Paso area, TX

The projects were selected based on their MSP design, which was to provide professional

development to middle school mathematics and science teachers during the second year—allowing for measurement of change over time within a three-year scope of the study.

Ideally, to obtain unbiased estimates of the effects of MSP professional development on instruction, we would have liked to have been able to assign teachers randomly to participate in MSP or to serve as a control group. Given the fact that the MSP projects were already underway at the time our study began, random assignment was not feasible. Thus, the study used a quasi-experimental design. Within each site, at the start of the baseline year (Year 1), we worked with site staff to identify a group of teachers targeted for participation in MSP activities, and a matched set of comparison teachers not selected for participation.

The specific approach to matching the targeted and comparison teachers varied across the four sites, depending on the way each site organized its MSP efforts. In Brockport (site 1), the MSP project sought applicants for participation in MSP activities. We identified applicants who were selected for participation as the treatment group, and applicants not selected as the comparison group. In Cleveland (site 2), the MSP project selected teachers for participation in MSP activities, and then

Survey of Enacted Curriculum

(Wave 1)

Instructional Practice: Content

and Activities/Strategies

Professional Development Activity

Log

Professional Development Experiences

Instructional Practice: Content

and Activities/Strategies

Professional Development Experiences

Professional Development

Experiences

Characteristics of Teacher, Target Class,

and School:

Survey of Enacted Curriculum

(Wave 2)

Year 2: During MSP

Program

Year 1: Before MSP

Program

Year 3: After MSP Program

Page 9: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 6

provided us with the names of other non-participating teachers to serve as a comparison. In Corpus Christi area (site 3), all mathematics teachers participating in MSP professional development in Year 1 were selected as the treatment group, and other mathematics and science teachers in the target schools assigned to the comparison group. Finally, in El Paso area (site 4), the MSP project was using a school-based approach to the delivery of professional development. The project identified six middle schools for participation in MSP and identified five other schools located in the same districts as comparison schools.

For simplicity, in the text that follows, we refer to the teachers targeted for MSP participation in each site as the treatment group, and we refer to the teachers identified as comparisons as the comparison group. The nature of the professional development “treatment” received by teachers of course varied from site to site, and also within each site. Thus our use of the term “treatment” should not be interpreted to mean that there is a single, standardized “MSP treatment,” but as a short-hand for “teachers targeted for MSP participation.” Because our treatment and comparison teachers were not randomly assigned, we have measured a large number of teacher characteristics at baseline (Year 1), to use as potential control variables in our analyses of the effects of professional development, including teacher background, prior instruction, and participation in professional development. We believe these variables should help reduce potential biases in the analysis. But we cannot rule out the possibility that teachers in the treatment and comparison groups differ in unmeasured ways – for example, in motivation. Thus, conclusions about the effects of professional development should be interpreted as potential effects; a more rigorous design would be required to draw causal conclusions.

In total, 476 teachers from four MSP sites were intended for the study. See column 1 of Exhibit 2 for the size of intend teacher sample by MSP site and treatment status.

Exhibit 2: Study sample: By MSP site and treatment status

MSP Site Treatment Comparison Total

Brockport 48 43 91

Cleveland 112 68 180

Corpus Christi 42 51 93

El Paso 61 51 112Total 263 213 476

Page 10: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

7 MSP PD Evaluation Study Report

Study Instruments For this study, we developed and tested the use of two instruments: PDAL and Surveys of

Enacted Curriculum (SEC). Using SEC, we collected data on teachers’ instructional practice in Year 1 and Year 3 from teachers in the four MSP sites. In Year 2, we gathered information about teachers’ participation in professional development activities using the Professional Development Activity Log (PDAL). First, we present a detailed description of PDAL here. Then, we provide a brief description about the SEC in the following pages.

PROFESSIONAL DEVELOPMENT ACTIVITY LOG (PDAL)

As mentioned earlier, in order to obtain unbiased estimates of the effects of professional development on teaching practice, it is essential to secure a full account of teachers’ professional development experiences. In collaboration with CCSSO and WCER, AIR developed the Professional Development Activity Log (PDAL) to examine the scope, nature, and quality of a wide array of professional development activities that teachers take part in over an extended period of time. This new tool was built on AIR’s prior work on the National Evaluation of the Eisenhower Professional Development (Garet et al., 2001). The PDAL is a web-based, self-administered, longitudinal data collection tool with which teachers record their professional development experiences in detail with the assistance of a series of structured prompts.

Teachers logged on to their password-protected web account and filled out their PDAL at least once a month, even if they did not participate in any professional development activities. In the PDAL, teachers were prompted each month to answer the following questions about each professional development activity in which they participated:

• Name of activity • Number of hours spent on each activity and its duration • Whether the activity is a one-time event or a continuous event (i.e., recurring over a number

of months) • Type of activity (e.g., workshop, summer institute, study group) • Purpose of activity (e.g., strengthening subject matter knowledge) • Professional development quality features (e.g., active learning, coherence, collective

participation; see Garet et al., 1999) • Content topics (e.g., algebraic concepts: absolute values, use of variables, etc.) • Instructional practice – instructional strategy topics covered in each activity (e.g., use of

calculators, computers, or other educational technology) • Materials used during each activity

COMPLEX PATTERNS OF TEACHERS’ PARTICIPATION IN PROFESSIONAL DEVELOPMENT

The advantages of the PDAL come in large part from the nature of the data that it produces: the PDAL captures the complexity of teachers’ participation in a wide variety of professional development activities.

The pattern of participation in professional development may vary widely by individual teachers, by the month of the year, or by the type of activities. To illustrate a variety of patterns of professional development participation, Exhibit 3 displays hypothetical patterns of participation in 7 professional development activities (A through G) by 5 teachers over the period of 8 months (July 2003 through February 2004). The total number of hypothetical monthly logs constructed for the combination of the 7 professional development activities and 5 teachers is 20. Each “1” in the cell of Exhibit 3 represents a monthly log.

Page 11: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 8

First, some teachers may be more active in professional development than others. As hypothetical data in Exhibit 3 illustrates, for example, Mr. Anderson participated in three activities (A, B, and C) during the 8-month period and completed 6 separate monthly logs for the activities. During the same period, however, Mrs. Smith kept only one monthly log for a single activity (G).

Second, the activity level may fluctuate by the month of the year. For example, four of five teachers were actively engaged in professional development in August 2003, while only one teacher was active in professional development in December 2003.

Third, the pattern of participation in professional development may be determined by the type of activity being offered. For instance, activities C and G are one-time events that occur within a single month. A typical district workshop and a national conference are examples of such one-time activities. Other activities like A, B, and D continue into following months. A study group or task force may be such continuous activities that require a teacher’s prolonged involvement. Some continuous activities such as A and D are conducted in consecutive months, while others like B in non-consecutive months (e.g., every other month). Further, some teachers may attend multiple activities within a month, as illustrated by the activities of Mr. Lee during September 2003. Still others may not have any activity to report for some of the months during the study period. For example, Mr. Anderson was inactive during October 2003 and February 2004.

Lastly, some activities (e.g., A and E) are commonly reported by more than one teacher, while others (e.g., B, C, and G) are reported by only a single teacher. For example, Mr. Anderson and Ms. Lopez filled out separate logs for activity A in which they shared common experience during July and August of 2003. Even for the common activity, two teachers may differ in their overall assessment of their professional development experience. In sum, the PDAL exposes these various patterns of activity participation through collecting disaggregated monthly log data about each and every professional development activity (or lack thereof) that individual teachers experience over the course of the study period.

Exhibit 3: Basic monthly activity log data: Log-level data disaggregated by teacher, by activity, and by month (hypothetical data)

Teacher Activity Jul-03 Aug-

03 Sep-03

Oct-03

Nov-03

Dec-03

Jan-04

Feb-04 # of logs

Mr. Anderson A 1 1 2 B 1 1 1 3 C 1 1 Ms. Lopez A 1 1 2 D 1 1 1 1 4 Mrs. Kelly E 1 1 2 Mr. Lee E 1 1 1 3 F 1 1 2 Mrs. Smith G 1 1

# of logs per month 3 4 4 2 3 1 2 1 20

Page 12: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

9 MSP PD Evaluation Study Report

STRUCTURE OF PDAL DATA

As Exhibit 3 illustrates, a monthly log is the basic level (or unit) of raw data collected with the PDAL; that is, a record for an activity for a month. And it is a basic building block of a complex PDAL data structure. For instance, Mr. Anderson participated in three activities (A, B, and C) over a period of 8 months and completed 6 separate monthly logs: 2 for activity A, 3 for activity B, and 1 for activity C. More specifically, activity A may represent a summer institute that extended over a couple of months; activity B, which recurred in every other month over a span of 5 months, may be a mentoring activity; and activity C may reflect a district-sponsored workshop. Since the PDAL asks teachers to report about their professional development on a monthly basis, we anticipate some month-to-month variation in the quantity and quality within the same professional development activities. For example, across the two monthly logs submitted by Mr. Anderson for the same activity A, he may report different number of contact hours and different amount of content focus depending on the month of reporting. In sum, as a disaggregate unit of observation, monthly logs for each activity will allow us to document in detail a great number of possibilities of professional development activities with varying quantities and qualities that teachers experience over an extended period of time.

Using disaggregate, monthly log level data as basic building blocks, we can construct a number of aggregate data at different levels, which may include teacher-activity level, month-level, activity-level, and teacher-level data. However, for this report, we aggregated PDAL data to the teacher-level since instructional practice variables were measured at the teacher level as well (For alternative methods of PDAL data aggregation, see Appendix A). We produced teacher level data by aggregating the basic monthly log level data across months and activities. As the last column of Exhibit 4 demonstrates, a total of 20 monthly logs were aggregated to produce 5 different teacher professional development profiles (or 5 sets of teacher-level data). For example, 6 monthly logs created by Mr. Anderson during the 8-month period were combined to produce his professional development portfolio. With this teacher-level portfolio, we can describe each teacher’s overall and cumulative professional development experience such as duration (e.g., total contact hours, mean contact hours, and span), cumulative content focus, and other quality features such as a mean level of active learning and coherence. For example, Ms. Lopez’s portfolio seems to indicate that her engagement in professional development is more extended over time than that of Mr. Lee (i.e., a longer span).

Exhibit 4: Teacher-level data aggregated across months and activities (hypothetical data)

Teacher Jul-03

Aug-03

Sep-03

Oct-03

Nov-03

Dec-03

Jan-04

Feb-04

# of logs aggregated

Mr. Anderson 1 1 1 1 1 0 1 0 6 Ms. Lopez 1 1 0 0 1 1 1 1 6 Mrs. Kelly 0 1 1 0 0 0 0 0 2 Mr. Lee 1 1 2 1 0 0 0 0 5 Mrs. Smith 0 0 0 0 1 0 0 0 1

It should be noted that we conceptualize the quality of a whole teacher-level professional

development portfolio as a weighted average of the quality of the individual activities that make up the portfolio, with each activity weighted by the number of months in which it occurs. For example, the overall quality of Mr. Lopez’s professional development portfolio would be more weighted on Activity D than on A, since there are 4 logs for Activity D and only 2 for Activity A.

Page 13: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 10

The PDAL data represent a complex dataset that is multilevel or hierarchical in nature ― monthly logs are nested within activities and individual teachers.1 In the remainder of the report, we will use teacher-level data as a basic unit of analysis to assess the effect of teachers’ professional development of their instructional practices. The teacher level data are also used to form a MSP site-specific profile to allow for cross-site comparisons on key quality features.

ADVANTAGES OF PDAL

The PDAL has a number of advantages over existing tools (Yoon et al., 2004). First, monthly logs are likely to provide accurate, time-sensitive information about teachers’ professional development experiences. By design, the PDAL allows teachers to enter real-time data as their professional development activities occur. At a minimum, it collects information on teachers’ professional development activities in the immediate past month(s) and helps reduce a recall bias from retrospective data. Second, with the PDAL, we can avoid problems with data aggregation when teachers report their professional development experiences that occurred over a period of time. Researchers can aggregate basic data to selected level(s) of aggregation for analysis (e.g., activity-level or teacher-level). Survey methods that ask teachers about their overall and cumulative professional development experience cannot capture detailed activity-specific information. Third, because we take an inclusive approach to professional development activities (i.e., not limited to MSP-sponsored activities), we can examine a full spectrum of professional development activities that teachers experience in a naturalistic setting. Fourth, we emphasize behavioral indicators of teachers’ professional development experiences (e.g., the frequency and contact time; opportunities for active learning; collective participation). Fifth, by taking advantage of structured prompts for skip patterns, the PDAL generates context-sensitive questions. For example, if teachers indicate they did not cover certain content areas, the PDAL skips questions regarding those areas. This feature of the PDAL alleviates teachers’ burden. Lastly, because teachers’ log entries are automatically saved in a database as they respond, we reduce the chance of data entry errors.

In sum, with the PDAL, we expect to obtain more valid and reliable data on a full array of teachers’ professional development experiences, despite such complex patterns of teachers’ participation in professional development activities as described above. The ultimate test of PDAL is whether it helps researchers, evaluators, and practitioners obtain a complete and comprehensive description of teachers’ entire professional development experiences. However, to the extent that not all teachers participate in PDAL data collection, teachers fail to respond to the entire set of questions asked by PDAL, teachers submit logs for less than full months, or teachers drop out in the middle of the study period, data collected with PDAL on teachers’ professional development experiences would be less than complete.

ADMINISTRATION OF PDAL

The PDAL data collection was launched in July, 2003 and ended on November 15, 2004. Teachers who had completed the SEC Year 1 and/or who taught in schools in the sample received an introduction packet in the mail. The packets included a letter introducing them to the PDAL, instructions on how to sign-on to and complete the PDAL, and a glossary of terms. Further, each teacher was provided with an e-mail and toll-free phone number to call when they needed help with the PDAL.

1 To use a technical term, monthly logs are cross-classified (or cross-nested) within activities and individual teachers (Raudenbush and Bryk, 2002).

Page 14: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

11 MSP PD Evaluation Study Report

Teachers were instructed to begin creating log entries for July, 2003. Subsequent month’s logs were automatically generated for teachers to fill out starting from the first day of each month. Teachers were asked to complete reporting on a month by the 15th of the following month. Teachers created separate logs for each professional development activity they participated in each month. Teachers could revisit the PDAL (over multiple sessions) as necessary to complete their monthly activity logs. They could also modify their log entries until they chose to “finalize” the month. If the activity continued beyond the first month, they would continue to report on that same activity in following months. Teachers who did not have any professional development activities in a given month were asked to indicate their inactive status for the month by clicking an option which states, “I did not participate in any formal professional development activities this month.” Regardless of their active or inactive status for the month, if teachers had participated in any informal activities, they were asked to report how many hours they engaged in informal self-directed learning activities in the month and if they used any of the informal self-directed learning in their classroom.2

We used several strategies to encourage teachers to respond to the PDAL. First, reminder e-mails were sent each month. At first, the e-mails simply reminded the users to fill out their PDAL. Over the course of the project, the e-mails included a list of all the activity logs the teachers had created for each month. This list also provided explicit instructions on what the teachers still had to do in order to complete their PDAL for a given month. Additionally, postcards were sent intermittently to teachers throughout the study (e.g., as a reminder at the beginning of the school year). Next, a subcontractor was hired to make monthly reminder phone calls to all in-scope teachers.3 The subcontractor reminded the teachers to complete their current and outstanding monthly PDAL logs, provided technical assistance, recorded inactive months, and updated contact information. We made an incentive payment of $50 to the PDAL participants when they filled out the first month’s log and another payment of $50 when they completed and finalized all their 15 months of the PDAL.

PDAL RESPONDENTS

A total of 273 mathematics and science teachers from four MSP sites participated in the PDAL. Overall, 57% of the teachers in the intended sample of 476 completed the PDAL for at least one month. However, as Exhibit 5 shows, the rates vary by MSP site and treatment status. For example, the Brockport, Cleveland, and Corpus Christi sites achieved about the same response rates: 60%, 62%, and 61%, respectively. But the El Paso site participation reached only 44%. Treatment group teachers were more likely to complete the PDAL compared with their comparison group counterparts: 66% vs. 46%. The data set includes logs covering teachers’ professional development during the period from July 2003 through September, 2004. Over the 15-month period of time, 1,997 monthly logs were completed by 273 teachers.

2 Informal professional development was defined in the PDAL instructions as an independent, self-directed learning opportunity that teachers select themselves, on the basis of their personal or professional interests. Examples are using Internet sites to plan lessons or do research; reading a specific journal to learn of the latest research in their field; and meeting their fellow teachers in informal ways to expand their interests and knowledge. 3 Those who retired, moved out of school district, or taught a subject other than mathematics or science were treated as out-of-scope for the PDAL data collection.

Page 15: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 12

Exhibit 5: Rates of responses to Year 1 & 3 SEC and PDAL in relation to intended sample: By MSP site, treatment status, and subject

Intended Sample

Math Science TotalResponse

Rate Math Science TotalResponse

Rate1 Math Science TotalResponse

Rate

MSP SiteBrockport 91 47 31 78 86% 37 18 55 60% 27 11 38 42%Cleveland 180 78 99 177 98% 49 63 112 62% 32 44 76 42%Corpus Christi 93 50 28 78 84% 34 23 57 61% 20 10 30 32%El Paso 112 34 22 56 50% 28 21 49 44% 14 12 26 23%

Treatment StatusComparison 213 76 92 168 79% 45 54 99 46% 18 16 34 16%Treatment 263 133 88 221 84% 103 71 174 66% 75 61 136 52%

MSP_Site*Treatment StatusBrockport-Comparison 43 19 17 36 84% 12 8 20 47% 9 3 12 28%Brockport-Treatment 48 28 14 42 88% 25 10 35 73% 18 8 26 54%Cleveland-Comparison 68 27 40 67 99% 14 15 29 43% 4 7 11 16%Cleveland-Treatment 112 51 59 110 98% 35 48 83 74% 28 37 65 58%Corpus Christi-Comparison 51 15 26 41 80% 6 22 28 55% 3 0 3 6%Corpus Christi-Treatment 42 35 2 37 88% 28 1 29 69% 17 10 27 64%El Paso-Comparison 51 15 9 24 47% 13 9 22 43% 2 6 8 16%El Paso-Treatment 61 19 13 32 52% 15 12 27 44% 12 6 18 30%

Total 476 209 180 389 82% 148 125 273 57% 93 77 170 36%

Note:1 PDAL response rate represents the number of teachers who completed usable PDAL data for at least one month among the intended sample.

PDALYear 1 SEC Year 2 SEC

Surveys of Enacted Curriculum (SEC) Surveys of Enacted Curriculum were used in this study to describe teachers’ instructional

practice that occurred in 2002-03 and 2004-05. The Surveys were initially designed and tested through prior studies involving 11 states and 400 schools from 1998 to 2001, supported by NSF grants to CCSSO and WCER (Blank, 2004; Blank, Porter, & Smithson, 2001; Porter, 2002). The version for the present study was developed by the CCSSO/AIR/WCER research team with assistance from the advisory panel (See Blank et al. (2001) for a full description of SEC. See also www.SECsurvey.org/Tools). The SEC is designed to provide information on classroom instruction practices, subject content (i.e., content topics and teacher’s expectations for student performance in the topics), and teacher opinions and beliefs. In the SEC, teachers report on the subject content and practices they used in a target class during a school year and the time allocated to different content topics (e.g., use of variables or patterns within the broad topic of algebraic concepts), and instructional activities and strategies.4 For the present study, we have measured the extent to which the teacher used broad instructional activities (e.g., classroom activities when students work in pairs or small groups) and specific strategies within the broad instructional activities (e.g., applying mathematical concepts to "real-world" problems as an instructional strategy when students in the target class work in pairs or small groups).

The SEC was administered in two waves: in spring 2003 (Year 1) and spring 2005 (Year 3) with the assistance of local site coordinators. As shown in Exhibit 5, in Year 1, the Surveys was completed by a total of 389 teachers in grades 6-12 across four MSP sites (209 mathematics and 180 science teachers). The overall response rate across the four sites was 82% in Year 1.

4 In the SEC, teachers receive instruction for selecting the target class. For mathematics instruction, for example: “For all questions about classroom practices please refer only to activities in the mathematics class that you teach. If you teach more than one mathematics class, select the first class that you teach each week. If you teach a split class (i.e. the class is split into more than one group for mathematics instruction), select only one group to describe as the target class.”

Page 16: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

13 MSP PD Evaluation Study Report

In Year 3, a total of 174 teachers completed the follow-up SEC (97 mathematics and 77 science teachers). As can be seen in Exhibit 5, there was a substantial attrition among teachers participating over the course of the 3-year longitudinal study, and the, response rate for the Year 3 SEC administration was only 36%.

Page 17: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 14

Measures and Constructs The data in this report are unique in that they provide consistent information on teaching

practice and professional development over a three-year period for teachers of mathematics and science in schools participating in four MSP projects. These data enabled us to analyze relationships between teachers’ professional development experiences and classroom practice, while controlling for prior differences in their classroom practice.

In this section, we describe measures and constructs that were used to analyze the effects of professional development on instructional practice, which are presented in the next section. First, we start with measures of the quality of professional development activities that we collected with PDAL in 2003-04. Then, we describe measures of teachers’ instructional and assessment practices that we collected with SEC in spring 2003 and spring 2005.

Measures of the Quality of Professional Development Activities Since the PDAL was specifically designed to describe teachers’ professional development

activities as comprehensively and completely as possible using the full set of research-based criteria of quality (Garet et al., 1999), we created a number of measures to capture these quality features. As was briefly mentioned in Section 1, there are three core features and three structural features that are associated with high quality professional development. They are:

Three core features • Active learning • Coherence • Content focus Three structural features • Duration (e.g., contact hours and span) • Type • Collective participation Of these six quality features of professional development, we will focus on five that were

used in our analysis of professional development effectiveness: active learning, coherence, collective participation, content focus, and duration. Particularly with regard to content focus, we focused on, among others, two areas of teachers’ professional development activities: one on instructional activities and strategies, and the other on assessment practice. In the previous Eisenhower longitudinal study (Desimone et al., 2002), all of these five measures of professional development quality were found to be linked to changes in self-reported measures of classroom teaching at two points in time.

ACTIVE LEARNING

Active learning concerns the opportunities provided for teachers to become actively engaged in meaningful discussion, planning, and practice as part of the professional development activity. To measure active learning, the PDAL asked the following 8 questions with a four-point response scale.

During this professional development activity this month, how often did you:

0=never, 1=rarely, 2=sometimes, and 3=often 1. Observe demonstrations of teaching techniques? 2. Lead group discussions?

Page 18: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

15 MSP PD Evaluation Study Report

3. Develop curricula or lesson plans, which other participants or the activity leader reviewed? 4. Review student work or score assessments? 5. Develop assessments or tasks as part of a formal professional development activity? 6. Practice what you learned and receive feedback as part of a professional development activity? 7. Receive coaching or mentoring in the classroom? 8. Give a lecture or presentation to colleagues?

The scale of active learning was created by combining these 8 items and computing a mean

score. The reliability of the active learning scale, as indexed by Cronbach’s alpha, was .85. On the four-point scale (from 0 to 3), the average level of active learning was 1.14 (with a standard deviation of .60), which indicates that opportunities for active learning provided for teachers were quite low. Recall that the scale of active learning is a weighted average of active learning of individual activities that made up each teacher’s professional development portfolio, with each activity weighted by the number of months in which it occurred.

COHERENCE

Coherence represents the degree to which professional development activities are perceived by teachers to be a part of a coherent program of teacher learning. To gauge the level of coherence of professional development activities that teachers participated in, the PDAL asked the following 5 questions with a five-point response scale.

How often was this professional development activity:

0=never, 1=rarely, 2=sometimes, 3=often, and 9=N/A 1. Designed to support the school-wide improvement plan adopted by your school? 2. Consistent with your department or grade level plan to improve teaching? 3. Consistent with your own goals for your professional development? 4. Based explicitly on what you had learned in earlier professional development activities? 5. Followed up with related activities that built upon what you learned as part of the activity?

The scale of coherence was created by combining these 5 items and computing a mean score.

The reliability of the active learning scale, as indexed by Cronbach’s alpha, was .83. On the four-point scale (from 0 to 3), the average level of coherence was 2.27 (with a standard deviation of .50), which indicates that teachers experienced quite coherent professional development across activities and over time.

COLLECTIVE PARTICIPATION

Collective participation relates to opportunities for teachers from the same educational setting to engage in joint professional development. To assess the level of collective professional learning opportunity, the PDAL asked the following two questions with a two-point response scale.

Teachers may participate in professional development activities alone or with groups of teachers from their school. For this professional development activity, with whom did you participate?

0=no and 1=yes 1. I participated with most or all of the teachers from my school. 2. I participated with most or all of the teachers from my department or grade level.

Page 19: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 16

The scale of collective participation was created by combining these 2 items and computing a sum score. The reliability of the active learning scale, as indexed by Cronbach’s alpha, was .54. And on the two-point scale, the average level of collective participation was .72, which indicates that many teachers participated in professional development alone rather than collectively.

CONTENT FOCUS

Content focus concerns how much emphasis was given in a professional development activity on enhancing teachers’ content knowledge and skills, including instructional strategies. We took a couple of different approaches to collect information about content focus. First, we examined how many broad instructional activities were covered in each respondent’s professional development. For example, in science, the PDAL asked science teachers whether their professional development was focused on any of the following broad instructional activities for use in their classroom:

Were any of the following instructional topics covered in this professional development activity? Classroom activities when students do a laboratory activity,

investigation, or experiment yes no Classroom activities when students collect data (other than

laboratory activities) yes no Classroom activities when students work in pairs or small groups (other than

laboratory activities) yes no Classroom activities when students use computers, calculators, or

other educational technology to learn science yes no We expected that teachers who had professional development that covered one of these four

broad instructional domains would be likely to increase their emphasis on these domains in their classroom instruction.

Then, for each instructional activity that was marked as being covered in their professional development, we further asked teachers which specific instructional strategies were a primary focus in their professional development sessions.5 For example, for the instructional activity area of laboratory activities, investigations, or experiments, the PDAL asked science teachers the following questions:

Did the professional development focus on any of the following instructional strategies for use in your classroom relating to laboratory activities, investigations, or experiments?

Make educated guesses, predictions, or hypotheses yes no Follow step-by-step directions yes no Use science equipment or measuring tools yes no Collect data yes no Change a variable in an experiment to test a hypothesis yes no Organize and display information in tables or graphs yes no Analyze and interpret science data yes no Design their own investigation or experiment to solve a scientific question yes no

5 A skip pattern was used in PDAL so that when teachers report that their PD did not cover a broad instructional activity (e.g., working in pairs), then they are not asked follow-up questions about specific instructional strategies relating to the broad category of instructional activity.

Page 20: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

17 MSP PD Evaluation Study Report

It was expected that professional development focused on a particular instructional strategy

(such as making educated guesses, predictions, or hypotheses) is likely to increase teachers’ use of that strategy in their classrooms.

Similarly, the PDAL asked mathematics teachers whether their professional development was focused on any of the following broad instructional activities for use in their classroom:

Were any of the following instructional topics covered in this professional development activity? Classroom activities when students work individually yes no Classroom activities when students work in pairs or small groups yes no Classroom activities when students use computers, calculators, or

other technology to learn mathematics yes no Classroom activities when students use hands-on materials yes no

In addition, we measured the extent to which teachers’ professional development focused on particular types of assessment activities. For example, in science, the PDAL asked science teachers whether their professional development was focused on any of the following assessment practices for use in their classroom:

Did the professional development focus on any of the following uses of assessment in your classroom with students?

Extended response item for which student must explain or justify solution yes no Performance tasks or events (for example, hands-on activities) yes no Individual or group demonstration, presentation yes no Science projects yes no Portfolios yes no

We collected a parallel set of information on content focus for mathematics teachers as well.

For a complete list of broad instructional activities and specific instructional and assessment strategies covered in mathematics and science professional development, see Appendix B.

DURATION OF PROFESSIONAL DEVELOPMENT ACTIVITY: TOPIC INTENSITY

As stated in the No Child Left Behind (NCLB) Act, high-quality professional development activities are expected to provide teachers with, among others, “intensive and sustained” learning experiences. To estimate how intensive the professional development activities are, we created a variable called topic intensity, which was operationally defined as contact hours divided by the number of topics on broad instructional activities that were of a primary focus in each respondent’s professional development. If a professional development activity devotes more time and is focused on a limited number of topics on instructional activities in a given session, the topic intensity increases. It is expected that the topic intensity is likely to have a positive impact on teachers’ instructional practice.

Measures of Teachers’ Classroom Instruction Using the SEC, we developed measures of teachers’ classroom instruction during the 2002-

03 school year and the 2004-05 school year. The items on the SEC matched those on the PDAL concerning broad instructional activities, specific instructional strategies, and assessment practices in the classroom (see Appendix C for the list of SEC questions examined in this study).

For each mathematics teacher, the SEC asked about the extent to which each of four broad instructional activities are used in the classrooms on a 6-point scale, where 0=none, 1=little (about

Page 21: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 18

10% or less of instructional time for the school year), 2=some (11-25%), 3=moderate (26-50%), 4=considerable (51-75%), and 5=almost all (more than 75% of instructional time for the school year) 6:

How much of the total mathematics instructional time do students in the target class: Work individually on mathematics exercise, problems, investigations, or tasks? Work in pairs or small groups? Use computers, calculators, or other technology to learn mathematics? Use hands-on materials such as manipulatives (e.g., geometric shapes or algebraic tiles),

measurement instrument (e.g., rulers or protractors), and data collection devices (e.g., surveys or probes)?

For each of the four broad instructional activities listed above, mathematics teachers were asked to estimate the amount of time they spent on utilizing specific instructional strategies in the classroom. 7 For example, for the broad instructional activity in the use of calculators, computers, or other technology, teachers were asked the following five questions regarding the use of specific higher-order instructional strategies, on the same 6-point scale8:

When students in the target class use hands-on materials, how much time do they: Work with manipulatives (for example, counting blocks, geometric shapes, or algebraic tiles) to

understand concepts? Measure objects using tools such as rulers, scales, or protractors? Build models or charts? Collect data by counting, observing, or conducting surveys? Present information to others using manipulatives (for example, chalkboard, whiteboard, poster board,

and projector)?

Thus, for each teacher, we collected data on the extent to which the teacher used each of the four broad instructional activities and specific instructional strategies in the classroom in Year 1, measured on a normalized scale, ranging from 0 to 100% of instructional time for the school year see footnote 6). We collected parallel data on the teacher’s classroom use of each strategy in Year 3. We examined any changes in teachers’ instruction over time between Years 1 and 3, as potential effects of their participation in professional development that took place in the intervening year (Year 2 and summer period).

Additional Measures In addition to the quality of professional development measures such as content focus, active

learning, coherence, and duration, we also collected data about a few contextual variables. In

6 In Year 3, the same set of questions was asked again on a new 5-point scale, where 0=none, 1=little (about 10% or less of instructional time for the school year), 2=some (11-25%), 3=moderate (26-50%), 4=considerable (more than 50% of instructional time for the school year). Year 1 SEC data were recoded to make the longitudinal data comparable across Years 1 and 3. These longitudinal data were again recoded to 0, 0.5, 1, 2.5, and 4, respectively, and then multiplied by 15. The resulting values of 0, 7.5, 15, 37.5, and 60, respectively, represent a normalized percentage of time spent on each of the instructional activities and strategies that were reported by teachers to have been spent on during the school years of 2002-03 and 2004-05. See Appendix D for the steps taken to normalize the relative instructional time spent on broad instructional activities and specific strategies within the activities. 7 In the SEC, there are 12 items at the broad instructional activity level. However, only four of them were used in this study because they were asked in the PDAL in a parallel fashion. 8 While PDAL asked teachers follow-up questions about specific instructional strategies only if they have reported positively on a set of broad instructional activities, SEC did not employ such conditional inquiry patterns.

Page 22: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

19 MSP PD Evaluation Study Report

particular, we collected information about each teacher’s subject area (i.e., mathematics or science), school grade level (i.e., middle or high school), MSP program treatment status (i.e., treatment or control group), and MSP site (i.e., Brockport, Cleveland, Corpus Christi, or El Paso).

It was hypothesized that after two years of MSP program implementation, teachers in the treatment group would exhibit higher quality instruction than their counterparts in the comparison group, after controlling for any pre-existing differences.9 However, it must be stressed that any effect of the treatment would be mediated through the quality of professional development activities, where MSP professional development is expected to yield a higher quality than others. If we include in our model both the treatment status and the measures of the quality of professional development as predictors at the same time, we expect that the quality of professional development would account for most of the impact of the treatment on instruction. However, if our mediating variables aren’t perfect measures, the treatment would yield some of the effect and the quality measures would do the same. Hence, we may be watering down the effect of the quality of professional development on instruction due to the presence of the treatment effect.

In addition, we anticipated variation across MSP sites in professional development, since each local university-school district partnership adopted and implemented different strategies to achieve their MSP program goals, based in part on local curricula, knowledge and skills.

Finally, we anticipated that there might be some subject and school-level related differences in the quality of professional development (e.g., content focus, active learning opportunities) and how teachers employ different instructional strategies in their classrooms (e.g., Kennedy, 1998).

9 Addressing potential pre-existing differences between treatment and comparison groups, we showed that, based on Year 1 SEC baseline data analyses, teachers in the treatment group were quite similar with their peers in the comparison group not only in professional development they received prior to the MSP study but also in instructional practices in the baseline year (see our interim report; Blank et al., 2005).

Page 23: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 20

Descriptive Data on Teachers’ Participation in Professional Development

In this section, we examine the extent to which teachers participated in professional development activities that were focused on a set of broad instructional strategies, specific instructional strategies and assessment strategies in 2003-04, separately for mathematics and science. For each subject, we present basic descriptive data on the extent to which teachers’ professional development was focused on (1) broad instructional activities, (2) specific instructional strategies, and (3) assessment practice.

Mathematics Some instructional activities or strategies were stressed more than others in professional

development activities that teachers participated in. For example, as Exhibit 6 shows, professional development that teachers attended seem to have been focused more on classroom activities involving student use of computers, calculators, and other technology than on classroom activities involving individualized student work (0.65 compared with 0.55).

Exhibit 6: Coverage of Broad Instructional Topics in Mathematics Professional Development

Year 2 Were any of the following instructional topics covered in this professional development activity? Mean (SD) 1 Classroom activities when students work individually 0.55 0.33 2 Classroom activities when students work in pairs or small groups 0.63 0.31 3 Classroom activities when students use computers, calculators, or other

technology 0.65 0.30 4 Classroom activities when students use hands-on materials 0.60 0.31 Note: Mean represents the extent to which a particular instructional topic was covered in professional development on a dichotomous scale of 0 (not covered) or 1 (covered).

A much wider gap in the focus of professional development was found among specific instructional strategies when students work individually. For example, strategies involving the application of mathematical concepts to "real-world" problems were focused on average in two-thirds of teachers’ professional development activities (.67), where strategies stressing completing or conducting proofs or demonstration of their mathematical reasoning was reported to be of primary focus in only 3 out of 10 professional development activities (.30).

Page 24: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

21 MSP PD Evaluation Study Report

Exhibit 7: Focus of Mathematics Professional Development on the Use of Specific Instructional Strategies in Class

Year 2

Did the professional development focus on any of the following instructional strategies for use in your classroom (when students in the target class work individually)? Mean (SD) 1 Solve word problems from a textbook or worksheet. 0.50 0.37 2 Solve non-routine mathematical problems (for example, problems that require

novel or non-formulaic thinking). 0.53 0.38 3 Explain their reasoning or thinking in solving a problem, using several sentences

orally or in writing. 0.59 0.37 4 Apply mathematical concepts to "real-world" problems. 0.67 0.37 5 Make estimates, predictions or hypotheses. 0.52 0.36 6 Analyze data to make inferences or draw conclusions. 0.51 0.38 7 Work on a problem that takes at least 45 minutes to solve. 0.31 0.35 8 Complete or conduct proofs or demonstrations of their mathematical reasoning. 0.30 0.34

Year 2 Did the professional development focus on any of the following instructional strategies for use in your classroom (when students in the target class work in pairs or small groups)? Mean (SD) 1 Solve word problems from a textbook or worksheet. 0.49 0.35 2 Solve non-routine mathematical problems (for example, problems that require

novel or non-formulaic thinking). 0.55 0.35 3 Talk about their reasoning or thinking in solving a problem. 0.70 0.33 4 Apply mathematical concepts to "real-world" problems. 0.71 0.33 5 Make estimates, predictions or hypotheses. 0.58 0.34 6 Analyze data to make inferences or draw conclusions. 0.58 0.36 7 Work on a problem that takes at least 45 minutes to solve. 0.44 0.33 8 Complete or conduct proofs or demonstrations of their mathematical reasoning. 0.46 0.34

Year 2 Did the professional development focus on any of the following instructional strategies for using calculators, computers, or other educational technology in your classroom with your students? Mean (SD) 1 Use sensors and probes 0.36 0.31 2 Display and analyze data 0.58 0.34 3 Develop geometric concepts (for example, using simulations) 0.38 0.35

Year 2 Did the professional development focus on any of the following instructional strategies for using hands-on materials in your classroom with your students?

Mean (SD) 1 Work with manipulatives (for example, counting blocks, geometric shapes, or

algebraic tiles) to understand concepts. 0.57 0.37

2 Measure objects using tools such as rulers, scales, or protractors. 0.48 0.35 3 Build models or charts. 0.60 0.35 4 Collect data by counting, observing, or conducting surveys. 0.47 0.35 5 Present information to others using manipulatives (for example, chalkboard,

whiteboard, poster board, projector). 0.62 0.34 Note: Mean represents the extent to which a specific instructional strategy was focused on a dichotomous scale of 0 (not focused) or 1 (focused).

Page 25: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 22

As Exhibit 8 displays, the extent to which professional development focused on assessment strategies ranges from the highest of .63 on extended response items, to individual or group presentation (0.54), mathematics projects (0.36), and down to the lowest of 0.16 on portfolios.

Exhibit 8: Focus of Mathematics Professional Development on the Use of Specific Assessment Strategies in Class

Year 2 Did the professional development focus on any of the following uses of assessment in your classroom with students?:

Mean (SD) 1 Extended response item for which student must explain or justify solution. 0.63 0.36 2 Performance tasks or events (for example, hands-on activities). 0.60 0.36 3 Individual or group demonstration, presentation. 0.54 0.38 4 Mathematics projects. 0.36 0.36 5 Portfolios. 0.16 0.29 Note: Mean represents the extent to which a specific instructional strategy was focused on a dichotomous scale of 0 (not focused) or 1 (focused).

Science While classroom activities involving student use of computers, calculators, and other

technology were most likely to be covered in mathematics professional development, they were least likely to be covered in science professional development. As Exhibit 9 indicates, other instructional topics were evenly covered in science professional development activities.

Exhibit 9: Coverage of Broad Instructional Topics in Science Professional Development

Year 2 Were any of the following instructional topics covered in this professional development activity? Mean (SD) 1 Classroom activities when students do a laboratory activity, investigation, or

experiment. 0.62 0.31 2 Classroom activities when students collect data (other than laboratory

activities). 0.62 0.31 3 Classroom activities when students work in pairs or small groups (other than

laboratory activities). 0.63 0.31 4 Classroom activities when students use computers, calculators or other

educational technology to learn science. 0.53 0.32 Note: Mean represents the extent to which a particular instructional topic was covered in professional development on a dichotomous scale of 0 (not covered) or 1 (covered).

As Exhibit 10 shows, instructional strategy involving data collection during lab activities or experiments and making observations/classifications were most likely to be focused in science professional development (0.80 and 0.78, respectively). In contrast, teaching strategies of using sensors and probes, working with pairs or in small groups on writing project, or completing written assignments from the textbook or workbook were least focused topics in science professional development (0.39, 0.40, 0.41, respectively).

Page 26: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

23 MSP PD Evaluation Study Report

Exhibit 10: Focus of Science Professional Development on the Use of Specific Instructional Strategies in Class

Year 2 Did the professional development focus on any of the following instructional strategies for use in your classroom with your students (relating to laboratory activities, investigations, or experiments)? Mean (SD) 1 Make educated guesses, predictions, or hypotheses. 0.70 0.35 2 Follow step-by-step directions. 0.69 0.37 3 Use science equipment or measuring tools. 0.75 0.33 4 Collect data. 0.80 0.31 5 Change a variable in an experiment to test a hypothesis. 0.56 0.40 6 Organize and display information in tables or graphs. 0.71 0.33 7 Analyze and interpret science data. 0.72 0.35 8 Design their own investigation or experiment to solve a scientific question. 0.63 0.38 9 Make observations/classifications. 0.78 0.31

Year 2 Did the professional development focus on any of the following instructional strategies for use in your classroom with your students (relating to collecting science data or information)? Mean (SD) 1 Have class discussions about the data. 0.71 0.35 2 Organize and display the information in tables or graphs. 0.75 0.32 3 Make a prediction based on the data. 0.67 0.35 4 Analyze and interpret the information or data, orally or in writing. 0.69 0.35 5 Make a presentation to the class on the data, analysis, or interpretation. 0.65 0.35

Year 2 Did the professional development focus on any of the following instructional strategies for use in your classroom with your students (when students work in pairs or small groups)? Mean (SD) 1 Talk about ways to solve science problems, such as investigations. 0.68 0.34 2 Complete written assignments from the textbook or workbook. 0.41 0.39 3 Write up results or prepare a presentation from a laboratory activity,

investigation, experiment or a research project. 0.57 0.37 4 Work on an assignment, report or project over an extended period of time. 0.49 0.38 5 Work on a writing project or entries for portfolios seeking peer comments to

improve work. 0.40 0.38 6 Review assignments or prepare for a quiz or test. 0.41 0.40

Year 2 Did the professional development focus on any of the following instructional strategies for use in your classroom with your students (relating to calculators, computers, or other educational technology)? Mean (SD) 1 Use sensors and probes (for example, CBL's). 0.39 0.39 2 Display and analyze data. 0.67 0.37 3 Solve problems using simulations. 0.45 0.38 Note: Mean represents the extent to which a specific instructional strategy was focused on a dichotomous scale of 0 (not focused) or 1 (focused).

Page 27: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 24

As Exhibit 11 demonstrates, projects and portfolios were less likely to be focused than other assessment strategies in science professional development. However, the overall level of content focus on these two strategies was higher than that in mathematics: 0.43 and 0.31 in science, while 0.36 and 0.16 in mathematics.

Exhibit 11: Focus of Science Professional Development on the Use of Specific Assessment Strategies in Class

Year 2 How often you use each of the following when assessing students in the target science class?

Mean (SD) 1 Extended response item for which student must explain or justify solution. 0.46 0.38 2 Performance tasks or events (for example, hands-on activities). 0.63 0.36 3 Individual or group demonstration, presentation. 0.61 0.39 4 Science projects. 0.43 0.41 5 Portfolios. 0.31 0.37 Note: Mean represents the extent to which a specific instructional strategy was focused on a dichotomous scale of 0 (not focused) or 1 (focused).

Page 28: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

25 MSP PD Evaluation Study Report

Descriptive Data on Changes in Teachers’ Instruction over Time Based on the data that we collected with the SEC, we examined the extent to which changes

were made between 2002-03 and 2004-05 in teaching practice.10 Separately for mathematics and science, we present data on (1) broad instructional activities, (2) specific instructional strategies, and (3) assessment practice. We hypothesized that changes in teaching practice may have been affected by the quality of professional development that teachers received (e.g., content focus, active learning, coherence, and topic intensity).

Mathematics Generally, the proportion of time spent on each of these instructional or assessment strategies

stayed about the same over the two year period between 2002-03 and 2004-05. However, some of the strategies underwent a significant change over time. For example, as Exhibit 12 shows, the amount of time students worked in pairs or small groups on mathematics exercises, problems, and investigations increased significantly between 2002-03 and 2004-05 (from 10.3% to 12.1%), while the amount of time students used on other types of activities remained about the same.11 In addition, some strategies were used relatively more than others. For example, students seem to have spent more time on using computer and other technology (11.8% and 12.4% in 2002-03 and 2004-05, respectively) than on using manipulatives and other hands-on materials (8.7% and 8.4% in 2002-03 and 2004-05, respectively).12

Generally, changes were made in the direction of spending less time on classroom activities in which students work individually; spending more time on classroom activities that students work in pairs or small groups or students engage in activities involving computers or other technology (see Exhibit 13); and spending far more time on assessment activities (see Exhibit 14).

10 For descriptive SEC data regarding other measures on professional development and instruction (e.g., alignment of instruction to standards), refer to Smithson and Blank (2006). 11 Assuming 180 school days per year and an hour for math or science per a school day, there are the total of 180 hours for the school year. Therefore, an increase of one percent of instructional time for the school year would be translated to a jump by 1.8 hours; change in instructional time from 10.3 to 12.1% can be converted to an increase from 18.5 to 21.8 hours. 12 The descriptive data reported in this section about changes in teachers’ instruction over time are based on teachers who provided the SEC data at both time points (2002-03 and 2004-05).

Page 29: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 26

Exhibit 12: Change in Mathematics Teachers' Use of Broad Instructional Activities in the Classroom (in Percentage of Instructional Time): Between Years 1 and 3

Year 1 Year 3 How much of the total mathematics instructional time do students in the target class: Mean (SD) Mean (SD) Signif. 1 Work individually on mathematics exercises, problems,

investigations, or tasks. 11.1 5.65 10.1 6.73 ns 2 Work in pairs or small groups on mathematics exercises,

problems, investigations, or tasks. 10.3 5.36 12.1 7.35 * 3 Use computers, calculators, or other technology to learn

mathematics. 11.8 6.93 12.4 8.91 ns 4 Use manipulatives (for example, geometric shapes or algebraic

tiles), measurement instruments (for example, rulers or protractors), and data collection devices (for example, surveys or probes). 8.7 5.28 8.4 6.01 ns

Notes: 1. Test of significance of mean difference between Years 1 and 3: 'ns' denotes non-significance; + <10; * <.05; ** <.01; and *** <.001. 2. Mean represents the normalized percentage of instructional time on the activity spent by teachers for the school year (on the scale of 0 to 100%).

Page 30: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

27 MSP PD Evaluation Study Report

Exhibit 13: Change in Mathematics Teachers' Use of Specific Instructional Strategies in the Classroom (in Percentage of Instructional Time): Between Years 1 and 3

Year 1 Year 3 When students in the target class work individually on mathematics exercises, problems, investigations, or tasks, how much time do they: Mean (SD) Mean (SD) Signif. 1 Solve word problems from a textbook or worksheet. 2.2 1.82 1.9 2.06 ns 2 Solve non-routine mathematical problems (for example, problems

that require novel or non-formulaic thinking). 1.4 1.30 1.4 2.23 ns 3 Explain their reasoning or thinking in solving a problem, using

several sentences orally or in writing. 1.3 1.11 1.1 0.87 ns 4 Apply mathematical concepts to "real-world" problems. 2.1 1.51 2.1 2.60 ns 5 Make estimates, predictions or hypotheses. 1.7 1.67 1.3 1.07 * 6 Analyze data to make inferences or draw conclusions. 1.4 1.01 1.1 0.86 * 7 Work on a problem that takes at least 45 minutes to solve. 0.4 0.64 0.7 1.52 ns 8 Complete or conduct proofs or demonstrations of their

mathematical reasoning. 0.6 0.84 0.5 0.70 ns

Year 1 Year 3 When students in the target class work in pairs or small groups on mathematics exercises, problems, investigations, or tasks, how much time do they: Mean (SD) Mean (SD) Signif. 1 Solve word problems from a textbook or worksheet. 1.7 1.16 2.0 1.93 ns 2 Solve non-routine mathematical problems (for example, problems

that require novel or non-formulaic thinking). 1.3 1.00 1.6 1.12 * 3 Talk about their reasoning or thinking in solving a problem. 1.7 1.07 2.1 1.90 ns 4 Apply mathematical concepts to "real-world" problems. 1.8 1.19 2.2 1.38 + 5 Make estimates, predictions or hypotheses. 1.4 1.00 1.6 1.20 ns 6 Analyze data to make inferences or draw conclusions. 1.5 0.95 1.5 1.21 ns 7 Work on a problem that takes at least 45 minutes to solve. 0.8 0.91 0.8 0.93 ns 8 Complete or conduct proofs or demonstrations of their

mathematical reasoning. 0.8 0.76 0.8 0.89 ns

Year 1 Year 3 When students in the target class are engaged in activities that involve the use of calculators, computers, or other educational technology as part of mathematics instruction, how much time do they: Mean (SD) Mean (SD) Signif. 1 Use sensors and probes. 0.0 0.01 1.2 1.62 *** 2 Display and analyze data. 2.5 2.08 2.7 2.88 ns 3 Develop geometric concepts (for example, using simulations). 1.5 2.04 1.5 1.77 ns

Year 1 Year 3 When students in the target class use hands-on materials, how much time do they: Mean (SD) Mean (SD) Signif. 1 Work with manipulatives (for example, counting blocks,

geometric shapes, or algebraic tiles) to understand concepts. 2.0 1.73 2.0 1.75 ns 2 Measure objects using tools such as rulers, scales, or protractors. 2.0 1.53 1.8 1.56 ns 3 Build models or charts. 1.6 1.26 1.7 1.33 ns 4 Collect data by counting, observing, or conducting surveys. 1.5 1.14 1.4 1.25 ns 5 Present information to others using manipulatives (for example,

chalkboard, whiteboard, poster board, projector). 1.8 1.41 1.8 1.63 ns Notes: 1. Test of significance of mean difference between Years 1 and 3: 'ns' denotes non-significance; + <10; * <.05; ** <.01; and *** <.001. 2. Mean represents the normalized percentage of instructional time on the activity spent by teachers for the school year (on the scale of 0 to 100%).

Page 31: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 28

As Exhibit 14 demonstrates, time spent on assessment increased substantially between 2002-03 and 2004-05. In the milieu of intensified accountability and standards-based reform, it is not surprising to see such an overall rise in assessment activities, especially in mathematics where it is part of NCLB’s annual assessment program. NCLB requires testing of children in reading and mathematics every year for grades 3-8 and for one grade in high school.

Exhibit 14: Change in the Frequency of Mathematics Teachers' Use of Specific Assessment Strategies in the Classroom: Between Years 1 and 3

Year 1 Year 3 How often you use each of the following when assessing students in the target mathematics class?

Mean (SD) Mean (SD) Signif. 1 Extended response item for which student must explain or justify

solution. 2.2 1.05 2.2 0.98 ns 2 Performance tasks or events (for example, hands-on activities). 1.9 1.07 2.2 1.14 + 3 Individual or group demonstration, presentation. 1.3 0.91 1.7 1.17 * 4 Mathematics projects. 1.0 0.76 1.3 1.08 ** 5 Portfolios. 0.8 1.05 1.1 1.33 + Notes: 1. Test of significance of mean difference between Years 1 and 3: 'ns' denotes non-significance; + <10; * <.05; ** <.01; and *** <.001. 2. Mean represents the average activity frequency on the scale of 0=never, 1=1-4 time per year, 2=1-3 times per month, 3=1-3 time per week, and 4=4-5 times per week.

Science In general, fewer significant changes were made in science teachers’ instructional practices

than in mathematics teachers’ practices. The few changes observed were only marginally significant. For example, students in the target class spent slightly less time on reviewing assignments or preparing for a quiz or test in 2004-05 than in 2002-03: from 2.6% to 1.9%. In addition, some strategies were used relatively more than others. For example, students seem to have spent more time on doing a laboratory activity, investigation, or experiment (12.6% and 13.5% in 2002-03 and 2004-05, respectively) than on using computer and other technology (8.3% and 9.1% in 2002-03 and 2004-05, respectively).

Exhibit 15: Change in Science Teachers' Use of Broad Instructional Activities in the Classroom (in Percentage of Instructional Time): Between Years 1 and 3

Year 1 Year 3 How much of the total science instructional time do students in the target class:

Mean (SD) Mean (SD) Signif. 1 Do a laboratory activity, investigation, or experiment. 12.6 5.36 13.5 5.97 ns 2 Collect data (other than laboratory activities). 7.6 3.79 8.7 4.08 ns 3 Work in pairs or small groups (other than laboratory activities). 13.3 6.17 12.4 5.82 ns 4 Use computers, calculators or other educational technology to

learn science. 8.3 5.70 9.1 6.28 ns Notes: 1. Test of significance of mean difference between Years 1 and 3: 'ns' denotes non-significance; + <10; * <.05; ** <.01; and *** <.001. 2. Mean represents the average percentage of instructional time on the activity spent by teachers for the school year (on the scale of 0 to 100%).

Page 32: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

29 MSP PD Evaluation Study Report

Exhibit 16: Change in Science Teachers' Use of Specific Instructional Strategies in the Classroom (in Percentage of Instructional Time): Between Years 1 and 3

Year 1 Year 3 When students in the target class are engaged in laboratory activities, investigations, or experiments as part of science instruction, how much time do they: Mean (SD) Mean (SD) Signif. 1 Make educated guesses, predictions, or hypotheses. 1.2 0.75 1.4 0.87 ns 2 Follow step-by-step directions. 1.5 0.86 1.7 1.21 ns 3 Use science equipment or measuring tools. 1.7 0.92 1.8 1.33 ns 4 Collect data. 1.6 0.93 1.9 1.24 ns 5 Change a variable in an experiment to test a hypothesis. 1.0 0.59 1.1 0.62 ns 6 Organize and display information in tables or graphs. 1.5 0.94 1.5 0.78 ns 7 Analyze and interpret science data. 1.5 0.90 1.6 0.94 ns 8 Design their own investigation or experiment to solve a scientific

question. 0.9 0.68 0.9 0.63 ns 9 Make observations/classifications. 1.6 0.88 1.6 1.08 ns

Year 1 Year 3 When students in the target class collect science data or information from books, magazines, computers, or other sources (other than laboratory activities), how much time do they:

Mean (SD) Mean (SD) Signif. 1 Have class discussions about the data. 1.6 1.04 1.6 1.09 ns 2 Organize and display the information in tables or graphs. 1.6 0.92 2.0 1.38 + 3 Make a prediction based on the data. 1.5 0.93 1.8 1.03 ns 4 Analyze and interpret the information or data, orally or in writing. 1.6 1.14 1.8 0.82 ns 5 Make a presentation to the class on the data, analysis, or

interpretation. 1.3 0.90 1.6 0.96 + Year 1 Year 3 When students in the target class work in pairs or small groups

(other than in the science laboratory), how much time do they: Mean (SD) Mean (SD) Signif.

1 Talk about ways to solve science problems, such as investigations. 2.5 2.15 2.4 2.03 ns

2 Complete written assignments from the textbook or workbook. 2.0 1.58 2.0 1.45 ns 3 Write up results or prepare a presentation from a laboratory

activity, investigation, experiment or a research project. 2.5 1.80 2.9 2.28 ns 4 Work on an assignment, report or project over an extended period

of time. 2.6 2.89 2.1 1.53 ns 5 Work on a writing project or entries for portfolios seeking peer

comments to improve work. 1.0 1.02 1.1 1.03 ns 6 Review assignments or prepare for a quiz or test. 2.6 2.70 1.9 1.21 +

Year 1 Year 3 When students in the target class are engaged in activities that involve the use of calculators, computers, or other educational technology as part of science instruction, how much time do they: Mean (SD) Mean (SD) Signif. 1 Use sensors and probes (for example, CBL's). 0.6 0.77 1.2 3.04 ns 2 Display and analyze data. 2.0 1.93 2.4 1.98 ns 3 Solve problems using simulations. 1.2 1.32 1.3 1.37 ns Notes: 1. Test of significance of mean difference between Years 1 and 3: 'ns' denotes non-significance; + <10; * <.05; ** <.01; and *** <.001. 2. Mean represents the average percentage of instructional time on the activity spent by teachers for the school year (on the scale of 0 to 100%).

As can be seen Exhibit 17, an upward trend in assessment found in mathematics was not observed in science. Unlike mathematics, science is not part of high stakes annual assessment

Page 33: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 30

program linked to NCLB. In contrast, time spent on portfolios has rather decreased slightly between 2002-03 and 2004-05.

Exhibit 17: Change in the Frequency of Science Teachers' Use of Specific Assessment Strategies in Class: Between Years 1 and 3

Year 1 Year 3 How often you use each of the following when assessing students in the target science class?

Mean (SD) Mean (SD) Signif. 1 Extended response item for which student must explain or justify

solution. 2.3 0.89 2.5 1.02 ns 2 Performance tasks or events (for example, hands-on activities). 2.4 0.84 2.5 0.86 ns 3 Individual or group demonstration, presentation. 1.8 0.90 1.8 1.02 ns 4 Science projects. 1.2 0.65 1.3 0.81 ns 5 Portfolios. 1.3 1.31 1.0 1.20 + Notes: 1. Test of significance of mean difference between Years 1 and 3: 'ns' denotes non-significance; + <10; * <.05; ** <.01; and *** <.001. 2. Mean represents the average activity frequency on the scale of 0=never, 1=1-4 time per year, 2=1-3 times per month, 3=1-3 time per week, and 4=4-5 times per week.

Page 34: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

31 MSP PD Evaluation Study Report

Analysis Strategy We conducted the analysis of data from two waves of the Surveys of Enacted Curriculum

(SEC), which were collected in spring 2003 and spring 2005, respectively. In addition, we analyzed the PDAL data, which were collected in 2003-04. We sought to explain teaching practice in 2004-05 (the second wave of SEC) on the basis of teachers’ professional development experiences during 2003-04, controlling for teachers’ classroom teaching practice in 2002-03 (the first wave of SEC). Given this analysis strategy, the sample for the analysis is restricted to teachers who returned both waves of SEC, who participated in professional development in 2003-04 and submitted PDAL data, and who continued to teach the same subject over three years of the study. Finally, the sample is further restricted to teachers who provided complete data on all necessary items. The number of teachers meeting these conditions ranges from 59 to 66 in mathematics and from 39 to 42 in science, depending on the specific analysis.

We conducted three parallel sets of analyses, each focusing on a different area of teaching or assessment practice, separately for mathematics and science. First, we examined the effects of professional development on teaching practice involving the use of broad instructional activities (e.g., small group activities, activities involving hands-on materials); then, we looked into four groups of specific instructional strategies for higher-order learning (e.g., building models or charts when students use hands-on materials); and finally, we examined assessment practice used in the classroom (e.g., science projects or portfolios).

This analysis addressed three main issues about the effects of professional development on instructional practice. First, we used the data to examine whether teachers who participated in professional development that focused on a particular teaching strategy (e.g., the use of hands-on materials) increased their classroom use of that strategy over the period from 2002-03 to 2004-05 more than did similar teachers who did not participate in professional development that focused on the strategy. Second, we used the data to examine whether teachers who participated in professional development that focused on several related strategies (e.g., the use of computers or calculators and the use of such hands-on materials as manipulatives and measurement instruments) increased their use of computers or calculators more than did teachers who focused only on that strategy during their professional development. Finally, we used the data to examine whether the benefits of participating in professional development that focused on particular teaching strategies were strengthened if a teacher’s professional development had other features of high quality (i.e., active learning, coherence, collective participation, and topic intensity). The analytic methods used in these three analyses parallel those used in the earlier Eisenhower longitudinal study (Desimone et al., 2002)

To estimate the magnitude of participating in professional development focused on particular teaching strategies involving hands-on materials, we created two new variables to characterize each professional development activity: the mean focus the activity gave to the set of specific instructional strategies in a domain and the relative focus the activity gave to each of the strategies in the domain.

Mean focus. To assess the extent to which the professional development activity that a teacher attended focused on multiple, related strategies, we calculated the average or mean focus given to the teaching strategies we measured. For the use of instructional strategies involving hands-on materials, the mean focus is the average emphasis placed on the five hands-on strategies: (1) Work with manipulatives to understand concepts; (2) measure objects using tools such as rulers; (3) build models or charts; (4) collect data by counting, observing, or conducting surveys; and (5) present information to others using manipulatives. Since each strategy is coded 1 if it was given attention as part of the teacher’s professional development activity and 0 if it was not, the mean focus ranges from 0, if no hands-on strategies were covered in the activity, to 0.4 if two of the five strategies were covered, to 1 if all five

Page 35: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 32

strategies were covered. The more strategies the activity focused on, the higher the mean focus. (See Exhibit 18 for more information on the derivation of mean focus.)

Relative focus. To measure the effects of focusing on one strategy rather than another within a professional development activity, we used a measure of relative focus. For example, if an activity focused on two of the five instructional strategies involving hands-on materials (e.g., “working with manipulatives to understand concepts” and “measuring objects using tools such as rulers”), the relative focus for the use of hands-on materials would have a value of 0.6—calculated as the difference between the value of 1 for the “working with manipulatives to understand concepts” strategy and the mean focus of 0.4. (See Exhibit 18 for more information on the derivation of relative focus.)

We chose to use mean focus and relative focus to characterize professional development activities because the variables clearly distinguish between the benefits of focusing on one strategy rather than another within a professional development activity (captured by the relative focus) and the benefits of professional development activities that focus on many or few strategies (captured by the mean focus).13 The effects of focusing on a set of strategies in a professional development activity can be examined by comparing the magnitude of the coefficients for mean focus and relative focus. If the coefficient for mean focus is higher than the coefficient for relative focus, there is a “spillover” effect in which focusing on a set of related strategies has an effect over and above the effect of focusing on an individual strategy alone. If the coefficients for the two variables are equal, focusing on multiple strategies neither helps nor hurts. If the coefficient for mean focus is lower than the coefficient for relative focus, it indicates that focusing on multiple strategies is harmful—that is, activities focusing on a single strategy are more effective in boosting the use of the strategy than are activities that focus on several related strategies.14

13 The approach we followed is similar to the approach used by Bryk and Raudenbush (1992) to distinguish individual and contextual effects in models involving students nested within schools. In such models, Bryk and Raudenbush propose centering measures of student background on the school mean and entering both the centered student values and the school means in the analysis. 14 The conclusions can be derived from the variable definitions in Exhibit 19. If tpi is the focus given to strategy p by teacher i (coded 1/0), mi is the mean focus on the five hands-on strategies for teacher i, dpi = (tpi-mi) is the relative focus on strategy p for teacher i, bm is the coefficient for mean focus, and br is the coefficient for relative focus, then the overall effect on the use of strategy p by teacher i can be written as follows:

pirirmpirimpirim tbmbbmtbmbdbmb +−=−+=+ )()( ,

Thus, mean focus (mi) has a positive effect if bm-br > 0; no effect if bm=br, and a negative effect if bm<br.

Page 36: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity
Page 37: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

33 MSP PD Evaluation Study Report

EXHIBIT 18 Calculation of Mean Focus and Relative Focus

Suppose the professional development activity attended by a teacher focused on two hands-on instructional strategies (i.e., working with manipulatives to understand concepts and measuring objects using tools such as rulers), but it did not focus on the other three strategies (building models or charts; collecting data by counting, observing, or conducting surveys; and presenting information to others using manipulatives). As described in the text, the mean focus for the teacher is 2/5, or 0.4.

The relative focus for each of the four strategies is computed by subtracting the mean focus from the focus for each strategy. The results are shown in the table below.

Hands-on instructional strategies Focus Relative focus Strategy 1: Working with manipulatives to understand concepts 1 1-.4= +0.6 Strategy 2: Measuring objects using tools such as rulers 1 1-.4= +0.6 Strategy 3: Building models or charts 0 0-.4= -0.4 Strategy 4: Collecting data by counting, observing, or conducting surveys 0 0-.4= -0.4 Strategy 5: Presenting information to others using manipulatives 0 0-.4= -0.4

As shown in the table, for this example, the relative focus for each strategy has a value of plus 0.6 or minus 0.4, depending on whether or not the activity focused on the strategy. The profile of values of relative focus for the five strategies (labeled 1, 2, 3, 4, and 5) are shown in the graph below.

Page 38: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 34

Statistical Methods Technically, our data have a two-level structure, with the five teaching strategies involving

hands-on materials nested within teachers. In the discussion that follows, we refer to the two levels at which we have data as the “strategy” and the “teacher/activity” levels. We use the term teacher/activity for the teacher level because our data at that level include both teacher characteristics (e.g., treatment status) and characteristics of the quality of the professional development activity (e.g., content focus, active learning) the teacher attended in 2003-04.

Given the two-level structure of the data (i.e., strategy-level and teacher/activity-level), we estimated the effects of professional development by using a hierarchical linear model (Raudenbush & Bryk, 2002). (See Exhibit 19 for the model equations.) The model for the effects of professional development on the use of instructional strategies involving hands-on materials includes the following teacher/activity-level and strategy-level variables:

Teacher/activity-level variables. At the teacher/activity level, we included the following variables in the model: the mean focus given to the five hands-on strategies during the professional development activity the teacher attended in 2003-04, four quality measures of the professional development (e.g., active learning, coherence, collective participation, and topic intensity), and controls for the teacher’s MSP site membership (Brockport, Cleveland, Corpus Christi, or El Paso), MSP program treatment status (treatment or comparison), and school level (middle or high school).15

Strategy-level variables. For each of the five teaching strategies involving hands-on materials, we included two variables in the model: the teacher’s 2002-03 use of the strategy and the relative focus given to the strategy during the professional development the teacher attended in 2003-04. We also included a set of indicator variables specifying the particular strategy. These variables represent the fact that on average, teachers may have increased their use of some strategies more than others over the period under study.

We assumed that two key parameters in the strategy-level model would vary among teachers: the strategy-level intercept, which represents the average use of the five hands-on teaching strategies in 2004-05, controlling for their use in 2002-03 and for the teacher’s 2003-04 participation in professional development; and the strategy-level slope, which represents the effects of focusing on a particular instructional strategy during professional development on classroom use of the strategy in 2004-05.16 (See Exhibit 19 for the equations for the strategy-level and teacher/activity-level equations.) These assumptions reflect the idea that teachers may differ in the degree to which they changed their instructional practice over the period from 2002-03 through 2004-05 and in their responsiveness to professional development. One key analysis question concerns the extent to which a teacher’s strategy-level slope and intercept are affected by characteristics of the activities in which the teacher participated—in particular, the mean focus on the set of hands-on strategies and the quality features of the activity (e.g., active learning). We conducted separate analyses for each of the three areas under study (broad instructional activities, specific instructional strategies, and assessment practice), first for mathematics and then for science. 15 Mean focus is a teacher/activity-level variable because it characterizes the activity the teacher attended as a whole (the average emphasis the professional development activity placed on the five hands-on strategies); it does not characterize each strategy separately. 16 In technical terms, we modeled these two parameters as random effects. We modeled all other parameters as fixed effects.

Page 39: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

35 MSP PD Evaluation Study Report

EXHIBIT 19 HLM Model for Examining the Effects of Professional Development on the Use of

Instructional Strategies

Page 40: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 36

For each area, we estimated a model which includes not only the strategy variables (mean focus and relative focus) and controls (e.g., MSP site, grade level taught), but also four quality measures of professional development.17

Since, for each teacher, we have data on five instructional strategies involving hands-on materials, the sample size available to estimate the effects of professional development on classroom use is about 5 * 60 = 300. The sample of strategies for the analysis of broad instructional activities is about 4 * 60 = 240; and the sample for assessment practice is about 5 * 60 = 300.

Analysis Results The results of our analysis for mathematics professional development effectiveness are presented in Exhibit 20, and the results of analysis for science professional development effectiveness in Exhibit 21. Each exhibit contains the results for six different HLM models. Model 1 examines the effects of focusing on broad instructional activities during the 2003-04 professional development on use of such activities in 2004-05, including the effects of four different measures of professional development quality. Similarly, Models 2 through 5 examine the effects of focusing on specific instructional strategies during professional development on use of such strategies in 2004-05. Finally, Model 6 concerns the effects of focusing on specific assessment practice during professional development on use of such assessment practice in the classroom in 2004-05.

To illustrate the meaning of the parameters displayed in the tables, we discuss in detail one HLM model for mathematics presented in Exhibit 20: Model 1 (the model examining the effect of professional development on the use of broad instructional activities).

Exhibit 20: Model 1 (regarding the use of broad instructional strategies in mathematics class). The parameter estimates for Model 1 are presented in two main groups. The first group of parameters (Level-1) contains the estimates for the strategy-level parameters that do not vary among teachers, i.e., the parameter for the effects of 2002-03 use of each strategy on 2004-05 use, and parameters representing the average 2004-05 use for each specific strategy relative to the reference strategy (usually the last one on the list marked with “-“ in the exhibit), controlling for 2002-03 use. The subscripted Greek letters in parentheses on each row of the table refer to the coefficients in the equations in Exhibit 19.

The first strategy-level parameter shown in the Exhibit (π2 = 0.44***) indicates that as we would expect, 2002-03 use of each strategy has a positive, highly significant effect on 2004-05 use. The remaining strategy-level coefficients represent the average 2004-05 use for each strategy relative to the reference category marked with “-“. For example, the coefficient π5 = 2.39* indicates that in 2004-05, teachers used computers, calculators, or other technology in instruction (which is the third one on the list of four broad instructional activities) more often than they used the reference category (i.e., using manipulatives or measurement instruments, which is the last of four categories), controlling for their 2002-03 level of use. Recall that use of broad instructional activities was measured on a normalized scale ranging from 0 to 100% of instructional time for the school year (see footnote 10 for the meaning of the normalized percentage of time and Appendix D for the scaling method). Thus, the coefficient estimate of 2.39 indicates that this strategy was used significantly more than the reference strategy by 2.39% of instructional time for the year (which can be translated to about 4.3 hours), controlling for prior use.

17 Despite the relatively small sample size, we included all four quality measures in a single HLM model.

Page 41: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

37 MSP PD Evaluation Study Report

EXHIBIT 20: Effects of Mathematics Professional Development on the Use of Instructional Strategies

Coefficient Model 1: Model 2: Model 3: Model 4: Model 5: Model 6:

Broad instructional

activities

Assessment practice

Students work individually

Students work in small groups

Use of technology

Use of hands-on materials

Level 1 model: Use of instructional strategies Extent of classroom use of strategy in 2003, π2 0.44*** 0.29*** 0.42*** 0.11 0.24*** 0.29***Strategy 1 (0/1), π3 1.06 0.76** 0.70*** -0.12 -0.06 0.44*Strategy 2 (0/1), π4 2.13+ 0.48* 0.45* 0.49+ -0.15 0.56**Strategy 3 (0/1), π5 2.39* 0.16 0.62** - -0.37* 0.22 Strategy 4 (0/1), π6 - 0.78** 0.69*** -0.60*** 0.09 Strategy 5 (0/1), π7 0.25 0.45* - -Strategy 6 (0/1), π8 0.14 0.28 Strategy 7 (0/1), π9 -0.01 -0.05 Strategy 8 (0/1), π10 - -Strategy 9 (0/1), π11

Last strategy marked as -, the reference category.Level 2 model: Teacher/activity-level effectsEffects on intercept in strategy model ( π 0i )Intercept (Baseline), β00 1.67 0.74 0.74 -0.69 0.85 0.01 School-level -- High School (vs. Middle), β02 -1.48 0.49+ -0.58+ -1.41* -0.74+ -0.23 Treatment (vs. Control), β03 -0.71 -0.01 0.18 -0.41 -0.91* 0.43+MSP site 1-- Brockport, β04 1.48 -0.70+ 0.22 1.42+ 0.26 0.89*MSP site 2 -- Cleveland, β05 -0.11 -0.24 -0.10 -0.01 0.04 0.37 MSP site 3 -- Corpus Christi, β06 -0.18 -0.48 0.08 -0.18 0.05 0.16 MSP site 4 -- El Paso, the reference category - - - - - -Active learning, β07 -0.11 -0.17 -0.15 0.14 0.61* 0.61**Coherence, β08 1.31 -0.10 0.02 1.12* 0.24 -0.24 Collective participation, β09 -0.75 0.13 -0.08 -0.46 -0.42 0.24 Topic intensity, β0.10 0.06+ 0.01 -0.01 0.02 0.01 -0.01 Mean focus on set of strategies, β01 1.05 0.47 0.08 0.11 0.87 0.14 Effects on d pi slope in strategy model ( π 1i )Baseline(Relative focus on specific strategy), β10 7.86*** 0.12 0.03 1.24+ 0.14 0.39+Variance componentsBetween-teacher variance in intercept <.001 0.64** 0.36+ 1.34* 1.66** 0.66**Covariation in intercept/slope -3.74 -1.03* -0.18 -0.89 -1.17+ -0.48+Between-teacher variance in slope 4.50 3.34** 0.84* 6.05** 1.82* 0.64*Residual 42.43*** 1.89*** 0.88*** 1.66*** 0.89*** 0.60***Degrees of freedomStrategy level 190 446 402 120 241 258Teacher/activity level 54 54 49 51 51 55

Note: + p<.10, * p<.05, **p<.01, ***p<.001

Specific instructional strategies

Page 42: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 38

The second group of parameters (Level-2) presented for Model 1 contains the parameters representing the effects of teacher/activity variables on each teacher’s intercept and slope in the strategy-level model. The first coefficient shown (β00 = 1.67) represents the baseline level of use for the typical teacher in 2004-05, controlling for 2002-03 use. The coefficient indicates that a teacher who did not use broad instructional activities at all in 2002-03 would be expected to have a use of 1.67% in 2004-05.18

The five teacher/activity-level coefficients that follow (β02, β03, β04, β05, and β06) represent the effects of school level, treatment status, and specific MSP site membership. For example, β02 of 1.48 indicates that high school teachers spent 1.48% instructional time less than middle school teachers in terms of average use of broad instructional activities in 2004-05, controlling for prior use. However, none of the five teacher/activity-level coefficients in Model 1 was significant.

There are four teacher/activity-level coefficients concerning the effect of the quality of professional development on the strategy-level intercept: β07, β08, β09, and β10 for active learning, coherence, collective learning, and topic intensity, respectively. Among them, the coefficient for the effect of topic intensity on the strategy-level intercept (β0.10 = 0.06+) was positive and marginally significant. The coefficient represents the effect of the topic intensity on the average use of broad instructional activities in 2004-05, controlling for prior use.19 For example, if a teacher’s topic intensity was on average 20 hours per broad instructional activity, the effect would be translated to 1.2% more instructional time more than another teacher who had zero hours.

The next coefficient, which represents the effects of the mean focus given to broad instructional activities in professional development (β01 = 1.05), indicates that teachers who participated in professional development that covered more strategies tended to make no more or no less use of each activity in their classroom practice in 2004-05, controlling for prior use. In particular, a teacher who was in an activity that focused on all four broad instructional activities

18 The expected use of 1.67 pertains to the use of broad instructional activity involving hands-on materials, by a control group mathematics teacher at the middle school level in El Paso MSP site (who fall into all reference categories). To determine the expected use of other strategies, the specific strategy coefficients must be added (e.g., 2.39 for strategies involving computers or other technology). To determine the expected use for a treatment teacher or a teacher at the high school level, the appropriate coefficients must be added. 19 The positive coefficient suggests that topic intensity may increase the use of broad instructional activities in the classroom, regardless of the focus of the activity. We did not hypothesize such an effect; we included the additive effect of active learning opportunities as a control. We also modeled potential interaction between a number of professional development quality features and relative focus. The coefficient represents the extent to which the effect of focusing on a particular strategy as part of a professional development activity is strengthened if the activity incorporates a particular quality feature such as opportunities for active learning. But we found very few such interaction effects.

Page 43: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

39 MSP PD Evaluation Study Report

mean focus=1) would have a predicted 2004-05 use 1.05% higher than a teacher whose professional development did not focus on any of the four broad instructional activities (mean focus=0).

The final teacher/activity-level coefficient represents the baseline effect of the relative focus on a particular broad instructional activity on the use of the activity in the classroom (i.e., the effect for a typical teacher).20

The estimated coefficient is positive and highly significant (β10 = 7.86***), which indicates that professional development that focuses on a particular broad instructional activity (relative to the reference activity) increases the use of the activity in the classroom by 7.86% of instructional time.

20 As we indicated above, we assumed that the strategy-level intercept might vary among teachers. Thus, the baseline coefficient shown is the average or typical value among teachers in the sample; values for individual teachers vary around this average. The variance components shown at the bottom of Exhibit 20 indicate the extent of this variation.

Page 44: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity
Page 45: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 40

This result supports the evidence that we obtained with previous studies that the content focus of professional development affects instructional practices tied to the content focus.

The coefficient for the mean focus on the set of four broad instructional activities (β01 = 1.05) was significantly smaller than the effect of the mean focus on a particular broad instructional activity (β10 = 7.86). 21 As described above, if the coefficient for mean focus is lower than the coefficient for relative focus, it indicates that focusing on multiple strategies is harmful—that is, activities focusing on a single strategy are more effective in boosting the use of the strategy than are activities that focus on several related strategies.

The variance components shown near the bottom of the exhibit indicate that there is no significant between-teacher variation remaining in the strategy-level intercept and slope (<0.001 and 4.50, respectively), after controlling for the variables in the model.22 This result indicates that teachers do not differ significantly in their use of broad instructional activities in 2004-05, after controlling for the variables in the model; in addition, they also do not differ in their responsiveness to professional development. Other characteristics, beyond those included in the model, may not help explain any remaining variation any further. This suggests that by including a number of professional development quality features in the model, we have explained a good deal of the variation among teachers in the effectiveness of the professional development they experienced.

In the remainder of the result section, we will present results for the estimation of the effects of professional development on teachers’ use of instructional and assessment strategies, particularly highlighting the effects of content focus (i.e., mean focus on set of strategies, and relative focus on specific strategy), active learning, coherence, topic intensity, and collective learning.

Exhibit 20: Models 2-5 (regarding the use of specific instructional strategies in mathematics class). In Models 2 and 3, not a single significant effect was found among the select quality features of professional development on the use of instructional strategies involving individual student work or students working in pairs or in small groups. In Model 4, the coefficient for coherence (β08 = 1.12*) suggests that such professional development quality feature significantly increases teachers’ use of instructional strategy involving technology regardless of the professional development’s focus on the content. The coefficient for relative focus on specific strategy (β10 = 1.24+) is also suggestive of its marginally significant and positive effect on the use of such specific instructional strategy in the classroom. In Model 5, active learning is found to have a significantly positive effect (β07 = 0.61*) on teachers’ use of hands-on materials in the classrooms.

Exhibit 20: Model 6 (regarding the use of assessment practice in mathematics class). As in Model 5, active learning had a highly significant and positive effect (β07 = 0.61**) on teachers’ use of reform type assessment practice such as projects and portfolios. Relative focus on specific strategy (β10 = 0.39+) shows its marginally significant and positive effect on the use of assessment practice.

Exhibit 20: Models 1-6, Summary. It should be pointed out that the coefficient of relative focus is positive in all 6 models shown in Exhibit 6 and significant in 3. This result suggests that if you focus on an instructional strategy in professional development, teacher will increase their use of it.

21 The standard error for the difference in the coefficients (not shown in the exhibit) is 2.93, and the significance level is p<.001. 22 The other two variance components shown in the exhibit include the covariation between the intercept and slope and the residual variance. The first of these indicates the extent to which teachers who have unusually high intercepts also have unusually high slopes; the second indicates the remaining variance in the use of broad instructional activities in 2004-05, after all other measured variables and variance components are taken into account.

Page 46: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

41 MSP PD Evaluation Study Report

Three other aspects of the results in Exhibit 6 are worth noting. First, the coefficient for, treatment group seems to show mixed results. One might wonder why teachers who were targeted for MSP program treatment have not benefited from its intended effect. But as we mentioned in the previous section, the lack of result is probably the artifact of our HLM model that included both the treatment (as main effect) and the quality of professional development (mediating effect) in the same prediction equation.

Second, high school mathematics teachers used instructional strategies involving technology such as computers and calculators less often than their middle school counterparts in 2004-05. However, we suspect that this result is an artifact of our model specification (or misspecification); that is, we are controlling for the content focus of the professional development, and the MSP site variable has its effect by determining the content focus of their professional development. So we may be overly controlling away the treatment effect.

Finally, as expected, there were some variation in terms of 2004-05 instructional and assessment practice in the classroom among MSP program sites, controlling for 2003-04 practices. For example, teachers in the Corpus Christi site used instructional strategies involving students in pairs or in small groups more than their peers in El Paso did, controlling for prior use; teachers in Brockport spent more of their instructional time in reform type assessment practice (e.g., projects and portfolios) than their peers in El Paso did.

Exhibit 21: Models 1 - 6 (regarding the use of instructional and assessment practice in science class). In the HLM models for examining the effectiveness of science professional development, mean focus on set of strategies was not found to be effective. Relative focus on specific strategy had a positive, albeit marginally significant, effect on the use of instructional strategies involving students in pairs or in small groups. As was the case in the mathematics models, the coefficient of relative focus is also positive in all 6 models. In Model 2, coherence and topic intensity had significant and positive effects on the use of instructional strategies involving lab activities, investigations, or experiments; however, the positive effect was offset by an unexpected negative effect of active learning. Similarly, topic intensity had a positive impact on the use of technology in science class; but, for unknown reasons, the same variable had an adverse effect on the use of instructional strategies involving collecting data and information in science class.

In the above HLM models for examining the effectiveness of science professional development, treatment group status did not show any significant and positive results probably for the same reason as discussed in the model for mathematics. Further, as was the case with mathematics teachers, high school science teachers used instructional strategies involving technology in 2004-05 significantly less than their middle school counterparts. Lastly, as expected, there were some variation among MSP program sites in terms of instructional and assessment practice in the science classroom. For example, teachers in Corpus Christi site used instructional strategies involving lab activities or experiments more than their peers in El Paso did; however, teachers in Corpus Christi MSP site spent less of their instructional time in technology than their peers in El Paso did.

Page 47: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity
Page 48: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 42

EXHIBIT 21: Effects of Science Professional Development on the Use of Instructional Strategies

Coefficient Model 1: Model 2: Model 3: Model 4: Model 5: Model 6:

Broad instructional

activities

Assessment practice

Lab activities, investigations, or experiments

Collect data orinformation

Students work in small groups

Use of technology

Level 1 model: Use of instructional strategies Extent of classroom use of strategy in 2003, π2 0.16+ 0.30*** 0.31*** 0.34*** 0.20* 0.48***Strategy 1 (0/1), π3 4.20** -0.24 -0.23 0.15 -0.09 0.76***Strategy 2 (0/1), π4 0.21 0.05 0.29+ 0.16 0.52+ 0.73***Strategy 3 (0/1), π5 2.98* 0.12 -0.01 0.72* - 0.45*Strategy 4 (0/1), π6 - 0.08 0.02 0.09 0.18 Strategy 5 (0/1), π7 -0.38* - -0.52+ -Strategy 6 (0/1), π8 -0.11 -Strategy 7 (0/1), π9 0.00 Strategy 8 (0/1), π10 -0.50**Strategy 9 (0/1), π11 -

Last strategy marked as -, the reference category.Level 2 model: Teacher/activity-level effectsEffects on intercept in strategy model ( π 0i )Intercept (Baseline), β00 9.26** 0.31 2.27* 1.86 2.54+ 0.24 School-level -- High School (vs. Middle), β02 -1.48 0.07 0.00 0.20 -1.24** 0.36+Treatment (vs. Control), β03 -0.20 -0.19 0.40 0.55 -1.34* -0.44+MSP site 1-- Brockport, β04 0.03 0.03 -0.73 -0.57 0.96 -0.36 MSP site 2 -- Cleveland, β05 -1.42 0.48 -0.60 -0.85 0.37 -0.13 MSP site 3 -- Corpus Christi, β06 -0.89 0.86* -0.72 0.67 -2.19** -0.44 MSP site 4 -- El Paso, the reference category - - - - - -Active learning, β07 -0.71 -0.56** 0.37 0.46 -0.37 -0.05 Coherence, β08 0.76 0.73** -0.18 -0.38 0.17 0.18 Collective participation, β09 -0.59 0.08 -0.39 -0.34 -0.10 0.11 Topic intensity, β0.10 -0.02 0.02* -0.04* -0.02 0.07** -0.01 Mean focus on set of strategies, β01 -0.19 -0.76 -0.26 0.24 -0.45 0.65 Effects on d pi slope in strategy model ( π 1i )Baseline(Relative focus on specific strategy), β10 3.08 0.44 0.25 0.75+ 1.23 0.27 Variance componentsBetween-teacher variance in intercept 5.97 0.26+ 0.76+ 0.17 14.21*** 0.21+Covariation in intercept/slope -5.12 -0.26 -0.27 0.50 -15.87*** -0.07 Between-teacher variance in slope <.001 0.56+ 0.30 2.03 17.46*** <.001Residual 30.41*** 0.59*** 0.56*** 1.75*** 1.35*** 0.53***Degrees of freedomStrategy level 120 326 198 74 162 162Teacher/activity level 31 31 30 28 31 31

Note: + p<.10, * p<.05, **p<.01, ***p<.001

Specific instructional strategies

Page 49: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity
Page 50: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

43 MSP PD Evaluation Study Report

Conclusion & Discussion

One of the goals of the MSP professional development evaluation study was to develop a new methodology to obtain a complete and comprehensive description of teachers’ entire professional development experiences as they relate to mathematics and science instruction. Another related goal was to obtain unbiased estimates of the effects of teachers’ professional development experiences on their instructional practice, on the basis of a full description of professional development activities gathered with PDAL.

With respect to the first goal, we made significant progress by developing and testing PDAL in this study. With the use of the web-based, monthly teacher logs collected over a 15-month period, we obtained a representation of teachers’ participation in professional development activities as they relate to mathematics and science. Thanks to its monthly data collection schedule, we believe PDAL was able to collect data on teachers’ learning activities that take place over time in various settings. We successfully combined each individual teacher’s total professional development experiences into teacher-level aggregate data with which changes in their instructional practices can be compared.

However, despite our attempt to collect complete and comprehensive data on teachers’ professional development experiences, we were unable to reach a high response rate in the current implementation of PDAL. The PDAL response rate of 57% that we obtained takes into account the number of teachers who completed usable PDAL data for at least one month. Even though the majority of responding teachers completed at least one log for 12 months or more, not all of these teachers completed logs throughout the entire PDAL data collection period, which was extended over 15 months. To the extent that not all teachers participated in the PDAL, not all teachers completed all monthly logs for the study period, and not all teachers respond to the complete set of questions asked in each monthly log, we are likely to obtain a less-than-complete description of teachers’ entire professional development experiences. For example, we were not able to estimate the total contact hours or entire duration of each professional development activity when teachers failed to complete their monthly logs for the entire 15 month period. In order to compensate for the lack of total contact hours, for example, we computed the average contact hours per month using all the available monthly logs for the teacher. If teachers had submitted all the monthly logs on their professional development activities, we would have been able to obtain an accurate representation of teachers’ actual participation in their professional learning activities. To the extent that teachers provided us with partial data, the PDAL data would be biased to an unknown degree. We acknowledge the limitation of our PDAL data due to its lack of completeness, and the results from the current PDAL data analyses should be interpreted with caution. For example, it is quite possible that teachers who responded to all the surveys and logs over the 3-year study period might be different in some significant ways from those who failed to do so (e.g., in their motivation to learn). In this case, our results might be positively biased to an unknown degree.

Even though we were not able to fully reach our goal of obtaining a complete and comprehensive account of each teacher’s entire professional development experiences, the PDAL’s potential is clear with respect to the collection, measurement, and analysis of full and rich data on teachers’ on-going learning experiences. If implemented under ideal conditions, PDAL is capable of examining the scope, nature, quality and quantity, and content of teachers’ professional development experiences with the aid of web-based technology. It would be useful to explore PDAL’s full potential with further studies conducted on a larger scale. Future studies should explore different ways of administering PDAL to boost response rates.

Page 51: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity
Page 52: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 44

To achieve the second goal of the study, we replicated the AIR’s earlier longitudinal study of the National Evaluation of Eisenhower Professional Development Program (Desimone et al., 2002). Consistent with the previous longitudinal study, the basic design of this study was to investigate the effects of Year 2 professional development features on Year 3 instructional practices, controlling for Year 1 instructional practices and other variables such as teacher or school characteristics (e.g., MSP treatment status, grade level taught). One of aims of this study was to identify features of professional development that are likely to be linked to changes in teachers’ instructional practice. In doing so, we achieved some measure of success by reaffirming the previous findings with the current study. In general, the effects that were found in this study are in the hypothesized directions. Especially, our results were clear that content focus of professional development is related to instructional practices tied to the content focus; relative focus on a specific instructional strategy in professional development was found to have a consistently positive effect on teachers’ use of this instructional strategy in the classroom, controlling for prior use. And for the other four a priori identified hypotheses about dimensions of quality professional development (i.e., active learning, coherence, collective participation, and duration), three were supported at least to some extent. In this study, however, we could not replicate the positive effect of collective participation that was supported in the Eisenhower professional development program evaluation (Desimone et al., 2002). To the extent that our findings from the earlier Eisenhower longitudinal study are replicated in this longitudinal study, they are all the more convincing.

As noted in section one, there are a number of strengths in the AIR’s previous longitudinal study that this study strove to emulate and replicate. They include a basic design of linking changes in teachers’ instructional practices over a 3-year period with professional development activities that teachers attended in the intervening year; independent measures of professional development; and a robust analytic approach. However, there are a number of important differences between this study and the previous longitudinal study that we replicated. In the final paragraphs, we discuss the implications of these differences.

First, as noted in section one, the previous longitudinal study was based on a single professional development activity that teachers themselves chose to report on, based on its perceived helpfulness to their target class. It seems plausible to expect that one might obtain a larger impact of the full collection of professional development activities than of a single activity alone. Thus, we developed a new instrument to gather information about a full spectrum of professional learning activities that teachers experienced. Even though we fell short in obtaining a complete description of teachers’ professional development, we were able to capture a more comprehensive view of the landscape of professional development than prior study with the use of PDAL.

However, it is important to take a note of a potential disadvantage of using aggregate measures of the quality of professional development activities that are summarized and described in a comprehensive manner. For example, when we aggregate data using the average active learning or average coherence across activities, we may end up with a situation in which the average active learning is high, but it’s not necessarily high for the activity that focused on the content we are examining in our model. On the other hand, since the Eisenhower-based longitudinal study was concentrated on a single professional development activity, quality features and the content were measured for the same activity. We suspect that that is why they were able to find an interaction effect between the content focus and the quality features of the professional development activity that they analyzed. For example, they were able to demonstrate that professional development that focused on set of strategies of technology use increased teachers’ use of those strategies in the classroom, especially when their professional development had such a high quality feature as active learning (Desimone et al., 2002).

Second, the previous longitudinal study was based on a sample of teachers who continued to teach the same course over all three waves of the Eisenhower teacher professional development

Page 53: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

45 MSP PD Evaluation Study Report

activity survey. Teachers were instructed to report on that particular course as a target class while responding to all waves of survey. In the Eisenhower longitudinal study, whether teachers were teaching the same course for the same target class was closely monitored and checked to be included in the final analysis. However, the same criteria were not employed in the selection of a target class in this study as were used in the prior study (see footnote 4). In addition, there was no way to ensure that teachers were teaching the same course for the same target over the study period. Consequently, there is a chance that any change in classroom instruction over time may be attributed to change in the courses teachers taught during the study period.

Third, as was the case in the prior longitudinal study, this study experienced low statistical power due to small sample size. The problem of sample attrition was substantially greater with the current study, possibly due to the high demand on participating teachers to complete monthly logs for as long as 15 months.

Fourth, the results of the previous longitudinal study were based on an analytic approach in which the authors chose not to put all of their independent variables into one model. Despite a similar restriction of small sample size, we chose to put all of the a priori identified dimensions of professional development in the same model (i.e., active learning, coherence, collective learning, and topic intensity, above and beyond the effect of content focus). This allowed us to compare the relative strength of one variable with that of others in accounting for their effectiveness.

Fifth, while the previous longitudinal study employed a survey method to gather information about teachers’ professional development experiences, the current longitudinal study used PDAL instead. In the previous study, it is noted that “our baseline-control variables, professional-development independent variables, and instructional-practices dependent variables are each measured through a different survey, at a different point in time. There are no spurious correlations among these three sets of variables due to [their] having been collected in the same instrument” (Desimone et al., 2002). Thanks to the adoption of PDAL, the current study has an additional advantage over the Eisenhower longitudinal study in that it was able to secure not only independent measures of professional development, but also an independent method of data collection (i.e., log method, which is distinct from survey method). To the extent that the findings from the earlier study are replicated in this longitudinal study with the use of independent method of teacher logs, they are even more affirming the results from the prior studies.

Sixth, even though two longitudinal studies share some common variables concerning instructional practice, the Eisenhower-based longitudinal study focused on high-order instructional strategies, while the current study was not limited to such strategies. Instead, the measures of instructional activities and strategies are broad-based to represent an entire spectrum of instruction in mathematics and science. In addition, the previous longitudinal study used a somewhat different set of measures of instructional practices. For example, in the AIR longitudinal study, for a set of questions about use of technology in the classroom, teachers responded with a raw response scale ranging from 0= almost never, 1=some lessons, 2=most lessons, to 3=every lesson. For the current study, the original raw scale ranging from 0=none to 4=considerable (more than 51% of instructional time for the school year). This scale was recoded and normalized to the new scale of 0% to 100%, representing the relative time spent on each of the instructional activities and strategies used in mathematics or science class.

Lastly, the two longitudinal studies vary slightly in terms of how each of the quality dimensions of professional development was operationalized. For example, in the previous longitudinal study, the active learning scale was computed by summing up all the opportunities provided to promote active engagement in learning in the professional development activity on which the teacher reported.

Page 54: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 46

Hence, its resultant scale ranges from the minimum of 0 (no opportunities were provided for active learning in the activity) to the maximum of 20 (all types of active learning were provided). However, in this study, active learning was scaled by averaging all applicable items that were reported to have been provided for active engagement, each of which was measured on a four-point scale of 0=never, 1=rarely, 2=sometimes, and 3=often. As a result, the average active learning score for each teacher runs from 0 to 3 and measures the frequency with which opportunities for active learning were provided, in addition to the number of different types of activities for which such opportunities were made available. In addition, topic intensity deviated from the previous measure on the duration of professional development. This new measure incorporates duration with content focus in the sense that the duration of professional development is specifically tied to the content that are covered and focused in professional development that teachers attend.

We conclude by emphasizing a need for further replication studies with more robust study designs (e.g., randomized controlled study), larger and more representative samples of teachers, and different analytic approaches. When studies of this sort are coupled with new methods like the PDAL for the collection of data on professional development, we are more likely to achieve the goal of obtaining improved estimates of the effects of teachers’ professional development on classroom practices.

Page 55: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

47 MSP PD Evaluation Study Report

Appendices

Page 56: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 48

Appendix A: Alternative Methods of PDAL Data Aggregation Using disaggregate, monthly log level data as basic building blocks, we can construct a

number of different new aggregated data that are useful to describe different professional development activity profiles for teachers.

First, we can produce teacher-activity level data by aggregating each teacher’s basic monthly log level data across months for each activity. As the last column of Exhibit 3 displays, there are nine sets of teacher-activity level data. They represent unique combinations of teachers and activities. For example, the first set of teacher-activity level data (as shown in the first row of Exhibit 3) represents Mr. Anderson’s participation is Activity A, which is produced by aggregating monthly log records for July and August 2003. This set of teacher-activity combination is distinguished from the teacher-activity combination for Ms. Lopez and Activity A (as shown in the fourth row of Exhibit 3).

Hence, Mr. Anderson’s professional development portfolio can be summarized by three teacher-activity level aggregates, which represent his participation in three separate activities. Since the quality of one professional development activity may differ from that of another, it is important to create separate measures of quality such as content focus or active learning for each activity. In the case of a one-time activity (e.g., activity C) which is bound to a month, there is no need to aggregate data across months. But, in the case of continuous activities such as A and B, we need to combine a series of monthly log records that Mr. Anderson filled out for the activities. For example, to obtain Mr. Anderson’s total contact hours for activity A, we need to sum up the contact hours reported for July and August 2003. In addition, we can compute mean contact hours for the same activity. In a similar manner, we can compute other aggregate measures of professional development quality such as coherence for each activity by averaging out corresponding quality indices across months.

Second, we can produce activity-level data by aggregating the basic monthly log level data across months and teachers. As the last column of Exhibit A-1 shows, a total of 20 monthly log records were aggregated to produce 7 different professional development activity profiles (or 7 sets of activity level data). For example, 4 monthly logs created by Mr. Anderson and Ms. Lopez during July and August 2003 were combined to produce an aggregate for professional development activity A. With this activity level aggregate, we can describe each activity’s duration (e.g., total contact hours, mean contact hours, and span), type, purpose, and other quality features such as a mean level of active learning.

Exhibit A-1: Activity-level data aggregated across months and teachers (hypothetical data)

Activity Jul-03 Aug-

03 Sep-03

Oct-03

Nov-03

Dec-03 Jan-04

Feb-04

# of logs aggregated

A 2 2 0 0 0 0 0 0 4 B 0 0 1 0 1 0 1 0 3 C 0 0 0 1 0 0 0 0 1 D 0 0 0 0 1 1 1 1 4 E 1 2 2 0 0 0 0 0 5 F 0 0 1 1 0 0 0 0 2 G 0 0 0 0 1 0 0 0 1

Third, we can produce teacher-level data by aggregating the basic monthly log level data across months and activities. This method was described in the text.

Page 57: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

49 MSP PD Evaluation Study Report

Appendix B: List of PDAL Questions about Teachers’ Professional Development Activities

Mathematics Mathematics Professional Development’s Coverage on Broad Instructional Activities

Were any of the following instructional topics covered in this professional development activity? Classroom activities when students work individually yes no Classroom activities when students work in pairs or small groups yes no Classroom activities when students use computers, calculators, or

other technology yes no Classroom activities when students use hands-on materials yes no

For each of the instructional topics that were marked as being covered in their professional development, teachers were asked which specific instructional strategies were primary focuses in their professional development activity.

Mathematics Professional Development’s Focus on Specific Instructional Strategies

Did the professional development focus on any of the following instructional strategies for use in your classroom when students in the target class work individually?

Solve word problems from a textbook or worksheet. yes no Solve non-routine mathematical problems (for example, problems that

require novel or non-formulaic thinking). yes no Explain their reasoning or thinking in solving a problem, using several

sentences orally or in writing. yes no Apply mathematical concepts to "real-world" problems. yes no Make estimates, predictions or hypotheses. yes no Analyze data to make inferences or draw conclusions. yes no Work on a problem that takes at least 45 minutes to solve. yes no Complete or conduct proofs or demonstrations of

their mathematical reasoning. yes no

Did the professional development focus on any of the following instructional strategies for use in your classroom when students in the target class work in pairs or small groups?

Solve word problems from a textbook or worksheet. yes no Solve non-routine mathematical problems (for example, problems that

require novel or non-formulaic thinking). yes no Explain their reasoning or thinking in solving a problem, using several

sentences orally or in writing. yes no Apply mathematical concepts to "real-world" problems. yes no Make estimates, predictions or hypotheses. yes no Analyze data to make inferences or draw conclusions. yes no Work on a problem that takes at least 45 minutes to solve. yes no Complete or conduct proofs or demonstrations of

their mathematical reasoning. yes no

Did the professional development focus on any of the following instructional strategies for using calculators, computers, or other educational technology in your classroom with your students?

Use sensors and probes. yes no

Page 58: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 50

Display and analyze data. yes no Develop geometric concepts (for example, using simulations). yes no

Did the professional development focus on any of the following instructional strategies for using hands-on materials in your classroom with your students?

Work with manipulatives (for example, counting blocks, geometric shapes, or algebraic tiles) to understand concepts. yes no

Measure objects using tools such as rulers, scales, or protractors. yes no Build models or charts. yes no Collect data by counting, observing, or conducting surveys. yes no Present information to others using manipulatives (for example, chalkboard,

whiteboard, poster board, and projector). yes no

Mathematics Professional Development’s Focus on Assessment Practice

Did the professional development focus on any of the following uses of assessment in your classroom with students?

Extended response item for which student must explain or justify solution. yes no Performance tasks or events (for example, hands-on activities). yes no Individual or group demonstration, presentation. yes no Mathematics projects. yes no Portfolios. yes no

Science Science Professional Development’s Coverage on Broad Instructional Activities

Were any of the following instructional topics covered in this professional development activity? Classroom activities when students do a laboratory activity,

investigation, or experiment. yes no Classroom activities when students collect data (other than

laboratory activities). yes no Classroom activities when students work in pairs or small groups (other than

laboratory activities). yes no Classroom activities when students use computers, calculators, or

other educational technology to learn science. yes no For each of the instructional topics that were marked as being covered in their professional

development, teachers were asked which specific instructional strategies were primary focuses in their professional development activity.

Science Professional Development’s Focus on Specific Instructional Strategies

Did the professional development focus on any of the following instructional strategies for use in your classroom relating to laboratory activities, investigations, or experiments?

Make educated guesses, predictions, or hypotheses. yes no Follow step-by-step directions. yes no Use science equipment or measuring tools. yes no Collect data. yes no Change a variable in an experiment to test a hypothesis. yes no

Page 59: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

51 MSP PD Evaluation Study Report

Organize and display information in tables or graphs. yes no Analyze and interpret science data. yes no Design their own investigation or experiment to solve a scientific question. yes no

Did the professional development focus on any of the following instructional strategies for use in your classroom with your students relating to collecting science data or information?

Solve word problems from a textbook or worksheet. yes no Solve non-routine mathematical problems (for example, problems that

require novel or non-formulaic thinking). yes no Explain their reasoning or thinking in solving a problem, using several

sentences orally or in writing. yes no Apply mathematical concepts to "real-world" problems. yes no Make estimates, predictions or hypotheses. yes no Analyze data to make inferences or draw conclusions. yes no Work on a problem that takes at least 45 minutes to solve. yes no Complete or conduct proofs or demonstrations of

their mathematical reasoning. yes no

Did the professional development focus on any of the following instructional strategies for use in your classroom when students in the target class work in pairs or small groups?

Talk about ways to solve science problems, such as investigations. yes no Complete written assignments from the textbook or workbook. yes no Write up results or prepare a presentation from a laboratory activity, investigation, experiment or

a research project. yes no Work on an assignment, report or project over an extended period of time. yes no Work on a writing project or entries for portfolios seeking peer

comments to improve work. yes no Review assignments or prepare for a quiz or test. yes no

Did the professional development focus on any of the following instructional strategies for using calculators, computers, or other educational technology in your classroom with your students?

Use sensors and probes yes no Display and analyze data yes no Develop geometric concepts (for example, using simulations) yes no

Science Professional Development’s Focus on Assessment Practice

Did the professional development focus on any of the following uses of assessment in your classroom with students?

Extended response item for which student must explain or justify solution. yes no Performance tasks or events (for example, hands-on activities). yes no Individual or group demonstration, presentation. yes no Science projects. yes no Portfolios. yes no

Page 60: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 52

Appendix C: List of SEC Questions about Classroom Instruction

Mathematics INSTRUCTIONAL ACTIVITIES IN MATHEMATICS

Listed below are questions about the types of activities that students in the target class engage in during mathematics instruction. For each activity, you are asked to estimate the relative amount of time a typical student will spend engaged in that activity during classroom instruction over the course of a school year. The activities are not necessarily mutually exclusive; across activities, your answers will undoubtedly greatly exceed 100%. Consider each activity on its own, estimating the range that bests indicates the relative amount of mathematics instructional time that a typical student spends over the course of a school year engaged in that activity.

AMOUNT OF INSTRUCTIONAL TIME (for the school year) 0 = None 1 = Little (10% or less of instructional time for the school year) 2 = Some (11-25 % of instructional time for the school year) 3 = Moderate (26-50% of instructional time for the school year) 4 = Considerable (50% or more of instructional time for the school year)

Teachers' Use of Broad Instructional Activities in the Target Mathematics Class

How much of the total mathematics instructional time do students in the target class?

(4 broad instructional activities that were analyzed for this report)

Work individually on mathematics exercises, problems, investigations, or tasks Work in pairs or small groups on mathematics exercises, problems, investigations, or tasks Use computers, calculators, or other technology to learn mathematics Use hands-on materials such as manipulatives (for example, geometric shapes or algebraic tiles),

measurement instruments (for example, rulers or protractors), and data collection devices (for example, surveys or probes)

(Other 8 broad instructional activities that were not analyzed for this report)

Watch the teacher demonstrates how to do a procedure or solve a problem. Read about mathematics in books, magazines, or articles (not textbooks). Take notes from lectures or the textbook. Complete computational exercises or procedures from a textbook or a worksheet. Present or demonstrates solutions to a math problem to the whole class. Do a mathematics activity with the class outside the classroom. Maintain and reflect on a mathematics portfolio of their own work. Take a quiz or test.

Page 61: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

53 MSP PD Evaluation Study Report

Teachers' Use of Specific Instructional Strategies in the Target Mathematics Class

AMOUNT OF INSTRUCTIONAL TIME (Working individually) 0 = None 1 = Little (10% or less of instructional time on mathematics exercises, problems or tasks) 2 = Some (11-25 % of instructional time on mathematics exercises, problems or tasks) 3 = Moderate (26-50% of instructional time on mathematics exercises, problems or tasks) 4 = Considerable (50% or more of instructional time on mathematics exercises, problems or tasks) When students in the target class work individually on mathematics exercises, problems, investigations, or tasks, how much time do they:

Solve word problems from a textbook or worksheet. Solve non-routine mathematical problems (for example, problems that require novel or non-

formulaic thinking). Explain their reasoning or thinking in solving a problem, using several sentences orally or in

writing. Apply mathematical concepts to "real-world" problems. Make estimates, predictions or hypotheses. Analyze data to make inferences or draw conclusions. Work on a problem that takes at least 45 minutes to solve. Complete or conduct proofs or demonstrations of their mathematical reasoning.

AMOUNT OF INSTRUCTIONAL TIME (Working in pairs or small groups)

0 = None 1 = Little (10% or less of instructional time in pairs or small groups) 2 = Some (11-25 % of instructional time in pairs or small groups) 3 = Moderate (26-50% of instructional time in pairs or small groups) 4 = Considerable (50% or more of instructional time in pairs or small groups) When students in the target class work in pairs or small groups on mathematics exercises, problems, investigations, or tasks, how much time do they:

Solve word problems from a textbook or worksheet. Solve non-routine mathematical problems (for example, problems that require novel or non-

formulaic thinking). Talk about their reasoning or thinking in solving a problem. Apply mathematical concepts to "real-world" problems. Make estimates, predictions or hypotheses. Analyze data to make inferences or draw conclusions. Work on a problem that takes at least 45 minutes to solve. Complete or conduct proofs or demonstrations of their mathematical reasoning.

Page 62: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 54

AMOUNT OF INSTRUCTIONAL TIME (Using calculators, computers, or other educational technology) 0 = None 1 = Little (10% or less of instructional time using calculators, computers, or other ed. technology) 2 = Some (11-25 % of instructional time using calculators, computers, or other ed. technology) 3 = Moderate (26-50% of instructional time using calculators, computers, or other ed. technology) 4 = Considerable (50% or more of instructional time using calculators, computers, or other ed. technology) When students in the target class are engaged in activities that involve the use of calculators, computers, or other educational technology as part of mathematics instruction, how much time do they:

Use sensors and probes Display and analyze data Develop geometric concepts (for example, using simulations)

AMOUNT OF INSTRUCTIONAL TIME (Using hands-on materials)

0 = None 1 = Little (10% or less of instructional time using hands-on materials) 2 = Some (11-25 % of instructional time using hands-on materials) 3 = Moderate (26-50% of instructional time using hands-on materials) 4 = Considerable (50% or more of instructional time using hands-on materials) When students in the target class use hands-on materials, how much time do they:

Work with manipulatives (for example, counting blocks, geometric shapes, or algebraic tiles) to understand concepts.

Measure objects using tools such as rulers, scales, or protractors. Build models or charts. Collect data by counting, observing, or conducting surveys. Present information to others using manipulatives (for example, chalkboard, whiteboard, poster

board, projector). Teachers' Use of Assessment Strategies in the Target Mathematics Class

FREQUENCY 0 = Never 1 = 1-4 times per year 2 = 1-3 time per month 3 = 1-3 times per week 4 = 4-5 times per week How often you use each of the following when assessing students in the target mathematics class:

Extended response item for which student must explain or justify solution. Performance tasks or events (for example, hands-on activities). Individual or group demonstration, presentation. Mathematics projects. Portfolios.

Page 63: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

55 MSP PD Evaluation Study Report

Science INSTRUCTIONAL ACTIVITIES IN SCIENCE

Listed below are questions about the types of activities that students in the target class engage in during science instruction. For each activity, you are asked to estimate the relative amount of time a typical student will spend engaged in that activity over the course of a school year. The activities are not necessarily mutually exclusive; across activities, your answers will undoubtedly greatly exceed 100%. Consider each activity on its own, estimating the range that bests indicates the relative amount of science instructional time that a typical student spends over the course of a school year engaged in that activity.

AMOUNT OF INSTRUCTIONAL TIME (for the school year) 0 = None 1 = Little (10% or less of instructional time for the school year) 2 = Some (11-25 % of instructional time for the school year) 3 = Moderate (26-50% of instructional time for the school year) 4 = Considerable (50% or more of instructional time for the school year)

Teachers' Use of Broad Instructional Activities in the Target Science Class

How much of the total science instructional time do students in the target class?

(4 broad instructional activities that were analyzed for this report) Do a laboratory activity, investigation, or experiment. Collect data (other than laboratory activities). Work in pairs or small groups (other than laboratory activities). Use computers, calculators or other educational technology to learn science.

(8 broad instructional activities that were not analyzed for this report)

Listen to the teacher explain something to the class as a whole about science. Read about science in books, magazines, articles (not textbooks). Work individually on science assignments. Write about science in a report/paper on science topics. Watch the teacher demonstrate a scientific phenomenon. Do a science activity with the class outside the classroom or science laboratory (for example,

field trips or research). Maintain and reflect on a science portfolio of their own science work. Take a quiz or test.

Page 64: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 56

Teachers' Use of Specific Instructional Strategies in the Target Science Class

AMOUNT OF INSTRUCTIONAL TIME (Doing laboratory activities, investigations, or experiments) 0 = None 1 = Little (10% or less of instructional time in laboratory activities, investigations, or experiments) 2 = Some (11-25 % of instructional time in laboratory activities, investigations, or experiments) 3 = Moderate (26-50% of instructional time in laboratory activities, investigations, or experiments) 4 = Considerable (50% or more of instructional time in laboratory activities, investigations, or experiments) When students in the target class are engaged in laboratory activities, investigations, or experiments as part of science instruction, how much time do they:

Make educated guesses, predictions, or hypotheses. Follow step-by-step directions. Use science equipment or measuring tools. Collect data. Change a variable in an experiment to test a hypothesis. Organize and display information in tables or graphs. Analyze and interpret science data. Design their own investigation or experiment to solve a scientific question. Make observations/classifications.

AMOUNT OF INSTRUCTIONAL TIME (Collecting science data or information)

0 = None 1 = Little (10% or less of instructional time in collecting science data or information) 2 = Some (11-25 % of instructional time in collecting science data or information) 3 = Moderate (26-50% of instructional time in collecting science data or information) 4 = Considerable (50% or more of instructional time in collecting science data or information) When students in the target class collect science data or information from books, magazines, computers, or other sources (other than laboratory activities), how much time do they:

Have class discussions about the data. Organize and display the information in tables or graphs. Make a prediction based on the data. Analyze and interpret the information or data, orally or in writing. Make a presentation to the class on the data, analysis, or interpretation.

AMOUNT OF INSTRUCTIONAL TIME (Working in pairs or small groups)

0 = None 1 = Little (10% or less of instructional time in pairs or small groups) 2 = Some (11-25 % of instructional time in pairs or small groups) 3 = Moderate (26-50% of instructional time in pairs or small groups) 4 = Considerable (50% or more of instructional time in pairs or small groups)

Page 65: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

57 MSP PD Evaluation Study Report

When students in the target class work in pairs or small groups (other than in the science laboratory), how much time do they:

Talk about ways to solve science problems, such as investigations. Complete written assignments from the textbook or workbook. Write up results or prepare a presentation from a laboratory activity, investigation, experiment or

a research project. Work on an assignment, report or project over an extended period of time. Work on a writing project or entries for portfolios seeking peer comments to improve work. Review assignments or prepare for a quiz or test.

AMOUNT OF INSTRUCTIONAL TIME (Using calculators, computers, or other

educational technology) 0 = None 1 = Little (10% or less of instructional time using calculators, computers, or other ed. technology) 2 = Some (11-25 % of instructional time using calculators, computers, or other ed. technology) 3 = Moderate (26-50% of instructional time using calculators, computers, or other ed. technology) 4 = Considerable (50% or more of instructional time using calculators, computers, or other ed. technology) When students in the target class are engaged in activities that involve the use of calculators, computers, or other educational technology as part of science instruction, how much time do they:

Use sensors and probes (for example, CBL's). Display and analyze data. Solve problems using simulations.

Teachers' Use of Assessment Strategies in the Target Science Class

FREQUENCY 0 = Never 1 = 1-4 times per year 2 = 1-3 time per month 3 = 1-3 times per week 4 = 4-5 times per week How often you use each of the following when assessing students in the target science class:

Extended response item for which student must explain or justify solution. Performance tasks or events (for example, hands-on activities). Individual or group demonstration, presentation. Mathematics projects. Portfolios.

Page 66: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

MSP PD Evaluation Study Report 58

Appendix D: Computation of Relative Instructional Time

As noted in Appendix C, teachers were asked about the types of activities that students in their target class engage in during mathematics or science instruction. For each activity, teachers were asked to estimate the relative amount of time a typical student would spend engaged in that activity during classroom instruction over the course of a school year. Teachers were instructed that, as the activities are not necessarily mutually exclusive across activities, their answers may not necessarily add up to 100%. They were advised to consider each activity on its own, estimating the range that bests indicates the relative amount of mathematics or science instructional time that a typical student spends over the course of a school year engaged in that activity.

In order to compute the relative instructional time spent on broad instructional activities and specific strategies within the activities, we carried out the following steps (we will use mathematics as an example):

(1) We computed the distribution of time spent on all 12 broad instructional activities that make up the entire mathematics instruction for the school year. To do that, we first computed the sum of percentage of time (recoded values) across 12 activities. Then, we divided the percentage of time on each activity by the sum of percentages across 12 activities. By doing so, for example, we obtain the relative time of 12% spent on instructional activity involving hands-on materials, relative to all 12 activities that make up the entire mathematics instruction for the school year. [Of these, only four were used in the analysis since the parallel items were asked in the PDAL as well.]

(2) Then we computed the proportion of instructional time spent on each of five specific instructional strategies (e.g., building models or charts) within the broad instructional activity involving hands-on materials. In order to compute the distribution of time spent on hands-on activity across 5 specific strategies, we first computed the sum of percentage values across the strategies, and then we divided the percentage of time spent on each strategy by the sum of percentages across 5 strategies. So, for example, if 30% of the time on the instructional activity involving hands-on material (which makes up 12% of time for the school year) was spent on the strategy of building models or charts, then the normalized percentage of time spent on the specific instructional strategy would be 3.6% (=30% x 12%) for the entire school year.

Page 67: EXAMINING THE EFFECTS OF MATHEMATICS AND SCIENCE PROFESSIONAL DEVELOPMENT …programs.ccsso.org/content/pdfs/AIR MSP_report_Final... · 2010-07-16 · professional development activity

59 MSP PD Evaluation Study Report

References

Blank, R.K. (2004) Data on Enacted Curriculum Study: Summary of Findings. Final report of experimental design study under a grant from the NSF/ROLE program. Washington, DC: CCSSO.

Blank, R., Birman, B., Garet, M., Yoon, K.S., Jacobson, R., Smithson, J., Minor, A. (2005). Longitudinal study of the effects of professional development on improving mathematics and science instruction: Year 2 progress report. Prepared for National Science Foundation by Council of Chief State School Officers, American Institutes for Research, and Wisconsin Center for Education Research.

Blank, R.K., Porter, A.C, & Smithson, J. (2001). New Tools for Analyzing Teaching, Curriculum and Standards in Mathematics & Science: Results from Survey of Enacted Curriculum Project. Washington, DC: CCSSO.

Brown, C., Smith, M., & Stein M. (1996, April). Linking Teacher Support to Enhanced Classroom Instruction. Paper presented at the American Educational Research Association, New York, NY.

Carey, N., & Frechtling, J. (1997, March). Best practice in action: Follow-up survey on teacher enhancement programs. Arlington, VA: National Science Foundation.

Cohen, D. & Hill, H. C. (1998). Instructional policy and classroom performance: The mathematics reform in California. CPRE Research Report Series RR-39. Philadelphia: Consortium for Policy Research in Education.

Desimone, L., Porter, A.C., Garet, M., Yoon, K. S., & Birman, B. (2002). Effects of professional development on teachers’ instruction: Results from a three-year study. Educational Evaluation and Policy Analysis, 24(2), 81-112.

Garet, M., Porter, A., Desimone, L., Birman, B., &Yoon, K.S. (2001). What makes professional development effective? Results from a national sample of teachers. American Education Research Journal, 38(4), 915-945.

Guskey, T. (2004). What makes professional development effective? Phi Delta Kappan.

Hiebert, J. (1999). Relationships between research and the NCTM standards. Journal for Research in Mathematics Education, 30(1), 3–19.

Kennedy, M. (1998). Form and substance of in-service teacher education. Research Monograph No. 13. National Institute for Science Education, University of Wisconsin – Madison.

Loucks-Horsley, S., Hewson, P. W., Love, N., & Stiles, K. E. (1998). Designing professional development for teachers of science and mathematics. Thousand Oaks, CA: Corwin Press.

Loucks-Horsley, S. & Matsumoto, C. (1999). Research on professional development for teachers of mathematics and science: The state of the scene. School Science and Mathematics, 99(5), 258-271.

Porter, A. C. (2002). Measuring the content of instruction: Uses in research and practice. (AERA 2002 Presidential Address) Educational Researcher 31(7), 3-14.

Richardson, V., & Placier, P. (2001). Teacher change. In V. Richardson (Ed.), Handbook of research on teaching (4th ed.). New York: Macmillan.

Smithson, J. & Blank, R. (2006). Indicators of Quality of Teacher Professional Development and Instructional Change Using Data from Surveys of Enacted Curriculum: Findings from NSF MSP-RETA Project, Washington, DC: Council of Chief State School Officers. http://www.ccsso.org/content/pdfs/IndicatorsOfQualityTeacherPDandIC.pdf

Supovitz, J. A. (2001). Translating teaching practice into improved student achievement. In S. Fuhrman (Ed.), From the capitol to the classroom: Standards-based reform in the states. National Society for the Study of Education Yearbook (Part II) (pp. 81-98). Chicago, IL: University of Chicago Press.

Yoon, K.S. & Jacobson, R. (2004). Professional Development Activity Log: A New Approach to Design, Measurement, Data Collection, and Analysis. Paper presented at AERA annual meeting.