implementation of the master plan for statewide
TRANSCRIPT
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012An Evaluation Study
O f f i c e o f R e s e a r c h
West Virginia Board of education
2012-2013
L. Wade Linger Jr., PresidentGayle C. Manchin, Vice President
Robert W. Dunlevy, Secretary
Michael I. Green, MemberPriscilla M. Haden, MemberLloyd G. Jackson II, MemberLowell E. Johnson, MemberJenny N. Phillips, MemberWilliam M. White, Member
Paul L. Hill, Ex OfficioChancellor
West Virginia Higher Education Policy Commission
James L. Skidmore, Ex OfficioChancellor
West Virginia Council for Community and Technical College Education
Jorea M. Marple, Ex OfficioState Superintendent of Schools
West Virginia Department of Education
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
An Evaluation Study
Patricia Cahape Hammer
West Virginia Department of Education Division of Teaching and Learning
Office of Research Building 6-Room 722 State Capitol Complex
1900 Kanawha Boulevard East Charleston, WV 25305
http://wvde.state.wv.us/ October 2012
Jorea M. Marple State Superintendent of Schools West Virginia Department of Education Robert Hull Associate Superintendent West Virginia Department of Education Larry J. White Executive Director Office of Research Keywords Evaluation, professional development, state goals, implementation, effectiveness,
quality Suggested Citation
Hammer, P.C. (2012). Implementation of the Master Plan for Statewide Profession-
al Staff Development for 2011-2012: An evaluation study. Charleston, WV: West Virginia
Department of Education, Division of Teaching and Learning, Office of Research.
Content Contact Patricia Cahape Hammer Coordinator Office of Research [email protected]
iii
Executive Summary
West Virginia Code §18-2-23a1 requires the West Virginia Board of Education
(WVBE) to establish annual professional development goals for public schools; to coordinate
professional development programs; and to guide program development, approval and eval-
uation. The legislative intent of this section of state law is
(1) To provide for the coordination of professional development programs by the State
Board;
(2) To promote high-quality instructional delivery and management practices for a thor-
ough and efficient system of schools; and
(3) To ensure that the expertise and experience of state institutions of higher education
with teacher preparation programs are included in developing and implementing professional
development programs.
Toward these ends, the WVBE (2011) adopted the following goals for professional
development for the 2011–2012 school year:
As a result of professional development, participants will . . .
1. deliver standards-based instruction in classrooms to ultimately improve student learning. Such instruction will exhibit an understanding of the Common Core State Standards for English/Language Arts and Mathematics including how the new standards align to the West Virginia 21st Century Content Standards and Objectives.
2. apply their knowledge of the Common Core State Standards into professional practice with specific attention to: (1) addressing writing and text complexity, (2) designing school-wide efforts to improve literacy and numeracy, and (3) ensuring technology and science are integrated into improvement efforts.
3. effectively apply the West Virginia Professional Teaching Standards to ensure that all students in West Virginia are served by high quality educators.
4. exhibit increased leadership and collaboration to facilitate school improvement. (pp. 4-5)
West Virginia Code §18-2-23a further states that, each year, once the annual goals
are set, the state board is required to submit the goals to the major state agencies
responsible for providing professional development to teachers, administrators, and other
professional education staff statewide, including the West Virginia Department of Education
(WVDE), the West Virginia Center for Professional Development (CPD), the regional
education service agencies (RESAs), and the Higher Education Policy Commission. These
agencies then collaborate in the development of an annual master plan for professional
development (PD Master Plan) aligned with the goals. Lastly, the statute requires evaluation
of the effectiveness of the professional staff development programs. The WVBE has charged
the WVDE Office of Research to meet this requirement.
1 See West Virginia Code §18-2-23a, Annual professional staff development goals established
by State Board; coordination of professional development programs; program development, approval
and evaluation. Available at
http://www.legis.state.wv.us/wvcode/ChapterEntire.cfm?chap=18&art=2§ion=23A#02.
Executive Summary
iv | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
This evaluation study provides summative information about the implementation of
the Master Plan for Professional Staff Development for 2011-2012 (PD Master Plan), which
was approved by the West Virginia Board of Education in May 2011. The study addresses the
following research questions:
RQ1. How comprehensively was the PD Master Plan implemented?
RQ2. What were participants’ views about the sessions’ adherence to research-based
practices for high quality professional development?
RQ3. What were participants views about the sessions’ success in addressing the
WVBE Goals for Professional Development?
RQ4. What were participants’ views about the impact of the professional development
on their knowledge, behaviors and skills, and attitudes and beliefs?
The study also raised questions about the formation of the plan itself, and how that
process may be affecting some of the findings in response to the research questions listed
above.
Methods
We examined the performance of professional development providers included in the
PD Master Plan, including the Marshall University June Harless Center, all eight RESAs,
CPD, and eight WVDE offices (Assessment and Accountability, Healthy Schools, Institution-
al Education Programs, Instruction, School Improvement, Special Programs, Title I, and Ti-
tle II, III, and System Support). These agencies—or PD providers—delivered professional
development from June 1, 2011 through May 31, 2012 in alignment with the WVBE-
approved PD Master Plan.2
There were two main data sources used in this study. The first was 572 reports sub-
mitted by the providers, using an online reporting system. Providers reported which sessions
were held, attendance, the county were the session was held, duration of the session and
timespan, and the e-mail addresses of all attendees. The second data source was an online
survey of 6,312 unduplicated, randomly selected participants, conducted in two waves—one
in late fall and one in the spring—resulting in 4,281 usable responses, which is a 68% re-
sponse rate. The survey collected participant perceptions about the quality, Board goal rele-
vance, and effectiveness of a single professional development session they attended.
Results
We set out to address the following aspects of the implementation of the 2011-2012
Master Plan for Professional Staff Development (PD Master Plan): (a) implementation of
planned sessions; (b) participant perceptions about the sessions’ adherence to research-
based practices for high quality professional development; (c) participant perceptions about
the sessions’ helpfulness with regard to the specific goals of the PD Master Plan; and (d) par-
ticipants perceived (self-reported) outcomes resulting from their involvement in profession-
2 Many of these providers delivered additional technical assistance and professional develop-
ment beyond the scope of the PD Master Plan. However, this evaluation examines only the profes-
sional development that was approved and included in the 2011-2012 PD Master Plan.
Executive Summary
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | v
Figure A. Attendance Changes from 2010-2011 to 2011-2012, by Provider Group
al development associated with the PD Master Plan. After the discussion of findings for each
of these topics, we provide a few observations about the formation of the plan, itself.
Implementation of Planned Sessions
Overall, implementation of planned sessions was down slightly from the level seen in
2010-2011—77.5% compared with 80.0% last year. The most prevalent reasons included a
lack of requests/registrations and scheduling issues. Five sessions were cancelled to avoid a
duplication of effort.
Attendance was down nearly 42%
from last year, dropping from about 37,000
in 2010-2011 to under 22,000 in 2011-2012.
Most of this drop in attendance—in fact,
83% of it—was attributable to the lower at-
tendance numbers reported by the RESAs,
which declined from 17,508 in 2010-2011 to
4,657 participants in 2011-2012. CPD and
IHEs (Marshall University only in 2011-
2012) also saw lower attendance, while
WVDE providers’ attendance was slightly up
(Figure A).
Top providers in terms of attendance
were all from WVDE, including the Office of
Instruction (3,995), Office of Special Pro-
grams (3,958), and the Office of Title I
(2,700).
The WVBE’s Goals for Professional Development were all well covered, with a mini-
mum of about 6,900 participants attending sessions focused on each of the goals.
Face-to-face sessions far outflanked other meeting formats at 90%, followed by ses-
sions that blended formats at 9%.
CPD had the highest average duration for its professional development sessions—45
hours, with a mean time span of 50 days. Six providers had average durations in hours for
their professional development that indicated they typically offer sustained professional de-
velopment (i.e., 14 hours or more), which research shows is the minimum required to effect
improvement in student achievement (Yoon, et al., 2007).
Sessions offered in a blended format tended to have the longest duration (average
17.5 hours).
There were five county locales where no professional development offered through
the PD Master Plan took place (Barbour, Monroe, Pleasants, Ritchie, and Taylor), although
educators from these counties did attend professional development offered in other locales.
This study estimates that the average travel time to professional development pro-
vided by the 18 offices and organizations covered in this report was about 61 minutes, or
Executive Summary
vi | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
slightly over an hour. When this estimate is projected to the more than 20,000 attendees in
sessions held during the 12-month period from June 2011 through May 2012, we estimate
that more than 20,000 staff hours were spent just travelling. Further, the burden of travel is
not equally shared. We estimate that educators in some counties travel well over 60 minutes
each way to attend professional development; educators in Hampshire, Hardy, Mineral, and
Monroe counties travelled at least half again the average, and twice as much as their coun-
terparts in Cabell, Calhoun, Kanawha, and Monongalia. No doubt some of this travel is una-
voidable, but perhaps not all of it. Reducing travel time by using online or other formats
could allow educators to redirect time spent travelling, allowing more time for other activi-
ties that would benefit students.
Use of Research-Based Practices
Overall, the strongest ratings in terms of the use of research-based practices were
given to the relevance and specificity (content-focus) of the professional development. The
weakest ratings were for
the two follow-up
items—that is, follow-up
discussion and collabora-
tion, and related follow-
up professional devel-
opment. These two di-
mensions may warrant
attention, and may well
receive it with the focus
on providing more sus-
tained professional de-
velopment in the current
(2012-2013) PD Master
Plan.
Results were sim-
ilar when we disaggre-
gated by professional
role and by programmat-
ic level, that is, there was
very little variation
among the role groups
and programmatic levels
with regard to the overall
quality index rating,
which ranged from 3.7 to
3.9 on a 5-point scale (1
[strongly disagree], 3
[neutral], and 5 [strongly
agree])—indicating a
Figure B. Mean Quality Index Rating for Adherence to Research-Based Practices for High Quality Professional Development, by Provider
CPD = WV Center for Professional Development; Marshall = Marshall University June Harless Center; RESA = regional education service agency (one each for eight regions in West Virginia); OAA = WVDE Office of Assessment and Accountability; OHS = WVDE Office of Healthy Schools; OIE = WVDE Office of Institutional Education Programs; OI = WVDE Office of Instruction; OSI = WVDE Office of School Improvement; OSP = WVDE Office of Special Programs; Title I = WVDE Office of Title I; Title II, III, SS = WVDE Office of Title II, III, and System Support.
Executive Summary
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | vii
moderate level of agreement that the professional development they attended adhered to
research-based practices for high quality professional development.
There was slightly more variation (3.7 to 4.0) when we disaggregated by content area,
with physical education and foreign language teachers expressing the highest level of agree-
ment.
The greatest degree of variation in the mean quality index rating was among provid-
ers, although all 18 providers had ratings that fell into the general agreement range—that is,
respondents tended to agree with statements that the professional development they attend-
ed adhered to research-based practices and was beneficial overall. However, there were six
providers that scored at 4.0 or above, including CPD; RESAs 1 and 2; and WVDE’s Office of
Instruction, Office of Special Programs, and Office of Title II, III, and System Support. The
lowest scoring providers were RESA 7 (3.58), RESA 8 (3.51), and the WVDE Office of School
Improvement (3.55) (Figure B).
Perceived Effectiveness in Addressing the Board’s Goals
For this measure, we selected respondents who attended sessions that providers had
indicated were aligned with particular Board goals for professional development, and
checked to what extent these respondents agreed that the professional development had
been helpful in meeting that goal. With few exceptions—that is, CPD, RESA 1, and the Office
of Title I—there is much room for improvement when it comes to respondents’ perceptions
about alignment of the professional
development they received with the
goal it was meant to address (Figure C).
It is unknown why some providers had
such consistently low alignment scores,
with only about a quarter to just over a
third of individual respondents agree-
ing that the session they attended ad-
dressed the goal it was intended to
support. In the case of the RESAs,
some of the lack of alignment may be
due to the approach they used in sub-
mitting nonspecific session titles—each
of which they designated as aligning
with several goals—and then reporting
multiple sessions under each, some of
which may or may not have aligned
well with the goals. However, RESA 1
did not seem to fall into that pattern,
which could indicate that RESA 1 truly
focused very sharply on the Board
goals—especially goals related to Eng-
lish/language arts, writing, and literacy
and numeracy skills, as well as on ap-
Figure C. Percentage of Overall Agreement or Strong Agreement that PD was Helpful in Meeting Board Goals, by Provider
Executive Summary
viii | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
plying WV Professional Teaching Standards, and leadership skills to improve schools. An-
other possible explanation is that
RESA 1 reported only their goal-
aligned professional development
through the PD Master Plan report-
ing system and refrained from re-
porting other nonaligned sessions.
With only 51.2% of respond-
ents, overall, in agreement that the
sessions they attended aligned well
with the Board goals they were in-
tended to support, goal alignment is
clearly an area that most providers
could focus on improving. The 2012-
2013 PD Master Plan evaluation may
see some improvement in this meas-
ure, as providers were restricted to
indicating only one primary goal for
each of the sessions they included in
the PD Master Plan, and they were
required to submit specific titles in-
dicating specific content.
Perceived Impacts on Knowledge, Behavior, and Attitudes/Beliefs
In three paired self-reported
pre-/posttest items, participants in-
dicated greater knowledge after hav-
ing participated in professional
development, reported engaging in
more behavior related to the PD they
attended, and holding attitudes and
beliefs slightly more aligned to those
supported by the professional devel-
opment. T tests returned statistically
significant differences for all three
areas (p <.000). In nearly all dis-
aggregations, professional develop-
ment was perceived by participants
to have had its greatest impact on
their knowledge and least impact on
their attitudes and beliefs. This pat-
tern held true whether we disaggre-
gated by programmatic level,
professional role, content area, or
Figure D. Perceived Impact of Professional Development (Pre/Postsession), Effect Size, by Provider
Light blue indicates small effects, medium blue indicates moderate effects, darker blue indicates large effects, and darkest blue indicates very large effects.
Executive Summary
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | ix
provider. Overall, perceived impacts were
Highest for early childhood/elementary participants and lowest for respondents who indicated they were in the other programmatic group;
Highest for regular classroom teachers and lowest for respondents who indicated they were in the other role group;
Highest for educators involved in all subject areas (i.e., elementary education), fol-lowed by foreign language teachers; and lowest for those indicating they were not teaching in a content area (N/A) and those teaching English as a second language3; and
Highest for respondents who attended professional development offered by RESA 1, RESA 2, and WVDE Office of Instruction; and lowest for professional development offered by RESA 8, and WVDE’s Office of Assessment and Accountability, Office of Healthy Schools, and Office of School Improvement (Figure D).
Formation of the 2011-2012 PD Master Plan
Participation of institutions of higher education
Language in West Virginia Code (§18-2-23a), which defines the Board of Education's
role in coordinating professional development, calls for the Board "To ensure that the exper-
tise and experience of state institutions of higher education with teacher preparation pro-
grams are included in developing and implementing professional development programs."
The reduction in participation by IHEs from two institutions (Fairmont State University and
Marshall University) in 2010-2011 to only one (Marshall University) in 2011-2012 is notable,
and indicates an area that needs attention if the statute is to be fully implemented. Ten pub-
lic IHEs in West Virginia with teacher preparation programs did not participate in the 2011-
2012 plan.
Participation of WV Department of Education offices
The lack of participation by several offices within the West Virginia Department of
Education, including two that participated in 2010-2011, was also notable for 2011-2012.
The Board and the WV Center for Professional Development (which is responsible for put-
ting together the plan) did address this issue during the formation of the 2012-2013 PD Mas-
ter Plan, and as a result, the new plan includes all offices that provide professional
development to teachers, administrators, and other school and district staff.
Participation of RESAs
For this PD Master Plan and the one preceding it, the RESAs submitted a common
set of nonspecific session titles, which served as categories—or placeholders—for the profes-
sional development they would offer throughout the following year. This approach has posed
a challenge for data collection from participants, and with one notable exception (RESA 1)
has resulted in what may be a skewed picture of a lack of alignment of RESA offerings with
the Board’s Goals for Professional Development. RESA representatives at the December
3 The sample was very small for this group and this was the only group for which the t test re-
turned insignificant results, so this result should be used with caution.
Executive Summary
x | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
2011 WVBE PD Committee meeting and the February 2012 State Professional Development
Advisory Committee Meeting indicated that district strategic plans form the basis of the RE-
SAs’ annual strategic plans for professional development, which are required by WVBE Poli-
cy 3233 and are due on October 1 each fall. To further investigate this issue, we reviewed the
professional development portions of all eight RESA strategic plans submitted on October 1,
2011, and found most of them to be quite detailed in the area of professional development,
listing specific titles for workshops, seminar series, online courses, and so forth. In some
cases, there were rich descriptions of objectives and action plans from which profession de-
velopment session titles could readily be developed. It seems that a large part of the chal-
lenge for RESAs in forming and evaluating the PD Master Plan has to do with timing. Their
planning cycle does not align with the Board’s planning cycle—both of which are driven by
state code. Yet, there may be a solution in state code. The statute that outlines the process
for developing the PD Master Plan includes the following language (§18-2-23a., see Appen-
dix A):
The Master Plan shall serve as a guide for the delivery of coordinated professional
staff development programs by the State Department of Education, the Center for
Professional Development, the state institutions of higher education and the regional
educational service agencies beginning on the first day of June in the year in which
the Master Plan was approved through the thirtieth day of May in the following year.
This section does not prohibit changes in the Master Plan, subject to State Board
approval, to address staff development needs identified after the Master Plan was
approved. (Emphasis added.)
This language seems to leave open the possibility of amending the PD Master Plan to include
more detailed plans for professional development that could be submitted by the RESAs af-
ter they have had the opportunity to consult the strategic plans generated by the districts
they serve.
Participation of the West Virginia Center for Professional Development
The West Virginia Center for Professional Development (CPD) is at the center of the
process for developing the PD Master Plan. It convenes meetings and works with the other
providers to compile the plan and submits its own slate of planned professional development
sessions. In preparation for the formation of the 2012-2013 PD Master Plan, CPD obtained a
list of planned professional development compiled from school strategic plans, for the Board
PD Committee to use in its process of setting new goals for professional development. CPD
worked closely with the Higher Education Policy Commission, and as a result there were
representatives from Concord University, Marshall University, West Virginia State Universi-
ty, and West Virginia University at the PD Advisory Committee Meeting in February 2012.
CPD also worked with the State Superintendent to convey to offices in the WVDE that all
professional development they plan to provide must be part of the PD Master Plan, and
prepared an informational frequently-asked-questions document about the Master Plan,
which explained the process and included the Board’s Goals (both strategic and professional
development). Lastly, they posted an online tool for providers to use in submitting their ses-
sion titles. This additional work resulted in an increase in the number of WVDE offices in-
cluded in the plan from nine in 2011-2012 to 15 in 2012-2013. It did not result, however, in
Executive Summary
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | xi
greater participation of IHEs. Only Marshall University continues to participate in the PD
Master Plan.
Discussion of the process for developing the PD Master Plan
While the challenges facing RESAs were specifically described, other agencies may
also face some of the same planning schedule issues, and may be more able to provide a
comprehensive and realistic plan for their professional development if they could add or
subtract sessions early in the fall. If scheduling is determined to be at issue, it appears that
there is a potential solution to the problem—that is, to reopen the PD Master Plan for a re-
vised list of PD session titles with an early October deadline. This date is only 4 months into
the PD Master Plan reporting year, so it would allow providers the opportunity to update
plans for the remaining 8 months.
While the ability to update plans would be useful, it only affects logistical aspects of
the planning process. Other, more programmatic and substantive issues remain about how
to use the PD Master Plan as a stronger mechanism for coordinating professional develop-
ment. The extent to which this plan helps drive the agenda for professional development is
unknown, although our review of RESA strategic plans in the context of the Boards’ Goals
for Professional Development provides some evidence that there may not be a strong con-
nection between what some providers deliver and what is envisioned by the Board through
its PD Master Plan.
Yet, the Board’s leadership in coordinating professional development was strongly
called for in the recently released, Education Efficiency Audit of West Virginia’s Primary
and Secondary Education System, by Public Works (2012), who asserted that “States cannot
improve the quality of professional development with a patchwork or series of improvement
strategies. Rather, improvements must be strategic, systemic, and use research to determine
the way professional development is selected, delivered, evaluated, and funded” (p. 62). The
authors endorsed the findings of the November 2006 RESA Task Force, which also called for
more focused leadership with the following claim:
. . . [T]he governance structure of the West Virginia professional development system
is too diffuse to assume that the entities responsible for professional development are
working in a synchronized way to meet state goals for professional development. The
professional development system needs to be driven by an agreed upon professional
development definition, vision, and standards (Public Works, 2012, p. 55).
Later in the report, they made the following recommendation:
Refine and use the Master Plan for Statewide Professional Staff Development as a
true strategic planning tool. In interviews conducted for this review, educators com-
mented that while the Master Plan articulates the state’s PD goals, it does not lay out
a larger strategy for how those goals will be achieved. Some interviewees described
the Plan as merely a “laundry list of state-approved PD courses.” At its inception, the
Master Plan was intended to serve as a tool to identify redundancies in PD offerings.
However, so far, there are no real examples of eliminating duplications.
During the course of the 2011-2012 year—before the Education Efficiency Audit was
released—the Board began moving in the direction articulated in the audit. It adopted stand-
Executive Summary
xii | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
ards for professional development that are based on the Learning Forward (formerly the Na-
tional Staff Development Council) standards. In December 2011, it developed a definition of
professional development and a new set of goals. The new goals are strongly aligned to the
Board’s Strategic Goals and the Superintendent’s priorities, forming a cohesive and coherent
vision of the role for professional development. The Board’s PD Committee also began ex-
ploring options for creating an online catalog of professional development offerings and cen-
tralized registration system. It soon realized that creating such a system would require
planning, resources, and time to do well. Yet developing such a system could help eliminate
duplications, provide needed oversight, and expedite the PD evaluation process.
Clearly, there is much to consider as the Board looks ahead to future PD Master
Plans. As a next step toward the goal of actively coordinating professional development as
envisioned in West Virginia Code (§18-2-23a), and as called for in the Education Audit, the
Board may need to know more about this very complex terrain, including a more compre-
hensive view of the professional development that is offered by the following groups:
IHEs—What sorts of partnerships exist between IHEs and RESAs, districts, and schools, and how well is what they are doing aligned with the Superintendent’s prior-ities and the Board’s strategic and professional development goals?
RESAs and WVDE—What is the complete picture of professional development that RESAs and WVDE offices provide during the course of a year, and how do they de-cide upon those particular offerings; that is, are there criteria they use for prioritizing what they do in response to requests from the field, or do they respond based mainly on an expressed need by a school or district? How closely aligned is their decision making about the slate of professional development they will offer with the Superin-tendent’s priorities and the Board’s strategic and professional development goals?
Districts and school—What professional development do they provide? How do they prioritize their offerings? Who does the actual training/facilitation—vendors, IHEs, in-house staff, others?
Overall, a more comprehensive study of professional development could build on the work
done by the authors of the Education Efficiency Audit, but also investigate what takes place
at the school and district levels. We have heard from the RESAs and others that a large por-
tion of the professional development they offer falls outside of the sessions listed in the plan
due to shifting priorities, and needs as they arise. The Board may wish to examine this phe-
nomenon, and consider whether professional development that providers offer outside of
the PD Master Plan aligns with the Board's strategic goals and priorities, and if not, deter-
mine if the Board should enlarge its vision or if such professional development efforts should
be abandoned or refocused.
Limitations of the Study
The participant survey conducted in November-December 2011 and April-May 2012
(with supplemental polling in August for CPD participants) asked respondents to recall PD
sessions they had participated in at some point in the past. In some cases, the sessions had
taken place up to five months prior to the survey. For this reason, there is a possibility of
temporal bias in survey participants’ responses.
Executive Summary
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | xiii
Furthermore, the use of a retrospective pretest/posttest methodology to assess
changes in knowledge, behavior and skills, and attitudes and beliefs poses some concerns.
We used this methodology primarily because some researchers have argued that a phenom-
enon called response shift bias can occur when conducting traditional pretest/posttest de-
signs. Response-shift bias “occurs when a participant uses a different internal understanding
of the construct being measured to complete the pretest and posttest” (Moore & Tananis,
2009, p. 190). Consider this in context of professional development. Some respondents
begin their involvement in professional development with a misconception that they are al-
ready well-versed in the content to be covered. When given a pretest, they rate their own
knowledge, behavior and skills, and attitudes and beliefs very positively. However, over the
course of the professional development, as they develop a deeper understanding of the con-
tent being covered, they realize they did not know as much as they originally thought. As
such, when presented with the posttest, their frame of reference has shifted and they could
potentially rate their knowledge, behavior and skills, and attitudes and beliefs lower than
they did on the pretest. This can lead to problems in analyzing the impact of the professional
development. For this reason, some researchers advocate for using retrospective pre-
test/posttest designs as we did in this study.
Despite this strength of the retrospective pretest/posttest design, a recent research
study conducted by Nimon, Zigarmi, and Allen (2011) found that using traditional pre-
test/posttest designs leads to less biased estimates of program effectiveness. The authors
present a compelling case that presenting both pre- and posttest items simultaneously on a
single survey is among the most biased design options available to researchers and can sig-
nificantly inflate effect size estimates. The authors recommend traditional pretest/posttest
designs when possible and advocate for the implementation of a separate retrospective pre-
test to allow researchers to determine the presence of any response-shift bias. This design
option, despite its strength, was not feasible in this study due to a mismatch between the
scale of professional development offerings in the state and available evaluation staffing re-
sources. Therefore, we recommend cautious interpretation of our own estimates of effect
size, as they may be somewhat inflated.
While a 68.1% response rate (or 74.8% for the sample adjusted for attrition) is high
for this type of survey, there remained a portion of the sample from whom we did not hear.
We can account for approximately 7% of the nonrespondents as individuals whose e-mail
addresses were broken or obsolete, or who contacted us to report that they had not attended
the session in our survey participation request. But this leaves approximately 25% of the to-
tal sample whose perceptions about the professional development are unknown.
Our literature review did not reveal any appropriately tested and validated measures
of professional development quality and/or impact that met our specific needs. Therefore,
we developed our own measures for this study. Due to time and resource constraints, these
measures were not field tested prior to operational use. Consequently, there is not adequate
validity evidence that the constructs we sought to measure are fully addressed by our survey
items. The measures used possess only face validity.
Executive Summary
xiv | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Issues for Consideration
The following considerations are based on findings from this study and are offered
for the purpose of improving the overall process of formulating, implementing, and evaluat-
ing the West Virginia Master Plan for Professional Staff Development (PD Master Plan). Re-
lated to development of future PD Master Plans, we offer the following suggestions:
With the exception of Marshall University’s June Harless Center, other institutions of
higher education (IHEs) with teacher preparation programs were absent from the
2011-2012 PD Master Plan, despite the WV Center for Professional Development’s
(CPD) efforts to include them. The Board may wish to consider if there are other
strategies that could be employed to bring this group into the Master Plan.
Similar to the approach they used in the 2010-2011 PD Master Plan, RESAs listed on-
ly seven session titles, and then reported multiple professional development sessions
they provided under one of those seven titles during the course of the year. Staff indi-
cate they have taken this approach because they cannot predict what professional de-
velopment districts will request before the districts put together their strategic plans.
The Board may wish to consider reopening the PD Master Plan in early October, to
allow the RESAs and other providers to revise their lists of planned professional de-
velopment sessions based on strategic needs of their target audiences.
Related to implementation of future PD Master Plans, we suggest the following:
There were some newcomers to the PD Master Plan this year, which may explain why
there was a slight drop in the fulfillment of sessions planned, from 80% last year to
77.5% in 2011-2012. A review of the reasons for not providing planned sessions re-
vealed that the most prevalent reason for cancelling sessions was lack of interest (not
enough people registered or districts did not request it). Five sessions were cancelled
to avoid a duplication of effort and another five due to changing priorities. Raising
the rate of fulfillment would be a good goal, which could be enhanced by allowing
providers to update the plan each October (as called for above).
Again this year, survey respondents indicated that providers have done well deliver-
ing professional development that is research-based in most of the seven dimensions
measured. Several of the providers could improve related to supporting extension of
the professional development to the workplace via discussions and collaboration,
and by providing follow-up sessions.
As discussed extensively in the previous section, about half of all respondents did not
agree that the professional development they attended was helpful in meeting the
Board goal that providers indicated the session was meant to support. Providers
should consider re-examining the alignment of the professional development they
have in the current plan (and future plans) to be sure that they are providing expe-
riences that truly are focused on the Board goals.
Travel time to professional development covered in the 2011-2012 PD Master Plan
was estimated to total more than 20,000 hours for the more than 20,000 attendees.
While some of this travel cannot be avoided, providers should consider looking for
Executive Summary
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | xv
ways to reduce it, especially by using formats other than face-to-face for their ses-
sions, which is the format currently used for 90% of all sessions.
Related to evaluation of future PD Master Plans, we suggest the following:
As noted in the Education Efficiency Audit, the evaluation of the PD Master Plan co-
vers only professional development delivered by providers included in the plan, and
only the subset of their offerings that were aligned with the Board’s goals for profes-
sional development and submitted as part of the plan. The drop off in attendance by
42% this year may be an indication that even among this group, less of what provid-
ers offered fell under the auspices of the PD Master Plan. Left out of the PD Master
Plan, and this study, is likely a large portion of the professional development that
takes place in West Virginia—including professional development delivered by dis-
tricts and schools. We know little about this professional development, including
whether it is aligned with goals and priorities of the Board and Superintendent. The
Board may wish to consider studying more comprehensively the professional de-
velopment that is offered by the four main groups of state and regional providers
(CPD, IHEs, RESAs, and WVDE), and by districts and—to the extent possible—by
schools. Conducting a 1-year study could provide essential background information
as the Board strives to fulfill the leadership and coordinating role laid out for it in
West Virginia Code (§18-2-23a) and urged upon it by the Education Efficiency Audit
(Public Works, 2012). The Board could require providers to report on all profes-
sional development they offer, and in so doing, indicate for each session they con-
duct and report, to which goal the PD is aligned—or provide a rationale for why the
professional development was offered. Part of the study could include an analysis of
the rationales provided, which could inform the Board as it enters a new cycle of
goal formation and planning.
xvi
xvii
Contents
Executive Summary ................................................................................................................. iii
Introduction ............................................................................................................................... 1
Goals of the Evaluation ....................................................................................................... 2
Methods .................................................................................................................................... 3
Population to be Studied..................................................................................................... 3
Sampling Procedures .......................................................................................................... 4
Sample Size, Power, and Precision ..................................................................................... 5
Measures and Covariates .................................................................................................... 6
Research Design ...................................................................................................................7
Description of professional development ......................................................................7
Description of participants ............................................................................................7
Participant perceptions of the extent to which the professional development used
research-based practices ..........................................................................................7
Participant perceptions of the extent to which the professional development met
the goals established by the WVBE as part of the 2011–2012 PD Master Plan ...... 8
Participant perceptions about the impact of the professional development on their
knowledge, behaviors, and attitudes/beliefs .......................................................... 8
Results ....................................................................................................................................... 9
Implementation of the PD Master Plan: Analysis of Provider Reports .............................. 9
Level of implementation ............................................................................................... 9
Attendance trends ........................................................................................................ 11
Duration and time span ............................................................................................... 15
Format, duration, and timespan .................................................................................. 16
Location and top providers .......................................................................................... 16
Use of Research-Based Practices, Alignment with Goals, and Perceptions of
Impact: Analysis of Participant Survey Responses ..................................................... 19
Demographic characteristics of survey respondents. .................................................. 19
Adherence to research-based practices ...................................................................... 27
Perceived effectiveness of professional development in meeting Board goals........... 33
Perceived impact of professional development .......................................................... 37
Discussion ............................................................................................................................... 47
Implementation of Planned Sessions ............................................................................... 47
Executive Summary: Goals of the Evaluation
xviii | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Use of Research-Based Practices ...................................................................................... 48
Perceived Effectiveness in Addressing the Board’s Goals ................................................ 48
Perceived Impacts on Knowledge, Behavior, and Attitudes/Beliefs ................................ 49
Formation of the 2011-2012 PD Master Plan ................................................................... 50
Participation of institutions of higher education ........................................................ 50
Participation of WV Department of Education offices ............................................... 50
Participation of regional education service agencies .................................................. 50
Participation of the West Virginia Center for Professional Development .................. 53
Final thoughts on the process for developing the PD Master Plan ............................ 53
Limitations of the Study ................................................................................................... 55
Issues for Consideration .......................................................................................................... 57
References ............................................................................................................................... 59
Appendix A. West Virginia Code (§18-2-23a).......................................................................... 61
Appendix B. Providers’ Session Report Protocol .................................................................... 63
Appendix C. WV PD Master Plan: 2012 Participant Survey ................................................... 65
Appendix D. E-mail Survey Participation Requests ................................................................ 71
Appendix E. Providers’ Missing Sessions Report .................................................................... 77
Appendix F. Participant Survey Data Tables .......................................................................... 79
Adherence to Research-Based Practices Data Tables ........................................................ 81
Paired Samples T Tests of Perceived Impacts on Knowledge, Behaviors, and
Attitudes/Beliefs ......................................................................................................... 85
List of Figures
Figure 1. Percent of Sessions Listed in the PD Master Plan That Were Delivered, by
Provider..................................................................................................................10
Figure 2. Percent of Total Professional Development Attendance Delivered by Each
Provider.................................................................................................................. 12
Figure 3. Attendance Changes from 2010-2011 to 2011-2012, by Provider Group .............. 14
Figure 4. Attendance by Session Format .............................................................................. 14
Figure 5. Participants' Expected Professional Development Hours for the Year ................. 19
Figure 6. Participants' Highest Level of Educational Attainment ....................................... 20
Figure 7. Participants' Programmatic Level ........................................................................ 20
Figure 8. Participants' Years of Work Experience in Education ......................................... 20
Executive Summary
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | xix
Figure 9. Respondents' Professional Role ............................................................................. 21
Figure 10. Respondents’ Primary Content Area ..................................................................... 21
Figure 11. Percent Travel Time by Time Category, by Work Location .................................. 23
Figure 12. Estimated Average Respondent Travel Time by Work Location .......................... 24
Figure 13. Estimated Average Travel Time by Region and Professional Role, in Minutes ... 25
Figure 14. Number of Respondents, by Provider .................................................................. 26
Figure 15. Adherence to Research-Based Practices for High Quality Professional
Development, Overall ........................................................................................... 28
Figure 16. Mean Overall Quality Index Rating (i.e., Adherence to Research-Based Practices
for High Quality Professional Development), by Programmatic Level ................ 29
Figure 17. Mean Quality Index Rating for Adherence to Research-Based Practices for High
Quality Professional Development, by Professional Role .................................... 29
Figure 18. Mean Quality Index Rating for Adherence to Research-Based Practices for High
Quality Professional Development, by Content Area ........................................... 30
Figure 19. Mean Quality Index Rating for Adherence to Research-Based Practices for High
Quality Professional Development, by Provider .................................................... 31
Figure 20. Ratings for Adherence to Research-Based Practices for High Quality Professional
Development, by Provider Group ......................................................................... 32
Figure 21. Percentage of Overall Agreement or Strong Agreement that PD was Helpful in
Meeting Board Goals, by Provider ........................................................................ 34
Figure 22. Percentage of Agreement or Strong Agreement that PD was Helpful in Meeting
Board Goals, by Provider Group ........................................................................... 35
Figure 23. Mean Perceived Impact of Professional Development (Pre-/Postsession), Total
Sample .................................................................................................................. 39
Figure 24. Perceived Impact of Professional Development (Pre-/Postsession), Effect Size, by
Programmatic Level .............................................................................................. 40
Figure 25. Perceived Impact of Professional Development (Pre-/Postsession), Effect Size, by
Professional Role ................................................................................................... 41
Figure 26. Perceived Impact of Professional Development (Pre-/Postsession), Effect Size, by
Content Area ......................................................................................................... 42
Figure 27. Perceived Impact of Professional Development (Pre-/Postsession), Effect Size, by
Provider (RESAS only) ......................................................................................... 43
Figure 28. Perceived Impact of Professional Development (Pre-/Postsession), Effect Size, by
Provider (All but RESAs) ...................................................................................... 44
Figure 29. Perceived Impact of Professional Development (Pre-/Postsession), Effect Size, by
Provider Group ..................................................................................................... 45
Executive Summary: Goals of the Evaluation
xx | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
List of Tables
Table 1. Sampling Overall and By Provider for Participant Survey ..................................... 5
Table 2. Provision of Professional Development Included in the PD Master Plan and
Attendance, by Provider ........................................................................................ 11
Table 3. Attendance Trends at PD Master Plan Sessions 2010-2011 to 2011-2012, by
Provider.................................................................................................................. 13
Table 4. Attendance at Professional Development by Goal Focus ...................................... 14
Table 5. Average Duration and Timespan of Professional Development Sessions, by
Provider.................................................................................................................. 15
Table 6. Top Providers and Format of Professional Development, by County ................... 17
Table 7. Number and Percent of Respondents by Employer .............................................. 21
Table 8. Comparison of the Percent of Total E-mail Addresses to Percent of Total
Respondents, by Provider ..................................................................................... 26
Table 9. Adherence to Research-Based Practices for High Quality Professional
Development, Overall ........................................................................................... 27
Table 10. Percent of Participants Who Agree or Strongly Agree That the Professional
Development was Helpful in Meeting the Board Goal Designated by the Provider
for the Session, by Provider .................................................................................. 36
Table 11. Interpretation of Effect Size Estimates Used in this Study .................................. 38
Table 12. Paired-Samples T Test of Perceived Impacts for Whole Sample ......................... 39
Table A 1. Estimated Travel Time to Participate in Professional Development ..................... 79
Table A 2. Adherence to Research-Based Practices for High Quality Professional
Development, by Programmatic Level .................................................................. 81
Table A 3. Adherence to Research-Based Practices for High Quality Professional
Development, by Professional Role ....................................................................... 81
Table A 4. Adherence to Research-Based Practices for High Quality Professional
Development, by Content Area ............................................................................. 82
Table A 5. Adherence to Research-Based Practices for High Quality Professional
Development, by Provider .................................................................................... 83
Table A 6. Paired-Samples T Test of Perceived Impact, by Programmatic Level .................. 85
Table A 7. Paired-Samples T Test of Perceived Impact, by Professional Role ..................... 85
Table A 8. Paired-Samples T Test of Perceived Impact, by Content Area ............................. 86
Table A 9. Paired-Samples T Test of Perceived Impact, by Provider .................................... 87
1
Introduction
West Virginia Code §18-2-23a4 requires the West Virginia Board of Education
(WVBE) to establish annual professional development goals for public schools; to coordinate
professional development programs; and to guide program development, approval and eval-
uation. The legislative intent of this section of state law is
(1) To provide for the coordination of professional development programs by the
State Board;
(2) To promote high-quality instructional delivery and management practices for a
thorough and efficient system of schools; and
(3) To ensure that the expertise and experience of state institutions of higher educa-
tion with teacher preparation programs are included in developing and implementing
professional development programs.
Toward these ends, the WVBE (2011) adopted the following goals for professional
development for the 2011–2012 school year:
As a result of professional development, participants will . . .
5. deliver standards-based instruction in classrooms to ultimately improve student learning. Such instruction will exhibit an understanding of the Common Core State Standards for English/Language Arts and Mathematics including how the new standards align to the West Virginia 21st Century Content Standards and Ob-jectives.
6. apply their knowledge of the Common Core State Standards into professional practice with specific attention to: (1) addressing writing and text complexity, (2) designing school-wide efforts to improve literacy and numeracy, and (3) ensuring technology and science are integrated into improvement efforts.
7. effectively apply the West Virginia Professional Teaching Standards to ensure that all students in West Virginia are served by high quality educators.
8. exhibit increased leadership and collaboration to facilitate school improvement. (pp. 4-5)
West Virginia Code §18-2-23a further states that, each year, once the annual goals
are set, the state board is required to submit the goals to the major state agencies
responsible for providing professional development to teachers, administrators, and other
professional education staff statewide, including the West Virginia Department of Education
(WVDE), the West Virginia Center for Professional Development (CPD), the regional
education service agencies (RESAs), and the Higher Education Policy Commission (HEPC).
These agencies then collaborate in the development of an annual master plan for
professional development aligned with the goals. The law states,
4 See West Virginia Code §18-2-23a, Annual professional staff development goals established
by State Board; coordination of professional development programs; program development, approval
and evaluation. Available at
http://www.legis.state.wv.us/wvcode/ChapterEntire.cfm?chap=18&art=2§ion=23A#02.
Introduction: Goals of the Evaluation
2 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
The Master Plan shall serve as a guide for the delivery of coordinated professional
staff development programs by the State Department of Education, the Center for
Professional Development, the state institutions of higher education and the regional
educational service agencies beginning on the first day of June in the year in which
the Master Plan was approved through the thirtieth day of May in the following year.
This section does not prohibit changes in the Master Plan, subject to State Board ap-
proval, to address staff development needs identified after the Master Plan was ap-
proved.
Lastly, the statute requires evaluation of the effectiveness of the professional staff
development programs. The WVBE has charged the WVDE Office of Research to meet this
requirement.
Goals of the Evaluation
This evaluation study provides summative information about the implementation of
the Master Plan for Professional Staff Development for 2011-2012 as follows:
Implementation of planned sessions, including the number of teachers, administra-
tors, and others who participated in the professional development sessions targeted at each
of the goals listed in the PD Master Plan from June 1, 2011 through May 31, 2012; sessions
planned versus sessions delivered; duration of the sessions; location of the sessions; attend-
ance at sessions conducted by each of the providers; and the delivery mode (i.e., online, face-
to-face, blended, or other).
Participant perceptions about the sessions’ adherence to research-based practices
for high quality professional development, including whether sessions were (a) intensive in
nature; (b) specific and content-focused; (c) relevant to participants’ current needs and pro-
fessional circumstances; (d) hands-on with active learning opportunities; (e) supported by
follow-up discussion or collaboration at participants’ workplaces or online; (f) supported by
related follow-up PD sessions; and (g) beneficial and had a positive impact on participants’
students and/or schools.
Participant perceptions about the sessions’ helpfulness with regard to reaching the
specific goals of the PD Master Plan, including whether the professional development
helped participants to (a) deliver standards-based instruction in reading and mathematics
(and other content areas) to ultimately improve student learning; (b) apply their knowledge
of the Common Core State Standards in their professional practice with specific attention to
addressing writing and text complexity, designing school-wide efforts to improve literacy
and numeracy, and ensuring that technology and science are integrated into improvement
efforts; (c) apply the West Virginia Professional Teaching Standards to ensure that all stu-
dents in West Virginia are served by high quality educators; and/or (d) exercise increased
leadership and collaboration to facilitate school improvement.
Participants’ perceived (self-reported) outcomes resulting from their involvement in
professional development associated with the PD Master Plan—for example, changes in ed-
ucators’ (a) knowledge; (b) behaviors and skills; and (c) attitudes and beliefs.
3
Methods
Population to be Studied
This study examines the performance of professional development providers in im-
plementing the 2011-2012 Master Plan for Professional Staff Development (PD Master
Plan), which was approved by the West Virginia Board of Education in May 2011. The list of
providers includes the following:
1. Marshall University June Harless Center
2. Regional Education Service Agency (RESA) 1
3. RESA 2
4. RESA 3
5. RESA 4
6. RESA 5
7. RESA 6
8. RESA 7
9. RESA 8
10. West Virginia Center for Professional Development (CPD)
11. West Virginia Department of Education (WVDE) Office of Assessment and Account-
ability
12. WVDE Office of Healthy Schools
13. WVDE Office of Institutional Education Programs
14. WVDE Office of Instruction
15. WVDE Office of School Improvement
16. WVDE Office of Special Programs
17. WVDE Office of Title I
18. WVDE Office of Title II, III, and System Support (Office of International Schools, on-
ly)
Not present in this list are three providers that participated in the 2010-2011 PD
Master Plan: Fairmont State University, WVDE Office of Career and Technical Instruction,
and WVDE Office of Instructional Technology. Also not included in this list are several
WVDE offices that provide professional development to educators and administrators,
which did not participate in the formation of the PD Master Plan in either year, including the
Office of Adult Education and Workforce Development, Office of Career and Technical Ac-
countability and Support, Office of Career and Technical Innovations, Office of Child Nutri-
tion, and Office of Professional Preparation. Additionally, nine public institutions of higher
education with teacher preparation programs are absent from this list of providers.
Methods: Sampling Procedures
4 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Sampling Procedures
All 18 professional development providers in the PD Master Plan reported on all ses-
sions they conducted as part of the Plan, providing the (a) title of session, (b) beginning and
ending dates, (c) duration of the session in hours, (d) format of the sessions, (e) number of
participants, and (f) e-mail addresses for all participants (see WVBE 2011-2012 PD Master
Plan Session Report form in Appendix B, p. 63). Using aggregated data from items a-e, we
provide detailed information in this report on various aspects of the PD Master Plan imple-
mentation.
Using the e-mail addresses of participants reported by the providers as attending
sessions held from June 1, 2011 through March 31, 2012, we conducted two online surveys of
teachers, administrators, and others who attended the professional development to collect
their impressions of the quality of the professional development they received. For both the
first and second participant surveys (conducted in late fall, November-December, 2011 and
spring, April-May, 2012), we applied multistage sampling—systematic, stratified, and simple
random—to select participants for this study, using the following procedure:
We combined the e-mail addresses—each e-mail address with its associated PD
Master Plan session ID and provider—into one comprehensive Excel file (N =
9,686 for the first participant survey; N = 4396 for the second).
Participants were sorted by e-mail address and assigned a random number. The
sample was then resorted by random number and the first occurrence of each in-
dividual’s e-mail was selected. For the spring survey, an extra step was involved
to avoid contacting any individual twice in one year. The sample was checked
against the sample from the fall, and any case that had been previously surveyed
was removed.
The sample was then stratified by provider and a simple random sample was
drawn for each provider.
The first sample (n = 4,332) was larger than the second sample (n = 1,980), due to
the greater number of participant e-mail addresses supplied by providers for the first report-
ing period, and also due to the elimination of participants who had been part of the first
sample. Overall, sampling for each provider was as shown in Table 1 below.
Methods: Sample Size, Power, and Precision
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 5
Table 1. Sampling Overall and By Provider for Participant Survey
Provider
Attendance Reported
E-mail addresses provided
Sample of June-March
participants
Total 21,552 14,725 6,312 Center for Professional Development 1,109 1,086 558 Marshall University 1,181 793 320 RESA 1 389 315 162 RESA 2 556 305 198 RESA 3 272 186 128 RESA 4 1,086 747 400 RESA 5 980 477 291 RESA 6 319 196 114 RESA 7 653 536 302 RESA 8 402 386 237 WVDE Office of Assessment and Accountability 1,133 1,092 457 WVDE Office of Healthy Schools 795 845 356 WVDE Office of Institutional Education Programs 726 399 196 WVDE Office of Instruction 3,995 2,616 912 WVDE Office of School Improvement 1,150 1,586 589 WVDE Office of Special Programs 3,958 1,029 414 WVDE Office of Title I 2,700 2,021 605 WVDE Office of Title II, III, and System Support 148 110 73
It should be noted that participants in professional development scheduled during
the months of April and May, 2012 were not surveyed to avoid interfering with the
WESTEST 2 testing window and in recognition of the fact that teachers and others are
difficult to reach with the onset of the summer break.
It should also be noted that the online reporting system for providers—that is, the
WVBE 2011-2012 PD Master Plan Session Report—and the diligence of the providers in
using it resulted in a much higher percentage of reported participants being represented
with e-mail addresses, from which a sample could be drawn. This year, providers supplied e-
mail addresses for about three-quarters of reported participants, compared with less than
half last year.
Sample Size, Power, and Precision
Knowing the population of each provider, we used sample size calculation software5
to determine what sample size was needed to attain a 95% confidence level with a +/-3%,
margin of error, and then drew a sample sufficient to achieve that level of confidence. The
sample amounted to about 43% of the e-mail addresses submitted by providers, or approxi-
mately 6,300 attendee e-mail addresses. This sample is not large enough, however, to pro-
vide reliable information about individual sessions or events.
5 MaCorr Research (n.d.) Sample Size Calculator. Available online at
http://www.macorr.com/sample-size-calculator.htm.
Methods: Measures and Covariates
6 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Measures and Covariates
As mentioned above, providers used the WVBE 2011-2012 PD Master Plan Session
Report to report essential information about each professional development session they
conducted, including (a) name of provider, (b) contact information, (c) title of session, (d)
duration in hours, (e) beginning and ending dates, (f) county location, (g) format (face-to-
face, online, or blended), (h) number of participants, (i) e-mail addresses for all partici-
pants, and (j) comments (optional) (see Appendix B, p. 63).
Information collected using this session report was combined with information about
the planned sessions in the PD Master Plan, which allowed us to report on sessions held re-
lated to each of the four West Virginia Board of Education goals for professional develop-
ment, and other information about implementation of the plan.
To collect participants’ perceptions about the quality, relevance, and effectiveness of
the training, an online survey questionnaire posted via SurveyMonkey, the WV PD Master
Plan: 2012 Participant Survey was used (see Appendix C, p. 65). Each participant in the sur-
vey was contacted—up to five times—about only one PD session they attended between June
1, 2011 and March 31, 2012 (see e-mail survey participation requests in Appendix D, p. 71).
Responses about these individual provider offerings were then aggregated to provide overall
perceptions about various aspects of the training offered. The questionnaire included a sec-
tion on participant demographics and three sections on participant perceptions about the
PD they attended.
Independent variables related to participants included (a) role, (b) county, (c) level of
educational attainment, (d) years of experience in education, (e) estimated number of hours
spent in PD this year, (f) programmatic level, (g) professional role, and (h) main content ar-
ea taught, if any. Dependent variables were participant perceptions about various aspects of
the PD sessions, including (a) the sessions’ adherence to research-based practices for high
quality professional development; (b) the sessions’ helpfulness with regard to the specific
goals of the PD Master Plan; and (c) perceived (self-reported) outcomes of participants’ in-
volvement in the professional development.
Lastly, we surveyed providers in early June 2012, to discover the reasons why some
sessions listed in the PD Master Plan were not offered. To collect these data, we used an Ex-
cel spreadsheet (see Appendix E, p. 77) that listed for each provider, the sessions in the PD
Master Plan for which we had not received any reports, and asked them to select one of the
following possible explanations:
Lack of applicants or registrations
Inclement weather
Canceled to protect instructional time with students
Anticipated funding not received
Priorities for professional development changed
Session was identified as a duplication of effort
Session did not receive the required approvals
Other
Methods: Research Design
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 7
If providers selected Other, they were asked to provide an explanation.
Research Design
We used a multimethod research design for this project, and used descriptive statis-
tics to explore five distinct areas: (a) implementation of the professional development ses-
sions listed in the PD Master Plan, (b) description of participants, (c) participants’
perceptions of the quality of professional development, (d) participant perceptions of the
extent to which professional development met the goals established as part of the PD Master
Plan, and (e) participant perceptions about the impact of the professional development on
their knowledge, behaviors, and attitudes/beliefs. We also conducted a variety of post-hoc
exploratory analyses to determine the relationship between variables included in the data
set. Each of these five areas involved a variety of analyses, as described below.
Description of professional development
We used descriptive analyses including frequency distributions and cross-tabulations
to provide an overview of the professional development offered by each provider during the
2011–2012 academic year and trends from 2010-2011 to 2011-2012. This analysis included a
description of which events were published in the Master Plan and then provided, as well as
those that were published, but never provided (e.g., canceled events). We analyzed results of
the missing sessions reports submitted by providers that did not deliver all of the sessions in
their plan.
Description of participants
We conducted additional frequency analyses to examine the composition of the par-
ticipant survey sample with respect to key demographic variables, including respondents’ (a)
county of employment, (b) education level, (c) professional role, (d) primary content area (if
any), (e) programmatic level, (f) total number of hours spent in professional development
during the current academic year, (g) overall years of experience in education, and (h) how
long participants had to travel to attend the session in question.
Participant perceptions about the extent to which the professional development used
research-based practices
We calculated average ratings, standard deviations, and frequency distributions to
describe the extent to which participants described the PD session as adhering to research-
based practices for high quality professional development, that is, whether the professional
development was (a) intensive in nature, (b) specific and content-focused, (c) relevant to
their needs as educators, (d) hands-on including active learning opportunities, (e) supported
by follow-up discussion or collaboration at their school, office, or online, (f) supported by
follow-up professional development sessions, and (g) beneficial and positive for students
and/or schools. These results are presented for the overall sample as well as disaggregated
by programmatic level, content area, professional role, provider, and provider group.
Methods: Research Design
8 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Participant perceptions about the extent to which the professional development met the
goals established by the WVBE as part of the 2011–2012 PD Master Plan
We used descriptive statistics (i.e., mean, standard deviation, frequency distribution)
to examine the extent to which the professional development provided in 2011–2012 met the
goals set forth by the WVBE as part of the 2011–2012 PD Master Plan. To accomplish this,
we conducted eight distinct analyses (i.e., three each for Goals 1 and 2, and one each for
Goals 3 and 4). In these analyses, we first selected all response records in the data set involv-
ing respondents who attended a professional development event that providers indicated
was aligned to, for example, Goal 1 as listed in the PD Master Plan. Then we determined re-
spondents’ ratings regarding the extent to which the event met this specific goal, and report-
ed the percentage of total respondents who indicated they agreed or strongly agreed that the
professional development was helpful in meeting Goal 1.
Participant perceptions about the impact of the professional development on their
knowledge, behaviors, and attitudes/beliefs
The participant survey includes three pairs of items designed to assess the impact of
the professional development experience upon participants’ knowledge, behaviors, and atti-
tudes/beliefs. Each pair consists of an item that asks respondents to rate their knowledge,
behaviors, or attitudes/beliefs before participating in the professional development and then
provide ratings for after having participated in professional development.
We used a retrospective pretest/posttest design to determine if respondents’ posttest
ratings are significantly different from pretest ratings (i.e., paired samples t tests). In addi-
tion, we conducted analyses of the effect size for the difference in respondents’ pre-/posttest
ratings to determine whether any statistically significant differences also have practical sig-
nificance. Results were examined for the entire sample and disaggregated by programmatic
level, content area, professional role, provider, and provider group.
9
Results
Results in this section are presented in two major sections: one devoted to imple-
mentation of the plan, based on provider reports; and one focused on the quality, alignment
with Board goals, and perceived effectiveness of the professional development sessions,
based on a survey of participants.
Implementation of the PD Master Plan: Analysis of Provider Reports
The following results are based on 572 reports submitted by the PD providers using
the online WVBE 2011-2012 PD Master Plan Session Report, during three data collection
periods: November (covering June 1–October 31, 2011), April (covering November 1, 2011–
March 31, 2012), and June (covering April 1–May 31, 2012).
Level of implementation
Overall, 77.5% of the professional development sessions included in the 2011–2012
PD Master Plan were actually provided to educators across the state. PD providers were
asked to report dates, locations, and attendance figures, as well as attendee e-mail addresses
for all sessions they included in the PD Master Plan (Table 2). If we received none of this in-
formation for a particular session, we counted that session as not provided or reported. In
some cases, an individual session listed in the PD Master Plan was held several times with
different groups of educators in various locations during the course of the academic year. In
those cases, we aggregated the e-mail addresses and attendance numbers and reported them
here as one of the planned sessions listed in the PD Master Plan (PD provided column), and
also broke out the number of individual sessions (repetitions) held.
The RESAs planned their sessions together, and submitted the same seven titles,
which functioned more as categories of professional development than as specific offerings
(e.g., “Support School Improvement Process [Curriculum, Instruction, Assessment, School
Culture & Climate, and Student/Parent/Community Support]”). They then held multiple
sessions under each of these titles.
Figure 1 and Table 2 show the level at which each of the providers followed the plan
they submitted and had approved as part of the PD Master Plan. Eight providers delivered
all of the professional development they planned—RESAs 2, 3, 4, 6, 7, and 8 (seven of seven
sessions planned); WVDE Office of Healthy Schools (three of three), and WVDE Office of
Title I (eight of eight). The providers with the lowest level of implementation included
WVDE Office of School Improvement (two of 13),6 WVDE Office of Assessment and Ac-
countability (eight of 14), and Marshall University (11 of 17).
6 The Office of School Improvement underwent a great deal of turnover during this time peri-
od. Staff indicated that sessions were cancelled to avoid duplications of effort, to protect instructional
time, or that content was delivered in other ways (by webinar, or via technical assistance) or by other
agencies (i.e., Higher Education Policy Commission, Education Alliance).
Results: Implementation of the PD Master Plan: Analysis of Provider Reports
10 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
An analysis of the data from providers regarding undelivered professional develop-
ment revealed the following most prevalent explanations (each followed by number of ses-
sions cancelled for this reason):
A lack of requests from counties/schools, or registrations by individuals (11)
Scheduling issues (10)—that is, sessions were delivered before June 1st, 2011 or
postponed until after June 1st, 2012
Sessions delivered but not reported by former staff members (6)
Changing priorities (5)
Avoidance of a duplication of effort (5)
Other, less prevalent issues included funding shortfalls (i.e., grants not received), the need
to protect students’ instructional time, and providers finding other ways to deliver content
(e.g., technical assistance) or through other organizations.
Figure 1. Percent of Sessions Listed in the PD Master Plan That Were Delivered, by Provider
Results: Implementation of the PD Master Plan: Analysis of Provider Reports
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 11
Attendance trends
Total attendance ranged from 148 (WVDE Office of Title II, III, and System Support)
to 3,995 (WVDE Office of Instruction) (Table 2). Figure 2 displays the percentage of the total
attendance (21,552) delivered by each of the providers. The top three providers of profes-
sional development associated with the PD Master Plan, in terms of attendance, were the
WVDE’s Office of Instruction (3,995), Office of Special Programs (3,958), and Office of Title
I (2,700).
Table 2. Provision of Professional Development Included in the PD Master Plan and Attendance, by Provider
PD planned PD provided
PD not provided or
not reported
Percent of Planned PD
provided
Number of individual
sessions held
Attendance all sessions
Overall 218 169 48 77.5 572 21,552
Center for Professional Development 28 25 3 89.3 41 1,109
Marshall University 17 11 6 64.7 62 1,181
RESA 1 7 5 1 71.4 11 389
RESA 2 7 7 0 100.0 20 556
RESA 3 7 7 0 100.0 19 272
RESA 4 7 7 0 100.0 33 1,086
RESA 5 7 6 1 85.7 32 980
RESA 6 7 7 0 100.0 13 319
RESA 7 7 7 0 100.0 42 653
RESA 8 7 7 0 100.0 14 402
WVDE Office of Assessment and Accountability 14 8 6 57.1 38 1,133
WVDE Office of Healthy Schools 3 3 0 100.0 7 795
WVDE Office of Institutional Education Programs 21 14 7 66.7 40 726
WVDE Office of Instruction 32 24 8 75.0 102 3,995
WVDE Office of School Improvement 13 2 11 15.4 5 1,150
WVDE Office of Special Programs 23 19 4 82.6 62 3,958
WVDE Office of Title I 8 8 0 100.0 23 2,700
WVDE Office of Title II, III, and System Support 3 2 1 66.7 8 148
Results: Implementation of the PD Master Plan: Analysis of Provider Reports
12 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Attendance overall is down nearly 42% from last year, dropping from 37,062 in
2010-2011 to 21,552 in 2011-2012 (Table 3). In fact, there were fairly dramatic shifts in at-
tendance across the board during this period. Some of the variation was due to changes in
the list of organizations that participated in forming the PD Master Plan, especially among
WVDE offices; the Offices of Career and Technical Education, and Instructional Technology
dropped out of the PD Master Plan in 2011-2012, whereas the Offices of Healthy Schools,
School Improvement, and Special Programs joined for the first time that year. Still, these
changes accounted for only a net loss of 870 participants. Overall, the WVDE saw an in-
crease in attendance at their professional development sessions of 20.5%. Looking only at
WVDE offices that participated both years, there was a rise in attendance of nearly 23%. On
the other hand, in 2011–2012, the RESAs saw a loss in participation of 73.4%—dropping
from 17,508 participants to 4,657 participants—which accounts for about 83% of the loss in
attendance overall (about 13 of 15.5 thousand fewer participants). Both CPD and the IHEs,
also, reported lower attendance in 2011-2012, dropping by 50.5% and 77.3% respectively,
Figure 2. Percent of Total Professional Development Attendance Delivered by Each Provider
Results: Implementation of the PD Master Plan: Analysis of Provider Reports
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 13
and accounting for the remaining loss in attendance between the 2 years. Figure 3 illustrates
these changes.
Table 3. Attendance Trends at PD Master Plan Sessions 2010-2011 to 2011-2012, by Provider
Attendance Percent increase/ decrease Provider 2010-2011 2011-2012
Total 37,062 21,552 -41.8
Center for Professional Development 2,239 1,109 -50.5
Fairmont State University 75 — *
Marshall University 5,119 1,181 -76.9
RESA 1 609 389 -36.1
RESA 2 6,164 556 -91.0
RESA 3 3,472 272 -92.2
RESA 4 2,547 1,086 -57.4
RESA 5 604 980 62.3
RESA 6 989 319 -67.7
RESA 7 1,703 653 -61.7
RESA 8 1,420 402 -71.7
WVDE Office of Assessment and Accountability 2,305 1,133 -50.8
WVDE Office of Career and Technical Education 1,992 — *
WVDE Office of Healthy Schools — 795 *
WVDE Office of Institutional Education Programs 75 726 868.0
WVDE Office of Instruction 2,904 3,995 37.6
WVDE Office of Instructional Technology 3,041 — *
WVDE Office of School Improvement — 1,150 *
WVDE Office of Special Programs — 3,958 *
WVDE Office of Title I 1,484 2,700 81.9
WVDE Office of Title II, III, and System Support 320 148 -53.8
— indicates nonparticipation in the PD Master Plan for that year. * indicates less than 2 years of attendance were data available.
Results: Implementation of the PD Master Plan: Analysis of Provider Reports
14 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Figure 3. Attendance Changes from 2010-2011 to 2011-2012, by Provider Group
By combining information about
the PD sessions included in the PD Master
Plan with information submitted by pro-
viders in the PD Master Plan Session Re-
ports, we computed attendance figures by
Board goal. This analysis revealed that the
goals were well covered, with a minimum
of about 6,900 participants attending ses-
sions focused on each of the goals and sub-
goals, as shown in Table 4.
Table 4. Attendance at Professional Development by Goal Focus
WVBE Goals for Professional Development Attendance
Goal 1. Deliver Common Core State Standards (CCSSs)-based instruction in English/Language Arts. 7,735
Goal 1. Deliver CCSS-based instruction in mathematics. 8,751
Goal 1. Deliver standards-based instruction in other content areas. 6,916
Goal 2. Apply CCSSs in addressing writing and text complexity. 7,282
Goal 2. Apply CCSSs in designing school-wide efforts to improve literacy and numeracy. 8,755
Goal 2. Apply CCSSs in ensuring technology and science are integrated into improvement efforts. 8,706
Goal 3. Effectively apply the West Virginia Professional Teaching Standards to ensure that all students are served by high quality educators. 11,370
Goal 4. Exhibit increased leadership and collaboration to facilitate school improvement. 11,242
As the WVDE moves away from
face-to-face meetings toward other for-
mats, 2011-2012 will likely serve as a base-
line year for tracking changes in
professional development delivery modes.
Figure 4 shows that 90% of participants
attended professional development deliv-
ered face-to-face, while only 1% attended
online sessions and 9% attended sessions
with a combination of delivery modes. It
Figure 4. Attendance by Session Format
Results: Implementation of the PD Master Plan: Analysis of Provider Reports
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 15
should be noted, however, that the WVDE Office of Instructional Technology did not partici-
pate in the 2011-2012 PD Master Plan, which would have altered this picture considerably
due to its large slate of online professional development offerings.
Duration and time span
As we write this report, the 2012-2013 PD Master Plan is in effect, which includes a
definition of professional development that features the notion of time as an essential ele-
ment:
Professional development includes sustained experiences that lead to the develop-
ment of knowledge, skills, practices, and dispositions educators need to help students
perform at higher levels and achieve college and career readiness. (WVBE, 2012, p. 5,
emphasis added)
To establish a baseline for later measurement of change in this aspect of professional devel-
opment—that is, if it was sustained over time—we calculated the average duration (in hours)
of professional development sessions, as well as the average timespan over which providers
conducted their sessions (in days) (Table 5). The following findings from the providers’ re-
ports are not entirely irrelevant to this year’s evaluation, however. These dimensions—
especially the notion of timespan—do provide some insight into the practice of providing fol-
low-up, which is one of the research-based practices examined in this report in a later sec-
tion. We do want to remind readers, however, that providers were not held accountable for
offering more sustained experiences in the 2011-2012 PD Master Plan.
Table 5. Average Duration and Timespan of Professional Development Sessions, by Provider
Provider Duration(Hours) Time Span (Days)
Mean Minimum Maximum Mean Minimum Maximum
Center for Professional Development 44.7 13 50 49.7 4 361
Marshall University 13.3 1.0 410.0 8.0 1 210
RESA 1 15.5 1.0 78.0 18.5 1 187
RESA 2 7.5 6.0 35.0 1.3 1 5
RESA 3 5.2 1.0 12.0 1.1 1 2
RESA 4 4.8 1.0 14.0 1.1 1 2
RESA 5 16.6 1.0 45.0 46.6 1 195
RESA 6 3.2 1.0 14.0 1.1 1 2
RESA 7 5.7 1.0 21.0 0.8 1 3
RESA 8 5.4 1.0 12.0 1.1 1 2
WVDE Office of Assessment and Accountability
6.2 1.0 32.0 1.1 1 4
WVDE Office of Healthy Schools 16.0 12.0 32.0 2.3 2 4
WVDE Office of Institutional Education Programs
9.7 3.0 24.0 19.8 1 363
WVDE Office of Instruction 9.9 2.0 50.0 5.9 1 160
WVDE Office of School Improvement 15.6 9.0 27.0 1.0 1 4
WVDE Office of Special Programs 14.5 1.0 64.0 2.7 1 45
WVDE Office of Title I 12.0 3.0 24.0 1.7 1 3
WVDE Office of Title II, III, and System Support
7.5 1.0 24.0 1.4 1 3
Results: Implementation of the PD Master Plan: Analysis of Provider Reports
16 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Seven providers conducted at least one professional development session that was
sustained over several weeks: Center for Professional Development, Marshall University,
RESA 1, RESA 5, WVDE Office of Institutional Education Programs, WVDE Office of In-
struction, WVDE Office of Special Programs.
Sixteen providers (excluding RESAs 3 and 8) conducted at least one professional de-
velopment session that would meet the WVBE’s current definition for sustained professional
development—that is “professional development lasting fourteen hours or more.” (WVBE,
2012, p. 7).
Six providers had average durations in hours for their professional development that
indicated they typically offer sustained professional development: Center for Professional
Development (44.7 hours), RESA 1 (15.5), RESA 5 ( 16.6), WVDE Office of Healthy Schools
(16.0), WVDE Office of School Improvement (15.6), and WVDE Office of Special Programs
(14.5).
Format, duration, and timespan
Combining two initiatives in the WVDE and WVBE—that is, an emphasis on reduc-
ing face-to-face meetings and on providing more sustained professional development—we
found that, for sessions reported by providers in 2011-2012, those using a blended format
tended to be more sustained than other formats. The average duration for blended sessions
was 17.5 hours, 12 hours for face-to-face, and 2.7 hours for online sessions. It is important to
note here that there were very few online sessions reported (about 1% of the total), especially
with the absence of the WVDE Office of Instructional Technology in this year’s PD Master
Plan; consequently, this finding is of limited value.
Location and top providers
We asked providers to report the physical location where professional development
sessions took place, and we also looked at what format was used for professional develop-
ment in each of the counties (Table 6). Notably there were five counties where no profes-
sional development was delivered by the 18 providers in the PD Master Plan: Barbour,
Monroe, Pleasants, Ritchie, and Taylor. Two of those counties were located in Region 5
(Pleasants and Ritchie) and two were in Region 7 (Barbour and Taylor). RESAs were the top
providers of PD in counties outside of Kanawha County in terms of number of sessions of-
fered, followed by the Office of Instruction. RESAs were also the top providers of PD in more
counties outside of Kanawha County in terms of the number of participants, followed by the
Office of Title I and the Office of Instruction.
Results: Implementation of the PD Master Plan: Analysis of Provider Reports
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 17
Table 6. Top Providers and Format of Professional Development, by County
County
Total atten-dance
Top providers Attendance by Format
By number of sessions By total attendance
Blended Face-to-
face Online Provider(s) N Provider(s) N
All 21,552 CPD 25 OI 3,995 1,852 19,504 120
Barbour — — — — — — — —
Berkeley 453 OI 6 RESA 8 238 453
Boone 96 RESA 3 3 RESA 3 51 96
Braxton 238 CPD 5 Marshall 59 19 219
Brooke 71 RESA 6 2 RESA 6 39 71
Cabell 1,457 Marshall 16 Marshall 699 298 1,159
Calhoun 335 RESA 5 8 RESA 5 205 335
Clay 112 RESA 3 * 12 RESA 3 112 112
Doddridge 17 RESA 7 * 3 RESA 7 17 4 13
Fayette 478 RESA 4 5 RESA 4 416 13 465
Gilmer 84 RESA 7 4 RESA 7 54 6 78
Grant 86 OI 4 OI 61 86
Greenbrier 156 RESA 4 3 RESA 4 109 156
Hampshire 390 RESA 8; OI; OSP 3 OSP 281 390
Hancock 50 RESA 6* 2 RESA 6 50 50
Hardy 34 RESA 8* 2 RESA 8 34 34
Harrison 2,565 RESA 7 23 Title I 1,236 283 2,282
Jackson 5 RESA 5* 1 RESA 5 5 5
Jefferson 88 RESA 8, OAA, OSP 1 OAA 48 88
Kanawha 6,728 CPD 20 OSP 2,538 839 5,889
Lewis 663 OI 5 Title I 254 18 645
Lincoln 77 RESA 2* 3 RESA 2 77 77
Logan 223 Marshall 9 OSP 187 27 196
Marion 148 RESA 7, OAA, OHS, OI 1 OI 61 20 128
Marshall 102 RESA 6* 2 RESA 6 102 102
Mason 195 Marshall 20 Marshall 117 195
McDowell 42 OI 2 RESA 1 22 42
Mercer 126 OAA 4 OAA 100 126
Mineral 15 OIE* 1 OIE 15 15
Mingo 186 RESA 2 4 RESA 2 78 186
Monongalia 2,058 CPD 15 OI 965 89 1,969
Monroe — — — — — — — —
Morgan 41 OSP* 1 OSP 41 41 120
Table 6 continues on next page.
Results:
18 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Table 6. Top Providers and Format of Professional Development, by County
County
Total atten-dance
Top providers Attendance by Format
By number of sessions By total attendance Blended Face-to-face
Online
Nicholas* 782 RESA 4 18 RESA 4 447 37 669
Ohio 82 OI 3 OI 68 3 79
PD online 120 Title II, III, SS 4 OI 66
Pendleton 92 RESA 8 2 OAA 65 92
Pleasants — — — — — — — —
Pocahontas 179 RESA 4 5 RESA 4 90 179
Preston 7 OAA* 1 OAA 7 7
Putnam 48 OI 3 OI 36 48
Raleigh 1,060 RESA 1, WVDE -OSP 8 RESA 1 317 30 1030
Randolph 79 Marshall 3 Marshall 61 79
Ritchie — — — — — — — —
Roane 96 RESA 5 2 RESA 5 88 96
Summers 6 OAA* 1 OAA 6 6
Taylor — — — — — — — —
Tucker 91 RESA 7* 3 RESA 7 91 91
Tyler 6 OI* 1 OI 6 6
Upshur 52 RESA 7* 2 RESA 7 52 12 40
Wayne 144 OI 3 OI 73 144
Webster 17 RESA 4* 1 RESA 4 17 17
Wetzel 150 RESA 6 6 RESA 6 120 8 142 120
Wirt 25 OI* 1 OI 25 25
Wood 951 RESA 5 20 RESA 5 650 55 896
Wyoming 246 OAA* 4 OAA 246 246
* In Nicholas County, there was a session, which the provider indicated was “Other.” Attendance was 76. NOTE: CPD = WV Center for Professional Development; Marshall = Marshall University June Harless Center; OAA = WVDE Office of Assessment and Accountability; OHS = WVDE Office of Healthy Schools; OIE = WVDE Office of Institutional Education Programs; OI = WVDE Office of Instruction; OSI = WVDE Office of School Improvement; OSP = WVDE Office of Special Programs; RESA = regional education service agency (one each for eight regions in West Virginia); Title I = WVDE Office of Title I; Title II, III, SS = WVDE Office of Title II, III, and System Support.
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 19
Use of Research-Based Practices, Alignment with Goals, and Perceptions of
Impact: Analysis of Participant Survey Responses
The remainder of the Results section is based on data collected via an online survey
of PD participants who attended professional development sessions held from June 1, 2011
to March 31, 2012. The survey was conducted in two phases: November 16–December 30,
2011, to cover professional development provided during the summer and early fall months,
and April 16–May 18, 2012 to cover professional development offered during late fall
through March.7 Results here were aggregated from both data collection periods.
The survey random sample was made up of an unduplicated list of 6,312 participants,
who were asked about one professional development event they attended. Of this sample,
559 were eliminated due to attrition, including 413 bad e-mail addresses, and 146 individu-
als who contacted us to report that they did not attend the event we asked them about, or
they attended as a facilitator or in some other nonparticipant capacity. After adjusting for
attrition, the sample was reduced to 5,753; of these, we received responses from 4,571. After
removing unusable responses, the dataset was reduced to 4,281 responses. This represents a
response rate of 67.8% for the full sample, or 74.4% for the sample adjusted for attrition.
Demographic characteristics of survey respondents.
Frequency analyses revealed characteristics of the respondents with respect to key
demographic variables.
Respondents’ estimated hours of professional development
We asked respondents to estimate how
many hours of professional development they
expected to complete by the end of the 2011–
2012 school year. It should be noted that we
did not define the nature of professional devel-
opment in the survey, so respondents could be
responding about anything they perceived to be
professional development. This could include
initiatives of varying scopes sponsored by a va-
riety of groups. Further, the survey was admin-
istered during the fall and spring; therefore, we
had to ask respondents to estimate how much
time they believed they would spend by the end
of the year. Consequently, in addition to self-
report bias, there is likely some error in asking
them to estimate a total before year’s end.
7 An additional data collection period took place from August 1–31 for 249 participants in
yearlong professional development offered by the Center for Professional Development (CPD) when it
was discovered that CPD was left out of the second data collection period sampling frame.
4,281 respondents
Figure 5. Participants' Expected Professional Development Hours for the Year
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
20 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Figure 6. Participants' Highest Level of Educational Attainment
4,281 respondents
Almost all (85.6%) anticipated that they would attend 16 or more hours, with 18.6%
indicating they would attend more than 60 hours of professional development. Only 14.3%
stated they would attend between 0 and 15 hours of PD during the year. This indicates that
our sample tends to participate in a great deal of professional development (Figure 5).
Respondents’ highest degree attained
The respondents were well-educated—
nearly all had achieved credit beyond a bache-
lor’s degree, and two thirds had at least a
master’s degree (Figure 6).
Respondents’ programmatic level
The largest number of respondents
worked at the early childhood/elementary
level (38%), followed by high school (28%)
and middle school (21%); in addition, a small
portion of the respondent pool was WVDE or
district central office staff working at all grade
levels or they indicated their role was Other
(Figure 7).
Respondents’ years of experience
About half of respondents (49%) had
more than 15 years of experience. In fact, 81% of the sample had 6 or more years of experi-
ence. The remaining 19% had 5 or fewer years of experience. Once again, due to the size of
the sample, it is reasonable to expect these results are reflective of the larger population in
the field (Figure 8).
4,281 respondents
Figure 8. Participants' Years of Work Experience in Education
4,864 respondents (multiple options allowed) chosen)
Figure 7. Participants' Programmatic Level
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 21
Respondents’ professional role
Just over half of respondents were teachers: 42% regular classroom teachers and
10% special education teachers. Another 20% were administrators: District central office
staff (10%) or principals/assistant principals (10%). The remainder—a fairly sizable group
indicated their role was Other (28%) (Figure 9).
Respondents’ primary content area
All of the content areas were covered,
although certain content areas had small
numbers of respondents—most notably Eng-
lish-as-a-second-language (15, 0.4%), and
foreign language (39, 0.9%, Figure 10).
Respondents’ school districts or other employer
There were respondents from all 55
county districts, the two special districts, the
WVDE, and a small number that indicated
Other or Out of state (Table 7). Of course, be-
cause of the random nature of the sampling
process, larger districts had a greater chance
of being represented in the respondent pool.
Table 7. Number and Percent of Respondents by Employer
Figure 9. Respondents' Professional Role
4,281 respondents
Figure 10. Respondents’ Primary Content Area
4,281 respondents
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
22 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
N Percent N Percent
Total 4,281 100.0 Mingo County Schools 104 2.4
Barbour County Schools 38 .9 Monongalia County Schools
136 3.2
Berkeley County Schools 133 3.1 Monroe County Schools 31 .7
Boone County Schools 104 2.4 Morgan County Schools 36 .8
Braxton County Schools 43 1.0 Nicholas County Schools 91 2.1
Brooke County Schools 30 .7 Ohio County Schools 64 1.5
Cabell County Schools 155 3.6 Other 113 2.6
Calhoun County Schools 43 1.0 Out of state 28 .7
Clay County Schools 47 1.1 Pendleton County Schools 61 1.4
Doddridge County Schools
25 .6 Pleasants County Schools 28 .7
Fayette County Schools 153 3.6 Pocahontas County Schools
47 1.1
Gilmer County Schools 54 1.3 Preston County Schools 29 .7
Grant County Schools 39 .9 Putnam County Schools 100 2.3
Greenbrier County Schools
99 2.3 Raleigh County Schools 107 2.5
Hampshire County Schools
69 1.6 Randolph County Schools 73 1.7
Hancock County Schools 48 1.1 Ritchie County Schools 9 .2
Hardy County Schools 46 1.1 Roane County Schools 70 1.6
Harrison County Schools 90 2.1 Summers County Schools 37 .9
Institutional 79 1.8 Taylor County Schools 37 .9
Jackson County Schools 49 1.1 Tucker County Schools 43 1.0
Jefferson County Schools 79 1.8 Tyler County Schools 19 .4
Kanawha County Schools 303 7.1 Upshur County Schools 67 1.6
Lewis County Schools 26 .6 Wayne County Schools 95 2.2
Lincoln County Schools 95 2.2 Webster County Schools 45 1.1
Logan County Schools 50 1.2 Wetzel County Schools 69 1.6
Marion County Schools 94 2.2 Wirt County Schools 40 .9
Marshall County Schools 58 1.4 Wood County Schools 188 4.4
Mason County Schools 80 1.9 WV Schools for the Deaf & Blind
15 .4
McDowell County Schools 87 2.0 WV Department of Education
67 1.6
Mercer County Schools 93 2.2 Wyoming County Schools 83 1.9
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 23
Respondents’ travel time to the professional development they attended
To get a sense about how much time individuals in different parts of the state must
travel to attend professional development, we included the following item: “How long did
you have to travel to participate in this professional development activity?” Response op-
tions included (a) Less than 30 minutes, (b) 31-60 minutes, (c) 61-90 minutes, (d) 91
minutes to 2 hours, or (e) More than 2 hours (see Table A 1 in Appendix F p. 79). Figure 11
illustrates the breakdown of responses, with bars for each district showing the aggregated
percentages of travel time. Portions of the bars in pink or red indicate travel time of more
than an hour. From this illustration, we can see that educators in our representative sample
in several counties had long commutes to participate in the professional development they
attended. At least half of respondents in 23 counties (i.e., in descending percentages, Miner-
al, Hampshire, Ohio, Monroe, McDowell, Marshall, Hardy, Randolph, Preston, Greenbrier,
Summers, Morgan, Logan, Tyler, Wirt, Pleasants, Ritchie, Webster, Mingo, Hancock,
Figure 11. Percent Travel Time by Time Category, by Work Location
Based on 4,126 respondents. See Table A 1 in the Appendix for data upon which this figure was based.
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
24 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Brooke, Barbour, Wood) and educators from the Institutional Education Programs and WV
Schools for the Deaf and Blind had to travel more than an hour.
Looked at another way, we computed estimated travel times for each of the school
districts by assigning each travel time category a midpoint value and computing mean travel
times for respondents in the sample. For example, the category Less than 30 minutes, was
assigned the value of 15.5 minutes. One limitation in this approach is that the midpoint for
the highest value, More than 2 hours, is unknown, so we followed the pattern established in
the other categories and assigned it a value equaling the low-point in the category plus 15.5
minutes, or 135.5 minutes. This method likely produced conservative estimates of the aver-
age time travelled for some counties. Nevertheless, the results in Figure 12 illustrate the un-
equal amounts of time educators from some counties must travel for professional
development. Educators in Hampshire, Hardy, Mineral, and Monroe counties travelled more
Figure 12. Estimated Average Respondent Travel Time by Work Location
Based on 4,126 respondents. See Table A 1in the Appendix for data upon which this figure was based.
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 25
than an hour and a half to attend professional development, compared with educators in Ca-
bell, Calhoun, Kanawha, and Monongalia, who travelled about half that length of time.
We also took a regional look, illustrated in Figure 13. Here it is evident that educators
in the northern and eastern panhandles (Regions 6 and 8 respectively) travel about 24%
more than average. The rest of the regions travel slightly more or less than the average. As
for professional role, district central office staff spent the most time travelling (Figure 13).
The average travel time for the whole sample (nearly 4,000 respondents) was just
over an hour (Figure 12). That translates to more than 4,000 person hours just for this sam-
ple—which represents only about a fifth of the participants in professional development of-
fered by providers in the PD Master Plan during the 12-month period (June 2011 through
May 2012). Extending the average travel time to the whole population of more than 20,000
reported attendees for the whole year (Table 1) accounts for more than 20,000 staff hours.
Respondents by provider
The respondent pool included individuals who were responding to professional de-
velopment provided by all 18 providers (Figure 14). The reader is reminded that each partic-
ipant was selected to respond about a single professional development session provided by a
single provider. The number of respondents for each provider is fairly well aligned with the
proportion of e-mail addresses each one submitted through the Provider PD Session Re-
ports, as shown in Table 8.
Figure 13. Estimated Average Travel Time by Region and Professional Role, in Minutes
3,925 respondents for regional analysis (left) ; 4,220 for role group analysis (right)
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
26 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Table 8. Comparison of the Percent of Total E-mail Addresses to Percent of Total Respondents, by Provider
Provider
Number e-mail addresses provided
Number respondents
Percent of total e-mail
addresses Percent of total
respondents
Total* 14,725 4,272* 100.0 100.0*
Center for Professional Development 1,086 389 7.4 9.1
Marshall University 793 191 5.4 4.5
RESA 1 315 111 2.1 2.6
RESA 2 305 117 2.1 2.7
RESA 3 186 83 1.3 1.9
RESA 4 747 244 5.1 5.7
RESA 5 477 201 3.2 4.7
RESA 6 196 81 1.3 1.9
RESA 7 536 210 3.6 4.9
RESA 8 386 162 2.6 3.8
WVDE Office of Assessment and Accountability
1,092 275 7.4 6.4
WVDE Office of Healthy Schools 845 209 5.7 4.9
WVDE Office of Institutional Education Programs
399 142 2.7 3.3
Table 8 continues on next page.
4,272 total respondents, including 7 respondents with missing provider data
Figure 14. Number of Respondents, by Provider
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 27
Table 8. Comparison of the Percent of Total E-mail Addresses to Percent of Total Respondents, by Provider
Provider
Number e-mail addresses provided
Number respondents
Percent of total e-mail
addresses Percent of total
respondents
WVDE Office of Instruction 2,616 653 17.8 15.3
WVDE Office of School Improvement 1,586 442 10.8 10.3
WVDE Office of Special Programs 1,029 299 7.0 7.0
WVDE Office of Title I 2,021 420 13.7 9.8
WVDE Office of Title II, III, and System Support
110 43 0.7 1.0
*Total includes seven cases with missing provider data.
Adherence to research-based practices
Survey respondents were asked to respond to seven items about the extent to which
the professional development event they attended adhered to research-based practices for
high quality professional development. Respondents were instructed to respond to each
statement using a 5-point Likert-type response format, that is, 1 (strongly disagree), 2 (dis-
agree), 3 (neutral), 4 (agree), 5 (strongly agree). Thus, higher average scores indicate that
professional development was perceived to be of higher quality. In addition to examining
responses to each item individually, we also calculated an overall quality index score by
summing each respondent’s ratings for all quality items and dividing the resulting value by
the total number of items. This allowed us to examine quality in a more holistic manner
across a variety of participant groups.
Prior to examining the results, it should be noted that the response format used for
these items is most easily interpreted by examining average responses in comparison to
some reference point upon the scale. For the purposes of this report, evaluators settled upon
the midpoint of the scale (i.e., 3.00) as that point of reference. This is because a rating of
3.00 indicated a neutral
response. Therefore, any
mean score below 3.00
would indicate general
disagreement with the
item and any mean score
above 3.00 would indi-
cate some agreement with
the item. It is also useful
to conceptualize a mean
rating of 4.00 or above as
evidence that, overall,
participants were in some
agreement about a given
statement.
Table 9. Adherence to Research-Based Practices for High Quality Professional Development, Overall
The professional development was . . . N Mean SD
Beneficial, had a positive impact 4253 3.89 .909
Hands-on with active learning 4237 3.87 .945
Intensive in nature 4263 3.73 .904
Relevant to current needs 4251 4.08 .868
Specific and content-focused 4247 4.17 .769
Supported by follow-up discussion 4242 3.59 1.052
Supported by follow-up PD sessions 4251 3.40 1.047
Overall Quality Index* 4135 3.82 .711
* This is a computed value based on the average of the other seven measures of quality.
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
28 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
The average quality ratings for each item in the whole sample are presented in Table
9 (above) and Figure 15. The two most highly rated dimensions of quality were Specific and
content focused and Relevant to my current needs and circumstances as an educator. Mean
ratings for both of these dimensions fell above a 4, indicating quite strong agreement. The
average rating for the extent to which professional development was Supported by related
follow-up PD sessions was the lowest of all seven dimensions. The rating was slightly above
the midpoint of the scale, but indicated only weak agreement. Participants provided a simi-
lar average rating regarding the extent to which their professional development experience
was Supported by follow-up discussion or collaboration at our school or office or online.
When we disaggregated the sample by programmatic level, we found very little varia-
tion among the different levels as shown in Figure 16 and in Table A 2 (Appendix F, p. 81).
Further, the breakdown for the seven indicators closely followed the breakdown for the total
sample shown in Figure 15.
Results were similar when we disaggregated by professional role, that is, there was
very little variation among the role groups with regard to the overall quality index rating
(Figure 17), although regular classroom teachers and special education teachers reported
higher agreement that the professional development they attended included opportunities
for hands-on and active learning (Table A 3, Appendix F, p. 81), with ratings of 4.01 and
3.99, respectively.
Figure 15. Adherence to Research-Based Practices for High Quality Professional Development, Overall
* This is a computed value based on the average of the other seven measures of quality.
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 29
There was slightly more variation when we disaggregated by content area (Figure 18,
p. 30 and Table A 4, Appendix F, p. 82), with physical education and foreign language teach-
ers expressing the highest level of agreement with the seven quality dimensions, overall.
Practices that received higher than average ratings, reaching or exceeding agreement (>=
4.00), included the following:
Relevant and Content-focused achieved that level across all content areas.
Figure 16. Mean Overall Quality Index Rating (i.e., Adherence to Research-Based Practices for High Quality Professional Development), by Programmatic Level
Figure 17. Mean Quality Index Rating for Adherence to Research-Based Practices for High Quality Professional Development, by Professional Role
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
30 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Intensive in nature received higher than average ratings from teachers of English
as a second language.
Hands-on with active learning received higher than average ratings from educa-
tors involved in all content areas (e.g., elementary grade teachers), and arts, for-
eign language, physical education, and science teachers.
Practices that received ratings below the mean, falling into the neutral range (i.e., ratings at
about 3.4 or below) which may require extra attention from providers included
Supported by follow-up related professional development received lower ratings
from educators involved in arts, mathematics, reading/language arts, and social
studies.
Supported by follow-up discussion or collaboration received lower ratings from
mathematics teachers.
Figure 18. Mean Quality Index Rating for Adherence to Research-Based Practices for High Quality Professional Development, by Content Area
ALL = All subject areas (e.g., elementary teacher, support personnel); ARTS = Arts (visual, music, dance, theater, other); CTE = Career/technical education; ESL = English as a second language; FL = Foreign language; MATH = Mathematics; PE = Physical education; RLA = Reading/language arts; SCI = Science; SOCS = Social studies; SPED = Special education; N/A = Not applicable (e.g., administrator or county staff)
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 31
The analysis that revealed the greatest degree of variation in the mean overall quality
index rating was when we disaggregated the data by provider, although all 18 providers had
overall quality index ratings that fell into the general agreement range—that is, respondents
tended to agree with statements that the professional development they attended adhered to
research-based practices and was beneficial overall (Figure 19). To see individual quality rat-
ings for each provider, see Table A 5, p. 83 in Appendix F.
The four providers with the highest ratings were the WVDE Office of Instruction
(4.00), WVDE Office of Special Programs (4.00), RESA 1 (4.15), and the WVDE Office of Ti-
tle, II, III, and System Support (4.02).
Figure 19. Mean Quality Index Rating for Adherence to Research-Based Practices for High Quality Professional Development, by Provider
CPD = WV Center for Professional Development; Marshall = Marshall University June Harless Center; RESA = regional education service agency (one each for eight regions in West Virginia); OAA = WVDE Office of Assessment and Accountability; OHS = WVDE Office of Healthy Schools; OIE = WVDE Office of Institutional Education Programs; OI = WVDE Office of Instruction; OSI = WVDE Office of School Improvement; OSP = WVDE Office of Special Programs; Title I = WVDE Office of Title I; Title II, III, SS = WVDE Office of Title II, III, and System Support.
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
32 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
The providers with the lowest scores were the WVDE Office of School Improvement
(3.55), RESA 7 (3.58), and RESA 8 (3.51). These offices may want to focus on the following
research-based practices in their future professional development offerings:
WVDE Office of School Improvement—Supported by follow-up PD sessions (2.93),
and Supported by follow-up discussion and collaboration (3.18)
RESA 7—Supported by follow-up PD sessions (3.08), Supported by follow-up discus-
sion and collaboration (3.35), and Intensive in nature (3.32)
RESA 8—Supported by follow-up PD sessions (3.00), and Supported by follow-up
discussion and collaboration (3.33)
These ratings all fell within the neutral range, with participants, overall, neither agreeing nor
disagreeing that the professional development they attended adhered to these practices.
We also analyzed providers by groups—that is, we grouped the RESAs and WVDE of-
fices to see how well these larger groups of providers were perceived by participants as hav-
ing adhered to research-based practices for professional development. Marshall University
and the WV Center
for Professional
Development did
not belong to ei-
ther group, so their
mean ratings are
included as indi-
vidual providers in
Figure 20. For six
of seven measures,
the WV Center for
Professional De-
velopment had the
highest mean rat-
ings, while for an
equivalent num-
ber, the RESAs, as
a group, held the
lowest mean rat-
ing.
Figure 20. Ratings for Adherence to Research-Based Practices for High Quality Professional Development, by Provider Group
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 33
Perceived effectiveness of professional development in meeting Board goals
As noted previously, West Virginia state law §18-2-23a requires the WVBE to estab-
lish annual professional development goals for public schools; to coordinate professional
development programs; and to guide program development, approval and evaluation. As a
reminder, the PD Master Plan for school year 2011–2012 included four goals:
As a result of professional development, participants will . . .
1. deliver standards-based instruction in classrooms to ultimately improve student learning.
Such instruction will exhibit an understanding of the Common Core State Standards for
English/Language Arts and Mathematics including how the new standards align to the
West Virginia 21st Century Content Standards and Objectives.
2. apply their knowledge of the Common Core State Standards into professional practice
with specific attention to: (1) addressing writing and text complexity, (2) designing
school-wide efforts to improve literacy and numeracy, and (3) ensuring technology and
science are integrated into improvement efforts.
3. effectively apply the West Virginia Professional Teaching Standards to ensure that all
students in West Virginia are served by high quality educators.
4. exhibit increased leadership and collaboration to facilitate school improvement. (WVBE,
2011, pp. 4-5)
Each professional development session included in the 2011–2012 PD Master Plan
was determined by PD providers to be aligned to one or more of the above goals; therefore,
as part of the evaluation of the PD Master Plan, we sought to determine the extent to which
each PD participant’s professional development experience had helped to meaningfully con-
tribute toward the goal area(s) aligned with the session each one attended.
We used descriptive statistical analyses to examine responses for all events associat-
ed with each goal area. First we disaggregated responses into eight datasets:
three associated with professional development sessions aligned to Goal 1 (stand-
ards-based instruction in [a] English/language arts, [b] mathematics, and [c]
other content areas);
three associated with Goal 2 ([a] addressing writing and text complexity, [b]
school-wide efforts to improve literacy and numeracy, and [c] integration of
technology and science);
one associated with Goal 3 (applying West Virginia Professional Teaching Stand-
ards [WVPTSs]; and
one associated with Goal 4 (increased leadership and collaboration to facilitate
school improvement).
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
34 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
We then analyzed participants’
responses for each goal area
independently. Respondents were
instructed to respond to statements
about the professional development
using a 5-point Likert-type response
format as follows: 1 (strongly disagree),
2 (disagree), 3 (neutral), 4 (agree), 5
(strongly agree). A sixth category, not
applicable, was included, and tallied
along with the other responses as an
indication of the lack of alignment with
the goal in question—that is, if the
respondent considered the goal in
question as not applicable to the session
he or she attended, we counted this
response as a lack of agreement that the
session was helpful in meeting the goal.
The full results for each of the providers
appear in Table 10, organized by goal.
Overall results for each of the providers
are found in Figure 21, and a
breakdown by provider group is found
in Figure 22.
Overall, about half (51.2%) of all respondents agreed that the professional develop-
ment had been helpful in meeting the provider-designated Board goal or goals for that ses-
sion. Items and the overall percent of respondents in agreement or strong agreement were as
follows (Table 10):
The professional development was helpful to me in—
Delivering standards-based instruction in English/language arts.
(Goal 1, apply ELA standards), 45.1%
Delivering standards-based instruction in mathematics.
(Goal 1, apply math standards), 45.0%
Delivering standards-based instruction in other content areas.
(Goal 1, other content areas), 50.1%
Improving students' writing and text complexity.
(Goal 2, writing/text complexity), 43.5%
Improving students' literacy and/or numeracy.
(Goal 2, literacy/numeracy), 53.7%
Integrating technology and/or science into improvement efforts.
(Goal 2, science/technology), 49.6%
Figure 21. Percentage of Overall Agreement or Strong Agreement that PD was Helpful in Meeting Board Goals, by Provider
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 35
Understanding and applying the WV Professional Teaching Standards.
(Goal 3, apply WVPTSs), 50.0%
Improving leadership and collaboration to facilitate school improvement.
(Goal 4, lead/collaborate), 67.5%
About two-thirds of the time, respondents agreed or strongly agreed that sessions the pro-
viders designated as supporting Goal 4 were, in practice, helpful in meeting that goal; none
of the sessions focused on the other goals approached this level of agreement, overall, which
could be interpreted to mean that their content did not align well with the Board goals they
were intended to address. Another possible interpretation is that some providers did not re-
alistically categorize their planned sessions by Board goal during the formulation of the PD
Master Plan.
Figure 22. Percentage of Agreement or Strong Agreement that PD was Helpful in Meeting Board Goals, by Provider Group
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
36 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
In any case, there was considerable variance among provider groups and individual
providers. The large majority of participants who attended professional development offered
by some providers agreed or strongly agreed that the session helped them address the pro-
vider-designated Board goal, especially the WV Center for Professional Development (71%),
RESA 1 (73%), the WVDE Office of Instruction (70%), and the WVDE Office of Title I (76%).
On the other hand, other providers’ participants did not share that level of agreement that
the session had been helpful in addressing the intended Board goal, especially RESA 5
(38%), RESA 7 (26%), RESA 8 (36%), and the WVDE Office of School Improvement (27%).
Scores in Table 10 that fell more than 20 percentage points above the mean for that item are
marked in green, indicating very strong alignment to goals; those that fell more than 20 per-
centage points below the mean for that item are marked in red, indicating very weak align-
ment to goals the professional development was intended to address.
When we aggregated the RESAs and the WVDE offices, both groups included provid-
ers that were among the highest and lowest scoring, in terms of aligning their professional
development with the Board goals. Overall though, as shown in Figure 22 (p. 35), the RESAs
scored the lowest in seven of the eight measures for goal alignment; for six of the eight
measures, well under half of participants agreed or strongly agreed that the professional de-
velopment they attended had been helpful in meeting the goal it was meant to address. The
WVDE group’s performance was much more varied. For two of the measures, they showed
the strongest goal alignment; for four of the measures they were second only to CPD; and for
the remaining two, they traded places with the RESAs for weakest and second weakest goal
alignment.
Table 10. Percent of Participants Who Agree or Strongly Agree That the Professional Development was Helpful in Meeting the Board Goal Designated by the Provider for the Session, by Provider
Provider
Overall goal
helpful-ness
Goal 1 apply
ELA standards
Goal 1 apply math
standards
Goal 1 other
content areas
Goal 2 writing/
text complexity
Goal 2
literacy/ numeracy
Goal 2
science/ technology
Goal 3
apply WVPTSs
Goal 4
lead/ collaborate
% agree % agree % agree % agree % agree % agree % agree % agree % agree
n n n n n n n n n
All Providers 51.2 45.1 45.0 50.1 43.5 53.7 49.6 50.0 67.5
16,424 1725 1954 1628 1733 2049 2006 2859 2477
CPD 70.9 67.6 61.8 64.7 74.8 64.0 68.7 72.3 88.7
812 71 55 136 115 114 67 148 106
Marshall 57.6 46.1 49.5 63.2 40.6 58.1 58.5 66.5 72.2
1,327 102 188 133 155 186 188 188 187
RESA 1 73.5 77.4 56.1 54.8 77.6 77.1 69.2 78.5 84.1
777 106 107 31 107 105 107 107 107
RESA 2 60.6 56.0 54.8 48.1 52.6 61.7 53.4 71.6 80.2
864 116 115 54 116 115 116 116 116
RESA 3 51.8 47.6 45.1 55.9 44.4 52.5 56.3 50.0 63.4
628 82 82 59 81 80 80 82 82
RESA 4 43.7 38.3 38.5 41.0 34.2 47.7 36.9 52.1 59.8
1,851 240 239 173 240 237 241 240 241
Table 10 continues on next page.
RESA 5 38.3 30.7 29.5 39.2 30.7 37.9 33.7 38.5 65.6
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 37
Table 10. Percent of Participants Who Agree or Strongly Agree That the Professional Development was Helpful in Meeting the Board Goal Designated by the Provider for the Session, by Provider
Provider
Overall goal
helpful-ness
Goal 1 apply
ELA standards
Goal 1 apply math
standards
Goal 1 other
content areas
Goal 2 writing/
text complexity
Goal 2
literacy/ numeracy
Goal 2
science/ technology
Goal 3
apply WVPTSs
Goal 4
lead/ collaborate
% agree % agree % agree % agree % agree % agree % agree % agree % agree
n n n n n n n n n
1,481 189 190 143 192 190 190 192 195
RESA 6 46.9 38.0 34.2 58.3 41.8 49.4 36.7 62.5 58.8
603 79 79 48 79 79 79 80 80
RESA 7 26.4 17.1 15.7 20.0 18.7 20.1 32.5 37.6 48.3
1,638 210 210 175 209 209 206 210 209
RESA 8 36.3 26.9 24.5 31.7 27.7 33.5 32.3 43.9 66.2
1,150 156 155 60 155 155 155 157 157
OAA 59.6 — — — — — — 54.9 64.4
535 — — — — — — 268 267
OHS 45.9 — — 39.2 — — 38.0 35.1 68.3
748 — — 166 — — 166 208 208
OIE 49.5 49.4 42.0 47.5 43.1 50.8 59.0 43.1 58.1
554 81 69 61 72 59 61 58 93
OI 70.3 65.7 73.0 69.0 64.5 72.4 75.2 50.9 71.2
2,218 248 429 348 211 519 306 53 104
OSI 26.6 — — — — — — 22.8 51.8
429 — — — — — — 373 56
OSP 57.1 79.5 8.6 — — — 100.0 50.0 66.7
294 44 35 — — — 11 102 102
Title I 75.5 — — — — — — 67.1 87.3
400 — — — — — — 234 166
Title II, III, SS 60.0 — — 61.0 — — 53.1 64.3 —
115 — — 41 — — 32 42 —
CPD = WV Center for Professional Development; Marshall = Marshall University June Harless Center; RESA = regional education service agency (one each for eight regions in West Virginia); OAA = WVDE Office of Assessment and Accountability; OHS = WVDE Office of Healthy Schools; OIE = WVDE Office of Institutional Education Programs; OI = WVDE Office of Instruction; OSI = WVDE Office of School Improvement; OSP = WVDE Office of Special Programs; Title I = WVDE Office of Title I; Title II, III, SS = WVDE Office of Title II, III, and System Support. NOTE: Scores that fell more than 20 percentage points above the mean for that item are marked in green, indicating very strong alignment to goals; those that fell more than 20 percentage points below the mean for that item are marked in red, indicating very weak alignment
Perceived impact of professional development
The survey contained three pairs of items that asked respondents to use a 4-point
Likert-type scale (1 [not at all], 2 [to a small extent], 3 [to a moderate extent], 4 [to a great
extent]), to rate the extent to which they agreed with statements about themselves both be-
fore and after having participated in the professional development session they attended, as
follows:
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
38 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Table 11. Interpretation of Effect Size Estimates Used in this Study
Value for Cohen’s d Interpretation
Less than .4 Small effect
.4 to .7 Moderate effect
.8 or 1.1 Large effect
1.2 and above Very large effect
Pair 1. Before participating in this PD, to what extent were you knowledgea-
ble about the topic it covered?
After participating in this PD, to what extent are you knowledgeable
about the topic it covered?
Pair 2. Before participating in this PD, to what extent did you practice behav-
iors or skills it taught?
After participating in this PD, to what extent do you practice behaviors
or skills it taught?
Pair 3. Before participating in this PD, to what extent did you hold atti-
tudes/beliefs it encouraged?
After participating in this PD, to what extent do you hold atti-
tudes/beliefs it encouraged?
A fifth response category was included, but only used to allow respondents to indicate the
item was not applicable to them. These responses were not used when calculating mean
scores.
We used a retrospective pretest/posttest design to assess the extent to which survey
respondents perceived a change in their own knowledge, behaviors, and beliefs and attitudes
as a result of participating in professional development. A series of paired-samples t tests
were conducted using respondents’ pre- and post-ratings. These analyses tested for statisti-
cally significant differences between respondents’ pre- and post-ratings, with time as the in-
dependent variable. When statistically significant differences were found (i.e., p <.05), it is
reasonable to say that the difference observed between participants’ pre- and posttest results
are not likely to be due to chance. That is, there is some systematic reason underlying the
difference. This analysis does not allow one to infer a cause for the difference. It merely de-
scribes the presence of a significant difference.
One limitation of significance testing is that it tells us very little about the magnitude
of any observed differences. We detect a difference, but cannot tell from the t test if the dif-
ference is meaningful in a practical sense. Calculating an effect size is one way to explain the
magnitude of any statistically significant differences. In this study, we used Cohen’s d as a
measure of effect size. This statistic is commonly used in simple pretest/posttest designs,
although its interpretation is often debated in social sciences (see the Limitations of the
Study section, p. 50, for more about this debate).
The guidelines we used for interpreting the
meaning of the effect sizes in this study are
found in Table 11. Paired-samples t tests were
conducted for three impact items: (1) knowledge
about the topic of professional development, (2)
use of behaviors and skills related to the topic,
and (3) presence of attitudes/beliefs advocated
by the professional development.
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 39
Results for total sample
The tests returned statistically significant differences for all three areas (p <.000). In
all cases, survey respondents rated themselves higher after participating in professional
development than before
participating (Figure 23). Effect
size estimates revealed overall
large effect sizes for knowledge
(d = 1.00), moderate effect sizes
for behavior (d = .65), and a
weak effect for attitudes (d =
.32). In other words,
participants indicated greater
knowledge after having
participating in professional
development, reported engaging
in more behavior related to the
PD they attended, and holding
slightly different attitudes and
beliefs about the topic areas
addressed by the professional
development (Table 12).
Results by programmatic level
We disaggregated results for the perceived impact of the professional development
by programmatic level—that is for early childhood/elementary school (1,663 to 1,763 re-
sponses, depending on the item), middle school (949 to 1,004 responses), high school (1,218
to 1,304 responses), a category for those in WVDE or district central offices (not applicable,
225 to 262 responses), and an other category (264 to 295 responses). Tests returned statisti-
cally significant differences for all programmatic levels, for all three self-assessments (p ≤
.001). In all cases, participants rated themselves higher after participating in professional
development than before participating. For knowledge, effect size estimates revealed moder-
ate effects for the other and not applicable categories and large effect sizes for the three pro-
grammatic levels. For behavior, moderate effects were observed across all categories and
programmatic levels; while for attitudes/beliefs, weak effects were observed for all but early
childhood/elementary responses, who exhibited moderate effects for the professional devel-
opment (Table A 6 in Appendix F, p. 85, Figure 24).
Table 12. Paired-Samples T Test of Perceived Impacts for Whole Sample
Dimension Mean ∆
pre-post Standard deviation
Standard error of
mean t Significance n
Effect size (Co-hen's d)
Knowledge .709 .747 .012 58.995 .000 4077 1.00
Behavior .546 .704 .011 46.524 .000 3858 .65
Attitudes/beliefs .362 .638 .010 34.195 .000 3808 .32
Figure 23. Mean Perceived Impact of Professional Development (Pre-/Postsession), Total Sample
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
40 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Results by professional role
We disaggregated results for the perceived impact of the professional development
by professional role—that is for district central office staff (356 to 407 responses), princi-
pal/assistant principal (374 to 405 responses), regular classroom teachers (1,683 to 1,752
responses), special education teachers (392 to 413 responses), and an other category (1,003
to 1,100 responses). Tests returned statistically significant differences for all professional
role groups, for all three areas (p <.000). In all cases, participants rated themselves higher
after participating in professional development than before participating. For knowledge,
effect size estimates were large for all role groups except regular classroom teachers, which
showed a very large effect, and other, which showed a moderate effect. For behavior, moder-
ate effects were observed across all role groups except regular classroom teachers, which had
a large effect size; and for attitudes/beliefs, small effects were realized for administrators
and the other category, while the two teacher categories exhibited moderate effects (Table A
7 in Appendix F, p. 85, Figure 25).
Figure 24. Perceived Impact of Professional Development (Pre-/Postsession), Effect Size, by Programmatic Level
Light blue indicates small effects, medium blue indicates moderate effects, darker blue indicates large effects, darkest blue indicates very large effects.
.20
.47
.75
.20
.41
.67
.31
.60
.89
.34
.64
1.04
.46
.76
1.14
.00 .10 .20 .30 .40 .50 .60 .70 .80 .90 1.00 1.10 1.20
Attitudes/beliefs
Behavior
Knowledge
Attitudes/beliefs
Behavior
Knowledge
Attitudes/beliefs
Behavior
Knowledge
Attitudes/beliefs
Behavior
Knowledge
Attitudes/beliefs
Behavior
Knowledge
No
tA
pp
licab
leO
ther
Hig
hsc
ho
ol
Mid
dle
sch
oo
l
Earl
yC
hild
ho
od
/El
emen
tary
Effect size, Cohen's d
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 41
Results by content area
We disaggregated results for the perceived impact of the professional development
by content area (see Table A 8 in Appendix F, p. 86 for number of responses and other sta-
tistics). Tests returned statistically significant differences for all content areas, for all three
measures (p ≤ .05) with the exception of English as a second language (ESL), which had a
small number of responses (14 to 15) and did not achieve statistical significance for atti-
tudes/beliefs or behavior (these results must be treated cautiously). In all cases, participants
rated themselves higher after participating in professional development than before partici-
pating. For knowledge, effect size estimates were large or very large for all role groups except
ESL, not applicable, and physical education which showed moderate effects. For behavior,
large effect sizes were observed for all subject areas (i.e., elementary school teaching), sci-
ence, and social studies; while all other content areas saw moderate effects for behavior, ex-
cept ELS, which saw small effects. Seven of the 12 content areas saw small effects for
attitudes/beliefs (ESL, mathematics, not applicable, reading/language arts, science, social
studies, and special education), while the other five content areas saw moderate effects
(Figure 26).
Figure 25. Perceived Impact of Professional Development (Pre-/Postsession), Effect Size, by Professional Role
Light blue indicates small effects, medium blue indicates moderate effects, darker blue indicates large effects,
darkest blue indicates very large effects.
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
42 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Figure 26. Perceived Impact of Professional Development (Pre-/Postsession), Effect Size, by Content Area
Light blue indicates small effects, medium blue indicates moderate effects, darker blue indicates large effects, darkest blue indicates very large effects.
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 43
Figure 27. Perceived Impact of Professional Development (Pre-/Postsession), Effect Size, by Provider (RESAS only)
RESA = regional education service agency (one each for eight regions in West Virginia)
Light blue indicates small effects, medium blue indicates moderate effects, darker blue indicates large effects, darkest blue indicates very large effects.
Results by provider
Lastly, we disaggregated results for the perceived impact of the professional devel-
opment by provider (see Table A 9 in Appendix F, p. 87). Tests returned statistically signifi-
cant differences for all providers, for all three areas (p <.000). In all cases, participants rated
themselves higher after participating in professional development than before participating.
Figure 27 and Figure 28 show that for knowledge, effect size estimates were very
large for RESAs 1 and 2, the West Virginia Center for Professional Development, the Mar-
shall University June Harless Center, the WVDE Office of Instruction, and the Office of Title
II, III, and System Support. Effect sizes for knowledge were large for RESAs 3, 4, 5, 6, and 7,
and the WVDE Offices of Institutional Education, Special Programs, and Title I. Effect sizes
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
44 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
for knowledge were moderate for RESA 8 and the WVDE Offices of Assessment and Ac-
countability, Healthy Schools, and School Improvement.
Some providers also achieved large effects in behavior, including the Center for Pro-
fessional Development, Marshall University June Harless Center, RESAs 1, 2, and 3, and the
WVDE Office of Instruction. All the rest achieved moderate effects in behavior except the
CPD = WV Center for Professional Development; Marshall = Marshall University June Harless Center; OAA = WVDE Office of Assessment and Accountability; OHS = WVDE Office of Healthy Schools; OIE = WVDE Office of Institutional Education Programs; OI = WVDE Office of Instruction; OSI = WVDE Office of School Improvement; OSP = WVDE Office of Special Programs; Title I = WVDE Office of Title I; Title II, III, SS = WVDE Office of Title II, III, and System Support.
Figure 28. Perceived Impact of Professional Development (Pre-/Postsession), Effect Size, by Provider (All but RESAs)
Results: Use of Research-Based Practices, Alignment with Goals, and Perceptions of Impact: Analysis of Participant Survey Responses
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 45
WVDE Office of School Improvement, which saw a small effect. As in the other disaggrega-
tions and the overall sample, attitudes/beliefs saw the smallest effects. Yet, all but four pro-
viders (RESA 7, WVDE Office of Assessment and Accountability, Office of Healthy Schools,
and Office of School Improvement) achieved moderate effects in this domain, while the four
remaining achieved a small effect.
When aggregating by group, both the RESAs and the WVDE offices provided profes-
sional development that participants, overall, perceived as having large effects, while both
groups obtained moderate effects for behavior and attitudes/beliefs. There was, as described
above, quite a lot of variance within those groups, however.
Figure 29. Perceived Impact of Professional Development (Pre-/Postsession), Effect Size, by Provider Group
Light blue indicates small effects, medium blue indicates moderate effects, darker blue indicates large effects, darkest blue indicates very large effects.
46
47
Discussion
We set out to address the following aspects of the implementation of the 2011-2012
Master Plan for Professional Staff Development (PD Master Plan): (a) implementation of
planned sessions; (b) participant perceptions about the sessions’ adherence to research-
based practices for high quality professional development; (c) participant perceptions about
the sessions’ helpfulness with regard to the specific goals of the PD Master Plan; and (d) par-
ticipants perceived (self-reported) outcomes resulting from their involvement in profession-
al development associated with the PD Master Plan. After the discussion of findings for each
of these topics, we provide observations about the formation of the plan, itself.
Implementation of Planned Sessions
Overall, implementation of planned sessions was down slightly from the level seen in
2010-2011—77.5% compared with 80.0% last year. The most prevalent reasons were a lack
of requests/registrations and scheduling issues. Five sessions were cancelled to avoid a du-
plication of effort.
Attendance was down nearly 42% from last year. Most of this drop in attendance—in
fact, 83% of it—was attributable to the lower attendance numbers reported by the RESAs,
which declined from 17,508 in 2010-2011 to 4,657 participants in 2011-2012. CPD and IHEs
also saw lower attendance, while WVDE providers’ attendance was slightly up.
Top providers in terms of attendance were all from WVDE, including the Office of In-
struction (3,995), Office of Special Programs (3,958), and the Office of Title I (2,700).
The WVBE’s Goals for Professional Development were all well covered, with a mini-
mum of about 6,900 participants attending sessions focused on each of the goals.
Face-to-face sessions far outflanked other meeting formats at 90%, followed by ses-
sions that blended formats at 9%.
CPD had the highest average duration for its professional development sessions—45
hours, with a mean time span of 50 days. Six providers had average durations in hours for
their professional development that indicated they typically offer sustained professional de-
velopment (i.e., 14 hours or more), which research shows is the minimum required to effect
improvement in student achievement (Yoon, et al., 2007).
Sessions offered in a blended format tended to have the longest duration (average
17.5 hours).
There were five county locales where no professional development offered through
the PD Master Plan took place (Barbour, Monroe, Pleasants, Ritchie, and Taylor), although
educators from these counties did attend professional development offered in other locales.
This study estimates that the average travel time to professional development pro-
vided by the 18 offices and organizations covered in this report was about 61 minutes, or
slightly over an hour. When this estimate is projected to the more than 20,000 attendees in
sessions held during the 12-month period from June 2011 through May 2012, we estimate
Discussion: Use of Research-Based Practices
48 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
that more than 20,000 staff hours were spent just travelling. Further, the burden of travel is
not equally shared. We estimate that educators in some counties travel well over 60 minutes
each way to attend professional development; educators in Hampshire, Hardy, Mineral, and
Monroe counties travelled at least half again the average, and twice as much as their coun-
terparts in Cabell, Calhoun, Kanawha, and Monongalia. No doubt some of this travel is una-
voidable, but perhaps not all of it. Reducing travel time by using online or other formats
could allow educators to redirect time spent travelling, allowing more time for other activi-
ties that would benefit students.
Use of Research-Based Practices
Overall, the strongest ratings in terms of the use of research-based practices were
given to the relevance and specificity (content-focus) of the professional development. The
weakest ratings were for the two follow-up items—that is, follow-up discussion and collabo-
ration, and related follow-up professional development. These two dimensions may warrant
attention, and may well receive it with the focus on providing more sustained professional
development in the current (2012-2013) PD Master Plan.
Results were similar when we disaggregated by professional role and by program-
matic level, that is, there was very little variation among the role groups and programmatic
levels with regard to the overall quality index rating, which ranged from 3.7 to 3.9 on a 5-
point scale (1 [strongly disagree], 3 [neutral], and 5 [strongly agree])—indicating a moderate
level of agreement that the professional development they attended adhered to research-
based practices for high quality professional development.
There was slightly more variation (3.7 to 4.0) when we disaggregated by content area,
with physical education and foreign language teachers expressing the highest level of agree-
ment.
The greatest degree of variation in the mean quality index rating was among provid-
ers, although all 18 providers had ratings that fell into the general agreement range—that is,
respondents tended to agree with statements that the professional development they attend-
ed adhered to research-based practices and was beneficial overall. However, there were six
providers that scored at 4.0 or above, including CPD; RESAs 1 and 2; and WVDE’s Office of
Instruction, Office of Special Programs, and Office of Title II, III, System Support. The low-
est scoring providers were RESA 7 (3.58), RESA 8 (3.51), and the WVDE Office of School
Improvement (3.55).
Perceived Effectiveness in Addressing the Board’s Goals
For this measure, we selected respondents who attended sessions that providers had
indicated were aligned with particular Board goals for professional development, and
checked to what extent these respondents agreed that the professional development had
been helpful in meeting that goal. With few exceptions—that is, CPD, RESA 1, and the Office
of Title I—there is much room for improvement when it comes to respondents’ perceptions
about alignment of the professional development they received with the goal it was meant to
address. It is unknown why some providers had such consistently low alignment scores, with
Discussion: Perceived Impacts on Knowledge, Behavior, and Attitudes/Beliefs
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 49
only about a quarter to just over a third of individual respondents agreeing that the session
they attended addressed the goal it was intended to support. In the case of the RESAs, some
of the lack of alignment may be due to the approach they used in submitting nonspecific ses-
sion titles—each of which they designated as aligning with several goals—and then reporting
multiple sessions under each, some of which may or may not have aligned well with the
goals. However, RESA 1 did not seem to fall into that pattern, which could indicate that RE-
SA 1 truly focused very sharply on the Board goals—especially goals related to Eng-
lish/language arts, writing, and literacy and numeracy skills, as well as on applying WV
Professional Teaching Standards, and leadership skills to improve schools. Another possible
explanation is that RESA 1 reported only their goal-aligned professional development
through the PD Master Plan reporting system and refrained from reporting other nonaligned
sessions.
With only 51.2% of respondents, overall, in agreement that the sessions they attend-
ed aligned well with the Board goals they were intended to support, goal alignment is clearly
an area that most providers could focus on improving. The 2012-2013 PD Master Plan eval-
uation may see some improvement in this measure, as providers were restricted to indicat-
ing only one primary goal for each of the sessions they included in the PD Master Plan, and
they were required to submit specific titles indicating specific content.
Perceived Impacts on Knowledge, Behavior, and Attitudes/Beliefs
In three paired self-reported pre-/posttest items, participants indicated greater
knowledge after having participated in professional development, reported engaging in more
behavior related to the PD they attended, and holding attitudes and beliefs slightly more
aligned to those supported by the professional development. T tests returned statistically
significant differences for all three areas (p <.000). In nearly all disaggregations, profession-
al development was perceived by participants to have had its greatest impact on their
knowledge and least impact on their attitudes and beliefs. This pattern held true whether we
disaggregated by programmatic level, professional role, content area, or provider. Overall,
perceived impacts were
Highest for early childhood/elementary participants and lowest for respondents who indicated they were in the other programmatic group;
Highest for regular classroom teachers and lowest for respondents who indicated they were in the other role group;
Highest for educators involved in all subject areas (i.e., elementary education), fol-lowed by foreign language teachers; and lowest for those indicating they were not teaching in a content area (N/A) and those teaching English as a second language8; and
Highest for respondents who attended professional development offered by RESA 1, RESA 2, and WVDE Office of Instruction; and lowest for professional development offered by RESA 8, and WVDE’s Office of Assessment and Accountability, Office of Healthy Schools, and Office of School Improvement.
8 The sample was very small for this group and this was the only group for which the t test re-
turned insignificant results, so this result should be used with caution.
Discussion: Formation of the 2011-2012 PD Master Plan
50 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Formation of the 2011-2012 PD Master Plan
Participation of institutions of higher education
Language in West Virginia Code (§18-2-23a), which defines the Board of Education's
role in coordinating professional development, calls for the Board "To ensure that the exper-
tise and experience of state institutions of higher education with teacher preparation pro-
grams are included in developing and implementing professional development programs."
The reduction in participation by IHEs from two institutions (Fairmont State University and
Marshall University) in 2010-2011 to only one (Marshall University) in 2011-2012 is notable,
and indicates an area that needs attention if the statute is to be fully implemented. Ten pub-
lic IHEs in West Virginia with teacher preparation programs did not participate in the 2011-
2012 plan, including Bluefield State College, Concord University, Fairmont State University,
Glenville State College, Salem University, Shepherd University, West Liberty State College,
West Virginia State University, West Virginia University, and West Virginia University-
Parkersburg. Participation by Marshall University was limited to staff from the June Harless
Center.
Participation of WV Department of Education offices
The lack of participation by several offices within the West Virginia Department of
Education, including two that participated in 2010-2011, was also notable for 2011-2012.
The Board and the WV Center for Professional Development (which is responsible for put-
ting together the plan) did address this issue during the formation of the 2012-2013 PD Mas-
ter Plan, and as a result, the new plan includes all offices that provide professional
development to teachers, administrators, and other school and district staff.
Participation of regional education service agencies
The 2010-2011 PD Master Plan was the first one for which the Office of Research
provided the evaluation. For that Plan, the RESAs submitted the same eight broad session
descriptions, under which they individually submitted a large number and wide variety of
professional development sessions. Subsequently, when we conducted the participant sur-
vey, we asked participants to respond to questions about a specific professional development
event using the original nonspecific session descriptions. This led to some confusion. Be-
cause the cycle for putting together the PD Master Plan requires formation and approval of
the new plan before the evaluation of the previous plan is available, we were not able to relay
this dilemma to the Center for Professional Development and WVBE PD committee, which
would have allowed them to make adjustments to RESA input into the 2011-2012 PD Master
Plan. Consequently, we were faced with another PD Master Plan to evaluate with nonspecific
RESA session titles during 2011-2012. We dealt with the issue by allowing RESAs to submit
more specific session titles along with the e-mail addresses of participants in their provider
reports throughout the year. While this procedure reduced the confusion among participants
regarding which specific PD sessions we were asking them about when we surveyed them, it
also introduced bias into the study. In essence, RESAs were allowed to submit placeholder
titles, and then develop actual titles later in the year, unlike the other 10 providers, who had
to submit a specific plan at the outset.
Discussion: Formation of the 2011-2012 PD Master Plan
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 51
During the formation of the 2012-2013 plan the RESAs, once again, submitted a
small set of nonspecific session titles. Their argument in support of this approach was based
on their need to be responsive to the districts in each of their regions, and their inability to
predict what those districts would include in their strategic plans. RESA representatives at
the December 2011 WVBE PD Committee meeting and the February 2012 State Professional
Development Advisory Committee Meeting indicated that district strategic plans form the
basis of the RESAs’ annual strategic plans for professional development, which are due on
October 1 each fall. WVBE Policy 3233 outlines a criteria for developing those strategic
plans, including
(1) direction from the State Superintendent; (2) findings from five-year strategic
plans of low-performing schools in member county systems; (3) findings for member
districts from reviews of accountability reports from the Office of Education Perfor-
mance Audits (hereinafter OEPA); (4) requests from superintendents of low-
performing schools; and (5) any other findings considered appropriate by the RESA
executive director for planning programs and services that address the needs of
member county systems and that are consistent with and support WVBE initiatives
(p. 8).
To further investigate this issue, we reviewed the professional development portions of all
eight RESA strategic plans submitted on October 1, 2011, and found most of them to be quite
detailed in the area of professional development, listing specific titles for workshops, semi-
nar series, online courses, and so forth. In some cases, there were rich descriptions of objec-
tives and action plans from which professional development session titles could readily be
developed.9 There were other notable patterns, as well:
There was diversity among the plans, even while some plans shared commonalities.
This diversity in professional development plans will continue to be difficult to cap-
ture and evaluate if RESAs participate in the PD Master Plan as a single entity or
submit a single slate of professional development offerings.
Many of the plans showed some alignment with the Board goals, offering profession-al development in
o 21st Century Content Standards and Objectives, although not necessarily men-tioning the new Common Core State Standards for English/language arts and mathematics (Goal 1);
o Technology integration, such as the use of whiteboards, e-instruction, College Foundation Web Portal, and so forth (Goal 2); and
o Leadership training, especially in the use of various assessment strategies (e.g., Instructional Practices Inventory (IPI), progress monitoring, school profile de-velopment) and new administrator mentoring and other new administrator sem-inars and professional learning series.
9 In only one case did the strategic plan lack information about professional development
geared toward K-12 teachers, administrators, and counselors, focusing only on pre-K and public ser-
vice training.
Discussion: Formation of the 2011-2012 PD Master Plan
52 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Although there were sessions in several of the plans featuring research-based teach-
ing and learning practices, there was no mention of applying the West Virginia Pro-
fessional Teaching Standards (Goal 3).
The strategic plans included a much broader spectrum of professional development
offerings than what could be included under the umbrella of the 2011-2012 Board
goals for professional development. For example, many RESAs included specific
plans for professional development in adult basic education core workshops, public
service training, GEAR-UP, Teaching American History, WVEIS on the Web (espe-
cially the student discipline module), preK standards and assessments, and various
health education initiatives (e.g., Fitness Gram, Health Education Assessment Pro-
ject [HEAP], Let’s Move).
Many of the plans included nonspecific language, such as this from RESA 4, “Provide
professional development consistent with county and/or RESA 4 goals as stated in
strategic plans” (RESA 4, 2011, p. 13), allowing for a highly responsive mode of ser-
vice delivery—with no expressed alignment or commitment to the Board’s goals for
professional development.
The RESA strategic plans all featured the Board’s strategic goals for education, but
not its goals for professional development.
It seems that a large part of the challenge for RESAs in forming and evaluating the
PD Master Plan has to do with timing. Their planning cycle does not align with the Board’s
planning cycle—both of which are driven by state code. In the face of this challenge, the
RESAs each submitted only three specific professional development titles—one focused on
each of the three Board goals for professional development in 2012-2013. While this
approach will address the problem of the lack of specificity in previous PD Master Plans, and
will likely allow respondents in the current year’s participant survey to more readily
recognize the relationship of the professional development they engaged in with the goals it
was designed to address, it poses another problem. Taking this approach will dramatically
reduce the number of professional development offered by the RESAs that are included and
approved in the PD Master Plan. Unless the process is altered, the consequence of this
strategy for the RESAs will be a further steep decline—following the steep decline seen this
year—in the number of participants who attend sessions approved as part of the PD Master
Plan. Such an outcome is unfair to the RESAs, making it appear that their level of service to
their regions is decreasing over time. This outcome also does not serve the larger goals of the
Board as delineated in state code, which calls for the coordination and evaluation of
professional development through the formation of a statewide master plan. Nor does it
align with recommendations in the Education Efficiency Audit (Public Works, 2012)
commissioned by the governor (more discussion about implications of the audit appears
later in this section).
Yet, there may also be a solution in state code. The statute that outlines the process
for developing the PD Master Plan includes the following language (§18-2-23a., see Appen-
dix A):
The Master Plan shall serve as a guide for the delivery of coordinated professional
staff development programs by the State Department of Education, the Center for
Discussion: Formation of the 2011-2012 PD Master Plan
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 53
Professional Development, the state institutions of higher education and the regional
educational service agencies beginning on the first day of June in the year in which
the Master Plan was approved through the thirtieth day of May in the following year.
This section does not prohibit changes in the Master Plan, subject to State Board
approval, to address staff development needs identified after the Master Plan was
approved. (Emphasis added.)
This language seems to leave open the possibility of amending the PD Master Plan to include
more detailed plans for professional development that could be submitted by the RESAs af-
ter they have had the opportunity to consult the strategic plans generated by the districts
they serve.
Participation of the West Virginia Center for Professional Development
The West Virginia Center for Professional Development (CPD) is at the center of the
process for developing the PD Master Plan. It convenes meetings and works with the other
providers to compile the plan and submits its own slate of planned professional development
sessions.
In preparation for the formation of the 2012-2013 PD Master Plan, CPD obtained a
list of planned professional development compiled from school strategic plans, for the Board
PD Committee to use in its process of setting new goals for professional development. The
list was massive, however, including tens of thousands of session titles that were not catego-
rized in any way, so it proved to be of limited value in the goal-setting process. CPD also
worked more closely with the Higher Education Policy Commission, and as a result there
were representatives from Concord University, Marshall University, West Virginia State
University, and West Virginia University at the PD Advisory Committee Meeting in February
2012. CPD also worked with the State Superintendent to convey to offices in the WVDE that
all professional development they plan to provide must be part of the PD Master Plan, and
prepared an informational frequently-asked-questions document about the Master Plan,
which explained the process and included the Board’s Goals (both strategic and professional
development). Lastly, they posted an online tool for providers to use in submitting their ses-
sion titles. This additional work resulted in an increase in the number of WVDE offices in-
cluded in the plan from eight in 2011-2012 to 19 in 2012-2013. It did not result, however, in
greater participation of IHEs. Only Marshall University continues to participate in the PD
Master Plan.
Lastly it should be noted that CPD has a sophisticated online registration system for
managing its professional development registrations—something most of the rest of the pro-
viders lack and the Board has expressed the need for.
Final thoughts on the process for developing the PD Master Plan
While a disproportionate part of the discussion thus far has focused on the RESAs,
other agencies may also face some of the same planning schedule issues, and would be more
able to provide a comprehensive and realistic plan for their professional development if they
could add or subtract sessions early in the fall. If scheduling is determined to be at issue, it
appears that there is a potential solution to the problem—that is, to reopen the PD Master
Plan for a revised list of PD session titles with a deadline of October 1, which is the deadline
Discussion: Formation of the 2011-2012 PD Master Plan
54 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
for RESA strategic plans and the earliest point at which they have their final list of profes-
sional development sessions compiled. This date is only 4 months into the PD Master Plan
reporting year, so it would allow providers the opportunity to update plans for the remaining
8 months.
While the ability to update plans would be useful, it only affects logistical aspects of
the planning process. Other, more programmatic and substantive issues remain about how
to use the PD Master Plan as a stronger mechanism for coordinating professional develop-
ment. The extent to which this plan helps drive the agenda for professional development is
unknown, although the review (above) of RESA strategic plans in the context of the Board’s
Goals for Professional Development provides some evidence that there may not be a strong
connection between what some providers deliver and what is envisioned by the Board
through its PD Master Plan.
Yet, the Board’s leadership in coordinating professional development was strongly
called for in the recently released, Education Efficiency Audit of West Virginia’s Primary
and Secondary Education System, by Public Works (2012). The authors of the report made
the following assertion:
States cannot improve the quality of professional development with a patchwork or
series of improvement strategies. Rather, improvements must be strategic, systemic,
and use research to determine the way professional development is selected, deliv-
ered, evaluated, and funded. (p. 62)
The authors endorsed the findings of the November 2006 RESA Task Force, which also
called for more focused leadership with the following claim:
. . . [T]he governance structure of the West Virginia professional development system
is too diffuse to assume that the entities responsible for professional development are
working in a synchronized way to meet state goals for professional development. The
professional development system needs to be driven by an agreed upon professional
development definition, vision, and standards (Public Works, 2012, p. 55).
Later in the report, they made the following recommendation:
Refine and use the Master Plan for Statewide Professional Staff Development as a
true strategic planning tool. In interviews conducted for this review, educators com-
mented that while the Master Plan articulates the state’s PD goals, it does not lay out
a larger strategy for how those goals will be achieved. Some interviewees described
the Plan as merely a “laundry list of state-approved PD courses.” At its inception, the
Master Plan was intended to serve as a tool to identify redundancies in PD offerings.
However, so far, there are no real examples of eliminating duplications (p. 63).
During the course of the 2011-2012 year—before the Education Efficiency Audit was
released—the Board began moving in the direction articulated in the audit. It adopted stand-
ards for professional development that are based on the Learning Forward (formerly the Na-
tional Staff Development Council) standards. In December 2011, it developed a definition of
professional development and a new set of goals. The new goals are strongly aligned to the
Board’s Strategic Goals and the Superintendent’s priorities, forming a cohesive and coherent
vision of the role for professional development. The Board’s PD Committee also began ex-
ploring options for creating an online catalog of professional development offerings and cen-
Discussion: Limitations of the Study
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 55
tralized registration system. It soon realized that creating such a system would require plan-
ning, resources, and time to do well. Yet developing such a system could help eliminate du-
plications, provide needed oversight, and expedite the PD evaluation process.
Clearly, there is much to consider as the Board looks ahead to future PD Master
Plans. As a next step toward the goal of actively coordinating professional development as
envisioned in West Virginia Code (§18-2-23a), and as called for in the Education Audit, the
Board may need to know more about this very complex terrain, including a more compre-
hensive view of the professional development that is offered by the following groups:
IHEs—What sorts of partnerships exist between IHEs and RESAs, districts, and schools, and how well is what they are doing aligned with the Superintendent’s prior-ities and the Board’s strategic and professional development goals?
RESAs and WVDE—What is the complete picture of professional development that RESAs and WVDE offices provide during the course of a year, and how do they de-cide upon those particular offerings; that is, are there criteria they use for prioritizing what they do in response to requests from the field, or do they respond based mainly on an expressed need by a school or district? How closely aligned is their decision making about the slate of professional development they will offer with the Superin-tendents’ priorities and the Board’s strategic and professional development goals?
Districts and schools—What professional development do they provide? How do they prioritize their offerings? Who does the actual training/facilitation—vendors, IHEs, in-house staff, others?
Overall, a more comprehensive study of professional development could build on the work
done by the authors of the Education Efficiency Audit, but also investigate what takes place
at the school and district levels. We have heard from the RESAs and others that a large por-
tion of the professional development they offer falls outside of the sessions listed in the plan
due to shifting priorities, and needs as they arise. The Board may wish to examine this phe-
nomenon, and consider whether professional development that providers offer outside of
the PD Master Plan aligns with the Board's strategic goals and priorities, and if not, deter-
mine if the Board should enlarge its vision or if such professional development efforts should
be abandoned or refocused.
Limitations of the Study
The participant survey conducted in November-December 2011 and April-May 2012
(with supplemental polling in August for CPD participants) asked respondents to recall PD
sessions they had participated in at some point in the past. In some cases, the sessions had
taken place up to five months prior to the survey. For this reason, there is a possibility of
temporal bias in survey participants’ responses.
Furthermore, the use of a retrospective pretest/posttest methodology to assess
changes in knowledge, behavior and skills, and attitudes and beliefs poses some concerns.
We used this methodology primarily because some researchers have argued that a phenom-
enon called response shift bias can occur when conducting traditional pretest/posttest de-
signs. Response-shift bias “occurs when a participant uses a different internal understanding
of the construct being measured to complete the pretest and posttest” (Moore & Tananis,
Discussion: Limitations of the Study
56 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
2009, p. 190). Consider this in context of professional development. Some respondents
begin their involvement in professional development with a misconception that they are al-
ready well-versed in the content to be covered. When given a pretest, they rate their own
knowledge, behavior and skills, and attitudes and beliefs very positively. However, over the
course of the professional development, as they develop a deeper understanding of the con-
tent being covered, they realize they did not know as much as they originally thought. As
such, when presented with the posttest, their frame of reference has shifted and they could
potentially rate their knowledge, behavior and skills, and attitudes and beliefs lower than
they did on the pretest. This can lead to problems in analyzing the impact of the professional
development. For this reason, some researchers advocate for using retrospective pre-
test/posttest designs as we did in this study.
Despite this strength of the retrospective pretest/posttest design, a recent research
study conducted by Nimon, Zigarmi, and Allen (2011) found that using traditional pre-
test/posttest designs leads to less biased estimates of program effectiveness. The authors
present a compelling case that presenting both pre- and posttest items simultaneously on a
single survey is among the most biased design options available to researchers and can sig-
nificantly inflate effect size estimates. The authors recommend traditional pretest/posttest
designs when possible and advocate for the implementation of a separate retrospective pre-
test to allow researchers to determine the presence of any response-shift bias. This design
option, despite its strength, was not feasible in this study due to a mismatch between the
scale of professional development offerings in the state and available evaluation staffing re-
sources. Therefore, we recommend cautious interpretation of our own estimates of effect
size, as they may be somewhat inflated.
While a 68.1% response rate (or 74.8% for the sample adjusted for attrition) is high
for this type of survey, there remained a portion of the sample from whom we did not hear.
We can account for approximately 7% of the nonrespondents as individuals whose e-mail
addresses were broken or obsolete, or who contacted us to report that they had not attended
the session in our survey participation request. But this leaves approximately 25% of the to-
tal sample whose perceptions about the professional development are unknown.
Our literature review did not reveal any appropriately tested and validated measures
of professional development quality and/or impact that met our specific needs. Therefore,
we developed our own measures for this study. Due to time and resource constraints, these
measures were not field tested prior to operational use. Consequently, there is not adequate
validity evidence that the constructs we sought to measure are fully addressed by our survey
items. The measures used possess only face validity.
57
Issues for Consideration
The following considerations are based on findings from this study and are offered
for the purpose of improving the overall process of formulating, implementing, and evaluat-
ing the West Virginia Master Plan for Professional Staff Development (PD Master Plan). Re-
lated to development of future PD Master Plans, we offer the following suggestions:
With the exception of Marshall University’s June Harless Center, other institutions of
higher education (IHEs) with teacher preparation programs were absent from the
2011-2012 PD Master Plan, despite the WV Center for Professional Development’s
(CPD) efforts to include them. The Board may wish to consider if there are other
strategies that could be employed to bring this group into the Master Plan.
Similar to the approach they used in the 2010-2011 PD Master Plan, RESAs listed on-
ly seven session titles, and then reported multiple professional development sessions
they provided under one of those seven titles during the course of the year. Staff indi-
cate they have taken this approach because they cannot predict what professional de-
velopment districts will request before the districts put together their strategic plans.
The Board may wish to consider reopening the PD Master Plan in early October, to
allow the RESAs and other providers to revise their lists of planned professional de-
velopment sessions based on strategic needs of their target audiences.
Related to implementation of future PD Master Plans, we suggest the following:
There were some newcomers to the PD Master Plan this year, which may explain why
there was a slight drop in the fulfillment of sessions planned, from 80% last year to
77.5% in 2011-2012. A review of the reasons for not providing planned sessions re-
vealed that the most prevalent reason for cancelling sessions was lack of interest (not
enough people registered or districts did not request it). Five sessions were cancelled
to avoid a duplication of effort and another five due to changing priorities. Raising
the rate of fulfillment would be a good goal, which could be enhanced by allowing
providers to update the plan each October (as called for above).
Again this year, survey respondents indicated that providers have done well deliver-
ing professional development that is research-based in most of the seven dimensions
measured. Several of the providers could improve related to supporting extension of
the professional development to the workplace via discussions and collaboration,
and by providing follow-up sessions.
As discussed extensively in the previous section, about half of all respondents did not
agree that the professional development they attended was helpful in meeting the
Board goal that providers indicated the session was meant to support. Providers
should consider re-examining the alignment of the professional development they
have in the current plan (and future plans) to be sure that they are providing expe-
riences that truly are focused on the Board goals.
Travel time to and from professional development covered in the 2011-2012 PD Mas-
ter Plan was estimated to total more than 20,000 hours for the more than 20,000 at-
Issues for Consideration: Limitations of the Study
58 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
tendees. While some of this travel cannot be avoided, providers should consider
looking for ways to reduce it, especially by using formats other than face-to-face for
their sessions, which is the format currently used for 90% of all sessions.
Related to evaluation of future PD Master Plans, we suggest the following:
As noted in the Education Efficiency Audit, the evaluation of the PD Master Plan co-
vers only professional development delivered by providers included in the plan, and
only the subset of their offerings that were aligned with the Board’s goals for profes-
sional development and submitted as part of the plan. The drop off in attendance by
42% this year may be an indication that even among this group, less of what provid-
ers offered fell under the auspices of the PD Master Plan. Left out of the PD Master
Plan, and this study, is likely a large portion of the professional development that
takes place in West Virginia—including professional development delivered by dis-
tricts and schools. We know little about this professional development, including
whether it is aligned with goals and priorities of the Board and Superintendent. The
Board may wish to consider studying more comprehensively the professional de-
velopment that is offered by the four main groups of state and regional providers
(CPD, IHEs, RESAs, and WVDE), and by districts and—to the extent possible—by
schools. Conducting a 1-year study could provide essential background information
as the Board strives to fulfill the leadership and coordinating role laid out for it in
West Virginia Code (§18-2-23a) and urged upon it by the Education Efficiency Audit
(Public Works, 2012). The Board could require providers to report on all profes-
sional development they offer, and in so doing, indicate for each session they con-
duct and report, to which goal the PD is aligned—or provide a rationale for why the
professional development was offered. Part of the study could include an analysis of
the rationales provided, which could inform the Board as it enters a new cycle of
goal formation and planning.
59
References
Annual professional staff development goals established by State Board; coordination of pro-
fessional development programs; program development, approval and evaluation,
West Virginia Code §18-2-23a (updated 2011, Fourth Special Session). Retrieved
from
http://www.legis.state.wv.us/wvcode/ChapterEntire.cfm?chap=18&art=2§ion=
23A#02.
Establishment and Operation of Regional Education Service Agencies (RESAs). West Virgin-
ia Board of Education Policy 3233 (updated July 1, 2010). Retrieved from
http://wvde.state.wv.us/policies/p3233_ne.pdf.
Moore, D., & Tananis, C. A. (2009). Measuring change in a short-term educational program
using a retrospective pretest design. American Journal of Evaluation, 30(2), 189 –
202.
Nimon, K., Zigarmi, D., & Allen, J. (2011). Measures of program effectiveness based on ret-
rospective pretest data: Are all created equal? American Journal of Evaluation,
32(1), 8-28.
Public Works. (2012). Education efficiency audit of West Virginia’s primary and secondary
education system. West Chester, PA: Author.
West Virginia Board of Education, (2011). Master Plan for Statewide Professional Devel-
opment: 2011-2012. Charleston, WV: West Virginia Center for Professional Devel-
opment. Retrieved from
http://www.legis.state.wv.us/legisdocs/reports/agency/E01_FY_2012_1196.pdf.
West Virginia Board of Education, (2012). Master Plan for Statewide Professional Devel-
opment: 2012-2013. Charleston, WV: West Virginia Center for Professional Devel-
opment. Retrieved from http://www.wvcpd.org/images/MasterPlan2012.pdf.
Yoon, K. S., Duncan, T., Lee, S. W.-Y., Scarloss, B., & Shapley, K. (2007). Reviewing the evi-
dence on how teacher professional development affects student achievement (Issues
& Answers Report, REL 2007–No. 033). Washington, DC: U.S. Department of Edu-
cation, Institute of Education Sciences, National Center for Education Evaluation
and Regional Assistance, Regional Educational Laboratory Southwest. Retrieved
from http://ies.ed.gov/ncee/edlabs.
60
61
Appendix A. West Virginia Code (§18-2-23a)
Retrieved from the West Virginia Legislature website at the following URL: http://www.legis.state.wv.us/
wvcode/code.cfm?chap=18&art=2.
§18-2-23a. Annual professional staff development goals established by State Board; coordina-
tion of professional development programs; program development, approval and evaluation.
(a) Legislative intent. -- The intent of this section is:
(1) To provide for the coordination of professional development programs by the State Board;
(2) To promote high-quality instructional delivery and management practices for a thorough and efficient
system of schools; and
(3) To ensure that the expertise and experience of state institutions of higher education with teacher
preparation programs are included in developing and implementing professional development programs.
(b) Goals. -- The State Board annually shall establish goals for professional staff development in the
public schools of the state. As a first priority, the State Board shall require adequate and appropriate professional
staff development to ensure high quality teaching that will enable students to achieve the content standards es-
tablished for the required curriculum in the public schools.
The State Board shall submit the goals to the State Department of Education, the Center for Profes-
sional Development, the regional educational service agencies, the Higher Education Policy Commission and the
Legislative Oversight Commission on Education Accountability on or before the fifteenth day of January each
year.
The goals shall include measures by which the effectiveness of the professional staff development pro-
grams will be evaluated. The professional staff development goals shall include separate goals for teachers,
principals and paraprofessional service personnel and may include separate goals for classroom aides and oth-
ers in the public schools.
In establishing the goals, the State Board shall review reports that may indicate a need for professional
staff development including, but not limited to, the report of the Center for Professional Development created in
article three-a, chapter eighteen-a of this code, student test scores on the statewide student assessment pro-
gram, the measures of student and school performance for accreditation purposes, school and school district
report cards and its plans for the use of funds in the strategic staff development fund pursuant to section thirty-
two, article two, chapter eighteen of this code.
(c) The Center for Professional Development shall design a proposed professional staff development
program plan to achieve the goals of the State Board and shall submit the proposed plan to the State Board for
approval as soon as possible following receipt of the State Board goals each year. In developing and implement-
Appendix A. West Virginia Code (§18-2-23a): Limitations of the Study
62 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
ing this plan, the Center first shall rely upon the available expertise and experience of state institutions of higher
education before procuring advice, technical assistance or consulting services from sources outside the state.
The proposed plan shall include a strategy for evaluating the effectiveness of the professional staff de-
velopment programs delivered under the plan and a cost estimate. The State Board shall review the proposed
plan and return it to the Center for Professional Development noting whether the proposed plan is approved or is
not approved, in whole or in part. If a proposed plan is not approved in whole, the State Board shall note its ob-
jections to the proposed plan or to the parts of the proposed plan not approved and may suggest improvements
or specific modifications, additions or deletions to address more fully the goals or eliminate duplication. If the
proposed plan is not wholly approved, the Center for Professional Development shall revise the plan to satisfy
the objections of the State Board. State board approval is required prior to implementation of the professional
staff development plan.
(d) The State Board approval of the proposed professional staff development plan shall establish a
Master Plan for Professional Staff Development which shall be submitted by the State Board to the affected
agencies and to the Legislative Oversight Commission on Education Accountability. The Master Plan shall in-
clude the State Board-approved plans for professional staff development by the State Department of Education,
the Center for Professional Development, the state institutions of higher education and the regional educational
service agencies to meet the professional staff development goals of the State Board. The Master Plan also shall
include a plan for evaluating the effectiveness of the professional staff development delivered through the pro-
grams and a cost estimate.
The Master Plan shall serve as a guide for the delivery of coordinated professional staff development
programs by the State Department of Education, the Center for Professional Development, the state institutions
of higher education and the regional educational service agencies beginning on the first day of June in the year
in which the Master Plan was approved through the thirtieth day of May in the following year. This section does
not prohibit changes in the Master Plan, subject to State Board approval, to address staff development needs
identified after the Master Plan was approved.
63
Appendix B. Providers’ Session Report Protocol
WVBE 2011-2012 PD Master Plan Session Report
[posted online in SurveyMonkey]
Contact information
Name:
E-mail Address:
Organization: [Drop-down menu listing providers]
Select which session from the 2011-2012 PD Master Plan you are reporting. [Drop-down
menu listing session titles for the organization selected in item above]
What was the duration of this session? Please indicate the total number of hours:
Please indicate the dates for this session.
Beginning date (MM/DD/YYYY):
Ending date (MM/DD/YYYY):
In which county was the training held? [Drop-down menu listing counties]
What was the format of the training? [Drop-down menu with the following options: Face-to-
face, Online, Blended]
What was the attendance for this session? Please indicate the number of participants:
Please paste or type participant e-mail addresses in the box below. If you have more than
250 addresses, use the second box for additional addresses, beginning with the 251st ad-
dress. E-mail addresses can be submitted as a list with one address per line, or separated by
commas, semicolons, or spaces. NOTE: E-mail addresses are NOT required for sessions
held from April 1 through May 31, 2012.
Is there anything else we need to know about this PD session? (Limit 50 words):
64
65
Appendix C. WV PD Master Plan: 2012 Participant Survey
Appendix C. WV PD Master Plan: 2012 Participant Survey
66 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Appendix C. WV PD Master Plan: 2012 Participant Survey
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 67
Appendix C. WV PD Master Plan: 2012 Participant Survey
68 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Appendix C. WV PD Master Plan: 2012 Participant Survey
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 69
Appendix C. WV PD Master Plan: 2012 Participant Survey
70 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
71
Appendix D. E-mail Survey Participation Requests
SURVEY ANNOUNCEMENT
Dear Educator,
In a few days, you will receive an important message about a brief survey that the West Vir-
ginia Board of Education has asked us to conduct.
You were selected from the thousands of state educators who participated in professional
development from November 1, 2011 through March 31, 2012, offered by Marshall University (June
Harless Center), the West Virginia Center for Professional Development, the Regional Education Ser-
vice Agencies (RESAs), and the West Virginia Department of Education.
The questionnaire has been designed so you can fill it out very quickly and easily. You need
only check off your answers. It will only take 2 to 5 minutes to fill out online.
We urge you to watch for this invitation and to take a few minutes to respond as soon as
you receive the message.
Your honest impressions, whether favorable or unfavorable, will be greatly appreciated.
Best regards,
Pat Hammer
Patricia Cahape Hammer
Coordinator
Office of Research
Building 6, Room 722
1900 Kanawha Boulevard East
Charleston, WV 25305-0330
304.558.2546 P
304.558.1613 F
E-mail: [email protected]
wvde.state.wv.us
Appendix D. E-mail Survey Participation Requests
72 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
FIRST REQUEST
Dear Educator,
The West Virginia Board of Education has asked us to conduct a brief survey of a sample of educa-tors who participated in professional development offered by various statewide providers from No-vember 2011 through March 2012.
You were selected from educators who participated in a professional development session provided by «Provider», beginning on «begindate», addressing the following topic: «resasession».
It is very important that we learn of your opinions about this session because you represent oth-ers who shared your experience. The questionnaire has been designed so you can fill it out very quickly and easily. You need only check off your answers. It will only take 2 to 5 minutes to fill out online.
You can be absolutely sure that all of the information you provide is strictly confidential, and no in-dividual respondent will be identified. Your answers will be combined with others and used only for statistical analysis.
Your honest impressions about the professional development, whether favorable or unfavorable, are very necessary to be sure we are able to provide an accurate evaluation for the Board.
Please follow these easy steps:
Click on this link: https://www.surveymonkey.com/s/WVBE2012PDParticipantSurvey
Type in the e-mail address we used to send you this message.
Copy the following Survey Code and paste it into the appropriate box: «sessionid».
We genuinely appreciate your participation. Thank you!
Best regards,
Pat Hammer
Patricia Cahape Hammer Coordinator Office of Research
West Virginia Department of Education
Building 6, Room 722 1900 Kanawha Boulevard East Charleston, WV 25305-0330 304.558.2546 P 304.558.1613 F E-mail: [email protected] wvde.state.wv.us
Appendix D. E-mail Survey Participation Requests
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 73
SECOND REQUEST
Dear Educator,
Last Thursday, we contacted you about a brief survey that the West Virginia Board of Educa-
tion has asked us to conduct. In response to their request we are contacting a sample of educators
who participated in professional development offered by various statewide providers from Novem-
ber 1, 2011 through March 31, 2012.
You were selected from educators who participated in a professional development session
provided by XX beginning on XX, addressing the following topic: XX
It is very important that we learn of your opinions about this session because you repre-
sent others who shared your experience. The questionnaire has been designed so you can fill it out
very quickly and easily. You need only check off your answers. It will only take 2 to 5 minutes to fill
out online.
You can be absolutely sure that all of the information you provide is strictly confidential, and
no individual respondent will be identified. Your answers will be combined with others and used
only for statistical analysis.
Your honest impressions about the professional development, whether favorable or unfa-
vorable, are very necessary to be sure we are able to provide an accurate evaluation for the Board.
Please follow these easy steps:
Click on this link: https://www.surveymonkey.com/s/WVBE2012PDParticipantSurvey .
Type in the e-mail address we used to send you this message.
Copy the following Survey Code and paste it into the appropriate box: XX.
We genuinely appreciate your participation. Thank you!
Best regards,
Pat Hammer
Patricia Cahape Hammer
Coordinator
Office of Research
West Virginia Department of Education
Building 6, Room 722 1900 Kanawha Boulevard East Charleston, WV 25305-0330 304.558.2546 P 304.558.1613 F E-mail: [email protected] wvde.state.wv.us
Appendix D. E-mail Survey Participation Requests
74 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
THIRD REQUEST
Dear Educator,
Once again, we are contacting you about a brief survey the West Virginia Board of Education
has asked us to conduct of a sample of educators who participated in professional development of-
fered by statewide providers from November 2011 through March 2012.
You were selected from educators who participated in a professional development session
provided by «Provider» beginning on «begindate», on the following topic: «resasession».
It is very important that we learn of your opinions about this professional development
because you represent others who shared your experience. The questionnaire has been designed
so you can fill it out very quickly and easily. You need only check off your answers. It will only take 2
to 5 minutes to fill out online.
You can be absolutely sure that all of the information you provide is strictly confidential, and
no individual respondent will be identified. Your answers will be combined with others and used
only for statistical analysis.
Your honest impressions about the professional development, whether favorable or unfa-
vorable, are very necessary to be sure we are able to provide an accurate evaluation for the Board.
Please follow these easy steps:
Click on this link: https://www.surveymonkey.com/s/WVBE2012PDParticipantSurvey .
Type in the e-mail address we used to send you this message.
Copy the following Survey Code and paste it into the appropriate box: «sessionid».
We genuinely appreciate your participation. Thank you!
Best regards,
Pat Hammer
Patricia Cahape Hammer
Coordinator Office of Research West Virginia Department of Education Building 6, Room 722 1900 Kanawha Boulevard East Charleston, WV 25305-0330 304.558.2546 P 304.558.1613 F E-mail: [email protected] wvde.state.wv.us
Appendix D. E-mail Survey Participation Requests
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 75
FOURTH (FINAL) REQUEST
Dear Educator,
Please pardon our persistence, but we have not yet heard from you, so we must contact you again about a brief survey the West Virginia Board of Education has asked us to conduct of a sample of educators who participated in professional development offered by various statewide providers from November 2011 through March 2012.
You were selected from educators who participated in a professional development session provided by XX beginning on XX, on the following topic: XX
It is very important that we learn of your opinions about this session because you represent oth-ers who shared your experience. The questionnaire has been designed so you can fill it out very quickly and easily. You need only check off your answers. It will only take 2 to 5 minutes to fill out online.
You can be absolutely sure that all of the information you provide is strictly confidential, and no in-dividual respondent will be identified. Your answers will be combined with others and used only for statistical analysis.
Your honest impressions about the professional development, whether favorable or unfavorable, are very necessary to be sure we are able to provide an accurate evaluation for the Board.
Please follow these easy steps:
Click on this link: https://www.surveymonkey.com/s/WVBE2012PDParticipantSurvey.
Type in your e-mail address that we used to send you this message.
Copy the following Survey Code and paste it into the appropriate box: XX
We genuinely appreciate your participation. Thank you!
Best regards,
Pat Hammer
Patricia Cahape Hammer Coordinator Office of Research West Virginia Department of Education Building 6, Room 722 1900 Kanawha Boulevard East Charleston, WV 25305-0330 304.558.2546 P 304.558.1613 F E-mail: [email protected] wvde.state.wv.us
76
77
Appendix E. Providers’ Missing Sessions Report
78
79
Appendix F. Participant Survey Data Tables
Table A 1. Estimated Travel Time to Participate in Professional Development
School district or worksite
Estimated mean
minutes of travel
Number of respondents
Travel time in minutes
30 or less 31-60 61-90 91-120 More
than 120
All 61.9 4,126 1,417 915 615 477 702
Barbour 73.1 37 6 12 6 5 8
Berkeley 73.7 132 60 7 3 5 57
Boone 57.9 104 29 43 7 10 15
Braxton 53.9 43 14 13 9 4 3
Brooke 67.2 29 10 4 4 6 5
Cabell 43.5 152 86 33 9 5 19
Calhoun 46.2 43 22 7 6 7 1
Clay 50.1 46 16 17 7 2 4
Doddridge 49.1 25 7 10 6 2 0
Fayette 48.3 152 67 45 15 9 16
Gilmer 47.2 53 25 9 12 5 2
Grant 70.8 38 17 3 1 3 14
Greenbrier 75.8 98 12 26 26 17 17
Hampshire 105.5 67 3 7 11 12 34
Hancock 75.5 48 11 12 5 6 14
Hardy 91.5 43 9 3 7 4 20
Harrison 55.2 90 25 40 5 11 9
Institutional 73.1 76 17 19 10 13 17
Jackson 70.1 50 2 26 9 5 8
Jefferson 58.6 80 43 9 1 4 23
Kanawha 45.6 301 177 47 14 25 38
Lewis 51.3 26 11 6 3 5 1
Lincoln 64.1 95 17 38 20 4 16
Logan 69.9 48 8 12 16 5 7
Marion 54.7 95 35 29 9 11 11
Marshall 80.8 57 10 5 17 15 10
Mason 53.9 79 26 25 16 4 8
McDowell 87.2 87 9 13 22 21 22
Mercer 64.0 91 32 19 11 10 19
Mineral 98.8 36 2 1 12 9 12
Table A 1 continues on next page
Appendix F. Participant Survey Data Tables
80 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Table A 1. Estimated Travel Time to Participate in Professional Development
School district or worksite
Estimated mean
minutes of travel
Number of respondents
Travel time in minutes
30 or less 31-60 61-90 91-120 More
than 120
Mingo 71.4 102 23 24 18 18 19
Monongalia 45.5 136 72 32 7 10 15
Monroe 96.5 30 2 5 5 6 12
Morgan 80.6 35 8 6 3 8 10
Nicholas 56.2 90 42 8 14 18 8
Ohio 88.3 61 10 3 15 17 16
Other 63.0 110 42 19 16 9 24
Pendleton 65.2 61 30 1 5 10 15
Pleasants 66.9 28 6 6 8 6 2
Pocahontas 54.8 45 20 7 7 6 5
Preston 88.4 28 1 9 5 3 10
Putnam 50.7 98 41 32 8 1 16
Raleigh 49.2 106 44 30 17 5 10
Randolph 76.3 72 13 9 26 11 13
Ritchie 85.5 9 0 4 0 3 2
Roane 59.4 69 14 26 18 5 6
Summers 82.2 36 4 10 6 6 10
Taylor 61.3 36 8 16 1 7 4
Tucker 58.4 42 18 6 6 6 6
Tyler 78.7 19 0 8 3 6 2
Upshur 53.7 66 24 14 16 10 2
Wayne 61.8 92 29 31 8 1 23
Webster 70.7 44 7 13 11 6 7
Wetzel 65.1 69 17 18 15 10 9
Wirt 62.0 40 13 4 14 6 3
Wood 55.8 187 80 12 57 27 11
WV Deaf & Blind 83.5
15 5 0 0 6 4
Wyoming 48.2 79 36 22 7 6 8
Note: Eliminated from this analysis were respondents from out-of-state and from WVDE.
Appendix F. Participant Survey Data Tables
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 81
Adherence to Research-Based Practices Data Tables
Table A 2. Adherence to Research-Based Practices for High Quality Professional Development, by Programmatic Level
Total
Early child./elem.
Middle school
High school
Other
N/A
Research-Based Practice Mean SD Mean SD
Mean SD
Mean SD
Mean SD
Mean SD
Overall Quality Index* 3.82 .711 3.88 .720 3.78 .718 3.79 .678 3.80 .745 3.78 .664
Supported by follow-up PD sessions
3.40 1.047
3.46 1.061
3.39 1.031
3.33 1.043
3.38 1.027
3.38 .971
Supported by follow-up discussion
3.59 1.052
3.69 1.031
3.60 1.036
3.46 1.095
3.59 1.043
3.58 .942
Specific and content-focused
4.17 .769
4.23 .756
4.11 .774
4.17 .765
4.13 .782
4.17 .685
Relevant to current needs 4.08 .868
4.13 .868
4.02 .893
4.08 .861
4.08 .861
4.09 .800
Intensive in nature 3.73 .904
3.77 .912
3.66 .898
3.74 .887
3.71 .880
3.61 .893
Hands-on with active learning
3.87 .945
3.93 .928
3.81 .934
3.85 .952
3.74 .996
3.73 .976
Beneficial, had a positive impact
3.89 .909
3.94 .925
3.87 .911
3.86 .891
3.85 .884
3.88 .814
* This is a computed value based on the average of the other seven measures of quality. NOTE: Frequencies ranged from 4135 to 4263 (Total), 1783 to 1834 (Early Child./elem.), 1013 to 1040 (Middle school), 1315 to 1356 (High school), 308 to 321 (Other), and 282 to 295 (N/A).
Table A 3. Adherence to Research-Based Practices for High Quality Professional Development, by Professional Role
Total
District central
office staff
Principal/ assistant principal
Regular classroom
teacher
Special education
teacher
Other professional
role
Research-Based Practice Mean SD Mean SD Mean SD Mean SD Mean SD Mean SD
Overall Quality Index* 3.82 .711
3.88 .680
3.83 .675
3.87 .727
3.88 .670
3.71 .710
Supported by follow-up PD sessions
3.40 1.047
3.54 .975
3.44 1.014
3.46 1.075
3.53 1.010
3.20 1.027
Supported by follow-up discussion
3.59 1.052
3.72 .940
3.73 .915
3.59 1.093
3.70 1.020
3.44 1.064
Specific and content-focused
4.17 .769
4.21 .757
4.14 .745
4.19 .810
4.19 .694
4.12 .743
Relevant to current needs 4.08 .868
4.25 .751
4.15 .789
4.08 .925
4.03 .847
4.02 .845
Intensive in nature 3.73 .904
3.70 .862
3.65 .872
3.83 .910
3.78 .867
3.59 .914
Hands-on with active learning
3.87 .945
3.74 .990
3.75 .934
4.01 .919
3.99 .850
3.70 .964
Beneficial, had a positive impact
3.89 .909
3.97 .810
3.93 .823
3.89 .973
3.92 .874
3.83 .879
* This is a computed value based on the average of the other seven measures of quality. NOTE: Frequencies ranged from 3988 to 4114 (Total), 418 to 432 (District central office staff), 409 to 417 (Principal/assistant principal), 1752 to 1802 (Regular classroom teacher), 415 to 430 (Special education teacher), and 1141 to 1182 (Other).
Appendix F. Participant Survey Data Tables
82 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Table A 4. Adherence to Research-Based Practices for High Quality Professional Development, by Content Area
Content area
ALL ARTS CTE ESL FL MATH PE RLA SCI SOCS SPED N/A
Mean Mean Mean Mean Mean Mean Mean Mean Mean Mean Mean Mean
SD SD SD SD SD SD SD SD SD SD SD SD
Overall Quality Index* 3.94 3.76 3.78 3.88 4.00 3.72 4.03 3.78 3.88 3.78 3.86 3.75
.720 .827 .653 .616 .639 .694 .673 .806 .661 .746 .712 .668
Supported by follow-up PD sessions
3.53 3.26 3.41 3.60 3.49 3.24 3.79 3.38 3.55 3.29 3.51 3.30
1.072 1.210 .949 .986 1.227 1.020 .876 1.095 1.081 1.083 1.051 .995
Supported by follow-up discussion
3.71 3.40 3.56 3.93 3.63 3.40 3.92 3.51 3.51 3.42 3.66 3.57
1.050 1.216 1.001 .594 1.260 1.058 .922 1.131 1.196 1.113 1.057 .980
Specific and content-focused
4.27 4.13 4.07 4.07 4.43 4.11 4.21 4.11 4.25 4.12 4.17 4.13
.734 .949 .787 .799 .675 .822 .718 .910 .713 .897 .706 .725
Relevant to current needs
4.16 3.84 4.02 4.00 4.10 4.00 4.14 4.00 4.11 4.03 4.03 4.10
.868 1.182 .857 1.038 .800 .903 .869 .977 .829 .952 .839 .798
Intensive in nature 3.86 3.76 3.68 4.00 3.93 3.69 3.96 3.73 3.89 3.79 3.76 3.59
.881 1.065 .852 .655 1.058 .925 .832 .968 .840 .943 .885 .878
Hands-on with active learning
4.01 4.00 3.91 3.60 4.10 3.91 4.05 3.90 4.07 3.87 3.93 3.68
.930 1.010 .849 1.056 .917 .887 .830 1.007 .866 .951 .868 .957
Beneficial, had a positive impact
4.00 3.81 3.82 3.73 4.02 3.73 4.07 3.75 3.90 3.94 3.91 3.87
.933 1.184 .906 .884 .880 .907 .848 1.026 .965 .972 .876 .820
* This is a computed value based on the average of the other seven measures of quality.
NOTES: Frequencies ranged from 929-958 (ALL), 93-98 (ARTS), 122-123 (CTE), 14-15 (ESL), 40-41 (FL), 331-341 (MATH),
97-99 (PE), 402-412(RLA), 180-188 (SCI), 168-173 (SOCS), 370-381 (SPED), 1389-1437 (N/A).
ALL = All subject areas (e.g., elementary teacher, support personnel); ARTS = Arts (visual, music, dance, theater, other); CTE = Career/technical education; ESL = English as a second language; FL = Foreign language; MATH = Mathematics; PE = Physical education; RLA = Reading/language arts; SCI = Science; SOCS = Social studies; SPED = Special education; N/A = Not applicable (e.g., administrator or county staff)
Appendix F. Participant Survey Data Tables
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 83
Table A 5. Adherence to Research-Based Practices for High Quality Professional Development, by Provider
Provider Total CPD RESA 1 RESA 2 RESA 3 RESA 4 RESA 5 RESA 6 RESA 7 RESA 8
Mean Mean Mean Mean Mean Mean Mean Mean Mean Mean
SD SD SD SD SD SD SD SD SD SD
Overall Quali-ty Index*
3.82 3.95 4.15 3.97 3.91 3.74 3.74 3.68 3.58 3.51 .711 .686 .712 .674 .702 .681 .786 .773 .699 .637
Supported by follow-up PD sessions
3.40 3.41 3.77 3.78 3.64 3.40 3.20 3.47 3.08 3.00 1.047 1.125 1.050 .996 1.077 1.011 1.080 1.108 1.094 .962
Supported by follow-up discussion
3.59 3.26 3.95 3.82 3.73 3.44 3.40 3.64 3.35 3.33 1.052 1.217 .990 .962 1.037 1.048 1.132 1.052 1.023 .936
Specific and content-focused
4.17 4.39 4.50 4.27 4.17 4.00 4.03 4.02 4.03 3.91 .769 .778 .761 .626 .730 .748 .925 .724 .750 .740
Relevant to current needs
4.08 4.33 4.38 4.17 4.14 3.98 3.92 3.72 3.95 3.84 .868 .880 .778 .726 .813 .799 .974 1.015 .861 .887
Intensive in nature
3.73 3.99 4.09 3.80 3.81 3.58 3.71 3.59 3.32 3.43 .904 .908 .793 .786 .876 .894 .999 .818 .943 .873
Hands-on with active learning
3.87 4.17 4.06 3.99 3.93 3.89 3.90 3.80 3.72 3.44 .945 .899 .907 .881 .894 .886 .982 1.048 1.025 .916
Beneficial, had a positive impact
3.89 4.11 4.24 4.03 3.96 3.84 3.72 3.59 3.63 3.66 .909 .892 .789 .864 .981 .817 1.038 1.010 .959 .792
Table A 5 continues on next page.
* This is a computed value based on the average of the other seven measures of quality.
NOTES: Frequencies ranged from 4,135 to 4,263 (Total), 377 to 388 (CPD), 109 to 111 (RESA 1), 110 to 116 (RESA 2), 83 (RESA 3), 233-243 (RESA 4), 187 to 198 (RESA 6), 200 to 209 (RESA 7), and 155 to 161 (RESA 8). CPD = WV Center for Professional Development; RESA = regional education service agency (one each for eight regions in West Virginia).
Appendix F. Participant Survey Data Tables
84 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Table A 5. Adherence to Research-Based Practices for High Quality Professional Development, by Provider, CONTINUED
Provider Total Marshall OAA OHS OI OIE OSI OSP Title I
Title II, III, SS
Mean Mean Mean Mean Mean Mean Mean Mean Mean Mean
SD SD SD SD SD SD SD SD SD SD
Overall Quality In-dex*
3.82 3.94 3.74 3.79 4.00 3.78 3.55 4.00 3.78 4.02
.711 .823 .739 .704 .693 .607 .626 .680 .633 .675
Supported by follow-up PD sessions
3.40 3.81 3.33 3.44 3.51 3.26 2.93 3.62 3.56 3.84
1.047 1.083 1.028 .973 1.070 .865 .887 1.027 .900 .998
Supported by follow-up discussion
3.59 3.91 3.57 3.64 3.78 3.66 3.18 3.81 3.81 4.09
1.052 1.075 1.017 1.019 1.026 .888 1.016 1.011 .853 .811
Specific and content-focused
4.17 4.10 4.12 4.12 4.33 4.18 4.05 4.30 4.04 4.40
.769 .921 .772 .733 .742 .700 .691 .733 .722 .760
Relevant to current needs
4.08 3.95 3.98 4.08 4.26 4.11 3.99 4.19 3.96 3.84
.868 .994 .951 .833 .828 .803 .782 .774 .846 1.045
Intensive in nature
3.73 3.72 3.61 3.58 3.95 3.64 3.48 3.90 3.73 4.16
.904 .914 .910 .978 .918 .737 .810 .887 .788 .843
Hands-on with active learning
3.87 4.05 3.67 3.74 4.16 3.74 3.49 4.04 3.68 3.98
.945 .909 .962 .984 .868 .907 .919 .901 .860 .886
Beneficial, had a posi-tive impact
3.89 3.94 3.82 3.97 4.02 3.94 3.71 4.10 3.70 3.81
.909 1.047 .953 .826 .888 .830 .814 .825 .911 1.006
* This is a computed value based on the average of the other seven measures of quality. NOTES: Frequencies ranged from 4,135 to 4,263 (Total), 184 to 191 (Marshall), 266 to 275 (OAA), 203 to 209 (OHS), 138 to 142 (OIE), 640 to 652 (OI), 424 to 441 (OSI), 285 to 298 (OSP), 412 to 420 (Title I), and 43 (Title II, III, SS). Marshall = Marshall University June Harless Center; OAA = WVDE Office of Assessment and Accountability; OHS = WVDE Office of Healthy Schools; OIE = WVDE Office of Institutional Education Programs; OI = WVDE Office of Instruction; OSI = WVDE Office of School Improvement; OSP = WVDE Office of Special Programs; Title I = WVDE Office of Title I; Title II, III, SS = WVDE Office of Title II, III, and System Support.
Appendix F. Participant Survey Data Tables
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 85
Paired Samples T Tests of Perceived Impacts on Knowledge, Behaviors, and
Attitudes/Beliefs
Table A 6. Paired-Samples T Test of Perceived Impact, by Programmatic Level
Programmatic level Dimension
Mean ∆ pre-post
Standard deviation
Standard error of
mean t Significance n
Effect size (Cohen's
d)
Early childhood/ elementary
Knowledge .768 .747 .018 43.193 .000 1763 1.14
Behavior .598 .717 .017 34.294 .000 1686 .76
Attitudes/beliefs .313 .751 .018 16.979 .000 1663 .46
Middle school Knowledge .720 .778 .025 29.347 .000 1004 1.04
Behavior .536 .689 .022 24.145 .000 965 .64
Attitudes/beliefs .246 .764 .025 9.896 .000 949 .34
High school Knowledge .661 .749 .021 31.888 .000 1304 .89
Behavior .501 .694 .020 25.372 .000 1235 .60
Attitudes/beliefs .227 .721 .021 11.014 .000 1218 .31
Other Knowledge .488 .627 .037 13.364 .000 295 .67
Behavior .393 .684 .041 9.481 .000 272 .41
Attitudes/beliefs .167 .705 .043 3.842 .000 264 .20
Not applicable Knowledge .531 .670 .041 12.812 .000 262 .75
Behavior .463 .673 .045 10.352 .000 227 .47
Attitudes/beliefs .156 .680 .045 3.432 .001 225 .20
Table A 7. Paired-Samples T Test of Perceived Impact, by Professional Role
Professional Role Dimension
Mean ∆ pre-post
Standard deviation
Standard error of
mean t Significance n
Effect size (Cohen's
d)
District central office staff
Knowledge .553 .689 .034 16.192 .000 407 .95
Behavior .454 .727 .038 11.840 .000 359 .57
Attitudes/beliefs .149 .702 .037 4.000 .000 356 .23
Principal/ assistant principal
Knowledge .686 .699 .035 19.773 .000 405 1.05
Behavior .556 .670 .034 16.117 .000 378 .69
Attitudes/beliefs .225 .708 .037 6.132 .000 374 .30
Regular classroom teacher
Knowledge .865 .772 .018 46.899 .000 1752 1.21
Behavior .641 .732 .018 36.003 .000 1693 .81
Attitudes/beliefs .342 .771 .019 18.201 .000 1683 .49
Special education teacher
Knowledge .763 .758 .037 20.448 .000 413 1.11
Behavior .580 .660 .033 17.544 .000 398 .73
Attitudes/beliefs .293 .707 .036 8.219 .000 392 .42
Other Knowledge .507 .681 .021 24.723 .000 1100 .69
Behavior .406 .648 .020 20.093 .000 1030 .44
Attitudes/beliefs .158 .680 .021 7.338 .000 1003 .20
Appendix F. Participant Survey Data Tables
86 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Table A 8. Paired-Samples T Test of Perceived Impact, by Content Area
Content area Dimension Mean ∆
pre-post Standard deviation
Standard error of
mean t Signifi-cance n
Effect size (Cohen's
d)
All subject areas Knowledge .885 .747 .025 35.841 .000 914 1.32
Behavior .667 .746 .025 26.638 .000 888 .88
Attitudes/beliefs .377 .790 .027 14.127 .000 879 .59
Arts Knowledge .871 .875 .091 9.598 .000 93 1.26
Behavior .522 .640 .067 7.741 .000 90 .69
Attitudes/beliefs .352 .766 .080 4.382 .000 91 .48
Career/technical education
Knowledge .620 .698 .063 9.764 .000 121 1.01
Behavior .535 .731 .068 7.817 .000 114 .72
Attitudes/beliefs .272 .768 .072 3.783 .000 114 .41
English as a second language
Knowledge .533 .743 .192 2.779 .015 15 .62
Behavior .286 .611 .163 1.749 .104 14 .14
Attitudes/beliefs .286 .611 .163 1.749 .104 14 .17
Foreign language Knowledge 1.077 .774 .124 8.688 .000 39 1.32
Behavior .769 .706 .113 6.807 .000 39 .76
Attitudes/beliefs .333 .806 .129 2.584 .014 39 .45
Mathematics Knowledge .839 .756 .041 20.357 .000 336 1.09
Behavior .522 .673 .038 13.784 .000 316 .57
Attitudes/beliefs .226 .760 .043 5.270 .000 314 .30
N/A Knowledge .539 .678 .018 29.221 .000 1352 .81
Behavior .443 .683 .019 22.831 .000 1239 .52
Attitudes/beliefs .161 .698 .020 8.013 .000 1214 .22
Physical education Knowledge .600 .659 .068 8.877 .000 95 .65
Behavior .500 .655 .068 7.326 .000 92 .50
Attitudes/beliefs .391 .695 .072 5.403 .000 92 .40
Reading/language arts
Knowledge .746 .812 .041 18.377 .000 401 1.08
Behavior .529 .656 .034 15.684 .000 378 .61
Attitudes/beliefs .269 .717 .037 7.303 .000 379 .35
Science Knowledge .834 .820 .061 13.688 .000 181 1.15
Behavior .653 .740 .056 11.712 .000 176 .84
Attitudes/beliefs .218 .720 .055 4.001 .000 174 .32
Social studies Knowledge .728 .722 .056 13.112 .000 169 .89
Behavior .685 .705 .055 12.470 .000 165 .81
Attitudes/beliefs .288 .655 .051 5.623 .000 163 .38
Special education Knowledge .651 .734 .039 16.848 .000 361 .91
Behavior .533 .685 .037 14.498 .000 347 .67
Attitudes/beliefs .257 .679 .037 6.921 .000 335 .35
Appendix F. Participant Survey Data Tables
Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012 | 87
Table A 9. Paired-Samples T Test of Perceived Impact, by Provider
Provider Dimension Mean ∆
pre-post Standard deviation
Standard error of
mean t Signifi-cance n
Effect size (Cohen's
d)
Center for Professional Development
Knowledge .754 .730 .037 20.194 .000 382 1.24
Behavior .678 .674 .035 19.225 .000 366 .98
Attitudes/beliefs .398 .606 .032 12.545 .000 364 .51
Marshall University June Harless Center
Knowledge .792 .793 .059 13.327 .000 178 1.39
Behavior .731 .810 .062 11.797 .000 171 1.02
Attitudes/beliefs .506 .713 .054 9.301 .000 172 .66
RESA 1 Knowledge .915 .745 .072 12.651 .000 106 1.40
Behavior .735 .688 .068 10.789 .000 102 .87
Attitudes/beliefs .485 .739 .073 6.665 .000 103 .72
RESA 2 Knowledge .948 .759 .071 13.390 .000 115 1.48
Behavior .673 .718 .068 9.824 .000 110 .89
Attitudes/beliefs .556 .660 .064 8.742 .000 108 .79
RESA 3 Knowledge .588 .630 .070 8.337 .000 80 .84
Behavior .600 .637 .074 8.161 .000 75 .82
Attitudes/beliefs .513 .663 .076 6.746 .000 76 .67
RESA 4 Knowledge .680 .781 .051 13.226 .000 231 1.05
Behavior .445 .670 .045 9.860 .000 220 .59
Attitudes/beliefs .326 .596 .040 8.184 .000 224 .47
RESA 5 Knowledge .849 .726 .052 16.207 .000 192 1.02
Behavior .698 .736 .054 13.048 .000 189 .73
Attitudes/beliefs .458 .820 .060 7.695 .000 190 .53
RESA 6 Knowledge .654 .735 .083 7.851 .000 78 .90
Behavior .366 .660 .078 4.676 .000 71 .36
Attitudes/beliefs .303 .674 .077 3.916 .000 76 .40
RESA 7 Knowledge .688 .808 .057 12.098 .000 202 .90
Behavior .555 .804 .060 9.314 .000 182 .57
Attitudes/beliefs .321 .677 .052 6.150 .000 168 .30
RESA 8 Knowledge .553 .585 .048 11.575 .000 150 .74
Behavior .400 .506 .043 9.352 .000 140 .45
Attitudes/beliefs .333 .528 .044 7.658 .000 147 .41
WVDE Office of Assessment & Accountability
Knowledge .513 .642 .040 12.917 .000 261 .74
Behavior .372 .589 .037 9.985 .000 250 .39
Attitudes/beliefs .306 .526 .033 9.220 .000 252 .34
WVDE Office of Healthy Schools
Knowledge .429 .599 .043 10.014 .000 196 .65
Behavior .328 .657 .047 6.925 .000 192 .51
Attitudes/beliefs .254 .515 .037 6.783 .000 189 .30
Table A 9 continues on next page.
Appendix F. Participant Survey Data Tables
88 | Implementation of the Master Plan for Statewide Professional Staff Development for 2011-2012
Table A 9. Paired-Samples T Test of Perceived Impact, by Provider
Provider Dimension Mean ∆
pre-post Standard deviation
Standard error of
mean t Signifi-cance n
Effect size (Cohen's
d)
WVDE Office of Instruction
Knowledge .960 .748 .030 32.221 .000 630 1.48
Behavior .702 .735 .030 23.180 .000 588 .89
Attitudes/beliefs .423 .694 .028 14.942 .000 600 .64
WVDE Office of Institutional Education
Knowledge .619 .744 .064 9.639 .000 134 .88
Behavior .537 .731 .066 8.086 .000 121 .54
Attitudes/beliefs .311 .722 .066 4.698 .000 119 .41
WVDE Office of School Improvement
Knowledge .374 .627 .031 11.977 .000 404 .52
Behavior .289 .565 .029 10.058 .000 388 .31
Attitudes/beliefs .185 .501 .026 7.243 .000 383 .21
WVDE Office of Special Programs
Knowledge .623 .671 .040 15.562 .000 281 .93
Behavior .588 .701 .043 13.708 .000 267 .74
Attitudes/beliefs .338 .619 .038 8.914 .000 266 .42
WVDE Office of Title I
Knowledge .817 .821 .041 20.107 .000 409 1.02
Behavior .503 .728 .037 13.454 .000 380 .64
Attitudes/beliefs .343 .608 .031 11.149 .000 391 .41
WVDE Office of Title II, III, and System Support
Knowledge 1.171 .803 .125 9.333 .000 41 1.31
Behavior .775 .698 .110 7.027 .000 40 .60
Attitudes/beliefs .525 .716 .113 4.640 .000 40 .51
Jorea M. Marple, Ed.D.State Superintendent of Schools