developing and implementing state-level evaluation systems bob algozzine, heather reynolds, and...
TRANSCRIPT
DEVELOPING AND IMPLEMENTING STATE-LEVEL EVALUATION SYSTEMSBOB ALGOZZINE, HEATHER REYNOLDS, AND STEVE GOODMAN
National PBIS Leadership ForumHyatt Regency O’Hare Rosemont, IllinoisOctober 27, 2011
Objectives
Describe core features of an effective evaluation system Evidence to document program, initiative, or
intervention Evidence to improve and support continuation Evidence to direct policies and practices
Share ongoing and exemplary state-level evaluations
Provide an opportunity for question-answer collaboration
Program Evaluation Simplified
Design/Plan[Redesign/Re-Plan]
Implement Intentionally
and Document Fidelity
Assess Continuously
and Document Intended and Unintended Outcomes
Core Features of an Effective Evaluation System
An effective evaluation has a clearly defined purpose that tells a story that helps to… document program, initiative, or intervention
context, input, fidelity, and impact evidence improve and support continuation
stages of innovation and continuous improvement evidence
direct policies and practices efficient and effective reporting and dissemination of
evidence
Document Program, Initiative, or Intervention
A simple plan? Organize evidence around what you need to know and questions you can answer. Why (i.e., circumstances, conditions, or events) was the program implemented?
[Statement of the problem and data on which to build evaluation…] What program was implemented? [Program description including key features…]
What other programs were considered? Why was program selected over other programs?
How was the program implemented? [Pilot sites, administrative dictum, widespread panic, quiet riot, volunteers…]
Was program implemented with fidelity sufficient to produce change? [Statement of the problem and data on which to build evaluation…]
What short-, intermediate-, and long-term changes resulted from implementing the program? [Statement of the problem and data on which to build evaluation…] Improvements in school and classroom ecology? Improvements in academic and social behavior?
Did implementation improve the capacity of the state/district to continue the program? [Statement of the problem and data on which to build evaluation…]
An important reminder: What you need to know and the questions you can answer will depend on where you are in the implementation process.
EXPLORATION
INSTALLATION
IMPLEMENTATION
CONTINUATION
INNOVATION
Context
Input
Fidelity
Impact
Documenting Program Context and Input
What to collect and report? Information about need and intervention Information about national, state, and local
education agency leadership personnel and program providers
Information about program participants Information about program
Focus, critical features, and content Type and amount of support Perceptions and other indicators of appropriateness Expectations for change
Context
Input
Documenting Program Fidelity
Intervention Level
Self-Assessment Measures
Progress Monitoring Measures
Research Measures
Universal Self-Assessment Survey (SAS)
Benchmarks of Quality (BoQ)
Team Implementation Checklist (TIC)
School-wide Evaluation Tool (SET)
Secondary and Tertiary
Benchmarks of Advanced Tiers (BAT)
Individual Student School-wide Evaluation Tool (I-SSET)
Overall Implementation Phases Inventory (IPI)
Phases of Implementation (POI)
Forms on www.pbisassessment.org
What to collect and report?
Fidelity
Documenting Program Impact
Social Behavior Benefits Fidelity Indicators School and Classroom Climate Attitudes Attendance Office Discipline Referrals (ODRs) Individual Student Points/Behavior Records Proportion of Time in Typical Educational Contexts Referrals to Special Education
Academic Behavior Benefits Fidelity Indicators Instructional Climate Attitudes Universal Screening and Progress Monitoring (vocabulary, oral
reading fluency) Standardized Test Scores
What to collect and report?Impact
Core Features of an Effective Evaluation System
An effective evaluation has a clearly defined purpose that tells a story Evidence to Document Program, Initiative, or
Intervention context, input, fidelity, and impact
Evidence to Improve and Support Continuation stages of innovation/continuous improvement cycles
Evidence to Direct Policies and Practices efficient and effective annual reports
Evidence to Improve and Support Continuation
Stages of Implementation Exploration Installation Initial Implementation Full Implementation Innovation Sustainability
2 – 4 Years
Fixsen, Naoom, Blase, Friedman, & Wallace, 2005 [report]
What to collect and report?
Design/Plan[Redesign/Re-
Plan]
Implement Intentionally
and Document
Fidelity
Assess Continuously
and Document
Intended and Unintended Outcomes
Continuous Improvement Process
Core Features of an Effective Evaluation System
An effective evaluation has a clearly defined purpose that tells a story Evidence to Document Program, Initiative, or
Intervention context, input, fidelity, and impact
Evidence to Improve and Support Continuation stages of innovation/continuous improvement cycles
Evidence to Direct Policies and Practices efficient and effective annual reports
o external supporto www. pbisassessment.orgo www. pbseval.org
Evidence to Direct, Support, and Revise Policy Decisions
Evaluation Blueprint
The OSEP Technical Assistance Center on Positive Behavioral Interventions and Supports has developed a document for individuals who are implementing School-wide Positive Behavior Intervention and Support (SWPBIS) in districts, regions, or states. The purpose of the “blueprint” is to provide a formal structure for evaluating if implementation efforts are (a) occurring as planned, (b) resulting in change in schools, and (c) producing improvement in student outcomes.
(blueprint)
Evidence to Direct, Support, and Revise Policy Decisions
North Carolina Annual Performance Report Annual reports highlight development and continued growth of
PBIS in North Carolina as well as indicators of fidelity of implementation and the impact PBIS is having on participating schools across the state. In addition, the
reports include information about plans for sustainability through training, coaching, and partnerships with other initiatives, in particular Responsiveness to
Instruction (RtI).
Michigan’s Integrated Behavior and Learning Support Initiative (MiBLSi) Newshttp://miblsi.cenmi.org/News.aspx
Illinois Evaluation Reportshttp://pbisillinois.org/
Florida’s Positive Behavior Support ProjectChilds, K. E., Kincaid, D., & George, H. P. (2010). A model for statewide
evaluation of auniversal positive behavior support initiative. Journal of Positive Behavior
Interventions, 12, 198–210.
Evidence from Exemplary State-Level Evaluations
North CarolinaNorth Carolina has been implementing a statewide Positive Behavior Intervention and Support (PBIS)
Initiative for 10 years. Heather Reynolds is the State PBIS Consultant.
Michigan Michigan’s Integrated Behavior and Learning Support
Initiative (MiBLSi) works with schools to develop a multi-tiered system of support for both reading and
behavior; PBIS is a key part of the Initiative’s process for creating and sustaining safe and effective schools.
Steve Goodman is Director of Michigan Integrated Behavior and Learning Support Initiative and PBIS
Coordinator.
Presentation Questions and Answers
Bibliography and Selected ResourcesEvaluation Action Plan
Abma, T. A., & Stake, R. E. (2001). Stake’s responsive evaluation: Core ideas and evolution. In J. C. Greene & T. A. Abma (Eds.), New directions for evaluation: No. 92. Responsive evaluation (pp. 7-21). San Francisco: Jossey-Bass.
Algozzine, B., Horner, R. H., Sugai, G., Barrett, S., Dickey, S. R., Eber, L., Kincaid, D., et al. (2011). Evaluation blueprint for school-wide positive behavior support. Eugene, OR: National Technical Assistance Center on Positive Behavior Interventions and Support. Retrieved from www.pbis.org
Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231).
Ruhe, V., & Zumbo, B. D. (2009). Evaluation in distance education and e-learning. New York: Guilford.
Scriven, M., & Coryn, C. L. S. (2008). The logic of research evaluation. In C. L. S. Coryn & M. Scriven (Eds.), Reforming the evaluation of research. New Directions for Evaluation, No. 118, pp. 89-106). San Francisco, CA: Jossey-Bass.
Stufflebeam, D. L. (2001). Evaluation models. In D. L. Stufflebeam, New directions for evaluation: No. 89. Responsive evaluation (pp. 7-98). San Francisco: Jossey-Bass.
Stufflebeam, D. L., & Shinkfield, A. J. (2007). Evaluation theory, models, and applications. San Francisco: Jossey-Bass/Pfeiffer.
The Evaluation Center. (2011). Evaluation checklists. Kalamazoo, MI: Western Michigan University. Retrieved from http://www.wmich.edu/evalctr/checklists/
The Joint Committee on Standards for Educational Evaluation (1994). The program evaluation standards. Thousand Oaks, CA: Sage Publications, Inc.
Bibliography and Selected Resources
Evaluation Action Plan
Evaluation_Action_Plan