m3-reviewing the slo-sso-demosite

42
1 Student Learning/Support Objectives (SLO/SSO) -Reviewing Module-

Upload: research-in-action-inc

Post on 21-Feb-2017

198 views

Category:

Education


0 download

TRANSCRIPT

Page 1: M3-Reviewing the SLO-SSO-DemoSite

1

Student Learning/Support

Objectives (SLO/SSO)-Reviewing Module-

Page 2: M3-Reviewing the SLO-SSO-DemoSite

2

Student Learning/Support Objectives

Module 3Reviewing the SLO/SSO

Page 3: M3-Reviewing the SLO-SSO-DemoSite

3

ReviewGoal• Reviewing technically rigorous student learning/support

objectives (SLO/SSO) for use in guiding instruction (and/or support services); while determining student mastery or growth as part of an educator effectiveness system.

Objectives• Participants will :

1. Conduct a multi-faceted, quality assurance review of the student learning/support objectives for:A. CompletenessB. Comprehensiveness (see Quick Start Training

Module*)C. Coherency

Page 4: M3-Reviewing the SLO-SSO-DemoSite

4

Final Outcome

A refined SLO/SSO Form with:• Rigorous, high-quality performance

measures designed to measure the targeted content/professional standards.

• Areas identified for further improvement/consideration (as documented within the Coherency Rubric)

Page 5: M3-Reviewing the SLO-SSO-DemoSite

5

Helpful Resources

Participants should consult the following:

Training• Handout #3: QA Checklist

Step 5: Quality Assurance of the SLO/SSO Form

Templates• Template #3: Coherency Rubric• Template #3a: Performance Measure Rubric

Resources• Help Desk• Models: Scored Examples

Page 6: M3-Reviewing the SLO-SSO-DemoSite

6

Participants should consult the following: Completion of the SLO Form• Handout #3-Quality Assurance Checklist

Comprehensiveness of the Performance Measures• Template #3a-Performance Measure Rubric

Coherency of the SLO Design• Template #3-Coherency Rubric

Helpful Resources (cont.)

Page 7: M3-Reviewing the SLO-SSO-DemoSite

7

Process Components

STEP #1 Goal

STEP #2 Standards

STEP #3 Blueprint

STEP #4 Form STEP #5 QA

Page 8: M3-Reviewing the SLO-SSO-DemoSite

8

• Check the drafted SLO/SSO (including the performance measures) for quality.

• Refine the measures and targets.• Edit the text and prepare discussion points and

highlights for the principal/supervisor.• Update the completed SLO/SSO form with

performance data.

REVIEW Phase

Page 9: M3-Reviewing the SLO-SSO-DemoSite

9

Review Phase Components

ReviewPreview

Checklist & Rubrics

Completion: QA Checklist

Comprehensiveness: Performance

Measure Rubric

Coherency: Coherency Rubric

Page 10: M3-Reviewing the SLO-SSO-DemoSite

10

Task Structure

QA 1. CompletenessIs the SLO/SSO Form completed correctly?2. ComprehensivenessAre the assessments of high technical quality?

3. Coherency Are the SLO/SSO components aligned to each other?

Page 11: M3-Reviewing the SLO-SSO-DemoSite

11

STEP 5

Quality Assurance

Page 12: M3-Reviewing the SLO-SSO-DemoSite

12

Checklist & Rubric Preview1. Apply the QA Checklist (Handout #3) to a completed SLO or SSO Form.

A. What information is needed?

B. Who is the SLO/SSO focused on?

2. Preview the three (3) strands within the Performance Measure Rubric (Template #3a) for each assessment identified within the SLO/SSO.

A. What is the purpose of the assessment/performance measure?

B. What standards does it purport to measure?

C. What technical data is provided/known?

Page 13: M3-Reviewing the SLO-SSO-DemoSite

13

Checklist & Rubric Preview (cont.)

3. Preview the three (3) phases within the Coherency Rubric for the SLO or SSO Form.

A. How well are the SLO/SSO components aligned to each other?

B. How well do the identified assessments/ performance measures aligned to the SLO’s/SSO’s stated goal?

Page 14: M3-Reviewing the SLO-SSO-DemoSite

14

Quality Assurance Checklist

Page 15: M3-Reviewing the SLO-SSO-DemoSite

15

Quality Assurance Checklist• The checklist is designed to verify each

element within the four (4) sections of the SLO/SSO

• The checklist applies the business rules to each element. The Help Desk document provides examples for each element.

SLO/SSO Completeness

Page 16: M3-Reviewing the SLO-SSO-DemoSite

16

Quality Assurance Checklist

Element Definition

1.1 Content Area Name of the content area upon which the SLO is based.

1.2 Course Name of the specific course/subject upon which the SLO is based.

1.3 Grade Level Grade levels for students included in the course/subject in Element 1.2.

1.4 Total StudentsAggregate number of students (estimated, across multiple sections) for whom data will be collected. 

1.5 Average Class Size The average number of students in a single session of the course/subject identified in Element 1.2.

1.6 Class Frequency The frequency (within the given timeframe) of the course/subject identified in Element 1.2.

1.7 Instructional Setting

The location or setting where the course/subject instruction is provided.

1.8 Instructional Interval The time frame of the course/subject identified in Element 1.2.

SLO Section I: Context

Page 17: M3-Reviewing the SLO-SSO-DemoSite

17

Quality Assurance Checklist (cont.)

Element Definition

1.1 Service Area Name of the primary service area (e.g., speech) upon which the SSO is based.

1.2 Service Location Name of the location(s) services are provided.

1.3 Grade Level Grade level(s) of students and/or educator-type services are provided.

1.4 Total Recipients Aggregate number of students and/or educators for whom data will be collected.

1.5 Average Case Size The “average” number of recipients of the services identified in

Element 1.4.

1.6 Service FrequencyThe typical frequency (within the identified service interval-Element 1.8) services are provided to recipients identified in Element 1.4.

1.7 Service Setting The contextual setting (e.g., school library, student’s home) services are provided.

1.8 Service Interval The typical time frame of the service model.

SSO Section I: Setting

Page 18: M3-Reviewing the SLO-SSO-DemoSite

18

SLO Section II: Goal

Element Definition

2.1 Goal StatementA narrative that articulates a key concept upon which the SLO is based. The statement addresses What, Why, and How. 

2.2 Content Standards Targeted Content Standards, which are the foundation of performance measures, used to develop the SLO.

2.3 Instructional Strategy The approach used to facilitate learning the key concept articulated in the Goal Statement and delineated among the Targeted Content Standards.

SLO Section III: ObjectiveElement Definition

3.1 Starting Point (Baseline)

The baseline data used for comparing student results at the end of the instructional interval.   

3.2 Objectives (Whole Class)

The expected level of achievement for the entire student learning objective (SLO) population (as defined in Element 1.4). 

3.3 Objectives (Focused Students)

The expected level of achievement for a subset of the SLO population (as defined in Element 1.4). 

3.4 End Point (Combined)

At the end of the instructional interval, the aggregate performance classification as delineated by four, empirical ranges (i.e., Unsatisfactory, Emerging, Effective, and Distinguished). 

Quality Assurance Checklist (cont.)

Page 19: M3-Reviewing the SLO-SSO-DemoSite

19

SSO Section II: Goal

Element Definition

2.1 Goal Statement A narrative that articulates a key concept upon which the SSO is based. The statement addresses What, Why, and How.

2.2 Targeted Professional Standards

Targeted Professional Standards outline the requirements an organization must fulfill to ensure that products and services consistently meet customers' requirements. Content standards may also be identified for those individuals providing instructional services.

2.3 Implementation Strategy The approach used to attain the primary service goal articulated in the Goal Statement and delineated among the Targeted Professional Standards.

SSO Section III: ObjectiveElement Definition

3.1 Starting Point (Baseline) The baseline data used for comparing client results at the end of the instructional interval.  

3.2 Objectives (All Clients) The expected level of performance for the entire client population (as defined in Element 1.4).

3.3 Objectives (Focused Clients) The expected level of performance for a subset of the client population (as defined in Element 1.4).

3.4 End Point (Combined) At the end of the service interval, the aggregate performance classification as delineated by four, empirical ranges (i.e., Unsatisfactory, Emerging, Effective, and Distinguished).

Quality Assurance Checklist (cont.)

Page 20: M3-Reviewing the SLO-SSO-DemoSite

20

SLO Section IV: Performance MeasureElement Definition

4.1 Name The name of each performance measure for which an objective is established in Element 3.2.

4.2 Purpose The purpose statement for each performance measure that outlines: (a) What the assessment measures, (b) How to use the scores, and (c) Why the assessment was developed.

4.3 Content Standards The Targeted Content Standards (the foundation of performance measures) used to develop SLOs. The content standards are those aligned with each performance measure.

4.4 Performance Targets Using the scoring tools for each performance measure (as listed in Element 4.1), the expected level of achievement for each student in the SLO population (as defined in Element 1.4).

4.5 Metric The metric by which the performance measure evaluates the performance target.

4.6 Administration

The administrative steps before, during, and after the assessment window, as well as the step-by-step procedures during each phase of administration, including: (a) the requirements for completing the performance measure, including accommodations, equipment, and materials, (b) standard time allotments to complete the overall performance measure, and (c) standard scripts that educators read to give directions for completing the performance measure.

4.7 Scoring Tools Scoring Keys: Objective MeasuresScoring Rubric: Subjective Measures

4.8 Results The number of students participating in the performance measure The number of students who met the target as stated in Element 4.4 The percentage of students who met the target as stated in Element 4.4

SLO RatingOne of four performance levels that the principal (or the evaluator) identifies after noting the actual performance in respect to each objective stated in the SLO. 

Notes and ExplanationSpace for the educator to note influences, factors, and other conditions associated with the SLO Rating, as well as to reflect on a purposeful review of the data. 

Quality Assurance Checklist (cont.)

Page 21: M3-Reviewing the SLO-SSO-DemoSite

21

SSO Section IV: Performance MeasureElement Definition

4.1 Name The name of each performance measure for which an objective is established in Element 3.2.

4.2 Purpose The purpose statement for each performance measure that outlines: (a) What the measure is evaluating, (b) How to use the scores, and (c) Why the measure was developed.

4.3 Professional Standards The Professional Standards (the foundation of measures) used to develop SSOs. The professional standards are those aligned with each identified measure.

4.4 Performance Targets Using the scoring tools for each performance measure (as listed in Element 4.1), the expected level of attainment for each client in the SSO population (as defined in Element 1.4).

4.5 Metric The metric by which the performance measure evaluates the performance target.

4.6 AdministrationThe administrative steps before, during, and after the evaluation window, as well as the step-by-step procedures during each phase of administration The requirements for completing the performance measure, including accommodations, equipment, and materials The standard time to complete the overall evaluation.

4.7 Scoring Tools Scoring Keys: Objective MeasuresScoring Rubric: Subjective Measures; Data collection mechanisms

4.8 Results• The number of clients participating in the performance measure. • The number of clients who met the target as stated in Element 4.4. • The percentage of clients who met the target as stated in Element 4.4

SSO Results One of four performance levels that the supervisor identifies after noting the actual performance in respect to each objective stated in the SSO.

Notes and Explanations Space for the professional to note influences, factors, and other conditions associated with the SSO Rating, as well as to reflect on a purposeful review of the data.

Quality Assurance Checklist (cont.)

Page 22: M3-Reviewing the SLO-SSO-DemoSite

22

Procedural Steps Step 1. Select the drafted SLO/SSO, including applicable

performance measures.

Step 2. Beginning with Section I, use the Handout #3-Quality Assurance Checklist and evaluate each element.

Step 3. Identify any element with missing or incorrect information (i.e., statement or data placed in the wrong element of the SLO/SSO Form).

Step 4. Flag any element needing refinement or further discussion with other educators/professionals and/or the principal/supervisor.

Step 5. Repeat Steps 2 through 4 with the other sections on the SLO/SSO Form.

Page 23: M3-Reviewing the SLO-SSO-DemoSite

23

Performance Measure Rubric

Page 24: M3-Reviewing the SLO-SSO-DemoSite

24

Performance Measure Rubric• The rubric is designed to examine the quality

characteristics of teacher-made performance measures. The rubric is comprised of 18 technical descriptors organized into three strands.

• The rubric’s purpose is to provide teachers with a self-assessment tool that assists in building high quality measures of student achievement.

SLO/SSO Comprehensiveness

Page 25: M3-Reviewing the SLO-SSO-DemoSite

25

• For vendor-developed assessments, examine the technical evidence to determine if the tool is designed to measure the Targeted Content/Professional Standards.

• For locally-developed assessments, follow the guidelines in the Quick Start training to create high-quality performance measures, and then apply Template 3a.

Performance Measure Rubric

Page 26: M3-Reviewing the SLO-SSO-DemoSite

26

Performance Measure Rubric (cont.)

Task ID Descriptor Rating

1.1 The purpose of the performance measure is explicitly stated (who, what, why).

 

1.2 The performance measure has targeted content standards representing a range of knowledge and skills students are expected to know and demonstrate.

 

1.3The performance measure’s design is appropriate for the intended audience and reflects challenging material needed to develop higher-order thinking skills.

 

1.4Specification tables articulate the number of items/tasks, item/task types, passage readability, and other information about the performance measure -OR- Blueprints are used to align items/tasks to targeted content standards.

 

1.5

Items/tasks are rigorous (designed to measure a range of cognitive demands/higher-order thinking skills at developmentally appropriate levels) and of sufficient quantities to measure the depth and breadth of the targeted content standards.

 

Strand I: Design

Page 27: M3-Reviewing the SLO-SSO-DemoSite

27

Performance Measure Rubric (cont.)Task ID Descriptor Rating

2.1Items/tasks and score keys are developed using standardized procedures, including scoring rubrics for human-scored, open-ended questions (e.g., short constructed response, writing prompts, performance tasks, etc.).

 

2.2Item/tasks are created and reviewed in terms of: (a) alignment to the targeted content standards, (b) content accuracy, (c) developmental appropriateness, (d) cognitive demand, and (e) bias, sensitivity, and fairness.

 

2.3

Administrative guidelines are developed that contain the step-by-step procedures used to administer the performance measure in a consistent manner, including scripts to orally communicate directions to students, day and time constraints, and allowable accommodations/adaptations.

 

2.4Scoring guidelines are developed for human-scored items/tasks to promote score consistency across items/tasks and among different scorers. These guidelines articulate point values for each item/task used to combine results into an overall score.

 

2.5Summary scores are reported using both raw score points and performance level. Performance levels reflect the range of scores possible on the assessment and use terms or symbols to denote each level.

 

2.6The total time to administer the performance measure is developmentally appropriate for the test-taker. Generally, this is 30 minutes or less for young students and up to 60 minutes per session for older students (high school).

Strand II: Build

Page 28: M3-Reviewing the SLO-SSO-DemoSite

28

Performance Measure Rubric (cont.)Task ID Descriptor Rating

3.1

The performance measures are reviewed in terms of design fidelity: Items/tasks are distributed based upon the design properties found within the specification or

blueprint documents; Item/task and form statistics are used to examine levels of difficulty, complexity, distractor

quality, and other properties; and, Items/tasks and forms are rigorous and free of bias, sensitive, or unfair characteristics.

 

3.2

The performance measure was reviewed in terms of editorial soundness, while ensuring consistency and accuracy of all documents (e.g., administration guide): Identifies words, text, reading passages, and/or graphics that require copyright permission or

acknowledgements; Applies Universal Design principles; and, Ensures linguistic demands and readability are developmentally appropriate.

 

3.3

The performance measure was reviewed in terms of alignment characteristics: Pattern consistency (within specifications and/or blueprints);

Targeted content standards match;

Cognitive demand; and. Developmental appropriateness.

 

3.4 Cut scores are established for each performance level. Performance level descriptors describe the achievement continuum using content-based competencies for each assessed content area.  

Strand III: Review

Page 29: M3-Reviewing the SLO-SSO-DemoSite

29

Performance Measure Rubric (cont.)

Task ID Descriptor Rating

3.5As part of the assessment cycle, post-administration analyses are conducted to examine aspects such as items/tasks performance, scale functioning, overall score distribution, rater drift, content alignment, etc.

 

3.6

The performance measure has score validity evidence that demonstrate item responses were consistent with content specifications. Data suggest that the scores represent the intended construct by using an adequate sample of items/tasks within the targeted content standards. Other sources of validity evidence such as the interrelationship of items/tasks and alignment characteristics of the performance measure are collected.

 

3.7

Reliability coefficients are reported for the performance measure, which includes estimating internal consistency. Standard errors are reported for summary scores. When applicable, other reliability statistics such as classification accuracy, rater reliability, etc. are calculated and reviewed.

 

 

Strand III: ReviewNote: The indicators below are evaluated after students/clients have taken the assessment (i.e., post-administration).

Page 30: M3-Reviewing the SLO-SSO-DemoSite

30

Procedural Steps Step 1. Identify the performance measures being used within

the SLO/SSO.

Step 2. Examine the alignment characteristics of the performance measure to those standards identified in Section II.

Step 3. Determine the most applicable metric (e.g., growth) based on the stated objectives in Section III.

Step 4. Evaluate the technical evidence provided by either the vendor or the assessment’s developer.

Step 5. Repeat Steps 2 through 4 with the other performance measures identified in Section IV.

Page 31: M3-Reviewing the SLO-SSO-DemoSite

31

Coherency Rubric

Page 32: M3-Reviewing the SLO-SSO-DemoSite

32

SLO/SSO Coherency

GOAL STATEMENT

SLO/SSO

RATING

OBJECTIVES

PERFORMANCE MEASURES & TARGETS

Page 33: M3-Reviewing the SLO-SSO-DemoSite

33

Coherency Rubric• The Coherency Rubric, which examines the

alignment characteristics of each student learning/support objective (SLO/SSO), serves as the measurement tool to ensure that each SLO/SSO meets the coherency criteria.

• The rubric evaluates each of the three (3) phases used organize this training module. Each aspect of the SLO/SSO should meet a specific descriptor in the Coherency Rubric.

*Note* Template #3: Coherency Rubric is found in the Review module.

Page 34: M3-Reviewing the SLO-SSO-DemoSite

34

Phase I: Design

Task ID Descriptor Rating

  Meets Criteria

Needs Refinement

1.1

The goal statement articulates the “big idea” under which targeted content/professional standards are directly aligned. The statement is concise and free of technical jargon.

1.2

Targeted content/professional standards have a direct influence on student performance outcomes and are viewed as “central” to the subject/service area.

1.3

The course/subject (service) area associated with the SLO/SSO is logically linked to the central theme and targeted content/professional standards.

Coherency Rubric

Page 35: M3-Reviewing the SLO-SSO-DemoSite

35

Phase I: Design

Task ID Descriptor Rating

  Meets Criteria

Needs Refinement

1.4

A blueprint or other design document illustrates relationships among key components (i.e., Goal Statement, Targeted Content/Professional Standards, Objectives, Performance Measures, and Overall Rating).

1.5

Performance measures are designed to evaluate the targeted content/professional standards (as demonstrated by the performance measure’s alignment characteristics).

Coherency Rubric (cont.)

Page 36: M3-Reviewing the SLO-SSO-DemoSite

36

Phase II: Build

Task ID Descriptor Rating

  Meets Criteria

Needs Refinement

2.1The goal statement represents a central concept that is enduring, has leverage, and is foundational to further, more complex learning outcomes. 

2.2

The SLO/SSO is supported by a representative sample of the educator’s/professional’s students, with a sample size that is sufficient to make valid inferences about student achievement and/or outcomes.

2.3 Targeted content/professional standards are selected using a valid and reliable approach that is fair and unbiased. 

Coherency Rubric (cont.)

Page 37: M3-Reviewing the SLO-SSO-DemoSite

37

Phase II: Build

Task ID Descriptor Rating

  Meets Criteria

Needs Refinement

2.4Objectives are specific, criteria-focused, attainable (yet challenging), and directly linked to the performance measures.

2.5

Performance measures have benchmarks for two or more points in time within a given school year [Growth]. In addition or alternatively, performance measures have a clear, date-specific target for an on-demand demonstration of skill and knowledge attainment [Mastery]. 

2.6

The overall rating is directly linked to a performance continuum based on the percentage of students meeting expectations across all objectives. 

Coherency Rubric (cont.)

Page 38: M3-Reviewing the SLO-SSO-DemoSite

38

Phase III: Review

Task ID Descriptor Rating

Meets Criteria

Needs Refinem

ent

3.1

The SLO/SSO is based on performance measures that are technically sound (i.e., reliable, valid, and fair) and fully align to the targeted content standards.

3.2

The SLO/SSO form reviews mitigate unintentional consequences and/or potential threats to inferences made about meeting performance expectations.

3.3

The SLO/SSO has data and/or evidence to support the assignment of an overall teacher rating (i.e., Unsatisfactory, Emerging, Effective, and Distinguished).

Coherency Rubric (cont.)

Page 39: M3-Reviewing the SLO-SSO-DemoSite

39

Task ID Descriptor Rating

Meets Criteria

Needs Refinement

3.4

The SLO/SSO form has been examined to ensure that it is complete. Meaning, all applicable elements within the SLO Form (Template #2a or Template #2c) have been addressed according to the prescribed business rules.

3.5

The SLO/SSO from has been reviewed to ensure it includes “comprehensive” performance measures. Meaning, all performance measures have been examined to determine that they are appropriate for use in the process AND are of high technical quality.

Needs Refinement Clarification

Coherency Rubric (cont.)Phase III: Review

Page 40: M3-Reviewing the SLO-SSO-DemoSite

40

Procedural Steps Step 1. Select the drafted SLO/SSO, including applicable

performance measures.

Step 2. Beginning with Phase I, use the Coherency Rubric to evaluate the alignment of the components to the overall SLO/SSO design.

Step 3. Identify any component with weak alignment and/or does not meet the rubric’s criteria.

Step 4. Flag any element needing refinement or further discussion with other educators/professionals and/or the principal/supervisor.

Step 5. Repeat Steps 2 thru 4 with the additional two sections of the Coherency Rubric.

Page 41: M3-Reviewing the SLO-SSO-DemoSite

41

Reflection

• Quality Assurance• The SLO/SSO Form is complete.• The SLO’s/SSO’s assessments are

comprehensive.• The SLO’s/SSO’s design is coherent.

Step 5

Page 42: M3-Reviewing the SLO-SSO-DemoSite

42

Summary

This SLO/SSO Review Phase:• Applied a set of quality assurance criteria to

ensure that the student learning/support objective, along with its applicable performance measures, was complete, comprehensive, and coherent (i.e., “high quality”).

• Identified areas of improvement for subsequent SLO/SSO refinement.