interpreting data for program evaluation and planning

29
Interpreting data for program evaluation and planning

Upload: jeremy-townsend

Post on 21-Jan-2016

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Interpreting data for program evaluation and planning

Interpreting data for program evaluation and planning

Page 2: Interpreting data for program evaluation and planning

Literacy Coach’s FocusIn Data Analysis

ProgramEvaluation

Regrouping

Form needs-basedgroups for classroom

instruction

Assign children to interventions

To what extent is my program keepingBenchmark children at benchmark?

Choose instructional emphasis

To what extent is small-group workmoving strategic children to benchmark?

To what extent is my program movingIntensive children to benchmark?

To what extent are classroom effectsapparent?

Page 3: Interpreting data for program evaluation and planning

Literacy Coach’s FocusIn Data Analysis

Regrouping

Form needs-basedgroups for classroom

instruction

Assign children to interventions

Choose instructional emphasis

Which DIBELS reportsshould I use?

Do you have curriculum materials to accomplish this?

Page 4: Interpreting data for program evaluation and planning

Literacy Coach’s FocusIn Data Analysis

ProgramEvaluation

To what extent is my program keepingBenchmark children at benchmark?

To what extent is small-group workmoving strategic children to benchmark?

To what extent is my program movingIntensive children to benchmark?

To what extent are classroom effectsapparent?

Page 5: Interpreting data for program evaluation and planning

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

K 1 2 3

At Risk

Some Risk

Benchmark

State-Level Year-End Data, 2004-05

Page 6: Interpreting data for program evaluation and planning

General Impressions

1. We are increasingly successful in prevention-based instruction in Kindergarten

2. We need to continue to experiment in intervention, particularly for second and third grade

Consider time, focus, and explicitness for needs-based work?Consider additional intervention programs?

Page 7: Interpreting data for program evaluation and planning

Cross-Sectional Analysis

How well are the kindergarten children this year doing compared to last year?– Did they start out stronger or weaker?– Did they make more or less progress between

fall and winter?

And yes, these are different children, but the teachers are the same and the program is the same

Page 8: Interpreting data for program evaluation and planning

For Kindergarten

Beginning of kindergarten status includes weighted combinations of measures

Middle kindergarten directs attention to initial sound fluency

End of kindergarten directs attention to phoneme segmentation fluency

*You have to look at your own data, considering all measures, to really evaluate your program

Page 9: Interpreting data for program evaluation and planning

State K Cross-Section

Fall 2003 Winter 2004 Spring 2004

I

31%

S

43%

B

26%

I

30%

S

40%

B

30%

I

24%

S

34%

B

42%

Fall 2004 Winter 2005

(ISF)

Spring 2005

(PSF)

I

30%

S

42%

B

27%

I

15%

S

46%

B

38%

I

8%

S

18%

B

74%

Page 10: Interpreting data for program evaluation and planning

New Directions

What did you decide to do differently next year when you saw these data for your school?

Page 11: Interpreting data for program evaluation and planning

For first grade

Beginning of first grade status includes weighted combinations of measures

Middle first grade directs attention to nonsense word fluency

End of first grade directs attention to oral reading fluency

*You have to look at your own data, considering all measures, to really evaluate your program

Page 12: Interpreting data for program evaluation and planning

State 1 Cross-Section

Fall 2003 Winter 2004 Spring 2004

I

33%

S

32%

B

36%

I

32%

S

30%

B

39%

I

25%

S

30%

B

45%

Fall 2004 Winter 2005

(NWF)

Spring 2005

(ORF)

I

19%

S

29%

B

53%

I

13%

S

40%

B

48%

I

16%

S

26%

B

58%

Page 13: Interpreting data for program evaluation and planning

New Directions

What did you decide to do differently next year when you saw these data for your school?

Page 14: Interpreting data for program evaluation and planning

For Second Grade

Beginning of second grade status includes weighted combinations of measures

Middle second grade directs attention to oral reading fluency

End of second grade directs attention to oral reading fluency

*You have to use the cognitive model of assessment to interpret these data

Page 15: Interpreting data for program evaluation and planning

State 2 Cross-Section

Fall 2003 Winter 2004 Spring 2004

I

27%

S

32%

B

42%

I

32%

S

18%

B

50%

I

38%

S

22%

B

40%

Fall 2004 Winter 2005 Spring 2005

I

21%

S

32%

B

47%

I

22%

S

19%

B

59%

I

26%

S

20%

B

54%

Page 16: Interpreting data for program evaluation and planning

New Directions

What did you decide to do differently next year when you saw these data for your school?

Page 17: Interpreting data for program evaluation and planning

For Third Grade

Third grade data include only oral reading fluency

*You have to use the cognitive model of assessment to interpret these data

Page 18: Interpreting data for program evaluation and planning

State 3 Cross-Section

Fall 2003 Winter 2004 Spring 2004

I

25%

S

31%

B

44%

I

33%

S

34%

B

33%

I

28%

S

40%

B

33%

Fall 2004 Winter 2005 Spring 2005

I

26%

S

35%

B

40%

I

27%

S

31%

B

42%

I

20%

S

38%

B

41%

Page 19: Interpreting data for program evaluation and planning

New Directions

What did you decide to do differently next year when you saw these data for your school?

Page 20: Interpreting data for program evaluation and planning

Cohort Analysis

Given children’s experience at your school over time, to what extent is your instructional program actually accelerating literacy growth over time?

(and you are right when you say it’s not EXACTLY the same children if your population is highly transient)

Page 21: Interpreting data for program evaluation and planning

State Cohort K-1

Fall 2003 Winter 2004 Spring 2004

I

31%

S

43%

B

26%

I

30%

S

40%

B

30%

I

24%

S

34%

B

42%

Fall 2004 Winter 2005

(NWF)

Spring 2005

I

19%

S

29%

B

53%

I

13%

S

40%

B

48%

I

16%

S

26%

B

58%

Page 22: Interpreting data for program evaluation and planning

0

10

20

30

40

50

60

F03 W04 S04 F04 W05 S05

K-1 Benchmark %

Page 23: Interpreting data for program evaluation and planning

State Cohort 1-2

Fall 2003 Winter 2004 Spring 2004

I

33%

S

32%

B

36%

I

32%

S

30%

B

39%

I

25%

S

30%

B

45%

Fall 2004 Winter 2005

(ORF)

Spring 2005

I

21%

S

32%

B

47%

I

22%

S

19%

B

59%

I

26%

S

20%

B

54%

Page 24: Interpreting data for program evaluation and planning

0

10

20

30

40

50

60

F03 W04 S04 F04 W05 S05

1-2 Benchmark

Page 25: Interpreting data for program evaluation and planning

State Cohort 2-3

Fall 2003 Winter 2004 Spring 2004

I

27%

S

32%

B

42%

I

32%

S

18%

B

50%

I

38%

S

22%

B

40%

Fall 2004 Winter 2005 Spring 2005

I

26%

S

35%

B

39%

I

27%

S

31%

B

42%

I

20%

S

38%

B

41%

Page 26: Interpreting data for program evaluation and planning

0

5

10

15

20

25

30

35

40

45

50

F03 W04 S04 F04 W05 S05

2-3 Benchmark

Page 27: Interpreting data for program evaluation and planning

Interpretation

• To what extent have you set and communicated the plan?

• To what extent are teachers understanding and implementing the curriculum?

• How are they using time?• How are they monitoring progress and adjusting

their instruction and groupings?• How well are they using intervention options?

Page 28: Interpreting data for program evaluation and planning

If you’re not getting the results you want, you have to do something different.

Start with yourself

Work more closely with administration.

Spend more time in classrooms.

Focus your pd time on differentiation.

Page 29: Interpreting data for program evaluation and planning

Next Steps

• Reflect on your own data; check on individual indicators in K and 1 to see if there are particular areas that are troublesome

• Find your most and least successful classrooms and observe so that you can learn about the curriculum and you can evaluate the effectiveness of your own professional support system

• Try something different