dr. michael e. hickey mehickey@towson dr. ronald s. thomas rathomas@towson

74
RE-THINKING HOW SCHOOLS IMPROVE: A Team Dialogue Model for Data- Based Instructional Decision Making Dr. Michael E. Hickey [email protected] Dr. Ronald S. Thomas [email protected] Center for Leadership in Education at Towson University CCSSO Education Leaders Conference September 12, 2007

Upload: marnin

Post on 20-Jan-2016

37 views

Category:

Documents


0 download

DESCRIPTION

RE-THINKING HOW SCHOOLS IMPROVE: A Team Dialogue Model for Data-Based Instructional Decision Making. Dr. Michael E. Hickey [email protected] Dr. Ronald S. Thomas [email protected] Center for Leadership in Education at Towson University CCSSO Education Leaders Conference - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

RE-THINKING HOW SCHOOLS IMPROVE:

A Team Dialogue Model for Data-Based Instructional Decision Making

Dr. Michael E. [email protected]

Dr. Ronald S. [email protected]

Center for Leadership in Education at Towson University

CCSSO Education Leaders ConferenceSeptember 12, 2007

Page 2: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

2

The Big PictureIn today’s session, we are going to:

1. Re-think our understanding of how schools improve—moving from the dysfunction of the old model to the requirements for what a “new model” might look like.

2. Focus on a “new model” for improving performance that enables content, vertical, or departmental teams to use data more effectively for classroom instructional improvement and increased student learning

Page 3: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

Part 1: What are we trying to do and why?

Page 4: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

4

“Every organization is perfectly designed to get the results it achieves.”

--W. Edwards Deming

Page 5: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

5

Think about how long you have been engaged in the school improvement

process.

Has the school gotten better each year?

Has the performance of each student improved as a result of each year he/she spends in the school?

If your answer to one or both questions is no, what will it take to change it to yes?

Page 6: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

6

What are data?

Data are observations, facts, or numbers which, when collected, organized

and analyzed, become information and, when used productively in

context, become knowledge.

Page 7: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

7

The DRIP Syndrome

Page 8: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

8

Being Data Rich

Your school may suffer from

You may need ways to organize the data.

Page 9: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

9

Sources of Student Achievement Data

• External assessment data

• Benchmark or course-wide assessment data

• Individual teacher assessment data--Supovitz and Klein (2003)

Page 10: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

10

Data-driven schools and school districts use data for two major purposes:

• Accountability (to prove)• Instructional decision making (to improve)

Page 11: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

11

The Hierarchy of Data for Accountability Purposes

External (State & National) Assessments

System Benchmark Assessments

Common School or Course Assessments

Classroom Assessments of Student Work

Page 12: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

12

The Hierarchy of Data for Instructional Decision Making

Classroom Assessments of Student Work

Common School or Course Assessments

System Benchmark Assessments

External (State & National Assessments)

Page 13: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

13

Think about it . . .

Do you have a school improvement plan?

Or a school accountability plan?

A SIP ? Or a SAP?

Have a three minute conversation with someone sitting near you about what you think most schools

currently have.

Page 14: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

14

The Old Model

The School Improvement Team, a Data Committee, or one person analyzes data, using primarily state test data. These data are mined for every possible nuance.

Page 15: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

15

The Old Model

The data are presented at a faculty, SIT, or department meeting, and faculty members brainstorm ideas for what to do to increase student

performance.

Page 16: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

16

The Old Model

Faculty or team members “average opinions” and put forth the solutions that are acceptable to the largest majority of people.

Page 17: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

17

The Old Model

This results in school-wide or department-wide initiatives that may or may not be implemented.Data expert Mike Schmoker has estimated that about 10% of what is planned in SIPs actually is implemented at a high level of quality.

Page 18: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

18

Results of the Old Model

Page 19: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

19

Why is the old model not working anymore?

Page 20: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

20

Why? Wrong Data

We have been using the wrong data. State test data are: Way too general Instructionally insensitive – not designed for instructional improvement

Page 21: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

21

Why? Wrong Time

The data come at the wrong time. State test data are: Out of date when they arrive For students we no longer have

The results of the changes that are implemented will not be known for a year.

Page 22: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

22

Why? Wrong Team

The SIT, a full department, or a Data Committee is the wrong team to do the analysis.Membership is too diverse (often including parents)Meets too infrequentlyNot connected to immediate classroom needs

Page 23: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

23

Why? Wrong Plan

The initiatives that are put in place are: Too global to address

the diversity of students Aimed at performance increases

of groups on average Looking for the “silver bullet” that

will have a schoolwide impact

Page 24: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

24

We need a new model.

Real time Specific to each grade and subject Addresses individual students’ needs Results in instructional improvements that will actually

occur at a high level of quality Can be re-directed frequently Has meaning for teachers (seen by teachers as a

worthwhile use of their time)

THREE MINUTE CONVERSATION: How do the data conversations in schools that you know of

rate against these criteria?

Page 25: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

25

What should that new model look like?

“School improvement is most surely and thoroughly achieved when teachers engage in frequent, continuous, and increasingly concrete and precise talk about teaching practice . . . adequate to the complexities of teaching, capable of distinguishing one practice and its virtue from another.”

--Judith Warren Little

Page 26: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

26

In other words . . .

A Classroom-Focused Improvement Process (CFIP)

Page 27: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

27

Education After Standards

Page 28: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

28

The Classroom-Focused Improvement Process is the work

that professional learning communities do.

A professional learning community is not an

organizational structure.

It is a WAY OF DOING BUSINESS.

Page 29: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

29

CFIP: A WAY TO MOVE SCHOOLS From To

• Focus on teaching• Emphasis on what

was taught• Coverage of content

• Curriculum planned in isolation

• Infrequent summative assessments

• Focus on average scores

• Focus on learning• Fixation on what

students learned• Demonstration of

proficiency• Shared knowledge of

essential curriculum• Frequent common

formative assessments

• Monitoring individual proficiency on every essential skill

Page 30: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

30

CFIP: A WAY TO MOVE SCHOOLSFrom To

• Remediation• One opportunity to

demonstrate learning• Isolation• Each teacher

assigning priority to different learning standards

• Privatization of practice

• Focus on inputs

• Intervention• Multiple

opportunities • Collaboration• Teams determining

priority of learning standards

• Sharing of practice

• Focus on results

Page 31: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

31

Fundamental Concepts of Collaborative Learning Communities

• Teachers establish a common, concise set of essential curricular standards and teach to them on a roughly common schedule.

• Teachers meet regularly as a team for purposes of talking in “. . . concrete and precise terms” about instruction with a concentration on “thoughtful, explicit examination of practices and their consequences.”

• Teachers make frequent use of common assessments.

Continued on next slide

Page 32: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

32

“These elements, so rarely emphasized in school . . . improvement plans, deserve our attention more than anything else we do in the name of school improvement.”

--Mike Schmoker (2006)

Page 33: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

33

Our Goal in the Data Dialogues:

Frequent, continuous, and increasingly concrete and precise dialogue by school teams, informed by data

Page 34: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

34

IS IT WORTH THE EFFORT?

Take a look at the following results. Then you tell us.

Page 35: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

35

Maryland School Assessment: 5th Grade Reading Disaggregated Data

49.9

67.6

33.3

16.7

46.2

57.1

85.6

71.4

75

61.5

0 10 20 30 40 50 60 70 80 90

Target

BBES

FARMS

SPED

AF. AMER

Su

bg

rou

ps

% Proficient

2005

2004

2005 57.1 85.6 71.4 75 61.5

2004 49.9 67.6 33.3 16.7 46.2

Target BBES FARMS SPED AF. AMER

Page 36: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

36

Maryland School Assessment: 5th Grade Math Disaggregated Data

38.3

60.8

25

22.2

30.8

47.2

83.8

75

60

61.5

0 10 20 30 40 50 60 70 80 90

Target

BBES

FARMS

SPED

AF. AMER

Su

bg

rou

ps

% Proficient

2005

2004

2005 47.2 83.8 75 60 61.5

2004 38.3 60.8 25 22.2 30.8

Target BBES FARMS SPED AF. AMER

Page 37: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

37

Reading: 2004 3rd Graders/ 2005 4th Graders

68.7

52.7

21.1

41.7

81.3

66.7

73.1

73.3

0 10 20 30 40 50 60 70 80 90

BBES

FARMS

SPED

AF. AMER

Su

bg

rou

ps

% Proficient

2005

2004

2005 81.3 66.7 73.1 73.3

2004 68.7 52.7 21.1 41.7

BBES FARMS SPED AF. AMER

Page 38: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

38

Math: 2004 3rd Graders/ 2005 4th Graders

65.2

36.8

26.3

41.6

75.6

52.4

53.8

60

0 10 20 30 40 50 60 70 80

BBES

FARMS

SPED

AF. AMER

Su

bg

rou

ps

% Proficient

2005

2004

2005 75.6 52.4 53.8 60

2004 65.2 36.8 26.3 41.6

BBES FARMS SPED AF. AMER

Page 39: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

39

Grasonville Elementary School Maryland School Assessment -

Reading

 

MSA Percent at Proficient and Advanced - Reading

50

60

70

80

90

100

2003 2004 2005 2006

Grade 3

Grade 4

Grade 5

Page 40: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

40

Grasonville Elementary School Maryland School Assessment -

Mathematics

 

MSA Percent at Proficient and Advanced - Mathematics

50

60

70

80

90

100

2003 2004 2005 2006

Grade 3

Grade 4

Grade 5

Page 41: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

41

THE CLASSROOM-FOCUSED IMPROVEMENT PROCESS (CFIP):

A Team Data Dialogue Protocol

Part 2:

Components of

THE NEW MODEL

Page 42: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

42

What are the right teams to conduct data dialogues?

Grade-levelVertical

Content

Page 43: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

43

When is the right time to conduct data dialogues?

•At a minimum, devote at least one hour to data dialogues every two weeks.

•According to several studies, schools that realized the greatest results from a shift to a data culture scheduled data dialogues at least once a week.

Page 44: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

44

Frequency of Data Dialogues

Source: Stanford University, Stanford Research Institute, Education Week, January 24, 2004

20

45

35

4548

7

0

10

20

30

40

50

60

A Few Timesa Year

A Few TimesA month

A Few TimesA Week

Gap Closers

Non-gap Closers

Page 45: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

45

What are the right data to use in the data dialogues?

Triangulate three types of data:• External Assessment Data• Course-wide Benchmark Assessment Data• Classroom Assessment Data

--Supovitz & Klein (2003)

Page 46: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

46

THE GPS ANALOGY

Page 47: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

47

Conclusions are specific to students in the class.

Conclusions are used to plan upcoming daily instruction.

The plans are implemented.

What is the right plan where the results of the data dialogues should be used?

Page 48: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

48

What is the right way to use the results of the data dialogues?

Conclusions are used to identify enrichments and interventions for the students in the class. Conclusions are used to

plan upcoming daily

instruction.

Page 49: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

49

The new process needs to be built on:

1. Dialogue

2. Protocols

3. Triangulation of

Data

Page 50: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

50

Why True Dialogue?

“In dialogue, a group accesses a larger ‘pool of common meaning,’ which cannot be accessed individually.

People are no longer primarily in opposition, rather they are participating in generating this pool of common meaning….

We are not trying to win in a dialogue. We all win if we are doing it right.”

- Senge, The Fifth Discipline (2006)

Page 51: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

51

Team Learning

Team learning is the process of aligning and developing the capacities of a team to create the results its members truly desire.

The discipline of team learning starts with “dialogue,” the capacity of members of a team to suspend assumptions and enter into a genuine “thinking together.” It also involves learning how to recognize the patterns of interaction in teams that undermine learning.

--Peter Senge (2006)

Page 52: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

52

What Is a Data Protocol?

A protocol consists of guidelines for dialogue – which everyone understands and has agreed to – that permit a certain kind of conversation to occur, often a kind of conversation which people are not in the habit of having.

Protocols build the skills and culture necessary for collaborative work. Protocols often allow groups to build trust by doing substantive work together.

Page 53: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

53

Using a Data Protocol

Protocols can help us to navigate difficult and uncomfortable conversations by:

Making it safe to ask challenging questions

Making the most of scarce time Providing an opportunity for all to be

involved Resulting in an analysis that will lead to

positive action

Page 54: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

54

Using a Data Protocol

The point is not to do the protocol well, but to have team dialogue that is:

In-depth Insightful Concrete Precise

Page 55: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

55

The Big Six of Data Analysis

1. Begin with a question.2. Understand the data source.3. Look for the big picture.4. Look for patterns in the data.5. Identify and act on the implications of the patterns for your students.6. Identify and act on the implications of the patterns for your instruction.

Page 56: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

56

CFIP DATA DIALOGUE PROTOCOL FORMATS

• One-page overview of the model, page 15• CFIP model with reflection questions, pages

17-18• CFIP model worksheets, pages 19-22• Reflection Guide to Instructional Changes,

pages 23-24• Examples of CFIP model as completed by

school teams, pages 25-38

Take a few minutes to preview these pages.

Page 57: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

57

SIX-STEP PROCESS - TEAM DATA DIALOGUE PROTOCOL: MOVING FROM DATA TO INCREASED STUDENT LEARNING

DATA SOURCE(S): __________________________________________________________________________

Step 1: Identify the questions to answer in the data dialogue.Step 2: Build assessment literacy. Define terms (if needed).Step 3: Identify the “big picture” conclusions from the data.Step 4: Identify the patterns of class strengths and weaknesses (using more than one data source, if possible).

STUDENT STRENGTHS STUDENT WEAKNESSES

Step 5: Drill down in the data to individual students. Identify and implement needed enrichments and interventions.

STUDENTS WHO EXCELLED

ENRICHMENTS TO BE PUT IN PLACE

STUDENTS NEEDING FURTHER WORK

INTERVENTIONS TO BE PUT IN PLACE

Step 6: Reflect on the reasons for student performance. Identify and implement needed instructional changes for the next unit.

Page 58: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

58

CFIP Step 1: When analyzing data, begin with a question.

All data analyses should be designed to answer a question.

Unless there is an important question to answer, there is no need for a data analysis.

Page 59: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

59

CFIP Step 2: Understand the data source

Build ASSESSMENT LITERACY with questions like these:What assessment is being described in this data report? What were the characteristics of the assessment?

Who participated in the assessment? Who did not? Why?

Why was the assessment given? When?

What do the terms in the data report mean?

Be sure you have clear and complete answers to these questions before you proceed any further.

Page 60: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

60

CFIP Step 3: Look for the “big picture” views in the data.

Identify:

What do we “see” in the data?

What “pops out” at us from

the data?

What questions do the data raise?

Page 61: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

61

CFIP Step 4A: Look for data patterns in a single data source.

What do you see over and over again in the data?

What are the students’ strengths? What knowledge and skills do the students have?

What are their weaknesses? What knowledge and skills do the students lack?

Page 62: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

62

CFIP Step 4B: Identify Patterns of Class

Strengths and Weaknesses from Multiple Data Sources.

TRIANGULATION

•In what ways are the results similar among data sources? For example, how do benchmark test results compare with ongoing classroom assessment data?•In what ways do the results among data sources differ?•Why might these differences occur?

Page 63: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

63

Power When Multiple Types

of Data Are Used Reduces the anxiety and the mistakes

of relying on a single measure as the only definition of student successProvides more frequent evidence on which to actDevelops and sustains a culture of inquiry in the school based on data

Page 64: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

64

CFIP Step 5: Drill Down to Individual

Students. Identify and Implement Needed Enrichments and Interventions.

What are the implications for enrichments and interventions from

what you learn from the data? Which students need enrichments and interventions?What should enrichments and interventions focus on?

Page 65: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

65

CFIP Step 6: Reflect on the reasons for student performance -- What in our teaching might be

preventing all students from being successful?

To what extent did we implement research-based instructional practices as we:

Planned instruction? Introduced instruction? Taught the unit? Brought closure to instruction? Assessed formatively?

Page 66: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

66

CFIP Step 6: Reflect on the reasons for

student performance. Identify and implement instructional changes

in the next unit.

How will we change instruction in our next unit?Content focusPacingTeaching methodsAssignments

Page 67: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

67

CFIP Step 6: Reflect on the reasons for

student performance. Identify and implement instructional changes

in the next unit. When will we review the data again to determine the success of the enrichments, interventions, and instructional changes?What do the data not tell us? What questions about student achievement do we still need to answer? How will we attempt to answer these questions?

Page 68: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

68

The Next Steps

1. Unless teams emerge from the data analysis process with a clear plan of action for their classroom, they have wasted their time.

2. Implement the plan of interventions, enrichments, and changes in instruction.

3. Collect the next set of data.

Page 69: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

69

Where does a school go from here in becoming more data-driven?

The Drivers The Barriers

DISCUSSION: What drivers and barriers would you see schools facing in implementing the CFIP model?

Page 70: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

70

Typical School Improvement Plan (SIP)

Classroom Focused Improvement Process (CFIP)

Process established at district level

Process designed at team level

Linear and prescriptive Non-linear/non-prescriptive

Annual strategic plan Short-cycle operational plan

Impact: total school Impact: students in class

SIT develops Classroom-level team develops

Purpose: meet AYP Purpose: adjust practice

Results determined end of year

Results determined when unit is taught

Page 71: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

71

So what about the School Improvement Team?

The School Improvement Team (SIT) as typically constituted is designed to do exactly what its name implies: IMPROVE THE SCHOOL. It is not designed to improve teaching and learning at the classroom level. That is the focus of the content or grade-level team or the department.

Page 72: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

72

Core Functions of the SIT

• Keep the vision alive.• Develop and monitor school-wide plan for

meeting state accountability standards.• Build a data-driven culture.• Establish priority focus on instruction.• Provide a safe and supportive environment

for all students.• Connect school with parents and

stakeholders.• Provide needed resources.

Page 73: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

73

Caveats about CFIP

• It is a paradigm shift from traditional lesson planning format.

• It is not easy, especially at first.• Follow the steps faithfully until they

become second nature.• The CFIP is a guide until you make the

process your own.• Expect mistakes and imprecision in the

data.• The results are worth the effort.

Page 74: Dr. Michael E. Hickey mehickey@towson Dr. Ronald S. Thomas rathomas@towson

74

Coming together is a beginning,

staying togetheris progress,

and working togetheris success.

- Henry Ford