smith, hiawatha d., ph.d. digging deeper: understanding

146
SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding Non-Proficient Students Through an Understanding of Reading and Motivational Profiles. (2017). Directed by Dr. Samuel Miller. 138 pp. With the continued emphasis on accountability for students, schools are working to increase the reading academic performance of their non-proficient students. Many remedial approaches fail to identify the individual strengths and weaknesses and tend to treat these students with a singular remedial focus on word identification (Allington, 2001). In this quantitative study, I explore the reading and motivational patterns present with elementary non-proficient readers representing marginalized groups. The results suggest that non-proficient readers do not need remediation with a singular focus, but have unique needs that must be taken into account when planning remediation. This study provided unique findings by examining the reading and motivational profiles of this unique sample of students. Six profiles were identified that represented strengths and weaknesses within the area of reading, as well as identifying preferred and less preferred motivators. The study also supported the idea that motivation is multi-dimensional and should be considered when providing support to struggling readers.

Upload: others

Post on 11-Apr-2022

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding Non-Proficient

Students Through an Understanding of Reading and Motivational Profiles. (2017).

Directed by Dr. Samuel Miller. 138 pp.

With the continued emphasis on accountability for students, schools are working

to increase the reading academic performance of their non-proficient students. Many

remedial approaches fail to identify the individual strengths and weaknesses and tend to

treat these students with a singular remedial focus on word identification (Allington,

2001). In this quantitative study, I explore the reading and motivational patterns present

with elementary non-proficient readers representing marginalized groups. The results

suggest that non-proficient readers do not need remediation with a singular focus, but

have unique needs that must be taken into account when planning remediation. This

study provided unique findings by examining the reading and motivational profiles of this

unique sample of students. Six profiles were identified that represented strengths and

weaknesses within the area of reading, as well as identifying preferred and less preferred

motivators. The study also supported the idea that motivation is multi-dimensional and

should be considered when providing support to struggling readers.

Page 2: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

DIGGING DEEPER: UNDERSTANDING NON-PROFICIENT STUDENTS

THROUGH AN UNDERSTANDING OF READING

AND MOTIVATIONAL PROFILES

by

Hiawatha D. Smith

A Dissertation Submitted to

the Faculty of The Graduate School at

The University of North Carolina at Greensboro

in Partial Fulfillment

of the Requirements for the Degree

Doctor of Philosophy

Greensboro

2017

Approved by

______________________

Committee Chair

Page 3: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

ii

ii

Pag

eii

2

2 P

ageii

22

APPROVAL PAGE

This dissertation written by HIAWATHA D. SMITH has been approved by the

following committee of the Faculty of the Graduate School at The University of North

Carolina at Greensboro.

Committee Chair ____________________________________

Samuel Miller

Committee Members ____________________________________

Terry Ackerman

____________________________________

Jewell Cooper

____________________________________

Melody Zoch

____________________________

Date of Acceptance by Committee

_________________________

Date of Final Oral Examination

Page 4: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

iii

iii

Pag

eiii

2

2 P

ageii

i22

TABLE OF CONTENTS

Page

LIST OF TABLES ...............................................................................................................v

LIST OF FIGURES ......................................................................................................... vii

CHAPTER

I. INTRODUCTION ................................................................................................1

Problem Statement ......................................................................................4

Purpose of the Study ...................................................................................7

II. REVIEW OF LITERATURE .............................................................................10

Review of Reading Profiles Literature .....................................................12

Review of Motivation Profiles Literature .................................................24

Need for Marginalized Groups .................................................................31

Need for Economically Disadvantaged Students ......................................34

Recommendations .....................................................................................40

III. METHODOLOGY .............................................................................................44

Research Design ........................................................................................44

Data Analysis ............................................................................................57

IV. RESULTS ...........................................................................................................64

Reading Trends .........................................................................................65

Reading Factors ........................................................................................72

Reading Profiles ........................................................................................78

Motivation for Reading Trends .................................................................82

Motivation Factors ....................................................................................85

Motivation Clusters ...................................................................................95

Motivation and Reading Clusters ............................................................100

V. DISCUSSION ...................................................................................................106

Reading Profiles ......................................................................................107

Motivation Profiles .................................................................................108

Motivation and Reading Profiles ............................................................109

Page 5: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

iv

iv

Pag

eiv

2

2 P

ageiv

22

Implications .............................................................................................114

Future Work ............................................................................................120

Limitations ..............................................................................................121

Conclusions .............................................................................................122

REFERENCES ...............................................................................................................124

APPENDIX A. MOTIVATION FOR READING QUESTIONNAIRE ITEMS ...........135

Page 6: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

v

v

Pag

ev

2

2 P

agev

22

LIST OF TABLES

Page

Table 1. Comparison of Areas Assessed in Reading Literature .......................................22

Table 2. Instruments for Data Collection ..........................................................................56

Table 3. Descriptive Statistics for the Reading Data by School .......................................67

Table 4. MANOVA Test Results for Reading Differences Between Schools .................69

Table 5. Post Hoc Test Comparison Between Schools .....................................................69

Table 6. Descriptive Statistics for the Reading Data by Grade Level ..............................70

Table 7. Total Variance of Reading Factors .....................................................................74

Table 8. Reading Factor Communalities ...........................................................................75

Table 9. Reading Variables Factor Loadings ....................................................................76

Table 10. Reading Cluster History ....................................................................................80

Table 11. Reading Clusters Summary Data ......................................................................81

Table 12. Motivation Item Descriptive Statistics .............................................................84

Table 13. Initial MRQ Factor Structure by Instrument Authors .......................................87

Table 14. Revised Final MRQ Factor Structure ................................................................91

Table 15. Descriptive Statistics for the Motivation Data by School..................................93

Table 16. MANOVA Test Results for Motivation Differences Between Schools ...........94

Table 17. Descriptive Statistics for the Motivation Data by Grade Level ........................96

Table 18. MANOVA Test Results for Motivation Differences Between Grades .............97

Table 19. Motivation Cluster History ...............................................................................98

Table 20. Motivation Clusters Summary Data .................................................................99

Page 7: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

vi

vi

Pag

evi

2

2 P

agev

i22

Table 21. Combined Cluster History ..............................................................................102

Table 22. Combined Clusters Summary Data .................................................................103

Page 8: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

vii

vii

Pag

evii

2

2 P

agev

ii22

LIST OF FIGURES

Page

Figure 1. NAEP Trends Ethnicity Comparisons ................................................................34

Figure 2. NAEP Trends Economic Comparisons ..............................................................35

Figure 3. Scree Plot of the Reading Factors ......................................................................74

Page 9: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

1

1

Pag

e1

2

2 P

age1

22

CHAPTER I

INTRODUCTION

Students’ academic performance continues to be a primary concern for our

nation’s schools and teachers. As teachers attempt to educate students, they face the

problem of ensuring that all students, including those from marginalized populations

(race, gender, religion, cultural group, or socioeconomic status) demonstrate proficiency

(Brown-Jeffy & Cooper, 2011; NCLB, 2001). As policy makers continue to implement

policies to address this goal, a focus is placed particularly on the performance of non-

proficient readers because all students are expected to read at grade level by the end of

the third grade. This focus is a result of federal accountability policies that evaluate

students’ performances on various accountability measures. This accountability places

emphasis on all students being successful with reading.

The National Assessment of Educational Progress Assessment is a nationally (bi-

annually) administered survey of achievement regarded as a source of information for

state-to-state comparisons as it assesses students’ ability to read within three different

contexts; reading for literary and informational purposes, and to perform a specific task

(schedules, directions, maps). NAEP scores have played a major role in the development

of educational policy over the last decade (Swanson & Barlage, 2006). The most recent

NAEP results (National Center for Education Statistics, 2015) showed that 64% of

fourth-grade students and 66% of eighth-grade students failed to meet its proficiency

Page 10: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

2

2

Pag

e2

2

2 P

age2

22

level. The results showed a 1% point decrease (4th grade) and 2% increase (8th grade) in

the percent of student scoring below proficient on this assessment. Despite continued

efforts at school reform via mandates, over half of our students are still unsuccessful on

reading assessments.

As a result of nationwide testing via NAEP and other state level assessments,

additional mandates resulted in the creation of No Child Left Behind (NCLB), which

originally promised to have all students reading proficiently by the year 2014 (NCLB,

2001). The goals of NCLB emphasized proficiency of marginalized groups, including

economically disadvantaged, minority, and English Language Learner students. This

legislation required accountability systems for each state to annually monitor the

academic performance of students in grades 3-12. The primary signature of NCLB, thus

far, has been the imposition of high stakes testing of students, even though policy makers

know that children of poverty are more likely to fail than children of other socioeconomic

groups (McCaslin, 2009).

The Reading First initiative was developed as a part of NCLB and provided

funding for the identification of scientifically-based research to improve reading

instruction with the hope of reducing the number of non-proficient students (Learning

Point Associates, 2004). It provided four pillars to guide reading programs; valid and

reliable assessments, effective instructional programs and aligned materials, professional

development, and dynamic instructional leadership. The main result of this initiative was

the teaching of reading through phonics-driven instruction. This focus led to programs,

professional development, and materials with this stated isolated remedial purpose.

Page 11: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

3

3

Pag

e3

2

2 P

age3

22

Several problems arose as a result of accountability legislation and subsequent

initiatives. The underlying problem is that the legislation’s emphasis on non-proficient

students resulted in them being treated as a homogenous group because their reading

scores fell within a similar performance range (Afferbach, 2004). If these students are

not homogenous, as Buly and Valencia (2002) claimed, then a one-size-fits-all emphasis

on remediation via the teaching of isolated word identification skills and a failure to

acknowledge other reading skills is misplaced. Previous researchers have warned not to

rely on a singular remedial focus via single assessments to make important instructional

decisions (Elmore, 2002; Shepard, 2000). These findings support the idea that additional

information about non-proficient students’ academic performance is necessary to help

them achieve success.

Despite these warnings, schools regularly address the needs of non-proficient

readers by providing intensive remedial instruction in various components of word

recognition (Allington, 2001). These remedial approaches are aligned with the common

assumption that phonics and phonemic awareness skills need to be mastered before

students can learn to comprehend texts (National Reading Panel, 2000). This response is

problematic in two ways. First, as stated earlier, non-proficient readers may not be a

homogeneous group: individuals within this grouping may have different instructional

needs than others. Second, this intervention only considers reading abilities without

attending to their motivations for becoming engaged in reading. This alternative is

particularly important because of recent studies showing the importance of students’

motivation for reading as a key component of their later reading success (Guthrie, Hoa,

Page 12: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

4

4

Pag

e4

2

2 P

age4

22

Wigfield, Tonks, Humenick & Littles, 2007; Unrau & Schlackman, 2006; Wang &

Guthire, 2004).

Problem Statement

Schools’ reliance on a one-size-fits-all approach to assisting struggling readers

increased dramatically in recent years due to an intensified emphasis on test-driven

accountability (Allington & McGill-Franzen, 1992; No Child Left Behind, 2001; Slavin,

Cheun, Holmes, Madden, & Chamberlain, 2013). As stated earlier, remedial

interventions mainly focused on improving non-proficient readers’ word recognition

abilities (Allington, 2009; Fisher & Ivey, 2006). Questions have been raised about

whether such a one-dimensional focus adequately represents non-proficient students’

literacy needs (Afferbach, 2004; Allington, 2001; Buly & Valencia, 2002; Lin, 2000;

Shepard, 2000; Valencia & Buly, 2004). Such questions are important because

struggling students may need a variety of remedial approaches if they are to become

successful readers (Moje, 2004). As stated by Buly and Valencia (2002), one-size-fits-all

reasoning tends to treat the symptoms of not being a proficient reader without

understanding the many possible underlying causes for this lack of proficiency.

Buly and Valencia (2002) designed a study to identify these underlying causes,

inclusive of reading skills and strategies that characterize non-proficient readers’

strengths and weaknesses (Valencia, 2011). They examined the reading profiles of upper

grade elementary students deemed non-proficient on a state mandated reading assessment

because they failed to achieve at grade level. Instead of finding a common reason for

their struggles, the authors discovered ten profiles based on patterns of students’ ability to

Page 13: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

5

5

Pag

e5

2

2 P

age5

22

identify words quickly (word identification), read fluently (fluency), and make meaning

from the text read (meaning). Their findings underscored the need to develop different

remedial programs based on students’ individual strengths and weaknesses. Their

research led to other studies where researchers identified multiple models of reading

profiles for struggling readers across grade levels (Dennis, 2013; Leach, Scarborough, &

Rescorla, 2003; Leseaux & Kieffer, 2010; Meyer, et al., 2013; Rupp & Leseaux, 2006;

Pierce, Katzir, Wold, & Noam, 2007).

The work of Buly and Valencia (2002, 2004) challenges the assumption of a one-

size-fits-all model for addressing the needs of struggling readers, and other researchers

have joined in the challenge with the creation of different profiles to describe non-

proficient students. While the identification of multiple profiles supports the need for

various remedial approaches, it also raises questions about the extent to which there are

consistent underlying structures represented in profiles across different studies. Without

an understanding of this underlying structure, a one-size-fits-all approach may be as

inappropriate for helping non-proficient readers as an any-profile-model approach. If we

are to support struggling readers, particularly those from marginalized student

populations, it is important to gain this understanding of the underlying constructs that

determine the successful engagement of non-proficient readers in their academic studies.

Reading researchers have shown that low performing students are not a

homogenous group (Dennis, 2013; Valencia & Buly, 2002); however, their profiles are

limited to cognitive domains. The continued barrage of assessments being forced on

students, as early as kindergarten, and focusing solely on isolated reading skills, continue

Page 14: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

6

6

Pag

e6

2

2 P

age6

22

to impact their beliefs about reading and their reading performance. Previous researchers

have warned not to rely on a singular assessment to make important decisions about

children (Elmore, 2002; Shepard, 2000) and thus support the idea that additional

information about students’ values, beliefs and perceptions about reading are essential to

providing effective remedial support to assist them with achieving success.

Baker and Wigfield (1999) designed a study to identify motivational profiles.

They generated profiles of 5th and 6th grade regular education students. Through their

profiles, they found that children’s motivations for reading are multidimensional,

originating from varied backgrounds and experiences. They identified seven profiles

inclusive of different patterns of motivation from the areas of efficacy, importance,

curiosity, involvement, challenge, recognition, grades, competition, social, compliance

and work avoidance. Their profiles suggest that students should have differences in

motivation and these occur in varied ways and purposes. Therefore, no students should

be classified as motivated or not motivated, but should be assessed to determine their

motivational preferences for reading activities.

Though their work addresses the need for educators to understand the

motivational constructs represented by students, they failed to include three things. First,

despite the intense accountability focus placed on non-proficient students from

marginalized populations, their study did not specifically address these students. Second,

while they addressed motivation, they lacked an explicit connection to students’ reading

performances to better understand them holistically. Finally, the study examined ethnic

differences, but they did not include a specific focus on minority students because 52% of

Page 15: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

7

7

Pag

e7

2

2 P

age7

22

their participants were Caucasian. This supports a need to specifically address

marginalized groups.

Purpose of the Study

Existing studies have addressed either the reading profiles of non-proficient

students or the motivational profiles of students in general, but the two have not been

bridged. As well, these studies have lacked a focus on understanding marginalized

student populations, including those from schools with high population of minorities,

particularly African American and Hispanic students. The purpose of this study is to

evaluate the reading and motivation profiles of elementary students, who have failed to

pass their state’s mandated reading assessment, and attend schools with similar student

demographics, including a large percentage of minority students.

Research Questions

This study examined the reading and motivational profiles of non-proficient

students across the upper elementary grades. The participating sample came from

schools with a high percentage of students representing marginalized groups, mainly

African-American, Hispanic, and those whose families lack economic resources.

1. What trends of reading and motivation for reading are represented in 3rd, 4th

and 5th grade non-proficient students?

2. What underlying motivational and reading constructs represent non-proficient

3rd, 4th and 5th grade students?

3. What are the reading profiles of 3rd, 4th and 5th grade non-proficient students?

Page 16: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

8

8

Pag

e8

2

2 P

age8

22

4. What are the motivation profiles of 3rd, 4th and 5th grade non-proficient

students?

5. What are the reading and motivation profiles of 3rd, 4th and 5th grade non-

proficient students?

6. What do these profiles tell us about the instructional needs of 3rd, 4th and 5th

grade non-proficient students?

Significance of the Study

Existing studies have addressed the reading profiles of non-proficient students as

well as the motivational profiles of a generic sample of students, but the two have not

been connected. They have failed to place an emphasis on understanding marginalized

students, including students of color, especially those from low-income families, who

struggle with reading in the elementary grades. Studies have shown that struggling

readers can display a combination of both word recognition and meaning difficulties

(Buly & Valencia, 2002; Dennis 2012) and that students may have varying motivational

characteristics (Baker & Wigfield, 1999). When teachers provide instruction and

remediation to non-proficient readers, the work must include multiple areas of reading as

well as motivation. Motivation is a key factor in a student’s choice to read, their beliefs

about their reading, and the value they place upon the act of reading (Cambria & Guthrie,

2010; Schiefele, 1999). Through the use of motivation as a component in reading

profiles that directly address marginalized student groups, teachers might have a better

chance to increase the academic achievement of non-proficient minority students. In

addition, the profiles from this research can add to the existing literature by offering a

Page 17: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

9

9

Pag

e9

2

2 P

age9

22

sample that comes directly from low income schools that are not identified as high

performing. Through this work, researchers and teacher educators can begin to think

about the specific reading factors represented as well as the role that motivation plays

specifically for non-proficient, low-income, minority students.

Page 18: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

10

10

Pag

e10

2

2 P

age1

02

2

CHAPTER II

REVIEW OF LITERATURE

The primary signature of NCLB, thus far, has been the imposition of high stakes

testing of students to monitor academic performance; this monitoring has an equal goal

for all students, 100% proficiency, which has had negative implications for students’

performance and motivation throughout all schools (Au, 2009; McCaslin, 2009).

Increased accountability via testing was initially thought to be a positive achievement by

measuring and comparing standardized assessment results within schools and states, and

by motivating the unmotivated to learn (Orfield & Kornhaber, 2001); however, testing

has negatively impacted the motivation of students and placed limits on their

opportunities to reach academic expectations (Amrein & Berliner, 2003; Au, 2009).

An abundance of federal funds have been devoted to increasing achievement for

students. Unfortunately, despite the appearance of vast amounts of federal funding,

accountability-driven reforms with continued and frequent high-stakes testing only

confirm what we already knew regarding student reading achievement, in that, the same

populations continue to fail (National Center for Education Statistics, 2015). Two

possible reasons for continued failure include the following. First, there is a lack of any

emphasis on understanding students’ motivation. This includes acknowledging why

students become engaged in some activities and not others (Guthrie, Coddington, &

Wigfield, 2009) and using this information to modify instructional practices for students.

Page 19: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

11

11

Pag

e11

2

2 P

age1

12

2

Second, the majority of the studies failed to include students from marginalized groups,

such as African American and Hispanic, particularly those who are economically

disadvantaged, the very students whom this legislation was intended to help. This lack of

inclusion of these groups limits our understanding of why these students have not

improved their achievement (Unrau & Schlackman, 2006).

The purpose of this section is to discuss and evaluate existing research related to

reading profiles and the need to address motivation profiles as well to more fully

understand why low-achievers continue to fail to perform at grade level. It will also

address the reasons and implications of addressing marginalized groups when attempting

to understand non-proficient readers. This section will be discussed in three major parts.

In the first part, I will discuss studies where researchers have identified profiles of readers

(non-proficient in isolation or with proficient students) to see if a common set of profiles

exists beneath different findings. I will do this by examining patterns of performance of

struggling students along with a highlight of the samples included. I will start this

evaluation with a more detailed explanation of the Buly and Valencia (2002) study,

followed by a review of subsequent studies of reading profiles, and then I will evaluate

the extent to which a common set of profiles or an underlying structure exists to help

educators understand the unique needs of struggling readers.

In the second part, I will evaluate whether the motivational profiles for non-

proficient readers represent the same variability as their reading profiles by examining the

patterns and constructs used in different studies. In this section, I will highlight existing

research that assessed the motivational patterns of students. This second part is important

Page 20: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

12

12

Pag

e12

2

2 P

age1

22

2

because researchers have identified motivation as a critical factor in determining the

extent to which students become engaged in their studies (Baker & Wigfield, 1999;

Mazzoni, Gambrell, & Korkeamaki, 1999). The section will conclude with how my

suggested study fills an existing gap in the research.

In the third part, I will focus on whether certain marginalized groups have been

represented adequately within existing research. I will highlight the need to address these

groups in research related to reading and motivation profiles and how the implications of

the research will improve the outcomes for these groups of students. Last, I will finish

the review with recommendations for future research based on the gaps within existing

research.

Review of Reading Profiles Literature

In the initial study by Buly and Valencia (2002), their investigation did not unveil

a singular reason for failure on the mandated state proficiency test. Instead, the authors

used cluster analysis to group non-proficient students based on the similarities with their

reading performance levels. They discovered profiles based on students’ strengths and

weaknesses with identifying words quickly (word identification), reading fluently

(fluency), or making meaning from the text read (meaning). There were ten profiles

initially identified, grouped into four pairs, followed by two singular profiles. The

authors later (2004) combined the four pairs of double profiles based on their statistical

similarities for a total of six profiles to facilitate its understanding by practitioners.

Respectively, the grouped profiles included:

Page 21: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

13

13

Pag

e13

2

2 P

age1

32

2

o Cluster 1 & 2: Automatic Word Callers 18%: The students in this cluster are

stronger in word identification and fluency than meaning.

o Cluster 3: Struggling Word Callers 15%: The students in this cluster are

experiencing some difficulty in word identification and meaning but able to read

with acceptable fluency.

o Cluster 4: Word Stumblers 18%: The students in this cluster exhibit strength in

making meaning but have difficulty with word identification and fluently reading

text.

o Cluster 5 & 6: Slow and Steady Comprehenders 24%: The students in this cluster

are lacking in fluency but their word identification and meaning abilities are

relatively strong, with meaning being the stronger.

o Cluster 7 & 8: Slow Word Callers 17%: The students in this cluster are displaying

difficulties in meaning and fluency with strength being in word identification.

o Cluster 9 & 10: Disabled Readers 9%: The students in this cluster are low in all

three areas and represent the smallest cluster of students.

In their reformulation, Valencia and Buly (2004) focused on the implications for

classroom practice for each of the profiles based on a prototypical student in each cluster.

Accordingly, the authors identified an underlying structure: there was only one cluster

with students performing low in all three areas, two clusters contained students with

strengths in more than one area (word identification and fluency, word identification and

meaning), and the remaining profiles consisted of strengths in only one area with the

other two areas as weaknesses. Each description provided an in-depth interpretation of

Page 22: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

14

14

Pag

e14

2

2 P

age1

42

2

the profiles and their respective representations with regards to the literacy components.

Through their profiles, the researchers challenged whether the one-size- fits all phonics

and word-identification programs were appropriate for all students, a caution against

overgeneralizing students’ needs. As schools address the needs of non-proficient

students, they need to go beyond the scores of state-mandated standardized assessments

to identify student instructional needs.

The work of Buly and Valencia (2002, 2004) led to other studies where

researchers attempted to identify and analyze reading profiles, particularly for those

students who were identified as non-proficient (Dennis, 2013; Leach et. Al, 2003;

Leseaux & Kieffer, 2010; Meyer et. al, 2013; Rupp & Leseaux, 2006; Pierce et. al, 2007).

Using the work of Valencia and Buly (2002, 2004) as a template, I evaluated these

studies by examining the nature and scope of their measures, how they identified non-

proficient readers, and their analytical procedures. My intent was to determine the extent

to which profiles for struggling readers varied across studies and whether there was an

underlying structure beneath their profiles. Table 1 below presents a comparison of the

studies including the areas measured and assessments used. In this section I will review

each study, in chronological order, and then provide an evaluation of their common

properties and what they imply for the identification of reading profiles for struggling

readers. For each study, researchers used either clusters or profiles to identify their

categories. I will use the terms used by researchers when describing their results.

Leach, Scarborough, and Rescorla (2003) used eight measures of students’ word

recognition fluency, vocabulary, and comprehension abilities to identify the profiles for

Page 23: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

15

15

Pag

e15

2

2 P

age1

52

2

3rd, 4th, and 5th grade students (n=161), from both affluent and economically diverse

populations, 5% of whom were minority. Splitting scores into high and low categories on

each of the measures, they identified four distinct profiles:

Profile 1: (8%) Comprehension deficit/No word deficit

Profile 2: (17%) Word deficit but no comprehension deficit

Profile 3: (16%) Deficit in both word and comprehension

Profile 4: (59%) No deficits in comprehension or word.

The profiles supported heterogeneity of reading development for these students with

those students identified as late-emerging reading disabilities being balanced in their

present within each profile.

Rupp and Leseaux (2006) investigated the profiles of proficient and non-

proficient 4th grade students (n=1,111). A lack of proficiency was determined based on

the students’ unsuccessful scores on district mandated reading assessments. They

assessed speed and accuracy for reading words; spelling; working-memory; and

phonological and syntactic awareness. The researchers used factor analysis to develop

high and low split scores for word-level skills, working-memory, and language skills,

then grouped students into four distinct clusters representing a combination of high and

low scores:

Cluster 1: (34%) Low Word, Low Memory

Cluster 2: (11%) Low word, high memory

Cluster 3: (16%) High word, Low memory

Cluster 4: (39%) High word, High memory.

Page 24: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

16

16

Pag

e16

2

2 P

age1

62

2

Students classified as non-proficient were unequally represented in each cluster, with the

majority (76.8%) falling into cluster one.

Pierce, Katzir, Wold, and Noam (2007) evaluated urban 2nd and 3rd grade students

who attended an after-school program and scored more than two-thirds of a standard

deviation below the mean on one of the subtests or composite of the Test of Word

Reading Efficiency (TOWRE) assessment. Using factor scores for decoding, fluency,

text level skills, and vocabulary, they generated a four-cluster model using a cluster

analysis:

Cluster 1: (27%) represent students with high vocabulary and decoding scores,

low word and text level skill factors;

Cluster 2: (19%) represent students with high scores on word level efficiency, text

level, and vocabulary factors and average decoding scores;

Cluster 3: (28%) represent students with low scores on vocabulary and at or

above the mean level for decoding, word level, and text level;

Cluster 4: (26%) low in all four areas.

These profiles provided support for the idea that non-proficient readers have varied

reading strengths and weaknesses.

Leseaux and Kieffer (2010) identified the literacy profiles of students in 6th grade

who were language minorities and native English speakers in a low-income urban setting.

The Gates-MacGinitie Reading Comprehension assessment was used to identify students

who scored below the 35th percentile. Subsequent testing included vocabulary, decoding,

Page 25: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

17

17

Pag

e17

2

2 P

age1

72

2

passage fluency, and working memory assessments. Using latent class analysis, three

profiles were generated:

Automatic Word Callers: (18.3%) these students are characterized by above-

average pseudo-word reading accuracy, substantially below-average vocabulary

skills, and average range fluency skills;

Slow Word Callers: (60.3%) these students are characterized by above-average

pseudo-word reading accuracy skills, far-below-average vocabulary skills, and

low-average fluency skills; and

Globally Impaired Readers: (21.4%) these students are characterized by below-

average performance on all measures.

An analysis of these profiles found that language status was not a predictor of

membership in the profiles.

Dennis (2013) evaluated 6th, 7th, and 8th grade students who failed the state

reading assessments the previous school year. The researcher included assessments for

phonemic awareness, phonics, fluency, vocabulary, and comprehension and identified the

study as a purposeful extension of the work of Valencia and Buly (2004) with a different

sample. Using cluster analyses, she identified four profiles using three factors which she

labelled as meaning, decoding, and rate:

Cluster 1: Slow and Steady Comprehenders (24%) These students have the

highest scores in meaning with low scores in decoding and rate;

Cluster 2: Slow Word Callers (26%) These students have highest scores in

decoding, with average meaning scores and low rate scores;

Page 26: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

18

18

Pag

e18

2

2 P

age1

82

2

Cluster 3: Automatic Word Callers (24%) These students are highest in

decoding, with average rate scores and low meaning scores;

Cluster 4: Struggling Word Callers (26%) These students are highest in rate,

with average comprehension scores and low decoding scores.

These profiles demonstrate that non-proficient students have capabilities and areas that

require intervention.

Meyer et al. (2013) evaluated 5th and 6th grade students, who were identified as

non-proficient because they scored below the 50th percentile on mandated end-of-grade

standardized reading tests. The authors assessed word recognition in isolation, oral

reading accuracy and rate, vocabulary, and comprehension. They categorized students as

high or low on each assessment area (print processing and vocabulary) to develop four

profiles:

Cluster 1 (26%) These students have high scores on both print processing

measures and vocabulary measures;

Cluster 2 (48%) These students scored high on print processing measure but low

on vocabulary measures;

Cluster 3 (12%) These students scored low on print processing but high on

vocabulary measures; and

Cluster 4 (14%) These students scored low on print processing and vocabulary

measures.

Although the researchers assessed comprehension, it was not directly included as a part

of the profiles generated.

Page 27: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

19

19

Pag

e19

2

2 P

age1

92

2

Before determining the extent to which each of these studies supports the

identification of a common set of profiles or an underlying structure, I examined each

study to see if the same dimensions of reading performance were evaluated. For this

purpose, I used the five components of reading, as identified by the National Reading

Panel (NICHD, 2000). Each of these components is interrelated, thereby raising the

possibility of students being strong or weak in one or more areas. While researchers have

criticized the Panel for its procedures and subsequent policies regarding literacy

instruction (Allington, 2009), there is general agreement as to the importance of their

components with promoting reading development. They include: (a) phonemic

awareness, (b) phonics, (c) fluency, (d) vocabulary, and (e) comprehension. A definition

of each component is presented below (Learning Point Associates, 2004):

A. Phonemic awareness —an awareness of and the ability to focus and

manipulate the individual sounds (phonemes) in spoken words. It includes

skills such as isolating phonemes, blending onset-rimes, blending and deleting

phonemes, adding and substituting phonemes, and segmenting words into

phonemes. This part of reading instruction allows students to understand

spoken words are made up of individual sounds.

B. Phonics —the study and use of sound/spelling relationships and syllable

patterns to help students read written words. It includes skills such as

identifying letters and sounds, and blending sounds. This is a part of reading

instruction that should not become a dominant component of a reading

program, but a means towards the end goal of reading, comprehension.

Page 28: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

20

20

Pag

e20

2

2 P

age2

02

2

C. Fluency —reading text with sufficient speed, accuracy and expression to

support comprehension. This includes grouping words into phrases that are

easier to read. A lack of fluency requires a reader to use cognitive resources

for reading the words that could be used to make meaning from the text.

D. Vocabulary —the body of words and their meanings that students must know

and understand to comprehend text. This includes skills such as word parts or

roots and the use of context clues to gain the meaning of unknown words. In

order for a student’s vocabulary to increase, they must come in contact with

words outside his or her current vocabulary.

E. Comprehension —the ability to make meaning requiring specific skills and

strategies, vocabulary, background knowledge and verbal reasoning skills.

Comprehension includes strategies such as comprehension monitoring, asking

and answering questions, using prior knowledge, and summarizing what has

been read. This is the final goal of reading instruction.

Table 1 illustrates similarities and differences between the reading profile studies,

only Buly and Valencia (2002) and Dennis (2013) assessed each of the National Reading

Panels (NICHD, 2000) five components of reading in their identification of reading

profiles. Based on their factor analysis, phonemic awareness and phonics were included

under the label “word identification” (Buly & Valencia, 2002) and vocabulary and

comprehension were included under the label “meaning.” For both studies, fluency or

rate remained as an isolated factor. There are direct similarities between the strengths

and weaknesses identified in four of Buly’s and Valencia’s (2004) profiles and those

Page 29: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

21

21

Pag

e21

2

2 P

age2

12

2

identified by Dennis (2013). The “Disabled Readers” group (9%) was not identified by

Dennis; quite possibly because she included older students, representative of multiple

grades, who probably did not have low scores on her word recognition measures because

such indices were designed primarily for younger students. For the profile, “Slow and

Steady Comprehenders”, it is possible that these students were placed in other categories

because of the nature of clustering process. Both Buly and Valencia (2004) and Dennis

(2013) used the cluster analysis methodology, but there was not a consistent description

across studies explaining the exact steps and criteria used to generate the profiles to

determine if this could be a cause for the difference in the number of profiles generated.

Page 30: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

22

Pag

e22

2

2 P

age2

22

2

22

Table 1

Comparison of Areas Assessed in Reading Literature

Buly &

Valencia

(2002, 2004)

Leach,

Scarborough, &

Rescorla (2003)

Rupp &

Leseaux

(2006)

Pierce,

Katzir,

Wold, &

Noam

(2007)

Leseaux &

Kieffer (2010)

Dennis

(2013)

Meyer et al.

(2013)

Phonemic

Awareness

X X X

Phonics X Spelling X X X X

Fluency X X X X X X

Vocabulary X X X X X X

Comprehension X X Working

Memory

Text Skills Working

Memory

X

Number of

Profiles

10 then 6 4 4 4 3 4 4

Page 31: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

23

23

Pag

e23

2

2 P

age2

32

2

Pag

e23

22

While these two studies failed to identify a similar number of profiles, they do

support the idea of a common underlying structure for non-proficient students. The most

common underlying structures were word recognition, fluency, and meaning

(comprehension and vocabulary). Based on the nature of the sample and analytical

procedures, researchers might discover additional profiles, particularly if the two

components of meaning – vocabulary and comprehension—formed separate factors. A

similar argument could be made if researchers evaluated the reading profiles of beginning

readers and the components of word recognition – phonemic awareness and decoding—

split into separate factors. Regardless, at this point, Buly and Valencia (2002) and

Dennis (2013) identified multiple profiles and their findings provided an underlying

structure by which educators could develop more multi-dimensional intervention

programs to meet non-proficient students reading needs.

Each of the remaining studies identified different profiles based on how the

authors assessed reading. For example, Leach, Scarborough, and Rescorla (2003) and

Rupp and Leseaux (2006) assessed fluency, but neither study found a separate factor for

this construct. Furthermore, studies by Pierce, Katzir, Wold, and Noam (2007); Leseaux

& Kieffer (2010); and Meyer et al. (2013) used different assessments for comprehension

and word recognition. Despite these differences in how constructs were assessed across

each of these studies, an underlying structure still appeared whereby non-proficient

students had difficulty with one of more of the identified areas.

After examining these studies, no one set of profiles represented all non-proficient

readers; there are multiple profiles with students’ performances varying across the

Page 32: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

24

24

Pag

e24

2

2 P

age2

42

2

Pag

e24

22

dimensions of word recognition, meaning (vocabulary and comprehension), and fluency.

There is heterogeneity within the non-proficient student classification and this finding

requires the development of corresponding interventions to address the unique needs of

students. What still needs to be discovered, however, is whether different motivational

profiles exist among categories of non-proficient students, similar to what was discovered

in reading by Buly and Valencia (2002) and Dennis (2013). This link between

motivation and reading is important because numerous studies document the critical role

of motivation in understanding students’ academic performances (Mazzoni, Gambrell, &

Korkeamaki, 1999). The next section will be used to evaluate existing students focused

on motivation for reading.

Review of Motivation Profiles Literature

Consistent with the expectancy/value motivation theory, students’ willingness to

invest time and effort in academic studies depends on their expectations for success and

the perceived value of achievement (Atkinson & Feather, 1966; Eccles, J. S., et al., 1983;

Heckhausen, 1977). Researchers view motivation as complex and domain specific (Paris

& Turner, 1994; Wigfield, Guthrie, Tonks, & Perencevich, 2004) and a multifaceted

process, inclusive of choices and beliefs (Watkins & Coffee, 2004; Wigfield, Guthrie,

Tonks, & Perencevich, 2004). This process explains why students either approach or

avoid a task and the reasons for their engagement or lack thereof. Thus, on a daily basis,

motivation helps teachers to understand what attracts a student to start, continue, end, or

avoid an activity (Graham & Taylor, 2002). It is a key factor in understanding students’

choice to read, beliefs about reading, and the value they place upon the act of reading

Page 33: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

25

25

Pag

e25

2

2 P

age2

52

2

Pag

e25

22

(Cambria & Guthrie, 2010; Eccles, Wigfield, & Schiefele, 1998; Schiefele, 1999).

Similar to reading profile research, teachers who employ a one-size-fits-all model to

promote motivation, quite possibly, fail to acknowledge its multifaceted nature, thereby

ignoring the individualized needs of some students (Valencia & Buly, 2004).

Motivation researchers have used the expectancy/value theory to conceptualize

approaches for understanding the non-cognitive factors impacting student achievement.

One popular approach focuses on learned helplessness, defined as a lack of persistence in

tasks that could realistically be mastered, usually because of a lack of effort caused by

repeated failures (Luchow, Crowl & Kahn, 1985). Such behaviors are problematic

because when students lack persistence, they give up, and thus have minimal chance for

success. Another example is the study of anxiety. People experience high levels of

anxiety when they believe that they are not competent to perform a certain behavior

(Stumpf, Brief, & Hartman, 1987), which interferes with their ability to attend, thereby

having a negative influence on their beliefs and efficacy for learning. Each example,

learned helplessness and anxiety show how students who lack expectancies for success

and do not value learning become alienated from their academic studies. This lack of

engagement leads to failure and eventually to being labeled as non-proficient. While

these approaches help educators to understand the behaviors and attitudes of non-

proficient students, they were not specific to any particular discipline; thus, we do not

know how these profiles apply to reading.

Wigfield and Guthrie (1995) used the expectancy-value theory to bridge the gap

between reading and motivation research (Atkinson & Feathers, 1966; Heckhausen,

Page 34: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

26

26

Pag

e26

2

2 P

age2

62

2

Pag

e26

22

1977, 1991). They developed the Motivations for Reading Questionnaire (MRQ) to

define and evaluate students’ expectations and values regarding their motivation for

reading. Initially, this measure had 82 items, most of which were taken from Eccles’

Achievement Motivation Research Project (Eccles et. al, 1983) – ability and efficacy

beliefs, subjective task values, achievement goals, intrinsic motivation, along with items

related to attitudes about reading and motivation for reading.

In discussing motivation for reading, I will focus mainly on three studies. I will

summarize motivation prior to returning to these studies to look at how they attempted to

bridge the gap between motivation and reading. In the first study, Wigfield and Guthrie

(1997) studied 4th and 5th graders across two semesters and identified 53 items and 11

constructs using the MRQ. The constructs and their definitions are presented in Table 3

(Wigfield, 1997). Using factor analysis, the researchers determined the existence of three

higher order dimensions of motivation from these 11 constructs. The three dimensions

are Extrinsic Motivation (social, efficacy, involvement, curiosity, recognition, and

challenge), Intrinsic Motivation (Compliance, grades, recognition (Spring), and

importance), and Competition and Work Avoidance (Wigfield & Guthrie, 1997). The

dimensions’ structure was relatively stable across the two semesters.

In the second study, Baker and Wigfield (1999) extended the initial work of

Wigfield and Guthrie (1997) by directly examining links with motivation and reading

achievement, examining differences in motivation based on student characteristics, and

determining what motivational profiles exist for students by using data from the MRQ,

Reading Activity Inventory, Gates-MacGinitie Reading Test, Comprehensive Test of

Page 35: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

27

27

Pag

e27

2

2 P

age2

72

2

Pag

e27

22

Basic Skills (Standardized Assessment) and a performance assessment. They used

confirmatory factor analysis of their 5th and 6th graders data to validate the identification

of the 11 constructs of Wigfield and Guthrie (1997).

Cluster analysis of the MRQ placed students into seven clusters (profiles):

1. Very Low Reading Motivation (n=14, 4%) The students in this cluster are

characterized by low scores across all constructs except work avoidance, in which

they scored the highest, just below the mean.

2. Low Reading Motivation (n=40, 11%) The students in this cluster had low ratings

across nine constructs with the exception of work avoidance, which were the

highest in this cluster. Their scores for competition were slightly below the mean.

3. Low Competition, Efficacy and Recognition (n=28, 8%) The students in this

cluster had the lowest scores in competition, efficacy and recognition and slightly

below average scores for compliance with the remaining constructs having

average scores.

4. Low Importance (n=28, 8%) The students in this cluster had average scores on

eight constructs. Importance scores were lowest, well below the mean, and

competition scores were slightly below the mean, with social scores falling just

above the mean.

5. Competitive and Work Avoidant (n=80, 21%) The students in this cluster have

average scores on eight constructs. Challenge scores were slightly below the

mean with competition and work avoidance scores falling just above the mean.

Page 36: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

28

28

Pag

e28

2

2 P

age2

82

2

Pag

e28

22

6. Low Competition and Work Avoidance; High Importance and Compliance

(n=58, 15%) The students in this cluster had average scores on seven constructs.

Work avoidance and competition scores fell slightly below the mean with

compliance and importance scores slightly above the mean.

7. High Reading Motivation (n=123, 33%) This cluster contained the largest

percentage of students who had scores above the mean in all areas, with the

exception of work avoidance that fell slightly below the mean.

These clusters underscore the heterogeneity of students’ motivation for reading as

students were somewhat evenly split among the clusters, with no cluster containing more

than 33% of the sample. Students in the first two clusters, labeled very low and low

reading motivation were characterized by high work avoidant scores. Students in the last

two clusters Low Competition and Work Avoidance and High Reading Motivation were

the opposite with low work avoidance scores. The remaining clusters varied by levels of

competition, need for recognition, importance attached to reading, and efficacy. Only in

one cluster, low competition and work avoidance: high importance and compliance, did

students demonstrate high scores on all the positive reading indices.

In the third study, Guthrie, Coddington, and Wigfield (2009) determined that

existing motivation literature lacked a specific focus on the constructs (intrinsic

motivation, avoidance, self-efficacy and perceived difficulty) that contribute to positive

motivation or undermine motivation for reading. With this in mind, they wanted to

capture and better understand the relationships between these constructs. They evaluated

5th grade African American and Caucasian student responses for only two of the four

Page 37: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

29

29

Pag

e29

2

2 P

age2

92

2

Pag

e29

22

motivation dimensions: intrinsic motivation and work avoidance to identify four profiles.

Using a different analytical approach, the researchers ordered students’ scores from

highest to lowest, then separated students into distinct groups of high or low by splitting

the scores at the median. This approach allowed the researchers to form profiles

consisting of clear independent constructs. They identified four profiles. They included:

1. Avid (high intrinsic and low avoidance) students who have reading interests,

enjoy reading in and out of school, and do not avoid school reading.

2. Apathetic (low intrinsic and low avoidance) students who are low on intrinsic

reading and avoidance of reading.

3. Ambivalent (high on intrinsic and high avoidance) students who have intrinsic

motivation for some texts but not others with avoidance of reading high for

some kinds of reading.

4. Averse (low intrinsic and high avoidance) students who are actively opposed

to most kinds of reading and possess few reading interests.

The methodology used allowed for clearly defined profiles. Each profile contains

students with either high or low scores on each of the two constructs.

Across the three studies, given the consistency with some constructs and the

stability of their findings at different points of time, these results underscored the

potential for the MRQ to develop specific profiles for different types of non-proficient

readers. The one challenge to developing these profiles relates to the types of reading

measures included or not included in these studies.

Page 38: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

30

30

Pag

e30

2

2 P

age3

02

2

Pag

e30

22

In the first study, Wigfield and Guthrie (1997) used the Reading Activity

Inventory, a measure of the breadth and depth of students’ personal reading habits as

their reading component. They used this instrument because such behaviors were a

strong predictor of reading achievement. They found the social, self-efficacy, curiosity,

involvement, recognition, grades, and importance constructs had the strongest

relationship to reading activity (Wigfield & Guthrie, 1997). Because this measure looked

at personal reading habits instead of actual classroom performances, it is of limited use

for identifying different profiles in the classroom for non-proficient readers.

In the second study, Baker and Wigfield (1999) used the Reading Activity

Inventory, the Gates-MacGinitie Reading Test, the Comprehensive Test of Basic Skills

(Standardized Assessment) and a researcher-developed performance assessment, where

students read passages and answered open-ended questions to measure reading

performance. With personalized reading habits, the results were obvious – students who

had the greatest breadth and depth with their reading habits had more positive

motivational outcomes than did students who read less widely or frequently. With the

formal reading measures, no differences were found between the clusters. Finally, with

the performance measure, only one difference was discovered: students in the sixth

profile significantly outscored students in the other profiles.

In the third study, Guthrie, Coddington, and Wigfield (2009) used the Gates-

Macginitie Reading Test, Woodcock-Johnson Fluency Test, and researcher-developed

word recognition test to measure reading performance. The only finding was a positive

relationship between scores on the Gates-McGinitie and intrinsic motivation for the

Page 39: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

31

31

Pag

e31

2

2 P

age3

12

2

Pag

e31

22

Caucasian students. No such relationship was found for the African-American/Black

students.

The existing research has supported the need to understand non-proficient

students from both reading and motivation for reading perspectives. What is missing is

an understanding of which non-proficient students should be the focus of research. The

next section will examine the research to determine which groups of students are not

adequately represented and why they should be the focus of future research.

Need for Marginalized Groups

While there is evidence of a need to understand non-proficient students from both

a cognitive and non-cognitive perspective, there is a need to understand specific groups

of these students, particularly those students who have not performed well despite

schools’ efforts to address their needs. Au (2009) referred to students, primarily minority

students with families who lack economic resources, as residing in “zipcode” schools

because their geography was a better predictor of their overall achievements in public

schools than any other measure of their performance. One possible explanation for this

effect is the nature of our reforms to improve their performances, in that, students might

need interventions that go beyond the present focus on improving word recognition

abilities. As a result, I argue that we need accurate evaluations of their reading strengths

and weaknesses if we are to break this cycle of underachievement for these student

populations.

Since the Coleman Report in 1966, a distinguishable performance gap has been

identified between Caucasian and African American students has been a concern for

Page 40: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

32

32

Pag

e32

2

2 P

age3

22

2

Pag

e32

22

educators, researchers, and policy makers (Clotfelter, Ladd, & Vigdor, 2009). While the

term achievement gap carries many negative connotations, this term is used as this term

was used by researchers to support the gap. The preferred term would be opportunity gap

as this removes the deficit perspective present within the other term. As well, the

achievement gap has been noted between students from low and high socioeconomic

backgrounds (Milner, 2013). This achievement gap is wide and despite numerous efforts

and initiatives, continues to exist (Lee, Grigg, & Donahue, 2007). NCLB placed a strong

focus on the differences in achievement by identifying equal performance requirements

for each group of students that must be met each year through minimal growth

requirements. To close the performance gap for minority students, schools must provide

high quality instruction (Au, 2009) and understand the performances represented in each

group. Without a direct focus on understanding what reading and motivational constructs

are represented in these groups, there will continue to be interventions and strategies that

fail to address the needs of students from these marginalized groups. To understand the

need for representation of certain students, a few questions must be answered. Is there an

opportunity (achievement) gap? What specific groups are those which are truly a part of

this gap?

The most recent NAEP assessment results (National Center for Education

Statistics, 2015) showed a 1% point decrease (4th grade) and 2% increase (8th grade) in

the percent of student scoring below proficient on this assessment since the previous

assessment. Despite continued efforts at school reform via mandates, over half of our

students are still unsuccessful on reading assessments. Although we see a lack of success

Page 41: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

33

33

Pag

e33

2

2 P

age3

32

2

Pag

e33

22

overall, there are differences in achievement for many of the groups who are identified

through the NCLB legislation as marginalized groups. To visually represent the trends of

performance and the gaps that exist between the marginalized and non-marginalized

groups, I created the line graphs representing differences in proficiency for minority and

non-minority groups. These graphs represent the fourth-grade reading proficiency

performance on the NAEP from 2002, the first assessment prior to NCLB, through 2015,

the latest administration of the NAEP (National Center for Education Statistics, 2015).

The first graph represents the performance trends for Caucasian, African American, and

Hispanic students.

Figure 1 illustrates several key accountability patterns related to ethnicity. First,

there is an increase in the performance for the three groups from 2003 through the last

administration in 2015. Second, while there has been an increase in performance for

these three groups, the achievement gap between Caucasian students and ethnic minority

students continues to exist. The achievement gap since 2002 between Caucasian and the

ethnic minority groups has decreased minimally, but at no point did the performance

level for African American or Hispanic students meet the baseline of Caucasian students

after implementation of NCLB in 2002, Last, between the previous (2013) and current

administration (2015) of the NAEP, there was no growth noted for the Caucasian nor

African American groups, with a single percentage point increase for Hispanic students.

Page 42: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

34

34

Pag

e34

2

2 P

age3

42

2

Pag

e34

22

Figure 1. NAEP Trends Ethnicity Comparisons

Need for Economically Disadvantaged Students

Similar to the first graph, Figure 2 identifies some key accountability patterns

related to economic status. First, there has been significant positive growth since 2003

for both economically advantaged and economically disadvantaged students. Second,

while there has been growth noted, there is still a significant achievement gap between

students representing these two economic categories. Students who are from

economically disadvantaged situations continue to perform at lower levels despite

continuous legislation. At no point did students in the lower economic group match the

baseline performance of the higher economic group. Last, although there is growth,

students from economically disadvantaged situations are not growing at a rate

comparable to their counterparts. Economically disadvantaged students have grown a

total of 6% in the 12 years since NCLB, while their counterparts have grown 10%.

Page 43: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

35

35

Pag

e35

2

2 P

age3

52

2

Pag

e35

22

The achievement gap from assessment data is a reality for both economically

disadvantaged and African American students. Despite continuous reforms and

legislation, the gap does not appear to be closing, but is larger than when initially

assessed via NAEP reading assessments. Therefore, it is essential to include these

marginalized populations in research to better understand their patterns of cognitive

and/or non-cognitive factors. It is essential because policy effects and responses must be

carefully studied to understand the impact on closing achievement gaps for marginalized

groups.

Figure 2. NAEP Trends Economic Comparisons

Page 44: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

36

36

Pag

e36

2

2 P

age3

62

2

Pag

e36

22

Are These Groups Represented?

I will use this next section to evaluate some of the existing literature on non-

proficient readers and motivation for reading to determine if marginalized minority

students such as African American, Hispanic, and Economically Disadvantaged students

were adequately represented. Although NCLB provides a direct focus on several

marginalized groups, for purposes of this study, I will place emphasis on the

representation of Ethnic Minorities and Economically Disadvantage students as these

groups have shown historically wide achievement gaps with their counterparts (see

NAEP graphs above).

Buly and Valencia’s (2002) work has served as a guide for many later research

studies using reading profiles to understand non-proficient students. Neither of their

publications adequately identified those characteristics of their students to determine if

they included the previously stated populations. The sample was taken from one school

district from the northwestern part of the U.S. with 57% of the students Caucasian and

43% noted as students of color. From this 43%, 11% were Hispanic and 11% were

African-American. Last, 47% of the students were considered economically

disadvantaged. While this study has great implications for research and practice, it

described the district’s characteristics and not the sample’s.

Leach, Scarborough, and Rescorla (2003) used eight literacy measures of

students’ word recognition including fluency, vocabulary, and comprehension abilities

for 3rd, 4th, and 5th grade students (n=161) with reading difficulties. The researchers split

students into groups with high and low combinations of word recognition and

Page 45: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

37

37

Pag

e37

2

2 P

age3

72

2

Pag

e37

22

comprehension performance to identify four profiles representing the patterns of these

students. While their study had a unique sample in the inclusion of students with

exceptional needs, their study did not adequately address other marginalized groups. Of

the 161 students representing 12 elementary schools, only half of the schools had less

than 9% of economically disadvantaged students. The other half ranged from 12 to 60%

of students from economically disadvantaged households. Of the sample, 95% of the

students were Caucasian, and the remaining percentage was a mix of African American,

Asian, and Hispanic. When looking at marginalized groups identified from NCLB, the

study did address students with special needs; however, it did not adequately represent

the patterns of performance for students who are minorities nor those who are

economically disadvantaged.

Pierce, Katzir, Wolf, and Noam (2007) evaluated urban 2nd and 3rd grade students

who scored more than two-thirds of a standard deviation below the mean on one of the

subtests and/or composites of the Test of Word Reading Efficiency (TOWRE)

assessment. They used factor analysis and a high/low split of scores to form their four

profiles. The 140 students in their study were from five schools between Phoenix and

Boston. The sample was composed of 60% Hispanic and 12% African American

students. There is not an adequate representation for African American students.

Additionally, while researchers described the income levels of students by sharing the

percentage of mothers and fathers who earned less than $35,000 a year, it was not linked

to its specific populations. To interpret poverty level, readers would need to know the

family size as well as specific amount of money each parent earns, therefore this

Page 46: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

38

38

Pag

e38

2

2 P

age3

82

2

Pag

e38

22

information will not be considered to adequately represent the economically

disadvantaged NCLB group.

Leseaux and Kieffer (2010) identified the literacy profiles of students in 6th grade

who were language minorities and native English speakers in a low-income urban setting.

Their sample of 262 represented students from five middle schools and one elementary

school where 201 were language minority students (English as a Second Language) and

61 were native English speakers. The six schools had low-income populations ranging

from 44% to 100%, and 10% of the sample had special education designations. When

thinking about marginalized groups, this study adequately addresses the English

Language Learners (ELLs) population; however, we are unaware of the numbers of

ethnic groups. The inclusion of students from economically disadvantaged populations

was difficult to infer because it is not specifically identified.

Dennis (2013) evaluated 6th, 7th, and 8th grade non-proficient students who failed

the state reading assessments the previous school year. The sample for the study

included 94 middle school students from four middle schools. Of those 94, 56% were

Caucasian, 36% were African American, and 7% Hispanic. The percentage of students

from economically disadvantaged households was 82% with 36% receiving special

education services and 10% being classified as English learners. The representation of

African American students was marginal, which did not provide adequate representation

to understand the reading constructs present within the group. The sample does

adequately represent students from economically disadvantaged groups. With 82%

Page 47: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

39

39

Pag

e39

2

2 P

age3

92

2

Pag

e39

22

represented, these findings do have some generalizability to other students from

economically disadvantaged groups.

Meyer et al. (2013) evaluated 5th and 6th grade students who were identified as

non-proficient on a mandated end-of-grade standardized reading tests. The sample was

barely described in detail in that 65 students were from a rural, small school system with

a total of 1,800 K-12 students. No information related to ethnicity or socio-economic

status was included.

The motivation studies did not provide adequate descriptions of the sample.

Baker and Wigfield (1999) included a sample of 371 students from five schools from a

large mid-Atlantic U.S. city. They did not have ethnicity and income level information

for 75 students due to a school removing itself prior to the final data collection. For the

sample information known, 52% were Caucasian, 46% were African American, and 54%

were economically disadvantaged. Additionally, they did not explain the percentage of

economically disadvantaged within the sample. Guthrie, Coddington, and Wigfield

(2009) included 245 5th grade students from three schools in a mid-Atlantic state. The

sample was 76% Caucasian, 24% African American, and 10% received special education

services. No information was reported to describe the income status of the participants.

Wigfield and Guthrie (1997) sample was 70% Caucasian and 30% African American

students and the income status was not reported.

While each of these studies added pertinent information to the literature in their

respective areas, do they adequately represent the marginalized groups who have been

identified as non-proficient students? The answer is somewhat. There were positive and

Page 48: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

40

40

Pag

e40

2

2 P

age4

02

2

Pag

e40

22

negative implications from existing literature. First, the studies have some adequate

representation of students from some marginalized groups, including special needs,

Hispanic students, ELLs and economically disadvantage students. While these groups

were represented, they were inconsistent in their representation. Second, within several

studies, there was a failure to identify the specific sample or to make a general reference

to the population. This limits knowledge of who the actual sample included. Next, with

the exception of one study, the others all included samples representative of three or more

schools (with some of these schools not having high percentages of minority or

economically disadvantaged populations). The schools included lacked a high minority

or low-income population. Last, with all of the studies, there was not an adequate

representation of African American students.

Recommendations

While these studies supported the notion of non-proficient students in reading

demonstrating variability in their motivational profiles, there are several caveats

regarding what needs to be accomplished in order to more accurately identify the

underlying dimensions of non-proficient readers’ motivational profiles. The following

recommendations are related to identification of these inclusive profiles.

First, based on its use in existing research studies, the MRQ is the best measure to

use when assessing the motivational profiles of non-proficient elementary grade students.

With the complexity of motivation, it needs to be assessed with instruments that are

specific to the content area of reading and age/grade appropriate. The MRQ encompasses

a holistic representation of a student’s motivational dimensions for reading through its

Page 49: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

41

41

Pag

e41

2

2 P

age4

12

2

Pag

e41

22

comprehensive measures. With the strong prevalence of the MRQ for developing an

understanding of students’ motivation for reading, it can serve as the foundation for

understanding the motivational components of non-proficient students.

Other researchers (e.g., Chapman & Tunmer, 1995; Gambrell, Palmer, Codling, &

Mazzoni, 1996; McKenna, M.C., Kear, D.J., & Ellsworth, R. A., 1995) developed similar

instruments to assess motivational constructs. Each of these instruments has smaller

subset scales that are directly related to those of Wigfield and Baker (1997). The MRQ

has been utilized in research on reading motivation with students in upper elementary

grades and early adolescent age to examine and determine the dimensions that exist for

children’s motivations for reading. With the inclusion of these areas, it is the most

relevant to use with developing a better understanding of students reading values, beliefs,

and behaviors in grades 3-5.

Second, similar to reading profile research, what is important is not the number of

profiles but the underlying structures of constructs. There is a need to include all the

items on the MRQ to allow underlying motivational dimensions to be revealed through

the appropriate methodology. Therefore, I believe a confirmatory factor analysis should

be used with the MRQ and exploratory factor analysis with reading assessments, similar

to what has been done by previous researchers. Using factor analysis, Wigfield and

Guthrie (1997) confirmed the existence of their suggested 11 motivation for reading

domains with their instrument; while Buly and Valencia (2002) and Dennis (2013)

identified underlying constructs of meaning, fluency, and decoding. While previous

researchers identified these constructs and dimensions, are they consistent across multiple

Page 50: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

42

42

Pag

e42

2

2 P

age4

22

2

Pag

e42

22

samples? Watkins and Coffey (2004) found that there is variability within the way the

motivational constructs are represented in students. Along these lines, the same

variability could exist for the reading factors.

Third, a specific recommendation based on existing motivational profiles is that

profiles be examined that specifically reflect the motivational values of non-proficient

students. The existing profiles focus on ethnic or economic basis groups without a focus

on non-proficient students as a special group. Future research should identify students

who are non-proficient with a goal of understanding the relationship between their

motivation and literacy profiles.

Fourth, in future studies, certain elements should be included related to non-

proficient students. These elements include addressing students from economically

disadvantaged homes, placing an emphasis on ethnic minorities. The current studies seek

to identify causes of disparities for non-proficient students, but in truly addressing the

demands of NCLB and other accountability legislation, there should be inclusion of

students from economically disadvantaged homes, placing an emphasis on ethnic

minority students, and including students, who are from schools with high percentages of

economically disadvantaged students. The achievement gap is impacted by school

experiences and research findings (Burchinal et al., 2011); therefore, researchers must

include students from these specific marginalized populations with historic achievement

gaps.

The final recommendation is to link motivation with reading. Without this

connection, educators will not understand how these two important areas are linked. A

Page 51: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

43

43

Pag

e43

2

2 P

age4

32

2

Pag

e43

22

fuller understanding of this connection will help educators to differentiate their

instruction.

Page 52: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

44

44

Pag

e44

2

2 P

age4

42

2

Pag

e44

22

CHAPTER III

METHODOLOGY

Research Design

The goal of the study was to identify reading and motivational profiles of non-

proficient marginalized students in zipcode schools, where schools within geographical

areas tend to represent a specific socio-economic status and ethnic makeup. With this

goal, I planned to discover different patterns of reading and motivation for reading of this

sample of students. The research questions suggested a quantitative study using several

multivariate analysis techniques to generate and analyze the reading and motivational

profiles of non-proficient students across the upper elementary grades. The research

questions that guided this study were as follows:

1. What trends of reading and motivation for reading are represented in 3rd, 4th

and 5th grade non-proficient students?

2. What underlying motivational and reading constructs represent non-proficient

3rd, 4th and 5th grade students?

3. What are the reading profiles of 3rd, 4th and 5th grade non-proficient students?

4. What are the motivation profiles of 3rd, 4th and 5th grade non-proficient

students?

5. What are the reading and motivation profiles of 3rd, 4th and 5th grade non-

proficient students?

Page 53: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

45

45

Pag

e45

2

2 P

age4

52

2

Pag

e45

22

6. What do these profiles tell us about the instructional needs of 3rd, 4th and 5th

grade non-proficient students?

Participants

The results of accountability legislation continue to directly impact students in

“zipcode” schools with large numbers of marginalized students who are living in poverty

and considered ethnic minorities (Gaddis & Lauen, 2014; Hursh, 2007; Jackson, Johnson,

& Persico, 2010). While these groups should be the focus of much research with reading

profiles, analysis of previous research demonstrated a lack of adequate representation of

these groups. As such, these schools and these students must be represented in research

to show the patterns that represent their reading and motivations for reading. Therefore,

my sample for this dissertation was purposefully representative of several of these

populations. For purposes of this study, non-proficient 3rd, 4th, and 5th grade students are

described as those who received a score of level 1 or 2 (out of 5) on the English

Language Arts End of Grade assessment for the previous academic year.

After university and district approval, contact was made with the principal and

administrative team of the identified schools. To follow FERPA and district regulations,

the schools were provided with recruitment letters and permission forms to share with

students who were eligible for the study (Scoring level 1 or 2 on the English Language

Arts End of Grade assessment). School administration determined eligible students based

on the non-proficient criteria and shared the forms with parents of those students as part

of the recruitment process. Upon return of the permission slips, students began

participation in the study.

Page 54: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

46

46

Pag

e46

2

2 P

age4

62

2

Pag

e46

22

The students were assessed by the author and his doctoral supervisor. Students

were assessed during times designated by the administration that limited distractions

from instructional time, including their intervention/enrichment and

specials/enhancements times. All measures were administered to students in three to four

sessions. During session one, students signed the assent form, then completed the

CTOPP, QRI word list, and the PPVT assessments. In the second assessment, students

completed the Narrative and Expository QRI’s. In the third session, in a group session, I

administered the Motivation for Reading Questionnaire to students. The maximum time

for assessment completion was two hours with data collection occurring between January

and March, 2017.

The research study was conducted in schools in two neighboring school districts

in North Carolina. Three schools were selected for participation to acquire as many

participants as possible that fit the desired criteria of “zipcode” schools and a high

percentage of minority and low socioeconomic status populations. The schools are

within a close proximity to each other. All students in each school received free or

reduce breakfast and lunch. One school was selected from District A and two from

District B. District A is a large school district that includes 40.62% Black/African-

American, 33.4% Caucasian, 15.2% Hispanic, 6.3% Asian, and 4.5% other race/ethnicity.

District B is smaller with 62.5% White, 20.2% Black/African-American, 11.4% Hispanic,

and 5.9% other race/ethnicity.

Demographic data summarizing the study were generated using SAS software

(SAS Institute, 2013). In District A, School 1 had a student population of 569 students---

Page 55: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

47

47

Pag

e47

2

2 P

age4

72

2

Pag

e47

22

67% Black, 17% Hispanic, 12% white and 4% other, with all students receiving free

lunch. The sample is representative of the overall school population with the majority of

the students identified as Black (81%), with the remaining 19% being made up equally

(6.3%) of White, Hispanic, and Multi-racial/Other students. The sample contained an

unequal ratio of male (40.6%) to female (59.4%) students. When examining the grade

and age frequencies, almost half of the sample consisted of third graders (46.9%) with

fourth (31.3%) and fifth grades (21.9%) representing the rest of the sample.

In District B, the superintendent selected two of the districts lower performing

schools that were both in close proximity to each other and also included high

percentages of minority non-proficient students. School 2 has a student population of 540

students---39.3% White, 31.5% Hispanic, 22.5% Black/African-American, and 6.7%

Multi-racial/Other, with all students receiving free lunch. The sample is similar to that of

the overall school population, the majority of the students (68.9%) identified as Hispanic

(36.4%) or White (32.5%) with the remaining 31.1% being similar with Multi-

racial/Other (16.9%) and African-American/Black (14.3%). The sample contained an

unequal ratio of male (50.6%) to female (49.4%) students.

School 3 has a student population of 417 students--57.1% Black/African

American, 11.3% Hispanic, 22.3% White, and 9.3% Multi-racial/Other, with all students

receiving free lunch. The sample is representative of the school population with the

majority of the students identified as Black (63%), with the remaining 37% being made

up of two of the other major ethnic group: White (28.3%) and Hispanic (8.7%). The

sample contained an unequal ratio of male (39.1%) to female (60.9%) students.

Page 56: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

48

48

Pag

e48

2

2 P

age4

82

2

Pag

e48

22

The combined sample included 187 non-proficient readers. The majority of the

sample were considered ethnic minorities (77.5%). Approximately half of the sample

were African-American/Black students (49.2%). The remaining half consisted of the

other three groups: White (22.5%), Hispanic (19.3%) and Other (9.1%). The sample

contained an unequal ratio of male (44.4%) to female (55.6%) students.

When examining the grade frequencies, almost half of the sample consisted of

third graders (44.4%), with fourth (28.9%) and fifth grades (26.7%), almost equally

representing the rest of the sample. The frequencies demonstrated the uniqueness and

importance of the sample in comparison to samples used within existing studies in that

this population primarily includes students of color from families with minimal economic

resources. A sample of this type is valuable as it is absent from existing research.

Measures

Reading is a complex process. Acknowledging this complexity, researchers have

assessed a variety of reading components, using a variety of assessments, to better

understand the patterns of students’ reading performance. With the need to focus on non-

proficient students, specifically ethnic and economic minorities, it is my belief that

assessments should be reflective of the five components of reading, as identified by the

National Reading Panel (NICHD, 2000). Each of these components is interrelated,

thereby raising the possibility of students being strong or weak in one or more areas.

While researchers have criticized the Panel for its procedures and subsequent policies

regarding literacy instruction (Allington, 2009), there is general agreement amongst

literacy researchers as to their importance with promoting reading development. They

Page 57: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

49

49

Pag

e49

2

2 P

age4

92

2

Pag

e49

22

include: (a) phonemic awareness, (b) phonics, (c) fluency, (d) vocabulary, and (e)

comprehension. Each component is described below (Learning Point Associates, 2004).

As identified in Chapter 2, only the work of Buly and Valencia (2002) and Dennis

(2013) assessed all five components in the creation of their reading profiles. An analysis

of their samples revealed neither of their samples adequately addressed African American

students. However, Dennis’ sample did include students identified as economically

disadvantaged. Therefore, to extend their work, I addressed the missing diversity within

the existing samples by including larger numbers of ethnic minority students (>50% of

the sample) and those students representing economically disadvantaged backgrounds.

Similar to the analysis of reading work, the existing motivational work fails to

adequately address African American and economically disadvantaged groups. In the

study by Baker and Wigfield (1999), we are not knowledgeable of the actual true sample

with such a large amount of missing demographic information. Although there was some

representation of these marginalized groups, it was not an adequate representation.

With the complexity of motivation, it needs to be assessed with instruments that

are specific to the content area of reading and age/grade appropriate. The MRQ

encompasses a holistic representation of a student’s motivational dimensions for reading

through its comprehensive measures. With the strong prevalence of the MRQ for

developing an understanding of students’ motivation for reading, it can serve as the

foundation for understanding the motivational components of non-proficient students.

Based on its use in existing research studies (Baker & Wigfield, 1999; Guthrie, et al.,

2007; Unrau & Schlackman, 2006; Wang & Guthire, 2004), the MRQ is the best measure

Page 58: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

50

50

Pag

e50

2

2 P

age5

02

2

Pag

e50

22

to use when assessing the motivational profiles of non-proficient elementary grade

students. The MRQ has been utilized in existing motivation research; however, the

instrument needs to be used with different samples to determine which of the suggested

domains are consistently represented (Yuan, 2005). Although the instrument has

theoretical support, the presence of each domain was validated using statistical analysis

to determine which were represented with this unique population of students.

In the next sections, I will present information on the suggested instruments to use

when collecting data to generate the reading and motivational profiles. A table is

included at the end of this section to summarize each area that will be assessed, the

assessments used, and the scores generated from the assessments (see Table 2). I will

start with summary information on each assessment. This will include an overview of the

reading construct(s) assessed by the instrument, how the assessment is administered, and

reliability and validity information for the instrument (if available). The assessments will

be summarized starting with the lower level reading skills moving to higher levels, then

finishing with motivation.

Phonemic Awareness

Phonemic awareness is the ability to recognize, identify, and manipulate sounds

in words. The Elision Subtest of the Comprehensive Test of Phonological Processing

(CTOPP) was used to measure phonemic awareness as it assesses the advanced levels of

the phonemic awareness developmental spectrum. This test required students to isolate

or remove syllables or phonemes within 20 spoken words (Wagner, Torgeson &

Rashotte, 1999). Although traditionally administered to student’s ages 5 and 6, the

Page 59: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

51

51

Pag

e51

2

2 P

age5

12

2

Pag

e51

22

Elision subtests have a form for older students allowing it to be administered to ages 7-

24. Test Reliabilities for the assessment range from .74 to .97 for the subtests with

internal consistency coefficients for all composites of the assessment were all .85 and

above.

Comprehension

The Qualitative Reading Inventory-4 (QRI) (Leslie & Caldwell, 2006) is a

commercial informal reading inventory that determines narrative and expository reading

levels for students. The QRI-4 can be used for multiple purposes. For this study, it was

used to assess fluency (reading text with sufficient speed to support comprehension), as a

measure of students’ rate in word identification (word list and accuracy), and as a

measure of comprehension (ability to make meaning from the text read). The fluency

and accuracy data were used to determine proficiency for each grade level text read,

therefore; for a student to achieve a grade level, they must have had sufficient accuracy

and fluency on that level text. For this dissertation, only the word list grade equivalent

scores and the highest instructional comprehension level grade equivalent scores for

expository and narrative texts were reported and analyzed.

The appropriate starting level for narrative text was determined by having

students read aloud word lists representing grade level words until they are no longer able

to read with at least 90% accuracy on the list. For expository text, based on pilot study

data collection, the assessors used their judgment based on the narrative text to start at the

narrative finishing level or one level below. For each text read, the assessor determined

the prior knowledge of the student for the text topic by asking the concept questions

Page 60: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

52

52

Pag

e52

2

2 P

age5

22

2

Pag

e52

22

followed by the student reading aloud predetermined narrative and expository passages.

All reading errors and times for reading were recorded to determine accuracy and rate.

The rate was not included in this study as this measure does not identify unique

information about students’ actual accuracy and rate independent of the text read. Upon

completion of oral reading, the student was asked a series of explicit (clearly stated in the

text) and implicit (implied or suggested, but not specifically stated in the text) questions.

The students read passages until the highest instructional reading level was determined

for each student with a minimum of 90% accuracy and 70-85% of the comprehension

questions answered correctly. Leslie and Caldwell (2006) reported inter-rater reliabilities

of .99 for oral reading miscues and .98 for comprehension. For data collection purposes,

the passages used span from QRI levels one to six.

Fluency

Fluency is a student’s ability to read text with sufficient speed, accuracy and

expression to support comprehension. The Dynamic Indicators of Basic Early Literacy

Skills (DIBELS) Oral Reading Fluency (DORF) (Good & Kaminski, 2011) assessment

was used to measure fluency of grade level text. Although the QRI can be used to

measure fluency of instructional level text, there is a specific need to understand student

reading fluency of grade level text with the demands of the ELA Common Core State

Standards (2010). The DIBELS DORF is a standardized, individually administered test

of accuracy and fluency with a set of three passages that are meant to identify children

who may need additional instructional support. It is also used to monitor progress of

students over time with grade level text. Each assessment measures the number of

Page 61: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

53

53

Pag

e53

2

2 P

age5

32

2

Pag

e53

22

correct words and accuracy of a students’ reading one minute of grade level text. Student

performance is measured by having students read three passages aloud for one minute

each followed by the student retelling what was read. Words omitted, substituted, and

hesitations of more than three seconds are scored as errors, while self-corrections are

scored as accurate. The number of correct words per minute from the passage is the oral

reading fluency rate. The DORF assessment is a required assessment for students in

grade K-3 as a part of the state assessment framework; however, two of the schools

currently administer the assessment to students in grades K-5. School two does not

administer the assessment to all students K-5. The researchers administered this

assessment to the students.

Good and Kaminski (2002) reported test-retest reliabilities for elementary

students ranging from .92 to .97 with alternate-form reliability of different reading

passages drawn from the same level ranging from .89 to .94. Criterion related validity

has been found in the range of .66-.77 for word correct and .54-.68 for accuracy. This

assessment was used to measure phonics (word recognition) and fluency (accuracy) for

the participants.

Vocabulary

Vocabulary has been shown to be a predictor of reading comprehension. In order

to acquire information on vocabulary knowledge independent of decoding ability, the The

Peabody Picture Vocabulary Test-Revised (PPVT) (Dunn & Dunn, 2007) was

administered to each student. Buly and Valencia (2002) and Dennis (2013) used this

assessment as it measures receptive vocabulary knowledge not dependent upon the

Page 62: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

54

54

Pag

e54

2

2 P

age5

42

2

Pag

e54

22

students’ decoding abilities. The assessor started the assessment with pictures that

represent the students’ suggested knowledge based on his/her age. The assessor said a

term while the student looked at four pictures and identified the picture that matched the

term. The assessor continued to present pictures until the student missed eight items in a

section. The PPVT provides a norm scaled score and grade equivalent with a test retest

reliability of .77 for standard scores. This assessment was used to measure vocabulary

for the participants.

Reading Proficiency

The North Carolina English Language Arts (ELA) READY End-of-Grade

Assessments (EOG) is a curriculum-based achievement test administered in grades 3–8.

Third grade students participate in a Beginning of Grade (BOG) assessment. Only fourth

and fifth grade students have EOG scores to determine proficiency for the study, so BOG

scores for third graders were used to determine their proficiency. The ELA/Reading

assessments are aligned to the Common Core State Standards (2010). The ELA

assessment is administered in a paper-and-pencil format with an initial time allotted of

180 minutes with up to an additional 60 minutes testing time if needed. The reading

selections are comprised of narrative and expository selections based on the Common

Core State Standards. Knowledge of vocabulary is assessed indirectly through

application and understanding of terms within the context of the selection and questions.

The EOG assessments of ELA/Reading at grades 3–5 contain 52 total test items, while

the BOG contains 44 items. Access to students BOG/EOG scores for this assessment are

Page 63: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

55

55

Pag

e55

2

2 P

age5

52

2

Pag

e55

22

not available to researchers per district policies. Because of this, the schools identified

candidates for participation with BOG/EOG scores of a level 1 or 2.

Motivation

Motivation is complex and domain specific (Wigfield, Guthrie, Tonks, &

Perencevich, 2004), which implies that it can change based on content area and thus will

need to be assessed with instruments that are specific to assessing motivation in that

content area. The MRQ is a domain-specific instrument used to assess the

multidimensionality of reading motivation. In comparison to other motivation

instruments, the MRQ addresses multiple dimensions of motivation within the area of

reading. It has been used by previous researchers to assess students’ motivation and with

its ability to measure multiple motivations for reading, (Guthrie, et al., 2007; Unrau &

Schlackman, 2006; Wang & Guthire, 2004; & Wigfield & Baker 2009) it was the best

option for my dissertation.

The MRQ is a student self-rated assessment of the extent to which each student is

motivated to read (Wigfield & Guthrie, 1997). It contains 53 items that each student

completes independently within a group of approximately 10-15 students (see Appendix

A for MRQ items). The response format ranges from 1= “very different from me” to 4=

“a lot like me.” Scores are computed for each construct by averaging across each of the

respective 11 constructs: Reading Efficacy, Reading Challenge, Reading Curiosity,

Reading Involvement, Importance of Reading, Reading Work Avoidance, Competition in

Reading, Recognition for Reading, Reading for Grades, Social Reasons for Reading, and

Compliance. The assessment began with two practice items and then students completed

Page 64: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

56

Pag

e56

2

2 P

age5

62

2

Pag

e56

22

56

Table 2

Instruments for Data Collection

Area to be Assessed Assessment Scores Used

Phonemic Awareness Elision Test: Comprehensive Test of Phonological Processing Raw Score

Phonics DIBELS (Dynamic Indicators of Basic Early Literacy Skills) DORF

Assessment Accuracy Scores

Standard Score

Vocabulary Peabody Picture Vocabulary Test-Revised Standard Score

Grade Equivalent

Fluency DIBELS (Dynamic Indicators of Basic Early Literacy Skills) DORF Standard Score

Comprehension Qualitative Reading Inventory-4. (Narrative and Expository) Grade Equivalent

Scores

Reading Proficiency NC Ready EOG Scale Scores

Proficiency Levels

Motivation Motivation for Reading Questionnaire Domain Average

Scores

Page 65: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

57

57

Pag

e57

2

2 P

age5

72

2

Pag

e57

22

the remainder of the questionnaire on their own while the assessor read each item aloud.

Students finished the assessment in one 20-25 minute session. Reliabilities for the

instrument have ranged from .52 to .81. This assessment was used to measure the

motivational values of this sample.

Data Analysis

The study used the following quantitative methods for generating the profiles of

the suggested sample: descriptive statistics (means, standard deviations), factor analysis

(exploratory and confirmatory), and cluster analysis (hierarchical and non-hierarchical).

The study followed the following steps to generate answers to each of the research

questions: 1) identification of descriptive statistics, identification of reading factors

(construct), 2) identification of reading clusters, 3) validation/identification of MRQ

instrument domains, including adjustments to identify best model and domains for the

MRQ, 4) identification of motivation clusters, and 5) identification of reading and

motivation clusters (profiles).

Reading Profiles

First, descriptive statistics were run to provide summaries about the sample and

the measures used. This included cross tabulations to identify the gender and ethnicity,

and grade statistics of the sample (gender, race, grade level). Next, I identified the means

and standard deviations for each grade level, as well as the entire sample, to determine if

the sample fell below grade level expectations for each reading measure. Last, I used

these descriptives to make general conclusions about the sample prior to subsequent

analyses.

Page 66: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

58

58

Pag

e58

2

2 P

age5

82

2

Pag

e58

22

Next, an exploratory factor analysis was run for the reading data (Word list,

CTOPP, PPVT standardized scores, QRI Narrative and Expository, and DIBELS correct

words per minute and accuracy scores) using SAS software (SAS Institute, 2011).

Exploratory factor analysis is a data reduction technique that explains correlations

through unknown, unobserved (summary variables) factors (Timm, 2002). This process

identified factors, or unobserved variables, that produced the measured variables.

Although existing research (including my pilot study) have determined factors, the

exploratory factor analysis was used to determine if there are consistent underlying

factors that are represented within this sample from previous work.

Several steps were completed to accurately determine the factors representing the

data. The first step was to determine the number of factors to retain. When doing this

step, multiple methods were used to ensure the best selection (Henson & Roberts, 2006),

including the Kaiser criterion, % of variance explained, and SCREE test. The Kaiser

method determined the number of factors by retaining all factors with eigenvalues above

1 (Costello & Osborne, 2005). For the percentage of variance explained method, the

researcher examined the variance explained by each variable and identified an acceptable

percentage. For this study, the acceptable level was percentage explaining above 65% of

the variance of the data. In the SCREE test method, the researcher visually examined the

scree plots graphical representation of the eigenvalues and identified the area where there

was an elbow, or sudden drop (Cattell, 1966). After determining the factors that

represented the data, the communalities were examined. This process examined the

variance of each item that is accounted for by the factors. The higher the value of the

Page 67: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

59

59

Pag

e59

2

2 P

age5

92

2

Pag

e59

22

communality for a variable, the more of its variance was explained by the factors, with

the goal to explain as much variance as possible. Communalities are the sum of the

squared factor loadings for all the factors. When examining the communalities

extractions, those greater than .60 are acceptable, as this criterion signifies that more than

half the variance for that item is explained by the factor. If there are variables with less

than 60% of the variance explained, these variables were deleted from the analysis.

Next, I examined the factor loadings using a Varimax rotation because it

maximized the variance of the loadings for each factor and provided the most

interpretable matrix (Kaiser, 1958). After determining if there were clear and

interpretable loadings for each variable on a factor, I named the underlying reading

constructs that represented this sample. Then, I saved the factors as variables to be used

to understand the profiles of this group of students within the subsequent cluster analysis.

The factor scores representing the reading measures were then used within a

cluster analysis to generate the reading profiles of non-proficient students. Cluster

analysis is a technique that partitions a set of observations/variables into a distinct

number of unknown groups in a manner that all observations within a group are similar,

while observations in different groups are dissimilar (Timm, 2002). It has been

suggested that when true numbers of clusters are unknown a priori, like in the current

study, to use a combination of hierarchical and non-hierarchical clustering techniques to

formulate groups (Sharma, 1996).

Multiple hierarchical methods were used to identify the number of clusters and

cluster membership (profiles). First, I completed the hierarchical cluster analysis using

Page 68: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

60

60

Pag

e60

2

2 P

age6

02

2

Pag

e60

22

average, complete, single, and ward linkage to group the variables. In doing cluster

analysis, it is vital to investigate the groups using multiple grouping measures. Then,

dendograms (icicle plots) were examined to determine the number of profiles present

with each grouping method. This process involved examining clusters joined to visually

decide where the largest distance between groups occurs and to stop joining groups at

that point (Rencher & Christensen, 2012). Next, the cluster history was examined to

identify the best cluster solution which had the largest distance between clusters while

keeping the smallest root-mean-square standard deviation (RMSSTD) and Semi-partial

R-Squared values and the highest R-squared values (Rencher & Christensen, 2012),

measures used to determine the homogeneity within the clusters. This procedure required

the researcher to examine each line of the cluster history table, comparing values of the

line to those before and after to determine where the largest “jumps” occur and use the

cluster solution above this. From these two procedures, the amount of clusters were

chosen that best represented the data. After the number of clusters were determined from

the Hierarchical Cluster method, the cluster memberships were refined using non-

hierarchical k-means clustering (Timm, 2002). The means from the hierarchical clusters

were generated in the previous method and were used as initial seeds to refine group

membership for the clusters in the non-hierarchical methods. From this process, the

group membership was defined for each cluster. I then used this information to generate

descriptive statistics of each profile to analyze the patterns present for these non-

proficient students.

Page 69: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

61

61

Pag

e61

2

2 P

age6

12

2

Pag

e61

22

Motivation Profiles

The MRQ has theoretical support for its structure (Baker & Wigfield, 1999;

Wigfield & Guthrie, 1997). Although the factor structure has a theoretical basis

established by the authors, my desired sample of students differed from the initial sample.

Because of this, prior to using this instrument to understand the motivation for reading of

these students, it was vital to confirm/identify the factor structure represented in this

sample (Yuan, 2005). First, I used LISREL 9.2 software to determine if the structure of

the MRQ is a fit for my data set. This step was important because the researcher needs to

be confident of the applicability of these motivation items to this sample. Goodness of fit

indices/statistics were used in the CFA to determine how well the MRQ’s a priori model

fits these data (McDonald & Ho, 2002). In using CFA, I determined if the model was a

good, marginal, or Poor fit for the data set (Kline, 2010) and if a Poor or marginal fit,

made modifications to make this a good fit. There are many indices produced in

structural equation modeling, but only certain indices were interpreted. There are no

required indices to be included in decision making for structural models; however, it is

necessary to report a variety of indices to capture different parts of the model fit and

make reasonable determinations (Crowley & Fan, 1997). Kline (2010) suggests inclusion

of the Chi-square indices, root mean square error of approximation (RSMEA), the

standardized root mean square residual (SRMR), and comparative fit index (CFI) to make

decisions about structural model fit. To analyze structural models for this data set, those

indices were used for interpretation of model fit. The Chi-square value is a traditionally

utilized fit index to evaluate a model’s fit. With this index, a good fit generated an

Page 70: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

62

62

Pag

e62

2

2 P

age6

22

2

Pag

e62

22

insignificant finding < .05 (Kline, 2010). The root mean square error of approximation

(RMSEA) explained how well the model with estimates fit the covariance matrix of the

sample. For this index, good fit values ranged from .05-.08 (Steiger, 2007). The

RMSEA also produced lower (LB) and upper bound (UB) limits. The upper bound limit

was included and this should be preferably less than .08 for a good fit. The standardized

root mean square residual (SRMR) was used since the questionnaire assesses consistent

levels i.e. 1-4 (Kline, 2010). Good fit values for SRMR were less than 0.08 (Hu &

Bentler, 1999). The comparative fit index (CFI) assessed the model fit. Values larger

than .90 are acceptable with a requirement of larger than .95 for a good fit (Hu & Bentler,

1999).

After identifying items and factors that are represented in my sample, I named the

factors. Next, I generated and discussed descriptive statistics representing the means of

individual items and the means of the factors. These descriptives were used to make

general conclusions about this sample of non-proficient readers’ motivation for reading

prior to subsequent analyses. I then completed the same cluster analysis process with the

motivation average scores for each factor. This allowed me to generate profiles that can

be compared to the existing work of Baker and Wigfield (1999).

Reading and Motivation Profiles

Last, I used the reading factor scores and the motivation factor scores to generate

profiles that represented the reading and motivation patterns of these non-proficient

students. These profiles were generated with the same procedures as done for the reading

and motivation profiles in isolation. These profiles were generated as interpretations of

Page 71: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

63

63

Pag

e63

2

2 P

age6

32

2

Pag

e63

22

the reading profiles as reading and motivational factors clustered together. These new

profiles were analyzed with mean scores and other descriptive statistics that represent the

factors to determine the patterns that exist with these students from multiple analyses.

An analysis was also completed to compare differences between the single and combined

profiles.

Page 72: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

64

64

Pag

e64

2

2 P

age6

42

2

Pag

e64

22

CHAPTER IV

RESULTS

This study focused on understanding the reading and motivation profiles for

students, who were not successful on their state mandated reading assessment. This

included identifying underlying reading and motivational constructs and the

multidimensionality of these constructs. The data collected measured students’ skills in

reading by assessing the five major areas of reading: phonemic awareness, phonics (word

recognition), fluency, vocabulary and comprehension; and motivations for reading:

including efficacy, challenge, avoidance, curiosity, involvement, importance, recognition,

grades, competition, social, and compliance. The instruments used to collect the data

were discussed in the previous chapter.

I first discuss the reading descriptive data by analyzing each school, by grade

level, and the entire sample. This is followed by the results of the exploratory factor

analysis and cluster analysis. Then, I discuss the motivation descriptive data. This is

followed by results from a confirmatory factor analysis and cluster analysis of the

motivation data. Last, I present the results from the combined cluster analysis,

representing data from both the motivation and reading clusters. I have structured the

results analysis this way because previous studies have not examined profiles from a

Page 73: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

65

65

Pag

e65

2

2 P

age6

52

2

Pag

e65

22

combination of reading motivation, but in isolation. This organization of the results

presents initial results that can be evaluated to determine if the existing profiles and

patterns are consistent with my sample.

Reading Trends

In this section, I present descriptive information by grade and school level for

each of the reading variables assessed. This data was generated by grouping data based

on grade levels and schools to identify mean scores that could be used to examine

patterns present. A multivariate ANOVA (MANOVA) was conducted with the reading

variables to determine if significant differences existed between the school level data.

Table 3 presents descriptive information (means, standard deviation, and skew) of

student performance across the schools for the seven reading variables. From the seven

variables, all appear to be within a normal range except DIBELS accuracy scores which

are negatively skewed. With this minor lack of normality which can be attributed to the

nature of a population inclusive of only non-proficient students, the data is acceptable to

future analyses.

When examining Table 3, four general patterns appeared. First, as expected,

given the non-proficient status of the students, performances overall were below grade

level expectations. For each variable, there were grade level expectations that were not

met within the sample for grades 3, 4, and 5 respectively, DIBELS expected accuracy

96%, 97%, & 98%; DIBELS correct words per minute 86, 103, and 112; QRI scores for

comprehension and the word list should be greater than or equal to the grade level.

Second, schools had different patterns of high and low scores in comparison to each

Page 74: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

66

66

Pag

e66

2

2 P

age6

62

2

Pag

e66

22

other. Of the seven variables assessed, School 3 had four variables with the highest

scores (QRI-Word list, DIBELS WPM, and QRI-narrative and expository). The other two

schools had fewer variables with the highest scores. School 1 had two with the highest

scores (CTOPP and PPVT) and School 2 had one (DIBELS Accuracy). Third, with

exception of CTOPP and DIBELS accuracy, School 2 had the smallest standard

deviations of most of the variables assessed. All though this school had the most students

in the sample, it provided the smallest spread of data. Last, each school had consistent

differences between QRI expository and narrative text comprehension. The expository

comprehension levels were approximately one grade level lower than their narrative

scores. This difference suggests these students were not comprehending expository texts

at the same grade level as they were narrative texts.

Page 75: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

67

67

Pag

e67

2

2 P

age6

72

2

Pag

e67

22

Table 3

Descriptive Statistics for the Reading Data by School

Variable School

1

n=64

School

2

n=77

School

3

n=46

Average of

Scores

n=187

Skew

Phonemic Awareness

CTOPP* 14.77

5.06

9.94

4.35

11.61

4.14

12.50

4.99

.01

Phonics/Word ID

DIBELS ACC *** 93.86

11.55

96.09

6.08

95.80

5.16

95.07

8.58

-5.18

QRI Word List **** 3.04

1.65

3.33

1.41

3.75

1.52

3.35

1.57

.13

Vocabulary

PPVT GE**** 3.08

1.67

2.74

1.58

3.05

1.61

2.99

1.63

1.10

Fluency

DIBELS 82.49

37.34

86.54

29.71

95.08

37.27

87.80

35.83

.02

Comprehension

QRI E **** 1.83

1.06

1.59

.79

1.98

.98

1.82

.97

1.25

QRI N **** 2.70

1.16

2.72

.98

2.88

1.19

2.77

1.13

.41

Note: E=Expository Text, N=Narrative Text, *Raw Score, ***Percent Correct,

****Grade Equivalent, number in italics is the standard deviation

1. Phonemic Awareness: The raw scores for CTOPP range from 1-20 with

average scores falling between 8 and 12.

2. Phonics/Word Identification: Percent correct, DIBELS expected accuracy

96%, 97%, and 98% for grades 3, 4 & 5 respectively. QRI word list

expectation is to be able to successfully read grade level word list.

3. Vocabulary: PPVT GE expectation is to identify vocabulary at grade level.

4. Fluency Scores: Correct Words read per Minute Expected scores for

DIBELS are 86, 103, and 112 for grades 3, 4 & 5 respectively.

5. Comprehension: For the QRI all scores representative of the highest

instructional level text read by grade level equivalents for the passages.

Page 76: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

68

68

Pag

e68

2

2 P

age6

82

2

Pag

e68

22

In addition to the patterns and differences observed in the reading data, a

MANOVA was completed to determine if there were significant differences for the seven

variables between the three schools. Table 4 provides the results of the MANOVA test.

There was evidence of a significant difference between schools for at least one of the

reading variables F(14, 346) = 5.26, p < .0001. As a result of the MANOVA results, the

univariate ANOVA tests were completed to determine which variables were significantly

different between schools. When examining the univariate tests, significant differences

were noted for the word list F (2, 184) = 3.68, p < .0271 and the CTOPP F (2, 184) =

17.76, p < .0001. A post hoc test was completed using the Tukey HSD test to determine

which means were different between the schools for the CTOPP and Word List variables

(see Table 5). For the word list variable, a significant difference was found between

School 1 and School 3 (p=.02). There were no significant differences between School 1

and School 2 (p=.58), nor between School 2 and School 3 (p=.34). For the CTOPP

variable, significant differences were found between School 1 and School 2 (p=.00), and

School 1 and School 3 (p=.00). No significant difference was found between School 2

and School 3 (p=.15). These results showed that School 1 is higher than School 2 and 3

for the CTOPP; while for the Word list, School 1 is lower than School 3.

Page 77: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

69

69

Pag

e69

2

2 P

age6

92

2

Pag

e69

22

Table 4

MANOVA Test Results for Reading Differences Between Schools

Statistic Value F Value Num DF Den DF

Wilks' Lambda .69 5.26* 14 356

Pillai's Trace .33 4.96* 14 358

Hotelling-Lawley Trace .44 5.57* 14 281

Roy's Greatest Root .40 10.19* 7 179

Note: *= p<.05. F. Dependent Variable=School, Independent Variable(s) = Reading

Assessments.

Table 5

Post Hoc Test Comparison Between Schools

Dependent Variable School School Mean

Difference

Std. Error

Word List

1 2 -.29 .29

1 3 -.71* .26

2 3 -.42 .30

CTOPP

1 2 4.83* .86

1 3 3.16* .78

2 3 -1.68 .89

Note: TUKEY HSD used for comparisons. *=p<.05.

Page 78: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

70

70

Pag

e70

2

2 P

age7

02

2

Pag

e70

22

Table 6

Descriptive Statistics for the Reading Data by Grade Level

Variable 3rd

n=83

4th

n=54

5th

n=50

Average of

Scores

n=187

Phonemic Awareness

CTOPP* 11.76

4.95

12.26

5.07

13.98

4.74

12.50

4.99

Phonics/Word ID

DIBELS ACC *** 94.25

7.61

94.17

12.21

97.40

3.66

95.07

8.58

QRI Word List **** 2.89

1.42

3.28

1.55

4.20

1.53

3.35

1.57

Vocabulary

PPVT GE**** 2.54

1.28

2.82

1.66

3.90

1.76

2.99

1.63

Fluency

DIBELS 76.84

34.50

86.93

33.94

106.92

32.54

87.80

35.83

Comprehension

QRI E **** 1.45

.69

1.89

.77

2.38

1.26

1.82

.97

QRI N **** 2.31

.91

2.70

.94

3.58

1.20

2.77

1.13

Note: Scores combined for grade levels represent all schools. E=Expository Text,

N=Narrative Text, *Raw Score, ***Percent Correct, ****Grade Equivalent, number in

italics is the standard deviation.

1. Phonemic Awareness: The raw scores for CTOPP range from 1-20 with

average scores falling between 8 and 12.

2. Phonics/Word Identification: Percent correct, DIBELS expected accuracy

96%, 97%, and 98% for grades 3, 4 & 5 respectively. QRI word list

expectation is to be able to successfully read grade level word list.

3. Vocabulary: PPVT GE expectation is to identify vocabulary at grade level.

4. Fluency Scores: Correct Words read per Minute Expected scores for DIBELS

are 86, 103, and 112 for grades 3, 4 & 5 respectively.

5. Comprehension: For the QRI all scores representative of the highest

instructional level text read by grade level equivalents for the passages.

Page 79: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

71

71

Pag

e71

2

2 P

age7

12

2

Pag

e71

22

Table 6 presents descriptive information (means and standard deviations) of

student performance across grade levels for the seven reading variables. With minimum

differences between school level data, and the uneven numbers of students, grade level

descriptive statistics were combined across schools and then evaluated. When examining

this table, five general patterns were identified. First, as observed in the school

comparisons, performances for each grade level were below grade level expectations.

For each variable, there were grade level expectations that were not met within the

sample or grades 3, 4, and 5 respectively DIBELS expected accuracy 96%, 97%, & 98%;

DIBELS correct words per minute 86, 103, & 112; QRI scores for comprehension and the

word list should be greater than or equal to the grade level. Second, except for DIBELS

accuracy scores, as expected, students’ performance for the reading scores increased by

grade level. Third, students’ ability to manipulate sounds of words (CTOPP), read words

in isolation (Dibels ACC and QRI Word Lists), and identify single word meanings

(PPVT) were closer to grade level than were their ability to read quickly (DIBELS) and

comprehend texts, particularly for expository passages. Fourth, students’ ability to

recognize and decode words were at a higher level than their comprehension levels. The

word list scores were all less than a grade level lower than expected based on grade level

placements. Last, when examining the comprehension variables, the spread (standard

deviation) grew as students increased in grade levels. This shows that the comprehension

of non-proficient students widens as students continue to higher grade levels, and this

area needs a greater emphasis than word recognition for these students. This finding

signifies that all non-proficient students were not the same and variances grow as

Page 80: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

72

72

Pag

e72

2

2 P

age7

22

2

Pag

e72

22

students proceed to higher grade levels. A MANOVA was not completed for

comparisons between grade levels because the expectation was for differences in

performances for each of the variables between grade levels.

In summary, the descriptive data presented in this section highlighted many of the

patterns present for the sample. The students were consistently below grade level

expectations regardless of the school. Only minimal significant differences were found

between mean scores for each school. These initial descriptive patterns provided the

introduction to later findings by identifying reading trends represented by these 3rd, 4th

and 5th grade non-proficient students. The initial differences suggested patterns of

performance for students, but with many individual characteristics to consider. The

larger standard deviations provided an initial support for the idea that non-proficient

readers were not identical in their needs. For the rest of this chapter, with the exception

of the motivation item level descriptive summaries, the schools were combined for

analysis. With no major differences and consistent trends in their performance, the data

was suitable for combining. A factor analysis determined if these multiple reading

characteristics could be reduced to fewer constructs, which were easier to interpret and

group. Then, these constructs were used to group students via the clustering process into

groups with similar patterns of performance.

Reading Factors

To better understand students’ strengths and weaknesses, factors were generated

to reduce the data into a smaller subset of interpretable values. This process identified

factors, or unobserved variables, that explained a large percentage of the measured

Page 81: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

73

73

Pag

e73

2

2 P

age7

32

2

Pag

e73

22

variables. An exploratory factor analysis was performed using SAS software with the 11

variables to identify these unobserved factors.

The first step of the exploratory factor analysis determined the number of factors

to retain. When doing this step, multiple criteria were examined to ensure the best

selection (Henson & Roberts, 2006). Three methods determined the factors--Kaiser

criterion, SCREE test (Cattell, 1966), and percent explained. The Kaiser method

determined the number of factors by retaining all factors with eigenvalues above 1

(Costello & Osborne, 2005). In the SCREE test method (see figure 3) the researcher

visually examined the scree plots graphical representation of the eigenvalues and

identified the area where there is an elbow, or sudden drop (Cattell, 1966). The

percentage method explained the number of factors to retain that represent a high

percentage of the variance. Using these three methods, a two-factor solution was

determined. The scree plot displayed a drop after two factors. While there is only one

eigenvalue above 1.0, the second is close with .93 and the two-factor solution explained

72% of the total variance which is acceptable (see Table 7).

Page 82: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

74

74

Pag

e74

2

2 P

age7

42

2

Pag

e74

22

Figure 3. Scree Plot of the Reading Factors

Table 7

Total Variance of Reading Factors

Component Initial Eigenvalues

Total % of Variance Cumulative %

1 3.40 56.73 56.73

2 .93 15.55 72.29

3 .72 11.96 84.25

4 .48 7.95 92.20

5 .26 4.36 96.56

6 .21 3.44 100.00

Next, the communalities were examined. This process examined the variance of

each item that was accounted for by the factors. The higher the value of the communality

for a variable, the more of its variance is explained by the factors, with the goal to explain

as much variance as possible. When examining the communalities extractions, those

Page 83: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

75

75

Pag

e75

2

2 P

age7

52

2

Pag

e75

22

greater than .60 were acceptable, as this criterion signifies that more than half the

variance for that item was explained by the factors. Variables with less than half of their

variance explained by the factors were not accurately measured by the factors. With the

initial two-factor model, all loadings were greater than .6, with the exception the CTOPP

(Table 8). Similar to the work of Buly and Valencia (2002, 2004), the CTOPP did not

have more than half of the variance of the variable explained by the factors and was

removed from subsequent analyses. When running the factor analysis again without the

CTOPP scores, all extractions were greater than .60 (see Table 8). This supported the

two-factor solution for this data set.

Table 8

Reading Factor Communalities

Initial Extraction Revised Extraction

Word List .78 .78

CTOPP .33 Removed

PPVT Standard .73 .68

QRI- Narrative .64 .69

QRI- Expository .66 .74

DIBELS WPM .79 .79

DIBELS Accuracy .63 .66

Note: BOLD is below 50% of the variance explained by the factor structure.

Page 84: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

76

76

Pag

e76

2

2 P

age7

62

2

Pag

e76

22

The next step involved examining the factor matrix to determine if there were

unequivocal loadings for each variable on a factor. This step ensured that the two factors

are actually interpretable with representation of the variables. For this step, a Varimax

rotation (see Table 9) was used to determine which variables loaded onto which factor,

because it maximized the variance of the loadings for each factor and provided the most

interpretable matrix (Kaiser, 1958). The Varimax rotation produced all acceptable

loadings, with the lowest being .60. Next, with interpretable factors, the factors were

appropriately named, word recognition and meaning, similar to that of Buly and Valencia

(2002, 2004) and Dennis (2013).

Table 9

Reading Variables Factor Loadings

Component

Word Identification Meaning

Word List .81 .35

PPVT Standard .00 .82

QRI- Narrative .57 .60

QRI- Expository .42 .75

DIBELS WPM .85 .26

DIBELS Accuracy .81 .02

Note: Bold Numbers: Heaviest loadings for each factor. Rotation Method:

Varimax with Kaiser Normalization.

Page 85: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

77

77

Pag

e77

2

2 P

age7

72

2

Pag

e77

22

Factor 1: Word Identification

Factor one was labeled as word identification, as each of the measures that loaded

onto this factor were related to word recognition abilities, including fluency and rate.

Each loading was greater than .80 which indicates a higher percentage of this variability

explained by the factor. These loadings included DIBELS accuracy scores, CWPM

scores, and QRI word list scores. This factor was attributed to 56.73% of the total

variance. The items empirically and theoretically fit together on this factor because

accuracy and rate are often associated with each other in reading assessments.

Factor 2: Meaning

Factor two was labeled as meaning because each of its measures were inclusive of

students’ ability to make meaning from knowledge of vocabulary and making meaning

from texts read. The QRI measures of comprehension for expository and narrative text as

well as the PPVT loaded here. Meaning was attributed to 15.55% of the total variance.

Factor loadings for this factor were all above .6, thereby supporting this factor as

representing a large amount of variability of each loading. Although the PPVT does not

require students to comprehend a text, it loaded highly onto this factor. This assessment

is a test of receptive vocabulary and measures students’ ability to understand and identify

items.

The results of the exploratory factor analysis generated two factors, word

recognition and meaning, which represented the underlying reading constructs for this

group of students. However, factor analysis does not provide information on patterns of

Page 86: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

78

78

Pag

e78

2

2 P

age7

82

2

Pag

e78

22

performance for understanding the similarities and differences present within this group.

The next analysis addressed this question.

Reading Profiles

Cluster analysis was performed on the reading factors, word identification and

meaning, to generate profiles/clusters that represent non-proficient readers. The cluster

analysis is a statistical technique that divided students into similar groups (Timm, 2002).

It allowed the researcher to interpret the sample by identifying patterns amongst similar

groups of students. It has been suggested that when true numbers of clusters are

unknown a priori, like in the current study, the researcher should use a combination of

hierarchical and non-hierarchical clustering techniques to formulate groups (Sharma,

1996). Both clustering techniques were used in this study.

The SAS program was used to perform hierarchical and non-hierarchical cluster

analysis to identify the groups of students with similar patterns. The non-hierarchical

cluster analysis was completed to determine the number of clusters, without a specific

focus on the group membership. Average, complete, single, centroid, and Ward’s linkage

methods were used within the analysis to group the data. These methods were used

because they provided different ways to group the variables by the nearest neighbor or

chaining method (single), furthest neighbor (complete), average distance between clusters

(average), all possible cluster pairs combined and summed (Ward’s), and the mean value

for each variable is calculated and used for clustering (centroid). The use of these

methods also helped to establish consistency when determining the number of clusters

present.

Page 87: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

79

79

Pag

e79

2

2 P

age7

92

2

Pag

e79

22

First, dendograms and cluster group history were examined using average,

complete, single, centroid, and Ward’s linkage methods to determine the number of

clusters. The dendograms were examined to visually determine the number of groups by

identifying the groups with the smallest distance between them. This process involved

examining clusters joined to visually determine where the largest distance between

clusters occurs. This method provided less consistent results than desired. Single linkage

methods were uninterpretable visually due to the large sample size. The average and

centroid methods visually displayed five groups, while the ward and complete methods

displayed four groups.

Next, the cluster histories were examined to identify a solution which would have

the largest distance between clusters while keeping the smallest RMSSTD and Semi-

partial R-Squared values and the highest R-squared values. This procedure required the

researcher to examine each line of the cluster history table (see Table 10), comparing the

values to those of the line before and after to determine where the largest “jumps”

occured. Then, the cluster solution above this one was selected. I reported the average

method, as this was the method used to perform the hierarchical step of the clustering

procedure. For this method, the largest jumps occurred between lines three and four for

the RMSSD, Semipartial R-squared, and R-Square values, while allowing for the smallest

RMSSD and Semipartial R-squared values and the largest R-Square values. As a result, a

four-cluster solution was selected as the best representation for the data.

With a number of clusters selected, I completed hierarchical clustering of the data.

This step refined group memberships to determine the exact number of students in each

Page 88: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

80

80

Pag

e80

2

2 P

age8

02

2

Pag

e80

22

group. The initial seeds (mean scores) from the non-hierarchical stage were used as seeds

to sort students into groups based on those means. This process ensured students in each

group were more similar to students in the same group and not as similar to other groups.

Table 10

Reading Cluster History

Number

of

Clusters

Clusters Joined n New

Cluster

RMS Std

Dev

Semipartial

R-Square

R-

Square

Maximum

Distance

10 CL22 CL23 21 .41 .01 .83 1.22

9 CL15 CL14 45 .46 .02 .82 1.25

8 CL17 CL12 86 .55 .05 .77 1.38

7 CL10 CL16 31 .56 .03 .74 1.42

6 CL9 CL8 131 .64 .10 .64 1.52

5 CL7 CL41 33 .61 .01 .63 1.73

4 CL6 OB69 132 .65 .01 .62 2.19

3 CL4 CL11 153 .82 .23 .39 2.42

2 CL5 CL3 186 .96 .32 .08 2.52

1 CL2 OB38 187 1.00 .08 0.00 5.48

Note: Complete Clustering Method Used. Only possible clusters less than 10 were

included to save space. Shaded line identifies greatest “jumps” and the best cluster.

Page 89: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

81

81

Pag

e81

2

2 P

age8

12

2

Pag

e81

22

Table 11

Reading Clusters Summary Data

Cluster

n Maximum

Distance

from Seed to

Observation

Distance

Between

Cluster

Centroids

Cluster Means

WordID Meaning

1 49 2.21 1.44 .45 1.25

2 35 4.03 2.02 -1.58 .06

3 52 1.75 .97 .53 -.18

4 51 1.50 .97 .11 -1.06

The revised groups contained students who were distinct from the other clusters,

as shown by the maximum distance from seed to observation, and the distance between

cluster centroid columns of the cluster summary table above (see Table 11). Several

patterns emerged from the clusters, as noted in the cluster summary, with most of the

clusters having patterns of high and low scores with word identification and meaning.

While examining these patterns, they must be interpreted within the context of these

students still performing below grade level expectations.

Cluster one featured 49 students who appeared as most successful with both word

identification and meaning as each of these scores are above the median. These students

had the highest meaning scores of the sample. While their word identification scores

were not the highest, they were still high based on a within group comparison. The

smallest cluster membership was for the students in cluster two. These 35 students had a

combination of low word identification and higher meaning scores. These students had

Page 90: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

82

82

Pag

e82

2

2 P

age8

22

2

Pag

e82

22

the lowest word identification scores for the sample and meaning scores slightly above

the median.

More than half of the sample (n=103) were grouped in clusters 3 and 4. These

two clusters have combinations of students, who had some of the lowest scores for the

sample in either word identification or meaning, with some scores above the median and

some below the median. Cluster 3 students had the highest word identification scores for

the sample, with meaning scores that just below the median. Cluster 4 students had the

lowest meaning scores for the sample with word identification scores just above the

median.

Motivation for Reading Trends

The Motivation for Reading Questionnaire (MRQ) was used to assess students’

motivation for reading within 11 predetermined dimensions noted by the authors. Table

12 highlights the descriptive statistics for each motivation question including means and

standard deviations. Similar to the reading descriptive data, several patterns emerged

from the motivation data. First, the data does not present motivation overall as high or

low for students across all areas. The students’ responses showed scores ranging from

2.13 to 3.65 across the motivation questions, with a possible score range from 1 to 4. The

question with the lowest score was 11, I visit the library often with my family. The

highest scores, 3.65, were associated with two questions: 17, I know that I will do well in

reading next year and 27, it is very important to me to be a good reader. Second, with a

format ranging from 1= “very different from me” to 4= “a lot like me,” the average range

would be considered to be scores between 2.0 and 3.0. This data set had 27 items with

Page 91: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

83

83

Pag

e83

2

2 P

age8

32

2

Pag

e83

22

scores above 3.0 and 26 items with scores between 2.0 and 3.0. These non-proficient

students had more than half of the items with high scores. While it may seem appropriate

to say that the students have a high motivation for reading, this inference cannot be made

as the students agree with statements and some of the motivation domains that are

represented by items that can be considered negative, such as #11, I visit the library often

with my family. This question examined motivation but it does not mean this is

something the student does not enjoy, but something in which they do not participate.

Analyses were run to determine the suitability of the data for further analyses.

The MRQ was found to be highly reliable (53 items; α = .87). The results of item level

Shapiro-Wilk test for normality statistics revealed that each item assessed with the MRQ

failed to be normally distributed. This was expected because the sample was specific to

non-proficient students and not a range of all students in 3rd, 4th, and 5th grades. Although

the data lacked normality, subsequent analyses were completed with the necessary

precautions.

Page 92: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

84

84

Pag

e84

2

2 P

age8

42

2

Pag

e84

22

Table 12

Motivation Item Descriptive Statistics

Item

# Mean

Std.

Deviation

Shapiro-Wilk

Statistic

1 3.06 .93 .82*

2 2.66 1.03 .87*

3 3.64 .70 .57*

4 3.04 .89 .83*

5 2.69 1.16 .83*

6 2.73 1.01 .87*

7 3.65 .71 .55*

8 2.91 1.05 .84*

9 3.09 1.12 .75*

10 3.28 .93 .75*

11 2.13 1.15 .81*

12 3.05 1.04 .80*

13 2.63 1.12 .84*

14 2.80 1.12 .83*

15 3.30 .89 .75*

16 3.00 .98 .83*

17 3.65 .66 .59*

18 3.39 .90 .69*

19 3.15 .92 .81*

20 2.83 .99 .85*

21 2.66 1.03 .87*

22 2.99 1.05 .82*

23 2.42 1.29 .78*

24 2.37 1.17 .83*

25 3.56 .82 .59*

26 2.50 1.35 .75*

27 3.49 .80 .67*

28 3.64 .73 .55*

29 2.80 1.08 .87*

30 3.12 1.04 .77*

31 2.53 1.21 .82*

32 2.27 1.22 .80*

33 2.94 1.12 .80*

34 2.44 1.17 .83*

Page 93: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

85

85

Pag

e85

2

2 P

age8

52

2

Pag

e85

22

Table 12

Motivation Item Descriptive Statistics

Item

# Mean

Std.

Deviation

Shapiro-Wilk

Statistic

35 2.41 1.16 .84*

36 3.44 .89 .66*

37 2.64 1.20 .82*

38 3.56 .80 .60*

39 2.60 1.13 .85*

40 2.88 1.17 .79*

41 3.09 1.09 .77*

42 2.81 1.19 .80*

43 3.41 0.91 .68*

44 3.53 0.85 .61*

45 2.32 1.16 .83*

46 3.38 0.92 .69*

47 3.36 0.89 .72*

48 2.94 1.05 .82*

49 3.03 1.14 .77*

50 3.48 0.85 .65*

51 3.34 0.86 .74*

52 2.54 1.21 .82*

53 3.45 .92 .63*

Note: df=187. *= p<.05. The information is

collapsed across grades and schools and

represents the complete sample.

Motivation Factors

Motivation Factor Structure

Wigfield and Guthrie (1997) demonstrated that the 53 MRQ items load onto 11

reading motivation dimensions with their initial sample. Prior to using this instrument to

analyze this data set, it was vital to confirm that this factor structure (see Table 13) was

represented in this sample (Yuan, 2005). LISREL 9.2 software (Jöreskog & Sörbom,

Page 94: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

86

86

Pag

e86

2

2 P

age8

62

2

Pag

e86

22

1996) was used to complete a confirmatory factor analysis (CFA) of the MRQ data from

the dataset (n=187). This step was important because the researcher needed to be

confident of the applicability of these motivation items to this sample. The distribution

of the data did not fall within a normal distribution range as noted by the results of the

Shapiro-Wilk Test (see Table 12 above); however, maximum likelihood estimation was

used in the confirmatory factor analysis to account for the non-normality within the data.

Goodness of fit statistics were used in CFA to determine how well the a priori

model fit the specific data (McDonald & Ho, 2002). When determining the fit, the

researcher seeks to determine if the model was a good, marginal, or Poor fit for the data

set and to identify the acceptable model for interpretation (Kline, 2005). There are many

indices produced in structural equation modeling, but only certain indices were

interpreted in this study. There are no required indices to be included in decision making

for structural models; however, it is necessary to report a variety of indices to capture

different parts of the model fit and make reasonable determinations (Crowley & Fan,

1997). To analyze the structural models for this data set, the SRMR, RSMEA estimate,

RMSEA upper limit of the confidence interval, and CFI were used. Kline (2005) suggests

inclusion of these indices to make decisions about structural model fit.

Page 95: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

87

87

Pag

e87

2

2 P

age8

72

2

Pag

e87

22

Table 13

Initial MRQ Factor Structure by Instrument Authors

Factor Items

Curiosity 10, 14, 19, 25, 29, 4

Involvement 6, 12, 30, 33, 35, 22

Competition 9, 41, 44, 49, 52

Social 11, 26, 42, 45, 48, 31, 39

Efficacy 7, 15, 21

Compliance 36, 46, 51, 23, 34

Importance 17, 27

Challenge 2, 5, 8, 16, 20

Avoidance 13, 24, 32, 40

Grades 3, 38, 50, 53

Recognition 18, 28, 37, 43, 47

Each of the above indices were included because of the unique information about

the model that it presented. The Chi-square value is a traditional fit index that evaluates

the model’s overall fit. With this index, a good fit generates an insignificant result such

as > .05 (Kline, 2005). The root mean square error of approximation (RMSEA) explains

how well the model with estimates fit the covariance matrix of the sample. For this

index, good fit values range from .05-.08 (Steiger, 2007). The RMSEA also produces

lower (LB) and upper bound (UB) limits. The upper bound limit will be included as this

Page 96: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

88

88

Pag

e88

2

2 P

age8

82

2

Pag

e88

22

should be preferably less than .08 for a good fit. The standardized root mean square

residual (SRMR) is used since the questionnaire assesses consistent levels such as 1-4

(Kline, 2005). Good fit values for SRMR are less than .08 (Hu & Bentler, 1999). The

comparative fit index assesses the model fit, including smaller sample sizes. Values

larger than .90 are acceptable with a requirement of larger than .95 for a good fit (Hu &

Bentler, 1999). The ideal sample size for a CFA in a structural model is 200-400. This

sample fell just short, so there were multiple interpretations of the model to ensure the

best match between motivational domains and representation of the data. First, the model

was run with all parameters simultaneously to determine the fit. This initial model was

the item loadings noted by the authors of the instrument. The full model produced the

following results: SRMR = .07, RMSEA = .05, RMSEA UB = .06, CFI of 0.90, χ2=

1920.98, df = 1270, and p = 0.00. While the Chi-Square p-value is not greater than .05,

the remaining indices were at acceptable and good levels. When examining the

standardized loadings, there were several items (13, 24, 32, 40) that had insignificant

loadings (p >.05) and errors with variances. Due to the poor fit of this data, model fit, the

errors, and significant loadings needed further examination and the model needed to be

examined with subsets of the data.

Baker and Wigfield (1999) had what they noted as an undesirable sample (n=371)

and decided to complete three separate CFA’s of their data. For their model, they used

subsets of the factors, ranging from two to six factors to measure the fit of the model for

their data. With this data set close to the ideal sample size, I needed to examine the

model with two subsets of the factor structure. The first half of the model (Challenge,

Page 97: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

89

89

Pag

e89

2

2 P

age8

92

2

Pag

e89

22

Grades, Curiosity, Involvement, Efficacy, and Avoidance) produced results of: SRMR =

.07, RMSEA = .05, RMSEA UB = .06, CFI of 0.88, χ2= 492.34, df = 335, and p = 0.00.

While the Chi-Square p-value was not greater than .05, the remaining indices were at

acceptable and good levels. Similar to the full model, this model produced several

insignificant standardized loadings. Unlike the full model, this model did not produce

any error negative variances. The second half of the model (Competition, Importance,

Recognition, Compliance, and Social) produced a SRMR = .07, RMSEA = .07, RMSEA

UB = .07, CFI of .88, χ2= 472.81, df = 265, and p = 0.00. For this half, the Chi-Square p-

value is not greater than .05, the CFI value was below the acceptable range, however, the

remaining indices were at good levels. Unlike the full model or the other half of this

model, there were no insignificant standardized loadings nor error negative variances.

With the identified issues in both the full and split models, changes needed to be

made to the factor structure. LISREL software outputs information for modification

changes for a better fit for data; however, the suggested changes were determined to not

be realistic based on the actual survey questions and the domains. The avoidance factor

was responsible for the majority of the issues with the model. The scores were reversed

as suggested by the instrument’s authors, but it continued to produce issues in the models

run. The best modification was to delete the avoidance factor and its associated

questions. This was done because this factor was causing the negative variances and the

insignificant standardized loadings. Using this modification to the structure, the full

model produced these calculations: SRMR = .07, RMSEA = .05, RMSEA UB = .06, CFI

of .91 χ2= 1647.04, df = 1082, and p = 0.00. The modification removed the negative

Page 98: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

90

90

Pag

e90

2

2 P

age9

02

2

Pag

e90

22

variances as well as the insignificant loadings and produced a good fit for the data. To

verify the fit of the model, the structural model was evaluated with two subsets as

completed earlier. Both models suggest a good fit for the model with the data.

While completing the confirmatory analysis, some challenges developed and were

overcome. A consistent issue with the Chi-Square test was noted as all of the models

continued to produce significant values when an insignificant value is desired. Although

this was an issue, it did not impact the model’s acceptance as this test assumes

multivariate normality, which was not represented for this data set. Second, the large

number of items for the assessment made the path diagram uninterpretable. With this, it

was not included or interpreted. While determining if the factors and items would be

accepted, the loadings were examined for each item. Each item produced significant

loadings. This finding supports the use of the 10 factors for subsequent analyses. Table

14 identifies the final factor structure with the items that load onto each factor.

Page 99: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

91

91

Pag

e91

2

2 P

age9

12

2

Pag

e91

22

Table 14

Revised Final MRQ Factor Structure

Factor Items

Curiosity 10, 14, 19, 25, 29, 4

Involvement 6, 12, 30, 33, 35, 22

Competition 9, 41, 44, 49, 52

Social 11, 26, 42, 45, 48, 31, 39

Efficacy 7, 15, 21

Compliance 36, 46, 51, 23, 34

Importance 17, 27

Challenge 2, 5, 8, 16, 20

Motivation Factors

The revised structural model for the data included ten motivation for reading

factors (domains) including Challenge, Competition, Compliance, Curiosity, Efficacy,

Grades, Recognition, Social, Importance and Involvement. These items were included in

the original 11 theoretical domains by the authors of the instrument. The difference was

that the dimension of avoidance was not clearly identified due to insignificant loadings

and variance errors. The ten domains identified were all labeled according to the original

theoretical model by Wigfield and Guthrie (1997) noted in Chapter 3.

Page 100: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

92

92

Pag

e92

2

2 P

age9

22

2

Pag

e92

22

Motivation Factor Descriptive Data

Table 15 presents the descriptive information for the motivation domains of each

school. When examining the school performances, none of the scores were extremely

high or low in comparison to those of the other schools. When examining the individual

domains, there was no school that contained all high or low scores for every motivation

domain. School 3 had the highest scores in six domains (efficacy, challenge (tie),

curiosity, importance, grades, and competition). School 1 had the highest in four

domains (challenge (tie), involvement, recognition, and compliance).

In addition to the patterns and differences observed from the chart with

descriptive statistics, a MANOVA test was completed to determine if there were

significant differences between the school level motivation variables. Table 16 provides

the results of the t-tests. There was no evidence that there was an overall significant

difference between the schools for the motivation variables: F (20, 350) = 1.39, p =0.126.

Page 101: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

93

93

Pag

e93

2

2 P

age9

32

2

Pag

e93

22

Table 15

Descriptive Statistics for the Motivation Data by School

Variable School 1

n=64

School 2

n=77

School 3

n=46

Total

n=187

Social 2.53

.61

2.56

.60

2.55

.69

2.55

.62

Challenge 2.88

.73

2.73

.64

2.88

.60

2.82

.67

Involvement 2.89

.48

2.77

.60

2.80

.57

2.82

.56

Compliance 3.03

.52

2.99

.48

3.00

.42

3.00

.48

Competition 3.14

.66

2.90

.63

3.21

.63

3.06

.65

Curiosity 3.12

.57

3.05

.65

3.19

.47

3.11

.58

Efficacy 3.28

.55

3.02

.65

3.41

.58

3.20

.62

Recognition 3.40

.54

3.14

.69

3.38

.49

3.29

.60

Grades 3.58

.55

3.41

.69

3.66

.39

3.53

.54

Importance 3.60

.60

3.43

.69

3.75

.42

3.57

.61

Note: Variables have been sorted with total averages in ascending order. Possible

values can range from 1 to 4. Standard Deviations are italicized.

Page 102: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

94

94

Pag

e94

2

2 P

age9

42

2

Pag

e94

22

Table 16

MANOVA Test Results for Motivation Differences Between Schools

Statistic Value F Value Num DF Den DF

Wilks' Lambda 0.86 1.39 20 350

Pillai's Trace 0.15 1.37 20 352

Hotelling-Lawley Trace 0.17 1.40 20 292.24

Roy's Greatest Root 0.13 2.27 10 176

Note: *= p<.05. F Statistic for Wilks' Lambda is exact.

Table 17 presents the descriptive information for the motivation domains of each

grade level. When examining the school performances, there were three clear patterns

that appear. First, there were six domains which showed increases from 3rd to 5th grade

(Efficacy, Challenge, Curiosity, Importance, Competition, and Compliance). Second,

recognition was the only domain that showed a consistent decrease from 3rd to 5th grade.

Third, there were three domains which showed an increase from 3rd to 4th grade and then

a decrease from 4th to 5th (Involvement, Grades, and Social). Overall, the data supported

the notion that motivation is multifaceted and not a singularly represented concept. The

scores ranged from 2.55 (Social) to 3.57 (Importance). Only three domains could be

considered as low motivators for the students with scores below 3.0: Challenge,

Involvement, and Social. Two domains, Grades and Importance, could be considered as

strong areas of motivation for the students with scores above 3.5. The remainder of the

Page 103: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

95

95

Pag

e95

2

2 P

age9

52

2

Pag

e95

22

scores fell between 3.0 and 3.5 (Efficacy, Curiosity, Recognition, Competition, and

Compliance).

Similar to the reading data, a MANOVA test was completed to determine if there

were significant overall differences between grade level motivation variables. Table 18

provides the results of the t-tests. There was no evidence that there was an overall

significant difference between the schools for the motivation variables: F(20, 350) = 1.43,

p =0.107.

Motivation Clusters

Similar to what was done previously with the reading factors, a cluster analysis

was completed with the motivation factor scores. The SAS program was used to generate

clusters using both hierarchical and non-hierarchical methods. The average, complete,

single, centroid, and median linkage methods were used to determine the most consistent

clustering structure. First, the dendograms and cluster group history were examined to

determine the number of clusters. The dendograms were examined visually to determine

where the largest distance between clusters occurred. With the large sample size, some

of these were not visually interpretable. The complete and ward’s method visually

showed six clusters.

Page 104: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

96

96

Pag

e96

2

2 P

age9

62

2

Pag

e96

22

Table 17

Descriptive Statistics for the Motivation Data by Grade Level

Variable 3rd

n=83

4th

n=54

5th

n=50

Total

n=187

Social 2.49

.68

2.71

.55

2.46

.57

2.55

.62

Challenge 2.72

.74

2.82

.62

2.98

.55

2.82

.67

Involvement 2.76

.64

2.92

.46

2.81

.49

2.82

.56

Compliance 2.97

.48

3.03

.49

3.04

.46

3.00

.48

Competition 3.01

.70

3.09

.64

3.10

.59

3.06

.65

Curiosity 3.04

.63

3.15

.58

3.18

.48

3.11

.58

Efficacy 3.12

.64

3.23

.64

3.29

.55

3.20

.62

Recognition 3.32

.64

3.30

.55

3.21

.60

3.29

.60

Grades 3.43

.62

3.67

.48

3.55

.42

3.53

.54

Importance 3.51

.69

3.57

.61

3.65

.45

3.57

.61

Note: Variables have been sorted with total averages in ascending order.

Possible values can range from 1 to 4. Standard Deviations are italicized.

Page 105: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

97

97

Pag

e97

2

2 P

age9

72

2

Pag

e97

22

Table 18

MANOVA Test Results for Motivation Differences Between Grade

Statistic Value F Value Num DF Den DF

Wilks' Lambda 0.85 1.43 20 350

Pillai's Trace 0.15 1.43 20 352

Hotelling-Lawley Trace 0.16 1.42 20 292

Roy's Greatest Root 0.09 1.59 10 176

Note: *= p<.05. Dependent Variable=Grade, Independent Variable(s) = Motivation

Domains.

The cluster history was examined with the same goals while identifying the

cluster solution that would have the largest distance while keeping the smallest RMSSTD

and Semi-partial R-Squared values and the highest R-squared values. This procedure

required the researcher to examine each line, comparing them to the line before and after

to determine where the largest “jumps” occured and use the cluster solution above this.

For this set, the largest jumps occurred at the line for cluster six for semi-partial R-square

and R-Square, this suggested a six cluster solution (see Table 19). A six cluster solution

was selected as the best fit to proceed for non-hierarchical methods. The means from this

cluster solution were used as initial seeds, or starting points, to generate refined clusters

with students who were closest to the initial seeds.

Page 106: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

98

98

Pag

e98

2

2 P

age9

82

2

Pag

e98

22

Table 19

Motivation Cluster History

Number

of

Clusters

Clusters Joined n New Cluster

RMS Std

Dev

Semipartial

R-Square

R-Square Between

Cluster

Sum of

Squares

10 CL15 CL24 35 .43 .02 .55 11.32

9 CL12 CL16 28 .51 .02 .53 12.07

8 CL19 CL17 32 .45 .02 .51 14.47

7 CL13 CL14 51 .41 .02 .48 14.88

6 CL9 CL18 36 .55 .03 .46 17.06

5 CL7 CL23 79 .40 .03 .43 19.80

4 CL6 CL11 41 .61 .03 .39 22.26

3 CL10 CL8 67 .48 .04 .36 23.24

2 CL3 CL4 108 .58 .10 .26 64.49

1 CL2 CL5 187 .60 .26 .00 171.83

Note: Ward’s Clustering Method Used. Only possible clusters less than 10 were included

to save space. Shaded line identifies greatest “jumps” and the best cluster.

The k-means non-hierarchical clustering method was used to generate groups.

The Ward’s clustering method was selected as the best option for grouping because it

grouped more similar students together without having clusters with one to two students.

The results produced clusters with varying motivational patterns (see Table 20).

Page 107: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

99

99

Pag

e99

2

2 P

age9

92

2

Pag

e99

22

Table 20

Motivation Clusters Summary Data

Cluster

n

Percen

tage

of S

ample

Efficacy

Challen

ge

Curio

sity

Involv

emen

t

Importan

ce

Reco

gnitio

n

Grad

es

Com

petitio

n

Social

Com

plian

ce

1 40 21.4% 3.88 3.56 3.62 3.16 3.95 3.71 3.92 3.73 3.05 3.17

2 39 20.9% 3.32 3.06 3.44 3.12 3.85 3.53 3.74 3.10 2.79 3.08

3 37 19.8% 2.80 2.42 3.09 2.99 3.41 3.31 3.46 2.94 2.72 3.03

4 35 18.7% 3.31 2.71 2.87 2.35 3.73 3.35 3.51 3.09 1.90 3.12

5 31 16.6% 2.63 2.38 2.49 2.51 2.87 2.54 3.10 2.45 2.27 2.65

6 5 2.6% 2.53 1.44 1.97 1.73 2.70 1.96 2.05 1.80 1.54 2.28

When looking at each cluster, there were unique characteristics present. First, no

cluster contained more than 25% of the sample. This meant there were not groups that

were overtly large with the most similar patterns. Second, there was no single

consistently high motivator across the six groups. The domains of importance and grades

were the highest motivator between the six groups.

A closer examination of the clusters provided a clearer interpretation. Clusters

one and six, respectively, represent the highest motivated students and the lowest

motivated students. The highest motivated students in cluster one represented the largest

cluster. They have relatively the highest scores for each motivational domain. The five

students in cluster six students have the lowest motivation scores for each motivational

Page 108: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

100

10

0

Pag

e10

0

2

2 P

age1

00

22

Pag

e10

02

2

domain. Clusters two and five have students with consistently high or average

motivators. Cluster two students have consistently high scores with the exception of their

lack of participation in social related reading activities. The students in cluster five are

average with most scores between 2.0 and 3.0 with the exception of Grades, which is

above 3.0. Last, the students in clusters three and four are characterized as “mixed

motivated.” They have combinations of high and low motivators, and no noticeable high

or low areas.

Motivation and Reading Clusters

Motivation and reading factors were previously identified within this dissertation.

The exploratory factor analysis identified two reading factors: word identification and

meaning. The confirmatory factor analysis confirmed ten motivation factors: Efficacy,

Challenge, Curiosity, Involvement, Importance, Recognition, Grades, Competition,

Social, and Compliance. Earlier, the cluster analysis procedure identified four clusters

that represented the reading factors, and six that represented the motivational factors.

While these analyses provided important information about the patterns of students, the

information is limited to grouping based on either motivation or reading. Previous

researchers (Baker & Wigfield, 1999; Guthrie, Coddington, & Wigfield, 2009) used

motivation to group students into clusters, then used summary statistics to describe the

patterns of performance for their sample. While this is acceptable, a grouping that takes

into account both the reading and motivational factors generates clusters that are

representative of students’ patterns of performance with both areas. This study had a

goal to better understand non-proficient students by grouping the students based on both

Page 109: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

101

10

1

Pag

e10

1

2

2 P

age1

01

22

Pag

e10

12

2

their motivation and reading descriptors using the clustering process. I use this section to

present results of the cluster analysis, including both reading and motivation factors

combined, to understand the different patterns of reading and motivation that exist

amongst non-proficient minority students. The process is explained, followed by a brief

description of the results of the process.

Although the number of clusters from the reading and motivation data were

identified separately, this is not representative of the true number of clusters when

including all factors together. Therefore, the complete process of using hierarchical and

non-hierarchical clustering techniques was completed to determine the number of groups

and define the membership (Sharma, 1996). As with the previous cluster analyses, the

clustering process was completed with the SAS program using both hierarchical and non-

hierarchical methods including analyses using all of the five linkage methods. See above

sections for the specific steps followed for this analysis.

The cluster history was examined with the same goals, while identifying the

cluster solution that would have the largest distance while keeping the smallest RMSSTD

and Semi-partial R-Squared values and the highest R-squared values. For this set, the

largest jumps occurred at the line for cluster six for Semi-partial R-square and R-Square,

suggesting a six cluster solution (see Table 21). Similar to the motivation cluster

analysis, a six cluster solution was selected as the best fit to proceed for non-hierarchical

methods. The means from this cluster solution were used as initial seeds, or starting

points, to generate refined clusters with students who were closest to the initial seeds.

Page 110: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

102

10

2

Pag

e10

2

2

2 P

age1

02

22

Pag

e10

22

2

Table 21

Combined Cluster History

Number

of

Clusters

Clusters Joined n New Cluster

RMS Std Dev

Semipartial

R-Square

R-Square Between

Cluster

Sum of

Squares

10 CL41 CL23 14 .54 .01 .48 4.23

9 CL14 CL34 35 .50 .03 .45 4.31

8 CL11 CL21 18 .63 .01 .44 4.41

7 CL18 CL8 37 .62 .03 .41 4.60

6 CL16 CL9 46 .58 .04 .37 5.24

5 CL6 CL10 60 .65 .08 .29 5.95

4 CL7 65 38 .65 .02 .27 6.17

3 CL5 CL12 148 .61 .08 .19 6.21

2 CL4 38 39 .69 .03 .16 7.56

1 CL3 CL2 187 .68 .16 .00 8.61

Note: Ward’s Clustering Method Used. Only possible clusters less than 10 were included

to save space. Shaded line identifies greatest “jumps” and the best cluster.

With the number of clusters determined, the non-hierarchical process was used to

refine the membership in each of the six groups. This process was completed by using the

means from each cluster as the initial seeds (starting points) for the refining process. This

process grouped students around the starting values, which ensured students were close to

others with similar scores. The revised six cluster solution produced clusters with varying

Page 111: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

103

10

3

Pag

e10

3

2

2 P

age1

03

22

Pag

e10

32

2

frequencies and distances from other clusters. Table 22 identifies the frequencies, along

with the summary data for each factor.

Table 22

Combined Clusters Summary Data

Cluster 1 2 3 4 5 6

Frequency 30 25 48 19 44 21

Compliance 3.13 3.07 2.91 2.97 3.20 2.58

Social 2.82 2.21 2.35 2.44 3.03 2.07

Competition 3.64 2.66 3.01 3.10 3.31 2.24

Grades 3.89 3.41 3.61 3.22 3.81 2.65

Recognition 3.65 2.84 3.30 3.27 3.70 2.41

Importance 3.92 3.42 3.64 3.24 3.85 2.79

Involvement 3.18 2.72 2.72 2.75 3.08 2.17

Curiosity 3.68 2.85 2.98 3.00 3.40 2.36

Challenge 3.52 2.76 2.55 2.56 3.15 2.06

Efficacy 3.83 3.16 3.11 2.86 3.37 2.51

Meaning .77 1.55 -.61 .04 -.50 -.53

Word ID .63 .29 .57 -2.08 -.21 -.22

Each profile is unique and the patterns present were used to better understand

each group of students. Three unique patterns are present within the data. First, through

all of the clusters, the motivation domains of Grades and Importance of reading were

Page 112: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

104

10

4

Pag

e10

4

2

2 P

age1

04

22

Pag

e10

42

2

consistently the highest motivators for reading. Because these are high for the majority of

the sample, they will not be used to describe and compare the clusters later. Second, the

reading data within the clusters have combinations of high and low with scores above and

below the median. Third, the social domain has the lowest scores for all the clusters, with

the exception of cluster six, which comes close. Last, the students in cluster one have the

highest motivating areas for reading and the highest word recognition scores, but do not

have the highest meaning scores. The paragraph provides descriptive information to

differentiate the motivation for reading and reading performances between the clusters.

Each cluster has unique characteristics that help with better understanding these

non-proficient readers. Clusters one and five contained the students with the highest

motivation for reading with consistently high scores for each motivation domain in

comparison to the other clusters. The students in cluster one have high word

identification and meaning scores. The students in cluster five have low word

identification and meaning scores as they fell just below the median. The students in

clusters two and three have higher word identification scores; however, their meaning

abilities differed. Cluster three students have the lowest meaning scores while cluster

two have the highest meaning scores. When examining their motivation for reading,

Grades and Efficacy drive these students, while Challenge and Involvement do not

necessarily drive these students to participate in reading related activities. Cluster four

students have the lowest word recognition, while having average meaning scores. These

students’ motivation for reading have strength in their Curiosity about reading and their

Page 113: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

105

10

5

Pag

e10

5

2

2 P

age1

05

22

Pag

e10

52

2

competitive nature. Last, cluster six students have the lowest scores in all motivation to

read areas. Grades are their highest motivator to participate in reading activities.

This chapter presented the results of the research study. Through the various

analysis, the results highlighted the differences that exist between non-proficient students.

These students overall are performing below grade level in all areas and have significant

differences between their comprehension of narrative and expository texts. Using factor

analysis, two factors were found that represent the reading data; word identification and

meaning. Ten of the motivation for reading domains were identified using factor analysis

of the motivation data set; however, the domain of avoidance was not present. These

results provide support for three major ideas. First, there is heterogeneity in the academic

performance and motivation for reading of non-proficient readers. Second, the idea that a

one-size-fits-all model does not work for providing remedial instruction for non-

proficient readers (Allington, 2009). Last, the results support the need to use motivation

when addressing the academic needs of non-proficient students.

Page 114: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

106

10

6

Pag

e10

6

2

2 P

age1

06

22

Pag

e10

62

2

CHAPTER V

DISCUSSION

The purpose of this study was to examine the reading and motivational profiles

for students who have been identified as non-proficient readers. Accountability policies

tend to unfairly penalize those schools with high percentages of marginalized groups,

students of color and those living in poverty (Hursh, 2007). Because these schools often

receive low-performing or failing designations, research needs to include such students in

their studies if we want to understand their reading strengths and weaknesses. Au (2009)

described the “zip code effect,” where schools within geographical areas tend to represent

a specific socio-economic status and ethnic makeup. This study purposefully included

zip code schools to ensure the representation of these marginalized students, who often

are criticized for being “unmotivated” and/or low achievers. Moreover, these schools

traditionally focus solely on word identification strategies when providing remediation. If

these strategies do not address students’ overall needs, then schools need to use a variety

of other assessments and interventions to address the heterogeneity that exists amongst

these students.

This chapter first discusses individual reading and motivational profiles. Within

this discussion, I focus mainly on how the profiles from the separate analyses in this

study differed from existing studies where such profiles also were generated. Second, I

place the larger emphasis of this chapter on a discussion of the combined reading and

Page 115: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

107

10

7

Pag

e10

7

2

2 P

age1

07

22

Pag

e10

72

2

motivational profiles as these profiles were the goal of the study. Within this chapter, I

highlight similarities and differences between the separated and combined clusters. Last,

I share implications for practice and policy, followed by suggestions for the need to

conduct future research.

Reading Profiles

Researchers have presented different profiles of readers (Buly & Valencia, 2002;

Dennis, 2013; Leach et. al, 2003; Leseaux & Kieffer, 2010; Meyer et. al, 2013; Rupp &

Leseaux, 2006; Pierce et. al, 2007). When identifying such profiles, only two studies

placed an emphasis on non-proficient readers and used assessments measuring the

National Reading Panels’ (NICHD, 2000) five components of reading (Buly & Valencia,

2002; Dennis, 2013). Both studies identified a three-factor representation of word

recognition (decoding), fluency, and meaning. Both studies used the three factors to

generate from four to 10 profiles.

In the reading only cluster analysis of this study, two factors emerged, word

identification and meaning. Thus, unlike most of the previous studies using exploratory

factor analysis, except for the Rupp and Leseaux (2006) study, fluency did not appear as

a separate factor. The two factors, word identification and meaning, were used in a

cluster analysis and generated four reading profiles. There are three important things to

note from these four profiles. First, unlike most existing research of reading profiles

(Buly & Valencia, 2002; (Leach et. Al, 2003; Leseaux & Kieffer, 2010; Meyer et. Al,

2013; Rupp & Leseaux, 2006; & Pierce et. al, 2007), this study’s separate analysis failed

to identify a group of non-proficient readers who had a combination of low scores on the

Page 116: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

108

10

8

Pag

e10

8

2

2 P

age1

08

22

Pag

e10

82

2

word identification and meaning factors. This difference mainly related to students’

word recognition abilities, which were clearly somewhat stronger in this study than in

previous related studies. An important note, however, the subsequent combined analyses

did reveal such a pattern, but only represented a small portion of the sample. Second, no

single set of profiles represented all non-proficient readers (Buly & Valencia, 2002);

differences in profiles are a direct reflection of the sample analyzed. This finding runs

counter to present accountability interventions (Allington, 2009) where non-proficient

readers are presented with one-size-fits-all remedial instruction. Third, overall, students

in each cluster scored higher on their ability to decode or recognize words than on their

comprehension of texts. Moreover, differences of almost a grade level existed between

students’ ability to comprehend narrative versus expository texts. Other profile specific

similarities and differences will be presented when the results from the combined analysis

are discussed.

Motivation Profiles

Motivation for reading explains why students approach or avoid reading

activities. While researchers have explored students’ motivation for reading, their

samples have not emphasized non-proficient readers. Based on its existing use with

elementary students and its range of motivation domains, the MRQ was used to generate

motivational profiles. Existing studies have identified six to 11 factors. This study’s

confirmatory factor analysis identified 10 of the 11 original factors. These included

Challenge, Competition, Compliance, Curiosity, Efficacy, Grades, Recognition, Social,

Importance and Involvement. Unique to this study, an avoidance factor was not

Page 117: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

109

10

9

Pag

e10

9

2

2 P

age1

09

22

Pag

e10

92

2

discovered. This is important because it documents students’ positive orientations

towards reading in comparison to some general expectations for this group of readers. In

other studies, avoidance was a common factor (Baker & Wigfield, 1999; Unrau &

Schlackman, 2006; Wigfield & Guthrie, 1997). Moreover, an avoidance factor is a

central component to the expectation-value model, upon which the MRQ is based. More

emphasis will be directed towards this point when discussing the combined analysis.

The 10 motivation factors were used in the clustering process to generate

profiles. This analysis produced a six-cluster solution. The cluster analysis produced six

profiles, which was fewer than the seven profiles in the work of Baker and Wigfield

(1999). These differences in the number of profiles were a clear reflection of the sample

used, in that, this study’s sample included greater heterogeneity. Subsequent discussion

of these differences is presented with the analysis of the combined clusters.

Motivation and Reading Profiles

This study’s main purpose was to evaluate reading and motivation profiles in a

combined analysis for non-proficient readers. The analysis specifically addressed

marginalized groups, including high percentages of ethnic minority students whose

families lacked economic resources. Prior studies have failed to directly address such

student populations, nor have they simultaneously analyzed reading and motivation

factors represented for non-proficient readers.

To better understand these students, combined reading and motivation profiles

were generated (see Table 29). The motivational domains of Grades and Importance

received the highest ratings across all profiles, whereas Social domain had the lowest

Page 118: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

110

11

0

Pag

e11

0

2

2 P

age1

10

22

Pag

e11

02

2

ratings. As expected, the patterns of the ratings differed across the six profiles. When

interpreting these profiles, I used their reading performances as a basis to describe the

findings because reading proficiency was the main construct of the study. The profiles

were named using descriptive terminology that will allow teachers to identify students’

placement using the diagnostic reading behaviors of students as well as their motivation

for reading characteristics.

First, while two of the groups had reading performances above the mean for this

sample, their motivational orientations differed markedly. While the scores were above

the mean, these students are non-proficient and not meeting grade level expectations.

Students in the first profile, Motivated Readers (n=30), performed higher on meaning,

yet lower on word recognition than students in the second profile, and had above average

ratings on each of the motivation indices. In particular, these students had the highest

scores on Importance, Curiosity, Recognition, and Competition. Students in the second

profile, Confident and Compliant Readers (n=25), performed above average on both

meaning and word recognition measures, yet had average motivation scores for seven of

the motivation measures, with below the mean scores for Social, Competition, and

Curiosity. The students in both profiles scored above the mean for word identification

and meaning, the difference between these profiles is attributed to the motivation scores

being higher for profile one than profile two.

The next two profiles had combinations of the lowest scores for one of the reading

factors. Students in profile 3, Competitive Word Callers (n=48), performed above

average on word identification, but had the lowest scores for meaning in the sample, and

Page 119: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

111

11

1

Pag

e11

1

2

2 P

age1

11

22

Pag

e11

12

2

had average ratings for eight of the motivation indices with Challenge and Social having

below average ratings. Profile 4 students, Curious Comprehending Readers (n=19),

performed at the mean for meaning, but had the lowest word identification scores in the

sample. Six of the motivation domains were average for these students with below the

mean ratings for grades, importance, challenge, and efficacy. These two groups had

similar patterns of high and low motivation areas; however, the main difference here is

their reading performances. The students in profile three are “word callers”, who are not

making meaning, while the students in profile four are struggling with their word

identification skills.

The final two profiles had lower than the mean scores for both reading factors.

Both clusters had almost identical scores for both word identification and meaning. The

differences between these students once again was attributed to the motivation domains.

Profile five students, Motivated Multi-Need Readers (n=44), had four motivation ratings

above the mean in Grades, Recognition, Importance, and Social with the other motivators

falling below the mean. The students in profile six, Compliant Multi-Need Readers

(n=21), had scores below the mean for each of the motivators. Although they are below

the mean, they still had a preferred motivator, being compliant. The difference between

these two profiles is attributed to motivation. The students in profile five are low

performing but have high motivation scores, while the students in profile six have low

reading performance patterns and low motivation scores.

Page 120: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

112

11

2

Pag

e11

2

2

2 P

age1

12

22

Pag

e11

22

2

Differences Between Conducting a Separate Versus a Combined Analysis

There are three clear similarities and differences to note between the reading

profiles and the combined profiles. First, the number of profiles were fewer when the

reading factors were evaluated separately. There are two reasons for this finding.

Statistically, we would expect fewer profiles due to the number of considered variables.

Next, the motivation variables allowed for further discriminations when considering

students’ performances across the two domains. Second, the separate analysis of the

reading clusters did not identify a group of students who had below the mean scores for

both word identification and meaning. The combined cluster analysis identified two

groups that had a combination of below mean scores for both word identification and

meaning. The original four patterns were present within clusters one through four;

however, by adding a motivational component, students who were low in both reading

areas were separated into two unique groups. Last, each of the largest profiles in both

analysis had what Buly and Valencia (2002) term “word callers.” These students had

stronger word identification scores and weaker meaning as they were unable to make

meaning at the same level that they could decode.

When examining differences between the separate and combined analysis for

motivation, there were four similarities and differences to note. First, the separate

motivation analysis produced a profile with only five students. This is quite a small

group considering the sample size. In comparison, the smallest profile in the combined

analysis was 19 students. Second, in the combined clusters, importance and grades were

consistently the highest motivators for each cluster. In the motivation only analysis,

Page 121: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

113

11

3

Pag

e11

3

2

2 P

age1

13

22

Pag

e11

32

2

profile five had recognition as the highest motivator. Third, both sets of clusters

contained a profile with the highest scores for each motivation domain and another

profile with students with the lowest scores. There are two possible explanations for

these differences. Statistically, when one increases the number of variables, greater

variation exists. Next, due to the nature of the variables across the two domains of

reading and motivation, greater differentiation resulted. These explanations might have

operated together to produce these differences.

In summary, these profiles demonstrated a need to include multiple variables,

particularly those related to motivation, to better understand non-proficient students.

When examining the patterns present, it was essential to understand the reading strengths

and weaknesses as well as identifying the most versus less preferred motivators. Instead

of classifying non-proficient and proficient readers as “motivated” or “not motivated,” we

should adopt a more multi-dimensional perspective by evaluating the multiple reasons as

to why students are or are not motivated to engage in a particular task (Baker & Wigfied,

1999). That is, motivation is multi-dimensional: it consists of patterns of underlying

motivational constructs that guide students’ participation in reading. These profiles

consisted of patterns with some students having preferences that motivate them to

participate in reading related activities and other areas that are not as desired. Also, the

identification of motivation in addition to the reading patterns is essential because the low

performing readers were not represented solely with the lower motivations for reading.

These lower performing students are often considered by teachers as not motivated when

Page 122: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

114

11

4

Pag

e11

4

2

2 P

age1

14

22

Pag

e11

42

2

they have differing motivators. To the contrary of existing work, with this “zipcode”

population, they did not demonstrate an avoidance performance orientation.

The profiles demonstrated that non-proficient readers from marginalized

populations were not a homogenized group (Dennis, 2013). The profiles consisted of

various numbers of students, made up of various ethnic groups. No one ethnic group was

represented as the dominant population in any of the profiles. Regardless of the profile,

they have unique needs that must be addressed via assessments with interventions to

build on their strengths while addressing their weaknesses. A one-size-fits-all

instructional approach does not benefit these students because it fails to place a direct

emphasis on the areas of strength and weakness.

Finally, when considering motivation, it is vital to dig deeper than just

acknowledging domains which are preferred or less preferred for students. In this study,

the motivation areas of Importance, Grades, and Social had consistent patterns across the

various profiles. As such, only interpreting these areas as part of the profiles would have

caused all profiles to be noted with almost identical motivational patterns. It is important

to take into consideration the overall domains with higher and lower preference and

understand how it impacts students’ total performances.

Implications

Informing Practice

Differentiated instruction is a key component for these non-proficient students.

Identification of a student as non-proficient via a standardized assessment does not

adequately describe the many facets of a student’s reading or motivational performances.

Page 123: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

115

11

5

Pag

e11

5

2

2 P

age1

15

22

Pag

e11

52

2

The main point is that non-proficient students fall into multiple patterns. Teachers must

use assessments that measure beyond the word identification components of reading.

These assessments will help teachers to identify specific patterns of reading strengths and

weaknesses for these students. The patterns can then be used to identify specific areas for

interventions to assist with remediating areas of weakness and ensuring the existing

achievement gaps are closed.

The results of the study support the need to understand and identify the

motivation for reading for non-proficient students. Some students within a classroom

may appear as low in all reading skills, however; they may not have low motivation

preferences across all domains. There will be clear preferences across their motivational

orientations. When teaching non-proficient students, these areas can be addressed as

motivators to encourage student participation in reading activities. In this study, social

reading was a low preference across all profiles. The students in this study would not

benefit from tasks that require them to read with a family member or complete specific

reading related activities outside of class (i.e., visiting the library, helping friends with

their schoolwork, telling their family and friends about what they are reading). Of

course, this recommendation might change after students improve their reading.

However, with the clear high scores for importance of reading, these students would

benefit from the teacher emphasizing why particular activities are of importance. While

multiple motivation surveys exist, surveys used should measure multiple motivation

domains. This would ensure that teachers adequately identify multiple motivators present

with their students.

Page 124: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

116

11

6

Pag

e11

6

2

2 P

age1

16

22

Pag

e11

62

2

These profiles present specific differentiations that can be used for students who

are represented within each profile. In differentiating, teachers should place emphasis on

the reading weaknesses of students, as this is the cause of students reading failures, while

also engaging students in reading activities based on their motivation domains with the

highest scores. In describing implications for differentiating instruction for each of the

profiles, I looked to those motivation factors which students in a profile rated most highly

as areas upon which teachers could focus their efforts to increase engagement. The

students who are described as Motivated Readers will benefit from interventions devoted

to helping them make meaning when they read. Regarding motivation, they uniquely had

high scores on 9 of the 10 items, giving teachers multiple ways to promote their

engagement. They particularly rated Grades, Importance, Curiosity and Efficacy as high

areas; therefore, teachers could appeal to these areas to promote their engagement. They

also could use any of the other five areas as an approach. These interventions and

activities should center on things of interest to the students, and things that will peak their

curiosity and help to increase their engagement while working on the students making

meaning (Vansteenkiste, Lens, & Deci, 2006). The Confident and Compliant Readers

believe they are decent readers, but they are also compliant, completing reading activities

because of external requirements. Regarding motivation, their four highest ratings were

Grades, Importance, Efficacy and Compliance factors. Interventions for these students

should emphasize fluency to increase the amount of words read per minute, while

continuing to reinforce making meaning. These interventions should encourage students

to continue to have strong beliefs of themselves as good readers meeting comprehension

Page 125: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

117

11

7

Pag

e11

7

2

2 P

age1

17

22

Pag

e11

72

2

proficiency, but need to read with increased speed and accuracy (Wigfield, Guthrie,

Tonks, & Perencevich, 2004).

The Competitive Word Callers will benefit from interventions focused on making

meaning. These students had the highest scores for grades, importance, efficacy and

competition and a strength in word recognition. These students must continue to hear

and believe they are good readers (Wigfield et. al., 2004), but with the emphasis of being

a good reader shifting from word identification to making meaning. With their

competitive nature, their interventions on making meaning should include charts or lists

that acknowledge their continued improvement in making meaning.

The Curious and Competitive Comprehending Readers need an emphasis placed

on their word recognition skills while still addressing meaning. These students had the

highest scores for Grades, Importance, Recognition, and Competition. Comprehension

(making meaning) is the ultimate goal of reading and although interventions for this

group should emphasize decoding and/or fluency, they should additionally address the

students making meaning. With the high desire to be recognized for their reading

behaviors, these students should be recognized for accomplishments through a goal chart

or a similar activity that allows the teacher to celebrate and recognize these student’s

success.

The Motivated Multi-Need Readers need interventions emphasizing both word

recognition and meaning, as both are weaknesses. Regarding motivation, similar to

profile one, they uniquely had high scores on all ten of the motivation items, giving

teachers multiple ways to promote their engagement. They particularly rated Grades,

Page 126: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

118

11

8

Pag

e11

8

2

2 P

age1

18

22

Pag

e11

82

2

Importance, Curiosity and Recognition; therefore, teachers could address these areas to

promote reading engagement. Interventions for this profile could be completed by

introducing and reinforcing decoding strategies during intervention time. While

addressing these strategies, the students would also benefit from explicit teaching of

comprehension related strategies through a variety of texts read by the students, with the

students and to the students. The interventions and activities for this group should

encourage the curiosity of students as they explore the reading by connecting to topics

and subjects with high interest and using highly engaging strategies (Meece, Anderman,

& Anderman, 2006).

The Compliant Multi-Need Readers represent the most challenging of the profiles.

These students have low scores for word recognition and meaning, and after Grades and

Importance, their highest motivational ratings are for Compliance. A teacher would not

want to encourage this set of behaviors by themselves; instead, they should focus on

promoting their curiosity, which has a closer rating, as a means to encourage their

engagement. Interventions for students in this cluster should emphasize meaning first,

while also addressing word recognition skills. Interventions and activities for this group

should encourage engagement by addressing their curiosity through high interest reading

texts and hands-on activities (Guthrie et al., 2007).

Informing Policy

As federal and state governments continue to enact accountability legislation, they

must consider how and what interventions will be implemented to meet these

accountability requirements. Instead of promoting one-size-fits-all remediation

Page 127: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

119

11

9

Pag

e11

9

2

2 P

age1

19

22

Pag

e11

92

2

(Allington, 2009), federal funding should be used to support the identification of

differentiation strategies with non-proficient readers. As emphasized in the implications

for practice section, these strategies should not be limited to programs, but should be

inclusive of actual reading and motivation strategies that can be shared with teachers to

use in their classrooms to differentiate instruction. With the low proficiency of

marginalized groups, they must be a direct beneficiary of new interventions. The

interventions must consider who will benefit from the strategy, with the emphasis placed

on non-proficient students.

As mandates continue to be made with regards to education, policy makers need

to make realistic considerations. The results of this study demonstrate the heterogeneity

that exists among non-proficient students. With this range of patterns, it is not realistic to

expect all students to meet grade level expectations immediately. Mandates for

proficiency should take this into consideration. Proficiency models should also consider

growth as an accountability factor. These models should take into account growth of

individual reading skills that are not measured by a single end-of-grade assessment.

Last, policy makers on local levels should continue to encourage differentiation

and equitable interventions for each reader. This differentiation for reading could occur

in two ways. First, schools can continue to require an intervention and enrichment block.

This time could be used to provide direct needs-based instruction that is related to the

students’ weaknesses (Averill, Baker, & Rindaldi, 2014). Second, schools can continue

to require a guided reading block. Guided reading is a part of a balanced literacy

framework that provides small group explicit instruction rooted in the individual needs of

Page 128: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

120

12

0

Pag

e12

0

2

2 P

age1

20

22

Pag

e12

02

2

students (Fawson & Reutzel, 2000; Fountas & Pinnell, 2012). This guided reading time

is beneficial for students to practice a variety of reading related skills daily with text on

their instructional level. This allows non-proficient students like those in this study to

receive instruction that addresses their weakness on a daily basis.

Future Work

From this study, multiple follow-up studies could occur. I will discuss four

possible avenues. First, in this sample of non-proficient readers, the motivation domain

of avoidance was not represented in the data. The sample size of the data set was close to

the acceptable range, but realistically it fell just slightly below the 200-400 range. Other

studies identified similar factor structures, but the MRQ structural model should be

examined using a larger sample, ideally 300-400 non-proficient students. This will

determine if there are consistent motivation domains represented across multiple samples.

Second, the motivation and reading profiles should be studied longitudinally

along with interventions based on their motivational preferences and their reading

strengths and weaknesses. Are memberships in profiles changing as would be desired, or

are they showing consistent increases in multiple areas and maintaining the same

patterns? While reading profiles are useful for understanding the patterns of non-

proficient students, the ultimate goal is reading proficiency. As such, we ideally want the

students’ patterns to change as they continue to grow as readers and hopefully achieve

proficiency. Are these patterns changing? Do components change? How do the changes

occur? These are all questions that could be investigated in a longitudinal study.

Page 129: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

121

12

1

Pag

e12

1

2

2 P

age1

21

22

Pag

e12

12

2

Third, fluency was not identified as a separate factor in this study. Theoretically

this combined factor including rate and accuracy would be expected with students who

are exposed to interventions and instruction focused on word recognition. Perhaps this

finding is related to the types of interventions offered to students, where the emphasis

was on the decoding of individual words with minimal focus on the reading of connected

text. Is this the case with other non-proficient students? Future work should investigate a

larger number of reading variables within the five areas of reading to determine if these

two areas will be combined or separate into isolated factors.

Fourth, the descriptive data noted at least one grade level difference between the

comprehension of narrative and expository texts for this sample. With the

implementation of Common Core State Standards, there has been a push to increase

students’ experiences with expository texts. There is a need to determine if the recent

emphasis on expository texts has resulted in increased comprehension of expository texts.

With this, we must determine whether or not this pattern is unique to this sample or is a

common trend amongst all non-proficient readers.

Limitations

Limitations exist with the current study, although I attempted to avoid many of

these. First, this study contained a specific sample not specifically present within all

schools. The sample was non-proficient students, specifically those who are from

marginalized (ethnic and economic) groups. With this, the results of this study with

regards to factors may not be generalizable to all non-proficient groups, nor to proficient

students. Second, the largest possible sample representing “zip code” schools was used

Page 130: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

122

12

2

Pag

e12

2

2

2 P

age1

22

22

Pag

e12

22

2

in the study. A confirmatory factor analysis was used to analyze the fit of the MRQ

domains. While my sample came close, it was not within the desired range of 200-400

for a CFA. Future work should seek to identify and use a larger sample to confirm the

lack of representation of the avoidance factor. Last, in terms of evaluating

comprehension, students’ background knowledge impacts their ability to make meaning

from text. The passages used in this study were selected by staff at the schools so that all

students should have had background knowledge of the topics. However, there may have

been passages for which the students had little or no background knowledge, which

would impact their ability to connect and make meaning.

Conclusions

These results are beneficial to parents, teachers, administrators, and policy

makers. There is no one-size-fits-all approach to remediating non-proficient readers.

These readers have multiple patterns of strengths and weaknesses. In addition to their

patterns of reading strengths and weaknesses, these students also have multiple patterns

of motivation preferences. While this study identified six profiles representing reading

and motivation patterns, the number of profiles are not important. The patterns highlight

the need to dig deeper and examine the motivation preferences, and reading strengths and

weaknesses. The designation of “non-proficient” from external assessments does not

provide information necessary to provide a direct intervention to improve the academic

outcomes for these students. This designation should merely be an identifier that gives a

license to dig deeper through multiple assessments in order to understand what

specifically needs to be addressed. This assessment data can be used to provide

Page 131: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

123

12

3

Pag

e12

3

2

2 P

age1

23

22

Pag

e12

32

2

meaningful differentiation for these non-proficient students. Through differentiation, the

ultimate goal of student growth and proficiency may be achieved.

Page 132: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

124

12

4

Pag

e12

4

2

2 P

age1

24

22

Pag

e12

42

2

REFERENCES

Afflerbach, P. (2004). Assessing adolescent reading. In T. L. Jetton & J. A. Dole (Eds.),

Adolescent literacy research and practice (pp. 369-391). New York: Guilford.

Allington, R. L. (2001). What really matters for struggling readers. New York:

Longman.

Allington, R. L. (2009). Literacy policies that are needed: Thinking beyond No Child

Left Behind. In Y. Goodman & J. Hoffman (Eds.), Changing literacies, changing

times: A historical perspective on the future of reading research, public policy,

and classroom practices. (pp. 266-281). New York: Routledge.

Allington, R. L., & McGill-Franzen, A. (1992). Unintended effects of educational

reform in New York. Educational Policy, 6, 397-414.

Amrein, A. L. & Berliner, D. C. (2003). The effects of high states testing on student

motivation and learning. Educational Leadership, 60, 32-38.

Atkinson, J. W., & Feather, N. T. (Eds.). (1966). A theory of achievement

motivation (Vol. 66). New York: Wiley.

Au, W. (2009). Unequal by design: High-stakes testing and the standardization of

inequality. London: Routledge.

Averill, O.H., Baker, D., & Rinaldi, C. (2014). A blueprint for effectively using RTI

intervention block time. Intervention in School and Clinic, 50, 29-38.

Page 133: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

125

12

5

Pag

e12

5

2

2 P

age1

25

22

Pag

e12

52

2

Baker, L., & Wigfield, A. (1999). Dimensions of children’s motivation for reading and

their relations to reading activity and reading achievement. Reading Research

Quarterly, 34, 452–477.

Brown-Jeffy, S., & Cooper, J. E. (2011). Toward a conceptual framework of culturally

relevant pedagogy: An overview of the conceptual and theoretical

literature. Teacher Education Quarterly, 38, 65-84.

Buly, M. R., & Valencia, S. W. (2002). Below the bar: Profiles of students who fail state

reading assessments. Educational Evaluation and Policy Analysis, 24, 219-239.

Burchinal, M., McCartney, K., Steinberg, L., Crosnoe, R., Friedman, S. L., McLoyd., V.,

& Pianta, R. (2011). Examining the Black-White achievement gap among low-

income children using the NICHD study of Early Child Care and Youth

Development. Child Development, 82, 1404-1420. doi:10.1111/j.1467-

8624.2011.01620.x

Cambria, J. & Guthrie, J.T. (2010). Motivating and engaging students in reading. New

England Reading Association Journal, 46, 16-29.

Cattell, R. B. (1966). "The scree test for the number of factors." Multivariate Behavioral

Research, 1, 245-276.

Chapman, J. W., & Tunmer, W. E. (1995). Development of young children's reading self-

concepts: An examination of emerging subcomponents and their relationship with

reading achievement. Journal of Educational Psychology, 87, 154.

Clotfelter, C. T., Ladd, H. F., & Vigdor, J. L. (2009). The academic achievement gap in

grades 3 to 8. The Review of Economics and Statistics, 91, 398-419.

Page 134: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

126

12

6

Pag

e12

6

2

2 P

age1

26

22

Pag

e12

62

2

Common Core State Standards Initiative. (2010). Common core state standards for ELA.

Retrieved from http://corestandards.org.

Costello, A. B. & Osborne, J. W. (2005). Exploratory Factor Analysis: Four

recommendations for getting the most from your analysis. Practical Assessment

Research & Evaluation, 10, 1-9.

Crowley, S. L., & Fan, X. (1997). Structural equation modeling: Basic concepts and

applications in personality assessment research. Journal of Personality

Assessment, 68, 508-531.

Dennis, D. V. (2013). Heterogeneity or homogeneity: What assessment data reveal about

struggling adolescent readers. Journal of Literacy Research, 45, 3-21.

Dunn, L. M., & Dunn, L. M. (2007). Peabody Picture Vocabulary Test (4th ed.). Circle

Pines, MN: American Guidance Service.

Eccles J. S., Adler, T. F., Futterman, R., Goff, S. B., Kaczala, C. M., Meece, J. L., &

Midgley, C. (1983). Expectancies, values, and academic behaviors. In J. T.

Spence (Ed.), Achievement and achievement motivation (pp. 75–146). San

Francisco, CA: W. H. Freeman.

Eccles, J.S., Wigfield, A., & Schiefele, U. (1998). Motivation to succeed. In N.

Eisenberg (Ed.), Handbook of child psychology (Vol. IV, pp. 1017–1095). New

York: John Wiley.

Elmore, R. F. (2002) Bridging the gap between standards and achievement: The

imperative for professional development in education. Washington, DC: Albert

Shanker Institute.

Page 135: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

127

12

7

Pag

e12

7

2

2 P

age1

27

22

Pag

e12

72

2

Fawson, P., & Reutzel, D. (2000). But I only have a basal: Implementing guided reading

in the early grades. The Reading Teacher, 54, 84-97.

Fisher, D., & Ivey, G. (2006). Evaluating the interventions for struggling adolescent

readers. Journal of Adolescent and Adult Literacy, 50, 180-189.

Fountas, I., & Pinnell, G. (2012). Guided reading: The romance and the reality. The

Reading Teacher, 66, 268-284.

Gaddis S. M., Lauen D. L. (2014). School accountability and the Black-White test score

gap. Social Science Research, 44, 15–31.

Gambrell, L. B., Palmer, B. M., Codling, R. M., & Mazzoni, S. A. (1996). Assessing

motivation to read. The Reading Teacher, 49, 518-533.

Good, R. H., & Kaminski, R. A. (Eds.). (2002). Dynamic indicators of early basic literacy

skills (6th ed.). Eugene, OR: Institute for the Development of Educational

Achievement.

Good, R. H., & Kaminski, R. A. (2011). DIBELS next assessment manual. Longmont,

CO: Sopris.

Graham, S., & Taylor, A. Z. (2002). Ethnicity, gender, and the development of

achievement values. In A. Wigfield, J. Eccles (Eds.), Development of

Achievement Motivation: A Volume in the Educational Psychology Series, (pp.

121-146). San Diego, CA: Academic Press.

Guthrie, J. T., Hoa, A. L. W., Wigfield, A., Tonks, S. M., Humenick, N. M., & Littles, E.

(2007). Reading motivation and reading comprehension growth in the later

elementary years. Contemporary Educational Psychology, 32, 282-313.

Page 136: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

128

12

8

Pag

e12

8

2

2 P

age1

28

22

Pag

e12

82

2

Guthrie, J.T, Coddington, C.S., & Wigfield, A. (2009). Profiles of motivation for reading

among African American and Caucasian students. Journal of Literacy Research,

41, 317-35.

Heckhausen, H. (1977). Achievement motivation and its constructs: A cognitive

model. Motivation and emotion, 1, 283-329.

Henson, R. K., & Roberts, J. K. (2006). Use of exploratory factor analysis in published

research common errors and some comment on improved practice. Educational

and Psychological measurement, 66, 393-416.

Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure

analysis: Conventional criteria versus new alternatives. Structural equation

modeling: a multidisciplinary journal, 6, 1-55.

Hursh, D. (2007). Assessing No Child Left Behind and the rise of neoliberal education

policies. American Educational Research Journal, 44, 493-518.

Jackson, C. K., Johnson, R. C., & Persico, C. (2015). The effects of school spending on

educational and economic outcomes: Evidence from school finance reforms. The

Quarterly Journal of Economics, 131, 157-218.

Kline, R. B. (2010). Principles and Practice of Structural Equation Modeling, 3rd

Edition. The Guilford Press.

Kaiser, H. F. (1958). The varimax criterion for analytic rotation in factor analysis.

Psychometrika, 23, 187-200.

Leach, J. M., Scarborough, H. S., & Rescorla, L. (2003). Late-emerging reading

disabilities. Journal of educational psychology, 95, 211.

Page 137: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

129

12

9

Pag

e12

9

2

2 P

age1

29

22

Pag

e12

92

2

Learning Point Associates. (2004). A closer look at the five essential components of

effective reading instruction: A review of scientifically based reading research for

teachers. Learning Point Associates.

Lee, J., Grigg, W., & Donahue, P. (2007). The nation's report card: Reading

2007 (NCES 2007–496). National Center for Education Statistics, Institute of

Education Sciences, U.S. Department of Education, Washington, D.C.

Lesaux, N. K., & Kieffer, M. J. (2010). Exploring sources of reading comprehension

difficulties among language minority learners and their classmates in early

adolescence. American Educational Research Journal, 47, 596-632.

Leslie, L., & Caldwell, J. S. (2006). Qualitative Reading Inventory (4th ed.). Boston,

MA: Allyn & Bacon.

Linn, R. L. (2000). Assessments and accountability. Educational Researcher, 29, 4-16.

Luchow, J. P., Crowl, T. K., & Kahn, J. P. (1985). Learned helplessness: Perceived

effects of ability and effort on academic performance among EH and LD/EH

children. Journal of learning disabilities, 18, 470-474.

Marinak, B. A., Malloy, J. B., Gambrell, L. B., & Mazzoni, S. A. (2015). Me and my

reading profile. The Reading Teacher, 69, 51-62.

Mazzoni, S. A., Gambrell, L. B., & Korkeamaki, R. L. (1999). A cross-cultural

perspective of early literacy motivation. Reading Psychology, 20, 237-253.

McCaslin, M. (2009). Co-regulation of student motivation and emergent

identity. Educational Psychologist, 44, 137-146.

Page 138: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

130

13

0

Pag

e13

0

2

2 P

age1

30

22

Pag

e13

02

2

McDonald, R. P., & Ho, M. H. R. (2002). Principles and practice in reporting structural

equation analyses. Psychological methods, 7(1), 64.

McKenna, M.C., Kear, D.J., & Ellsworth, R.A. (1995). Children's attitudes toward

reading: A national survey. Reading Research Quarterly, 30, 934–955.

Milner, H. R. (2013). Rethinking achievement gap talk in urban education. Urban

Education, 48, 3-8.

Meyer, C. K., Trathen, W., Morris, D., McGee, J., Stewart, T. T., Vines, N. A., & Gill, T.

(2013) Reading profiles of struggling middle-school readers: What does it mean

in the era of the Common Core State Standards. Paper presented at the annual

meeting of the Literacy Researchers Association, Dallas, Texas. Abstract

retrieved from

https://secure.literacyresearchassociation.org/papers/lra2013123204531.doc

Moje, E.B. (2004). Federal adolescent literacy policy: Implications for administration,

policy, and the adolescent literacy research community. Paper presented at the

annual meeting of the National Reading Conference, San Antonio, TX.

National Center for Education Statistics. (2015). The nation's report card: Reading

2015 (NCES 2015–457). Washington, DC: Institute of Education Sciences & U.

S. Department of Education.

National Reading Panel. (2000). Report of the National Reading Panel teaching children

to read: An evidence-based assessment of the scientific research literature on

reading and its implications for reading instruction. Washington, DC: National

Institute of Child Health and Human Development.

Page 139: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

131

13

1

Pag

e13

1

2

2 P

age1

31

22

Pag

e13

12

2

No Child Left Behind Act of 2001, Pub. L. No. 107-110.

Orfield, G., & Kornhaber, M. L. (Eds.) (2001). Raising standards or raising barriers?

Inequality and high-stakes testing in public education. New York: The Century

Foundation Press.

Paris, S., & Turner, J. (1994). Situated motivation. In P. Pintrich, D. Brown, & C.

Weinstein (Eds.), Student motivation, cognition, and learning: Essays in honor of

Wilbert J. McKeachie (pp. 213–237). Hillsdale, NJ: Lawrence Erlbaum.

Pierce, M. E., Katzir, T., Wolf, M., & Noam, G. G. (2007). Clusters of second and third

grade dysfluent urban readers. Reading and Writing, 20, 885-907.

Rencher, A. C. & Christensen, W. F. (2012). Methods of Multivariate Analysis (3rd ed.).

Hoboken, NJ: Wiley.

Rupp, A. A., & Lesaux, N. K. (2006). Meeting expectations? An empirical investigation

of a standards-based assessment of reading comprehension. Educational

Evaluation and Policy Analysis, 28, 315-333.

SAS Institute Inc. (2013). SAS 9.4 Help and documentation. Cary, NC

Schiefele, U. (1999). Interest and learning from text. Scientific Studies of Reading,

3, 257-279.

Schiefele, U., Schaffner, E., Möller, J., & Wigfield, A. (2012). Dimensions of reading

motivation and their relation to reading behavior and competence. Reading

Research Quarterly, 47, 427-463.

Sharma, S. (1996). Applied Multivariate Techniques. John Wiley, New York.

Page 140: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

132

13

2

Pag

e13

2

2

2 P

age1

32

22

Pag

e13

22

2

Shepard, L. (2000). The role of classroom assessment in teaching and learning,

CSE Technical Report 517, Los Angeles, CA, National Center for

Research on Evaluation, Standards, and Student Testing.

Slavin, R. E., Cheung, A., Holmes, G. C., Madden, N. A., & Chamberlain, A. .

(2013). Effects of a data-driven district reform model on state assessment

outcomes. American Educational Research Journal, 50, 371-396.

Steiger, J. H. (2007). Understanding the limitations of global fit assessment in

structural equation modeling. Personality and Individual differences, 42, 893-898.

Stumpf, S. A., Brief, A. P., & Hartman, K. (1987). Self-efficacy expectations and

coping with career-related events. Journal of Vocational Behavior, 31, 91-108.

Swanson, C. B., & Barlage, J. (2006). Influence: A study of the factors shaping

education policy. Bethesda, MD: Editorial Projects in Education Research Center.

Timm, N. (2002). Applied multivariate statistics. Secaucus, NJ: Springer.

Unrau, N., & Schlackman, J. (2006). Motivation and its relationship with reading

achievement in an urban middle school. Journal of Educational Research, 100,

81-101.

Vansteenkiste, M., Lens, W., & Deci, E. L. (2006). Intrinsic versus extrinsic goal

contents in self-determination theory: Another look at the quality of academic

motivation. Educational Psychologist, 41, 19-31.

Valencia, S. W. (2011). Using assessment to improve teaching and learning. In. S.

J. Samuels & A. E. Farstrup (Eds.). What research has to say about reading

instruction (4th ed., pp.379-405). Newark, DE: International Reading Association.

Page 141: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

133

13

3

Pag

e13

3

2

2 P

age1

33

22

Pag

e13

32

2

Valencia, S. W., & Buly, M. R. (2004). Behind test scores: What struggling

readers really need. Reading Teacher, 57, 520-533.

Wagner, R. K., Torgesen, J. T., & Rashotte, C. A. (1994). The Comprehensive

Test of Phonological Processing. Austin, TX: PRO-ED.

Watkins, M. W., & Coffey, D. Y. (2004). Reading Motivation: Multidimensional

and Indeterminate. Journal of Educational Psychology, 96, 110-118.

Wang, J. H., & Guthrie, J. T. (2004). Modeling the effects of intrinsic

motivation, extrinsic motivation, amount of reading, and past

achievement on text comprehension between U.S. and Chinese students.

Reading Research Quarterly, 39, 162-186.

Wigfield, A. (1997). Children's motivations for reading and reading

engagement. In J.T. Guthrie & A. Wigfield (Eds.), Reading engagement:

Motivating readers through integrated instruction (pp. 14–33). Newark, DE:

International Reading Association.

Wigfield, A., & Guthrie, J. T. (1995). Dimensions of children’s motivations for

.reading: An initial study (Reading Research Report No. 34). Athens, GA:

National Reading Research Center, Universities of Georgia and Maryland College

Park.

Wigfield, A., & Guthrie, J. T. (1997). Relations of children’s motivation for

reading to the amount and breadth of their reading. Journal of Educational

Psychology, 89, 430–432.

Page 142: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

134

13

4

Pag

e13

4

2

2 P

age1

34

22

Pag

e13

42

2

Wigfield, A., Guthrie, J. T., Tonks, S., & Perencevich, K. C. (2004). Children's

motivation for reading: Domain specificity and instructional influences. The

Journal of Educational Research, 97, 299-310.

Yuan, K. H. (2005). Fit indices versus test statistics. Multivariate Behavioral

Research, 40, 115–148.

Page 143: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

135

13

5

Pag

e13

5

2

2 P

age1

35

22

Pag

e13

52

2

APPENDIX A

MOTIVATION FOR READING QUESTIONNAIRE ITEMS

Students choose one of the following options for each statement below.

If the statement is very different from you, circle a 1.

If the statement is a little different from you, circle a 2.

If the statement is a little like you, circle a 3.

If the statement is a lot like you, circle a 4.

1. I like being the best at reading.

2. I like it when the questions in books make me think.

3. I read to improve my grades.

4. If the teacher discusses something interesting I might read more about it.

5. I like hard, challenging books.

6. I enjoy a long, involved story or fiction book.

7. I know that I will do well in reading next year.

8. If a book is interesting I don’t care how hard it is to read.

9. I try to get more answers right than my friends.

10. I have favorite subjects that I like to read about.

11. I visit the library often with my family.

12. I make pictures in my mind when I read.

Page 144: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

136

13

6

Pag

e13

6

2

2 P

age1

36

22

Pag

e13

62

2

13. I don’t like reading something when the words are too difficult.

14. I enjoy reading books about people in different counties.

15. I am a good reader.

16. I usually learn difficult things by reading.

17. It is very important to me to be a good reader.

18. My parents often tell me what a good job I am doing in reading.

19. I read to learn new information about topics that interest me.

20. If the project is interesting, I can read difficult material.

21. I learn more from reading than most students in the class.

22. I read stories about fantasy and make believe.

23. I read because I have to.

24. I don’t like vocabulary questions.

25. I like to read about new things.

26. I often read to my brother or my sister.

27. In comparison to other activities I do, it is very important to me to be a

good reader.

28. I like having the teacher say I read well.

29. I read about my hobbies to learn more about them.

Page 145: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

137

13

7

Pag

e13

7

2

2 P

age1

37

22

Pag

e13

72

2

30. I like mysteries.

31. My friends and I like to trade things to read.

32. Complicated stories are no fun to read.

33. I read a lot of adventure stories.

34. I do as little schoolwork as possible in reading.

35. I feel like I make friends with people in good books.

36. Finishing every reading assignment is very important to me.

37. My friends sometimes tell me I am a good reader.

38. Grades are a good way to see how well you are doing in reading.

39. I like to help my friends with their schoolwork in reading.

40. I don’t like it when there are too many people in the story.

41. I am willing to work hard to read better than my friends.

42. I sometimes read to my parents.

43. I like to get compliments for my reading.

44. It is important for me to see my name on a list of good readers.

45. I talk to my friends about what I am reading.

46. I always try to finish my reading on time.

Page 146: SMITH, HIAWATHA D., Ph.D. Digging Deeper: Understanding

138

13

8

Pag

e13

8

2

2 P

age1

38

22

Pag

e13

82

2

47. I am happy when someone recognizes my reading.

48. I like to tell my family about what I am reading.

49. I like being the only one who knows an answer in something we read.

50. I look forward to finding out my reading grade.

51. I always do my reading work exactly as the teacher wants it.

52. I like to finish my reading before other students.

53. My parents ask me about my reading grade.