at usf libraries. -...

Post on 16-Mar-2018

225 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

A

MODEL

FOR

CONDUCTING

A

FUNCTIONAL

ANALYSI

S

OF

ACADEMIC

PERFORMANCE

PROBLEMS Abstract: The

purpose of this article is to describe a simple conceptual

framework for academic intervention that extends functional

analysis procedures to basic academic skills. We organized

the empirical research on academic intervention into five

hypotheses that can guide the selection of interventions. T

reatment protocols for six academic interventions and proced

ures for quickly testing their effectiveness are presented.

The description of procedures for simple tests includes di

scussion of design issues, measuring outcomes, and guideline

s for selecting treatments.

There is an expectation in our

society that children will attend school and that school,

at the very least, will impart important academic skills. C

hildren are expected to be able to perform basic skills suc

h as reading, mathematics, and writing. Some fail to accomp

lish what is expected of them. When what a child can actua

lly do is at variance with what is expected of that child,

there is often an interest in understanding or explaining

the deficits in academic performance. The goal of this arti

cle is (a) to suggest some explanations for the failure of

some children to perform, (b) to offer a series of direct

"tests" for each of the common reasons children fail, and

(c) to provide some simple interventions which are linked

directly to the outcome of the tests.

The reasons why chil

dren fail are complex. Reading, for example, involves trans

lating symbols from a page into meaningful language, and in

numerable parts or combinations of this visual and neurolog

ical process can malfunction. When school psychologists are

asked by a parent or teacher why a child does not learn,

often the correct answer is, "I don't know." This response,

however, is not very satisfying to either the speaker or

the listener. Therefore, we often attempt to explain studen

t failure using diagnostic terms such as dyslexia or learni

ng disability or mental retardation. Relating academic perfo

rmance to other observed or inferred student characteristics

represents a structural approach to understanding failure

(Nelson & Hayes, 1986). Structural explanations rely upon p

revious correlational research to help school psychologists

organize what is often a complex picture of student strengt

hs and weaknesses, and structural labels typically occasion

major changes in a child's life such as special education

placement. From an intervention perspective, however, struc

tural explanations for student failure are limited in two r

espects. First, student performance as well as the traits i

nferred from such performance cannot be manipulated directly

(Nelson & Hayes, 1986). Rather, student behavior can only

be altered indirectly by manipulating one or more factors e

xternal to the child (e.g., what is taught or the amount o

f teacher assistance provided). Second, because structural e

xplanations emphasize student traits as causal agents, they

do not identify those factors external to the child that

may be contributing to academic failure.

Relating academic

performance to aspects of classroom instruction that prece

de and follow student performance represents a functional a

pproach to understanding failure. Functional explanations ap

peal to factors external to the child that have been shown

experimentally to affect academic performance such as time

for learning, feedback from the teacher, and reinforcement

for correct responding. Because these factors are external

to the child and subject to direct manipulation, functiona

l explanations have the added advantage of identifying simp

le, practical targets for instructional programming.

Functi

onal explanations of poor academic performance will perforce

be related to what teachers do to teach students: arrangin

g the instructional environment, sequencing how instruction

is delivered as students progress through the curriculum, p

roviding sufficient opportunities to respond, and/or structu

ring contingencies. Each of these teaching actions occurs a

s a result of instructional decision making. When students

are not learning, the task is to analyze how these factors

affect student performance to make explicit decisions abou

t how best to teach the child. In this article, five commo

n reasons why students fail will be described. These reason

s are related to what teachers should be doing in the clas

sroom to teach students effectively. Next, some simple meth

ods for testing these hypotheses quickly and efficiently wi

ll be presented. This section will include protocols for ef

fective academic interventions, how to structure these simpl

e tests (i.e., design issues), and important outcome measur

es that can be obtained reliably. The goal of this article

is to demonstrate that it is possible to apply the logic

and procedures of functional analysis to academic skills, p

rovide a framework for doing so, and offer protocols for ef

avior analysis has been particularly successful at linking

assessment to effective interventions by identifying the fun

ctions of behavior. When the contingencies maintaining behav

ior are known, it is possible to rearrange the contingencie

s so that appropriate and adaptive choices are more likely

and inappropriate responses are less likely (Martens, Witt,

Daly, & Vollmer, in press). For example, when it is known

that adult attention maintains tantrum behavior in a child

(i.e., tantrum behavior functions to gain attention), the

intervention might consist of not providing attention while

a child is having a tantrum, but providing attention when

it is sought in an appropriate manner. This line of resea

rch has been applied largely to aberrant social behaviors a

nd has not yet been applied to academic skills in a way th

at systematically compares variables functionally related to

student learning. A functional analysis of academic behavi

ors would provide information about the relative effects of

different teaching strategies (e.g., modeling, error correc

tion, or practice) on student performance. When the effects

of different instructional strategies are known (e.g., mod

eling versus practice), it is possible to alter instruction

to maximize the likelihood that the intervention will be s

uccessful.

Five common factors known to affect student aca

demic performance can be used as a starting point for gener

ating hypotheses that lead to interventions. Five of the mo

st common reasons students perform academic work poorly are

that (a) they do not want to do it, (b) they have not sp

ent enough time doing it, (c) they have not had enough hel

p to do it, (d) they have not had to do it that way befor

e, or (e) it is too hard (see Figure 1).

In Figure 1, eac

h is referred to as a "Reasonable Hypothesis" because they

are factors over which educators have control and are funct

ional reasons for why students fail to succeed in school. M

ore detail is provided about no. 3 because of the different

forms of assistance that students may need. These hypothes

es are not intended to be independent of each other. Academ

ic skills require a sequence of instructional activities th

at build upon one another to increase response rates in the

presence of curricular materials. Therefore, the strategies

that teachers use are often interrelated (e.g., modeling a

nd error correction are always confounded with opportunities

to respond). Therefore, the hypotheses are intended to ref

lect increasing levels of intrusiveness of academic interven

tion rather than independent hypotheses.

Time is a preciou

s commodity. Educators need to be efficient when problem so

lving. Under many circumstances, the most efficient thing t

o do is to test the easiest hypothesis first, implement an

intervention, and monitor and evaluate outcomes. If that a

pproach fails to improve student performance, then something

progressively more time intensive can be attempted until t

he probable cause of failure is identified. Also, easier so

lutions are more likely to be implemented consistently whil

e solutions which are more time consuming or technically di

fficult for teachers and support personnel are less likely

to be implemented correctly (Gresham, 1989). Therefore, the

"reasonable hypotheses" presented in Figure 1 are ordered

from those requiring the least intervention to those requir

ing the most intensive instructional intervention based on

logical considerations of classroom environments in general

.

This sequence is intended to be heuristic and not an in

variant sequence to be implemented by practitioners. When c

onducting assessment and intervention for academic problems,

practitioners might want to consider factors like teacher

skill, classroom routines, time required for implementing an

intervention, and student skill level in the hypothesis ge

neration phase. Therefore, practitioners might consider alte

rnate sequences of the proposed hypotheses depending upon t

he unique characteristics of the settings in which they wor

k (e.g., changing instructional materials might be very acc

eptable in some settings and not in others). In some cases,

practitioners might already have assessment data that indi

cate that one hypothesis is more likely than another (e.g.,

variable student performance might suggest a performance d

eficit; alternately, many errors might suggest an accuracy

problem). A direct test might then be conducted to confirm

or disconfirm the hypothesis.

We will examine the empirica

l support for the role of each hypothesis in student academ

ic performance and related interventions before we describe

strategies to test for each hypothesis. Figure 2 contains

examples for interventions that have been effective at impr

oving student performance, and each is associated with pres

umed functions of the target behavior. Appendix A contains

references that can be useful in developing interventions f

student not able to perform the skill (a skill deficit) o

r is the student able to perform the skill, but "just does

n't want to?" The distinction between skill and performance

deficits was clarified by Lentz (1988, p. 354) who stated,

"Skill problems will require interventions that produce ne

w behavior; performance problems may require interventions i

nvolving manipulation of 'motivation' through contingency ma

nagement." It is relatively easy to test the hypothesis of

a performance deficit. Incentives for reading (Ayllon & Rob

erts, 1974; Staats & Butterfield, 1964; Staats, Minke, Finl

ey, Wolf, & Brooks, 1964) and math (Broughton & Lahey, 1978

) have been effective in improving students' motivation and

performance (i.e., increasing active participation and decr

easing disruptive behaviors). If a student fails to respond

to incentives for increased academic performance, then eit

her the wrong incentives were used or the student does not

have the skills to perform the task.

The literature conta

ins numerous examples of interventions that can be used to

test this hypothesis. For example, Lovitt, Eaton, Kirkwood,

and Pelander (1971) improved students' oral reading fluenc

y by offering incentives for reading faster. Another strate

gy for improving students' motivation that is relatively ea

sy to implement for most teachers is offering students a ch

oice of work to be performed (e.g., a story about baseball

versus a fairy tale) or the order in which work is perfor

med (e.g., allowing the child the choice of a vocabulary dr

ill or a silent reading first). Students' on-task behavior

has been improved by giving students a choice of instructio

nal activities (Dunlap et al., 1994; Seybert, Dunlap, & Fer

ro, 1996), a strategy that can be easily adapted to most i

nstructional formats. It is noteworthy that in some of this

research students displayed high rates of on-task behavior

on those very assignments that they refused to do previous

ly. The only difference between assignments completed and a

ssignments refused was that students were allowed to choose

among several instructional assignments during seatwork. Wh

en students were allowed to choose the assignment, their co

ademically because he or she simply has not spent enough ti

me actively practicing the skill. There are large differenc

es in the amount of time students spend actively engaged in

academic responding (Rosenshine, 1980). Large differences a

lso have been observed across socioeconomic levels (Greenwoo

d, 1991). For instance, longitudinal studies conducted by r

esearchers at the Juniper Gardens Children's Project have i

dentified large cumulative differences across socioeconomic

levels in the amount of time students are actively respondi

ng. These differences amount to greater than 1.5 years more

schooling by the end of middle school for students of hig

her socioeconomic levels than for students of lower socioec

onomic levels (Greenwood, Hart, Walker, & Risley, 1994).

In a review of the literature on academic engaged time, Yss

eldyke and Christenson (1993) concluded that variability acr

oss classrooms and schools leads to large differences in th

e amount of time that students are academically engaged. Th

ese differences increase the salience of engaged time as an

important variable in the investigation of a student's aca

demic problems and underscore the importance of examining t

his factor on an individual basis. The implications for int

ervention are obvious. As a first step, a student's current

rate of active responding in the problematic subject area

or time of day should be estimated. This task can be accom

plished through recent advances in observation techniques su

ch as the Ecobehavioral Assessment Systems Software (Greenwo

od, Carta, Kamps, & Delquadri, 1995) and the Behavior Obser

vation of Students in Schools (Shapiro, 1996), two observat

ion tools that provide estimates of student active engageme

nt. The second step involves increasing the student's activ

e responding. Various strategies such as providing highly s

tructured tasks, allocating sufficient time for instruction,

providing continuous and active instruction, maintaining hi

gh success rates, and providing immediate feedback have bee

n shown to improve student engagement rates (Denham & Liebe

rman, 1980; Stallings, 1975; Ysseldyke & Christenson, 1993).

Even simpler solutions may be equally as effective; for in

stance, allocating more time for student responding and dec

reasing intrusions (e.g., transition time) into instructiona

n (1993) warn that engaged time is only moderately (though

significantly) related to student achievement. Increasing ti

me for engagement may not be sufficient ira student needs m

ore help to perform instructional tasks successfully. Feedba

ck for student responses may be necessary to assist a stude

nt to respond accurately and quickly (Heward, 1994). Feedba

ck is an integral part of the learning trial and consists

of an instructional antecedent (e.g., "Who was the first pr

esident of the United States?"), an active student response

(e.g., "George Washington."), and a consequence (e.g., "Co

rrect!"). When teachers actively provide feedback to student

s for responding, they increase the likelihood of student a

chievement (Rosenshine & Berliner, 1978).

Belfiore, Skinner

, and Ferkis (1995) showed that complete learning trials we

re more effective in helping students to master sight words

than merely having students repeatedly say the correct res

ponse. A learning trial consists of an antecedent (e.g., a

flashcard with "3 x 3") prior to a response and a conseque

nce (e.g., "Correct!" or "No, the correct answer is 9.") fo

llowing a response. Another strategy for increasing feedback

via complete learning trials is choral responding. Choral

responding involves having all students respond verbally dur

ing group lessons. Choral responding has been shown to impr

ove learning rates for diverse groups of students, includin

g preschool children with developmental disabilities, childr

en identified as Severe Behavior Handicap, first grade Chap

ter 1 students, general education students, and students id

entified as Developmentally Handicapped in special education

classrooms (Heward, 1994). Choral responding has been show

n to be more effective at improving learning rates when com

pared to on-task instruction in which the teacher praised s

tudents for paying attention while asking the same number a

nd type of questions (Sterling, Barbetta, Heron, & Heward,

1997).

Another strategy for increasing feedback to students

for nonverbal responses is response cards. To use response

cards, teachers can instruct students to write the correct

response on laminated cards during group instruction for m

ath, spelling, or content lessons. When the teacher asks a

question, the students are expected to write their answers

on the cards and to hold up the correct response. The teac

her scans students' responses and provides feedback to stud

ents. Heward (1994) provided guidelines for implementing the

use of response cards. Response cards have been shown to

improve (a) rates of responding and quiz scores relative to

hand-raising during fourth-grade recitation social studies

lessons (Narayan, Heward, Gardner, Courson, & Omness, 1990),

(b) ontask behavior of students with disruptive and off-ta

sk behavior during social studies lessons (Gardner, Heward,

& Grossi, 1994), and (c) quiz scores in earth science cla

sses for high school students (Cavanaugh, Heward, & Donelso

n, 1996). The common strand of these strategies is that the

y increase the amount of feedback given to students immedia

tely following responding. It is an opportunity to provide

positive feedback for correct responses and to correct erro

rs immediately rather than allowing a student to practice t

he wrong answer.

The Instructional Hierarchy. In the event

that increasing feedback during time allocated for instruc

tion is not sufficient for improving student performance, i

t may be necessary to look more carefully at a student's s

kill level as a basis for developing instructional interven

tions. How much assistance a student requires is dependent

upon his or her level of skill mastery. Mastery, in turn,

develops in a sequence of stages which lead to proficiency

and use of the skill across time and contexts (Daly, Lentz,

& Boyer, 1996; Haring, Lovitt, Eaton, & Hansen, 1978; Howe

ll, Fox, & Morehead, 1993). Initially, effective instruction

promotes accurate performance of the skill. At this stage,

modeling the skill and observing the learner are critical

components of good instruction. Also, explicit feedback abou

t performance is necessary. After the learner becomes accur

ate, the next step is to become fluent in the use of the s

kill. For a skill (e.g., "4 x 2 = 8") to be useful in the

future (e.g., with long division), the learner must be abl

e to respond rapidly when presented with the problem. Pract

ice improves skill fluency. Accurate and fluent skill use i

ncreases the chances that the learner will generalize the s

kill across time, settings, and other skills (Daly, Martens

, Kilmer, & Massie, 1996; LaBerge & Samuels, 1974; Wolery,

Bailey, & Sugai, 1988).

Daly, Lentz, and Boyer (1996) used

the heuristic notion of an "instructional hierarchy" devel

oped by Haring et al. (1978) to show that in many studies

of academic interventions the effectiveness of instructional

strategies for improving student accuracy and fluency coul

d be predicted based on the strategies used (e.g., modeling

versus practice). Although other instructional hierarchies

have been developed during the past century, Haring et al.'

s (1978) model is particularly useful because it explains p

atterns of results and allows us to predict which intervent

ions are most likely to be effective based on the component

s of instruction and where students are in the learning seq

uence. The instructional hierarchy suggests that strategies

that incorporate modeling, prompting, and error correction c

an be expected to improve accuracy and that strategies that

incorporate practice and reinforcement for rapid responding

can be expected to improve fluency. In a demonstration of

the predictive power of this particular instructional hier

archy, Daly and Martens (1994) accurately predicted the pat

tern of results of three reading interventions based on the

red contingencies for performance, the amount of time stude

nts are actively engaged in schoolwork, how much feedback t

eachers give, and some important teaching strategies. If in

terventions based on these factors do not lead to improveme

nts in academic performance, then the next factor to examin

e is the role of the instructional materials of improving s

tudent outcomes. When analyzing the instructional tasks and

assignments as possible factors related to poor student pe

rformance, there are at least two reasons why they may be

hindering student learning: either they are not helping the

student to practice actual skill use (Vargas, 1984) (Reaso

n # 4) or they are too difficult (Reason #5) (Gickling & A

rmstrong, 1978; Gickling & Rosenfield, 1995).

The goal of

instruction is to bring student responding under the contro

l of the instructional materials (i.e., they do not need he

lp to get the correct answer) so that students can apply t

heir skills to real life demands (Heward, Barbetta, Cavanau

gh, & Grossi, 1996). For instance, when teaching a student

to read the word "Name," the goal is for the printed word

"Name" to elicit the student response (i.e., reads "Name").

In this way, the student will be able to respond to many

demands for use of the skill such as completing a job appl

ication. Practicing the skill means that instructional mater

ials provide enough examples and nonexamples of the target

skill so students know when to use the skill (Englemann &

Carnine, 1991). For instance, if you are teaching colors to

students, you want to be sure to present enough examples

of the colors being taught using several different shapes s

o that they do not confuse the color red with the shape of

the objects you are using. Therefore, the instructional ma

terials are critical for providing students enough opportuni

ties to practice important academic skills while helping th

em recognize when to use the skill and when not to use the

skill.

Practicing the skill also means that students need

to know how to obtain the correct answer. Unfortunately, m

any instructional materials allow students to achieve the c

orrect answer for the wrong reason (Vargas, 1984). Take Bil

l, for instance, who knows how to complete his vocabulary w

orksheets by looking at the number of blanks where he is s

upposed to write the response and counting the number of le

tters in each of the vocabulary words. He is not learning

his vocabulary words. Instead, he has devised a strategy fo

r obtaining the right answer for the wrong reasons. Finally

, the instructional materials might not have the student re

spond in the way that he or she would be expected to respo

nd according to the curriculum. Therefore, teaching spelling

by having students circle spelling words in word puzzles o

r having them circle the correct answer among four options

is inadequate preparation for spelling words for which dict

ation is required.

Interventions aimed at improving the in

structional materials should be directed toward using materi

als that elicit the kinds of student responses that would b

e expected of students who have mastered the curriculum. As

a first step, the student's objectives should be defined (

e.g., will spell words with common consonant combinations a

ccurately). The, there must be assurance that the instructi

onal materials provide enough practice in actual use of the

skill (e.g., have students practice spelling words with co

mmon consonant combinations rather than having them circle

correct spellings). Worksheets that only require the student

to respond to a few items (e.g., each vocabulary word app

ears once) in a limited format (e.g., circling correct word

s versus spelling words) are not likely to improve student

performance. Therefore, whereas Hypothesis #2 suggested that

a student might not be spending enough time practicing the

skill to perform it better, Hypothesis #4 suggests that th

e student might not be spending time on the right kinds of

instructional activities to perform the skill better, indi

use the instructional materials are too difficult. Gickling

and Armstrong (1978) improved students' accuracy of assign

ed work and on-task behavior by changing instructional mate

rials to assure that they were not too difficult or too ea

sy. Students are more likely to generalize what they have j

ust learned to other similar instructional materials when t

hey are instructed at their instructional level. Daly, Mart

ens, Kilmer, and Massie (1996) found that providing reading

instruction at a level that was more appropriately matched

to students' skill levels resulted in greater generalizati

on than when instruction was provided in more difficult mat

erials. The importance of the effects of task difficulty on

student learning is obvious and cannot be overstated. Unfo

rtunately, often teachers have in their classrooms learners

with multiple skill levels, and changing instructional mat

erials to match each student's instructional level is a dif

ficult task. Therefore, it is one of the last things that

they may be willing to change. For this reason, we present

it last. If, however, instructional changes based on the o

ther factors are not effective, teachers will be hard press

ed to insist that they do not want to change the difficult

edures for conducting a functional analysis of behavior hav

e virtually revolutionized the treatment of behavior problem

s in children and individuals with developmental disabilitie

s (Homer, 1994; Iwata, Pace et al., 1994; Mace, 1994). The

key steps in conducting a functional analysis of behavior

involve (a) generating hypotheses about the possible ways i

n which problem behavior is being reinforced, and (b) expos

ing the individual to brief test conditions designed to mim

ic the hypothesized contingencies (Martens et al., in press

). For example, based on teacher interview and classroom ob

servation, one might suspect that a child is engaging in di

sruptive behavior to obtain attention from the teacher or t

o avoid schoolwork. To determine which of these hypotheses

is correct, we might instruct the teacher to attend to the

child for every misbehavior during some class periods and

to send the child to the office for every misbehavior durin

g other class periods. If we observe increases in disruptiv

e behavior only under the condition during which attention

is provided, then we can conclude with confidence that the

child's behavior is being positively reinforced by attention

from the teacher.

The use of brief test conditions to id

entify the functions of problem behavior was pioneered by I

wata and his colleagues in their work with self-injurious b

ehavior (Iwata, Dorsey, Slifer, Bauman, & Richman, 1994). T

he impact of their work on the treatment of self-injurious

behavior can be attributed, in part, to the logic underlyin

g functional analysis procedures. First, although the possib

le functions of problem behavior are relatively few in prin

ciple (e.g., positive or negative reinforcement), it is ass

umed that these principles operate in ways that are idiosyn

cratic to the life of each individual. Second, because test

conditions are designed to mimic and intensify the conting

encies supporting problem behavior, individual responses to

these conditions are based in large part on previous learni

ng. Capitalizing on previous learning allows test conditions

to be brief in duration (e.g., 10 min) yet still produce

relatively large changes in behavior. Third, functional anal

ysis can be characterized as an experimentally based assess

ment procedure in that it allows one to identify the functi

ons of problem behavior by manipulating contingencies. Once

identified, these functions can be selectively eliminated,

reversed, or weakened by implementing the appropriate trea

tment procedure (Martens et al., in press). As an assessmen

t technique then, functional analysis has the potential to

make all existing interventions more effective by allowing

them to be matched with confidence to the causes of behavio

r problems. Interventions based on a functional analysis of

behavior have been dramatically effective in decreasing se

lf-injurious behavior, and test conditions also have been u

sed to identify the causes of classroom behavior problems w

ith excellent results (e.g., Broussard & Northup, 1995; Ker

n, Childs, Dunlap, Clarke, & Falk, 1994).

Unlike many beha

vior problems which are supported by operant contingencies

(i.e., performance deficits), students' academic problems of

ten result from an absence of skills or inadequate skill de

velopment (i.e., skill deficits). How then is it possible t

o conduct a functional analysis of something like poor read

ing? If the goal of a typical functional analysis is to id

entify the operant contingencies affecting problem behavior,

then the goal of a functional analysis of reading is to i

dentify the instructional contingencies affecting academic b

ehavior. Because skills cannot be unlearned once they are a

cquired, test conditions for academic problems involve brief

presentations of two or more instructional strategies beli

eved to improve performance applied to different sets of ma

terials (Cooper et al., 1992; McComas et al., 1996). The fo

llowing is a sequence of test conditions for identifying th

e instructional contingencies affecting academic performance

problems. These conditions were designed based upon a revi

ew of research suggesting the availability of(a) instruction

al strategies that are brief, easy to implement, and produc

e immediate and substantial improvements in performance (e.g

., repeated readings or listening passage preview); (b) mea

sures that are sensitive to short-term gains in academic pe

rformance (e.g., curriculum-based measurement probes); and (

c) a theoretical model for sequencing the presentation of t

sequence of test conditions in a functional analysis is ref

erred to as a multielement design (e.g., Iwata, Pace et al.

, 1994). A multielement design typically begins with a base

line period to establish levels of the behavior in question

. Then, one or two sessions are conducted under the first

test condition followed by sessions conducted under the sec

ond, third, etc., condition until all conditions being comp

ared have been alternated repeatedly in a semi-random seque

nce (e.g., Iwata, Dorsey et al., 1994). The data points for

each condition (usually four or more) are then connected p

roducing separate data series which are examined for degree

of overlap.

An alternative version of the multielement de

sign requires that only one session be conducted per test c

ondition. In these studies, test conditions were presented

in a predetermined sequence that proceeded from easiest to

implement (requiring the least amount of adult assistance)

to most difficult to implement (requiring the most amount o

f adult assistance) (Harding, Wacker, Cooper, Millard, & Je

nsen-Kovalan, 1994; McComas et al., 1996). For example, Har

ding et al. (1994) presented test conditions in sequence un

til a significant increase in on-task behavior was observed

. Once this occurred, the condition producing the increase

was repeated with the condition just prior to it that did

not produce an increase, thereby resulting in a mini-revers

al design. By demonstrating improved performance on two occ

asions using the mini-reversal design, the researchers were

able to conclude with greater confidence that a given cond

ition would be an effective intervention in the natural set

ting. The brief multielement design has been used extensive

ly for conducting functional analyses of social behaviors i

n clinic settings in which time limitations make it difficu

lt to conduct extended analyses (Derby et al., 1992). Vollm

er, Marcus, Ringdahl, and Roane (1995) recommend its use if

differentiated response patterns can be obtained briefly b

efore trying more extended experimental analyses. By virtue

of its simplicity (relative to other kinds of evaluation d

esigns), the brief multielement design is uniquely suited t

o the kinds of time limitations under which school psycholo

gists often operate.

Because the test conditions reported

in this article target separate aspects of skill developmen

t and can be ordered according to ease of teacher implement

ation, we suggest that they be presented using a combinatio

n of strategies reported by Harding et al. (1994) and McCom

as et al. (1996). By way of example, Figure 3 presents dat

a for Jill, a sixth-grade student being instructed at a fif

th-grade reading level. At baseline, Jill correctly read 87

words per minute from the fifth-grade book of a basal rea

ding series. In this condition no special assistance was pr

ovided (i.e., she read the passage independently while the

examiner recorded her performance). This session was then f

ollowed by one session each of a series of test conditions

in sequence until her reading fluency reached a predetermi

ned criterion of 100 correctly read words per minute (Shapi

ro, 1996). Conditions were presented in the following order

: Contingent Reinforcement and Repeated Readings. In the Co

ntingent Reinforcement condition, Jill was offered incentive

s for reading at a rate of 100 correctly read words per mi

nute with three or fewer errors. Jill read 72 words correct

ly in one minute in this condition, indicating that her low

fluency rate is not a result of a performance deficit. In

the Repeated Readings condition, Jill read the passage fou

r times. After each reading she received feedback on how qu

ickly she read the passage. On the fourth reading Jill read

at a rate of 129 words correct per minute, a rate that ex

ceeds the criterion. Next, a baseline condition was repeate

d (in which Jill read 94 words correctly per minute) follow

ed by Repeated Readings (in which she read 117 words correc

t per minute) to produce a minireversal. It appears that Ji

ll benefits from increased opportunities to respond and rep

eated practice. Notice that the number of test conditions w

hich are implemented prior to the reversal will likely diff

er across children and the criterion levels of improvement

differ across grade levels.

Once the mini-reversal is comp

leted, one may conclude that an instructional program conta

ining an effective strategy in the test condition is more l

ikely to improve student performance than an ineffective st

rategy. Appendix B contains intervention protocols (includin

g protocols for oral reading and classroom assignments perf

The objective of the test conditions is to identify the ins

tructional intervention that produces the largest outcomes i

n the most efficient manner as a basis for recommending an

intervention. At the very least, it may be possible to el

iminate interventions that are not likely to be effective b

ecause they do not even produce immediate effects. Special

considerations need to be made for the multielement design

such as that previously described. Assessment materials shou

ld be of equal difficulty level across conditions to assure

that differences in outcomes are not due to differences in

difficulty level. The assessment materials should, however,

be sufficiently different from each other so that treatmen

ts do not interfere with outcomes across different conditio

ns. Finally, it is important to maximize obtained treatment

effects by ensuring that the assessment materials have con

siderable overlap with the instructional materials for a gi

ven condition (Daly, Martens et al., 1996). In other words,

the assessment materials used for each condition should re

flect what was taught in that condition and minimize the ef

fects of what was taught in the other conditions while stil

l being of equal difficulty level.

In general, responses w

ill be of two types: oral responses or written responses. F

or academic interventions in basic skills, we recommend usi

ng curriculum-based measures of oral reading, mathematics co

mputation, written expression, and spelling. The materials c

an be designed to maximize overlap with respective test con

ditions and the measures have good technical adequacy for i

nstructional decision making (Shinn, 1989). For example, whe

n reading interventions are being investigated, outcomes can

be measured in the reading passages in which instruction w

as provided (Daly & Martens, 1994). When interventions are

being investigated to improve mathematics computational skil

ls, the examiner may assign different problem types to diff

erent conditions (e.g., 6s to one condition, 7s to another

condition, and 8s to another condition). The assessment pro

bes for each condition would contain problem types for the

associated condition.

Under some circumstances, it may be

desirable to target work completion in classroom tasks. In

this case, the examiner may choose workbook materials that

meet the requirements described above (i.e., of equal diffi

culty level but independently reflect the associated test c

ondition). Test condition results are more likely to sugges

t meaningful differences if (a) difficulty level of the mat

erials is held constant across conditions, (b) differences

between assessment materials are maximized across conditions

, (c) high overlap between instructional conditions and ass

essment conditions is sought, and (d) global but sensitive

eviewing more than 8,000 studies of academic performance, W

alberg (1992) concluded that two factors produced the large

st overall effects: providing formative and summative feedba

ck, and giving incentives for slow or inaccurate performanc

e. The procedures described herein, which derive directly f

rom the empirical literature on student achievement, provide

simple, efficient tests for these and other common causes

of academic problems. Each "test" represents a brief interv

ention. The notion that intervention can be a form of asses

sment might appear to be contradictory, unless an intervent

ion is defined as virtually any change in environmental con

ditions. When a positive result is obtained from a test for

which the child is provided additional practice or an ince

ntive, then we know that variable is functionally related t

o the student's academic performance. The process is differ

ent from an approach in which structural variables such as

intelligence, learning style, or handicapping condition are

inferred, presumed, or hypothesized to cause problem behavio

r based on correlational data. The interventions described

have been shown empirically to improve student academic per

formance. We have not attempted, however, to provide an exh

austive list of academic interventions (see, for example, S

hapiro & Bradley, 1995, for a discussion of cognitive-based

interventions).

The most important advantage of this func

tional assessment method is that not only has the source of

the problem been identified but the chances of choosing an

effective intervention have been increased. The assessment

process provides a test of the intervention with the child

, using the same classroom materials that would be used, in

the environment in which the child would normally perform

school work. Individuals responsible for making decisions ab

out the child are in the position of being able to say, "W

e tried this and it worked." The decision then is not what

to do but to determine what resources are needed to imple

ment the intervention in a way that is sustainable. If a c

hild requires additional practice or error correction, then

who will provide the services, when will they be provided,

and what additional resources are needed to insure impleme

ntation? The model presented, however, is not intended to r

eplace or eschew the need for formatively evaluating interv

ention outcomes (Fuchs & Fuchs, 1986); it is merely intende

d to increase the likelihood of choosing the least intrusiv

e intervention that has the greatest chances of positively

affecting student academic performance.

Although the princi

pal goal of this work was to summarize the literature relev

ant to simple methods for assessment and intervention of ac

ademic problems, a related aim was to stimulate research. R

ecent advances in the area of functional analysis have show

n promising results for adapting this methodology to classr

oom settings (Broussard & Northup, 1995; Kern et al., 1994;

Northup et al., 1995). Although all of the procedures pres

ented are individually well-supported by research, their use

in combination or as part of a deliberate problem analysis

process has not been investigated. Hence, our hope is that

this article may serve to stimulate research in the develo

pment of procedures that are adaptable to educational setti

ngs (i.e., labor and time efficient) and that can be shown

to improve student responding through improved instructiona

l decision making.

There are two limitations of the functi

onal assessment model presented. First, until more applied

research that provides functional assessment protocols is co

nducted, practitioners will need to rely upon their own ski

lls to apply the model. Given that the process is time con

suming and that little training is available, it is not lik

ely that practitioners will readily apply the model. Future

research should be directed toward validating various trea

tment protocols and describing the conditions under which t

hey are most likely to be indicated. Second, the long-term

validation of experimentally derived academic interventions

(using the brief multielement design) has not been conducte

d. Future research also should be directed toward examining

applications and limitations of the brief multielement des

ign reported. When incorporated as a part of a broader mode

l of functional analysis, the brief multielement design may

be efficient for evaluating treatments when differentiated

response patterns can be produced across conditions (Vollm

er, Marcus, Ringdahl, & Roane, 1995).

Figure 1. Reason

able hypotheses for academic deficits. REASON #1:

They do not want to do it.

The student is not motivated

to respond to the instructional demands

REASON #2: They ha

ve not spent enough time doing it.

Insufficient active stu

dent responding in curricular materials

REASON #3: They ha

ve not had enough help to do it.

Insufficient prompting an

d feedback for active responding

Student displays poor acc

uracy in target skill(s)

Student displays poor fluency in

target skill(s)

Student does not generalize use of the ski

ll to the natural setting and/or to other materials/setting

s

REASON #4: They have not had to do it that way before.

The instructional demands do not promote mastery of the cur

ricular objective

REASON #5: It is too hard.

Students ski

ll level is poorly matched to the difficulty of the instruc

tional materials

Figure 2. Academic interventions identifie

d by the presumed function of the behavior.

Legend for Ch

art:

A - Reasonable HypothesesB - Possible Interventions

A B

The student is not motivated to respond to the instru

ctional demands

Increase interest in curricular activiti

es: 1. Provide incentives for using the skill 2. Teach th

e skill in the context of using the skill 3. Provide choi

ces of activities

Insufficient active student responding

in curricular materials

Increase active student respond

ing: 1. Estimate current rate of active responding & incr

ease rate during allocated time

Insufficient prompting an

d feedback for active responding

Increase rate of comple

te learning trials: 1. Response cards 2. Choral respondi

ng 3. Flash card intervention with praise/error correcti

on 4. Peer tutoring

Student displays poor accuracy in tar

get skill(s)

Increase modeling & error correction: 1. Re

ading passages to student 2. Use cover-copy-compare 3. Ha

ve student repeatedly practice correct response in contex

t for errors

Student displays poor fluency in target skil

l(s)

Increase practice/drill &/or incentives: 1. Have th

e student repeatedly read passages 2. Offer incentives fo

r beating the last score

Student does not generalize use

of the skill to the natural setting and/or to other mater

ials/ settings

Instruct the student to generalize use of

the skill: 1. Teach multiple examples of use of the ski

ll 2. Teach use of the skill in the natural setting 3. "C

apture" natural incentives 4. Teach self-monitoring

The i

nstructional demands do not promote mastery of the curric

ular objective

Change instructional materials to match t

he curricular objective: 1. Specify the curricular object

ive and identify activities that promote use of the skill

in the context in which it is generally used

Student's s

kill level is poorly matched to the difficulty of the ins

tructional materials

Increase student responding using b

etter matched instructional levels: 1. Identify student's

accuracy & fluency across instructional materials and us

e instructionalmaterials that promote a high rate of resp

ondingGRAPH: Figure 3. Number of correctly read words per m

Roberts, M.D. (1974). Eliminating discipline problems by str

engthening academic performance. Journal of Applied Behavior

Analysis, 7, 71-76.

Belfiore, P. J., Skinner, C. H., & F

erkis, M. (1995). Effects of response and trial repetition

on sightword training for students with learning disabilitie

s. Journal of Applied Behavior Analysis, 28, 347-348.

Brou

ghton, S. F., & Lahey, B. B. (1978). Direct and collateral

effects of positive reinforcement, response cost, and mixe

d contingencies for academic performance. Journal of School

Psychology, 16, 126-136.

Broussard, C. D., & Northup, J.

N. (1995). An approach to functional assessment and analysi

s of disruptive behavior in regular education classrooms. S

chool Psychology Quarterly, 10, 151-164.

Cavanaugh, R. A.,

Heward, W. L., & Donelson, F. (1996). Effects of response

cards during lesson closure on the academic performance of

secondary students in an earth science course. Journal of

Applied Behavior Analysis, 29, 403-406.

Cooper, L. J., Wac

ker, D. P., Thursby, D., Plagmann, L. A., Harding, J., Mill

ard, T., & Derby, M. (1992). Analysis of the effects of ta

sk preferences, task demands, and adult attention on child

behavior in outpatient and classroom settings. Journal of A

pplied Behavior Analysis, 25, 823-840.

Daly, E. J., III, L

entz, E E., & Boyer, J. (1996). The instructional hierarchy

: A conceptual model for understanding the effective compon

ents of reading interventions. School Psychology Quarterly,

11, 369-386.

Daly, E. J., III, & Martens, B. K. (1994). A

comparison of three interventions for increasing oral read

ing performance: Application of the instructional hierarchy.

Journal of Applied Behavior Analysis, 27, 459-469.

Daly,

E. J., III, Martens, B. K., Kilmer, A., & Massie, D. (1996

). The effects of instructional match and content overlap o

n generalized reading performance. Journal of Applied Behavi

or Analysis, 29, 507-518.

Denham, C., & Lieberman, A. (Eds

.). (1980). Time to learn. Washington, DC: National Institu

te of Education.

Derby, K. M., Wacker, D. P., Sasso, G.,

Steege, M., Northup, J., Cigrand, K., & Asmus, J. (1992). B

rief functional assessment techniques to evaluate aberrant b

ehavior in an outpatient setting: A summary of 79 cases. Jo

urnal of Applied Behavior Analysis, 25, 713-721.

Dunlap, G

., DePerczel, M., Clarke, S., Wilson, D., Wright, S., White

, R., & Gomez, A. (1994). Choice making to promote adaptive

behavior for students with emotional and behavioral challe

nges. Journal of Applied Behavior Analysis, 27, 505-518.

Englemann, S., & Carnine, D. W. (1991). Theory of instructi

on: Principles and applications (2nd ed.). New York: Irving

ton.

Fuchs, L. S., & Fuchs, D. (1986). Effects of systemat

ic formative evaluation: A meta-analysis. Exceptional Childr

en, 53(3), 199-208.

Gardner, R., III, Heward, W. L., & Gro

ssi, T. A. (1994). Effects of response cards on student par

ticipation and academic achievement: A systematic replicatio

n with inner-city students during whole-class science instru

ction. Journal of Applied Behavior Analysis, 27, 63-72.

Ge

ttinger, M. (1995). Increasing academic learning time. In A

. Thomas & J. Grimes (Eds.), Best practices in school psych

ology-III (pp. 943-954). Washington, DC: National Associatio

n of School Psychologists.

Gickling, E. E., & Armstrong, D

. L. (1978). Levels of instructional difficulty as related

to on-task behavior, task completion, and comprehension. Jou

rnal of Learning Disabilities, 11, 32-39.

Gickling, E. E.,

& Rosenfield, S. (1995). Best practices in curriculum-base

d assessment. In A. Thomas & J. Grimes (Eds.), Best practic

es in school psychology-III (pp. 587-596). Washington, DC:

National Association of School Psychologists.

Greenwood, C.

R. (1991). A longitudinal analysis of time, engagement, an

d achievement in at-risk versus non-risk students. Exception

al Children, 57, 521-535.

Greenwood, C. R., Carta, J. J.,

Kamps, D. M., & Delquadri, J. (1995). Ecobehavioral assessm

ent systems software: Technical manual and software. Kansas

City, KS: University of Kansas, Juniper Gardens Children's

Project.

Greenwood, C. R., Hart, B., Walker, D., & Risley

, T. (1994). The opportunity to respond and academic perfor

mance revisited: A behavioral theory of developmental retard

ation and its prevention. In R. Gardner III, D. M. Sainato,

J. O. Cooper, T. E. Heron, W. L. Heward, J. W. Eshleman,

& T. A. Grossi (Eds.), Behavior analysis in education: Focu

s on measurably superior instruction (pp. 213-224). Pacific

Grove, CA: Brooks/Cole.

Gresham, E M. (1989). Assessment

of treatment integrity in school consultation and prereferra

l intervention. School Psychology Review, 18, 3750.

Harding

, J., Wacker, D. E, Cooper, L. J., Millard, T., & Jensen-K

ovalan, E (1994). Brief hierarchical assessment of potential

treatment components with children in an outpatient clinic

. Journal of Applied Behavior Analysis, 27, 291-300.

Harin

g, N. G., Lovitt, T C., Eaton, M.D., & Hansen, C. L. (1978

). The fourth R: Research in the classroom. Columbus, OH: M

errill.

Heward, W. L. (1994). Three "low-tech" strategies

for increasing the frequency of active student response dur

ing group instruction. In R. Gardner III, D. M. Sainato, J.

O. Cooper, T. E. Heron, W. L. Heward, J. W. Eshleman, & T

. A. Grossi (Eds.), Behavior analysis in education: Focus o

n measurably superior instruction (pp. 283-320). Pacific Gro

ve, CA: Brooks/Cole.

Heward, W. L., Barbetta, P., Cavanaug

h, R. A., & Grossi, T. A. (1996). A dozen common teaching

mistakes and what to do instead. Workshop presented at the

22nj annual convention of the Association for Behavior Anal

ysis, San Francisco. Horner, R. H. (1994). Functional asses

sment: Contributions and future directions. Journal of Appli

ed Behavior Analysis, 28, 401-404.

Howell, K. W., Fox, S.

L., & Morehead, M. K. (1993). Curriculum-based evaluation:

Teaching and decision making (2nd ed.). Belmont, CA: Brooks

/ Cole.

Iwata, B. A., Dorsey, M. F., Slifer, K. J., Bauma

n, K. E., & Richman, G. S. (1994). Toward a functional ana

lysis of self-injury. Journal of Applied Behavior Analysis,

27, 215-240. (Reprinted from Analysis and Intervention in

Developmental Disabilities, 2, 1-20, 1982).

Iwata, B. A.,

Pace, G. M., Dorsey, M. F., Zarcone, J. R., Vollmer, T. R.

, Smith, R. G., Rodgers, T. A., Lerman, D.C., Shore, B. A.

, Mazaleski, J. L., Goh, H., Cowdery, G. E., Kalsher, M. J

., McCosh, K. C., & Willis, K. D. (1994). The functions of

self-injurious behavior: An experimental-epidemiological an

alysis. Journal of Applied Behavior Analysis, 27, 215-240.

Kern, L., Childs, K. E., Dunlap, G., Clarke, S., & Falk, G

. D. (1994). Using assessment-based curricular intervention

to improve the classroom behavior of a student with emotion

al and behavioral challenges. Journal of Applied Behavior A

nalysis, 27, 7-19.

LaBerge, D., & Samuels, S. J. (1974). T

oward a theory of automatic information processing in readi

ng. Cognitive Psychology, 6, 293-323.

Lentz, E E.(1988) Ef

fective reading interventions in the regular classroom. In

J. L. Graden, J. Zins, & M. J. Curtis (Eds.), Alternative

educational delivery systems: Enhancing instructional option

s for all students (pp. 351-370). Washing-ton, DC: National

Association of School Psychologists.

Lovitt, T., Eaton, M

., Kirkwood, M., & Pelander, J. (1971). Effects of various

reinforcement contingencies on oral reading rate. In E. Ram

p & B. Hopkins (Eds.), A new direction for education: Behav

ior analysis (pp. 54-71). Lawrence, KS: University of Kansa

s.

Mace, F. C. (1994). The significance and future of func

tional analysis methodologies. Journal of Applied Behavior A

nalysis, 27, 385-392.

Martens, B. K., Witt, J. C., Daly, E

., J., III, & Vollmer, T. (in press). Behavior analysis: Th

eory and practice in educational settings. In C. R. Reynold

s & T. B. Gutkin (Eds.), Handbook of school psychology (3rd

ed.). New York: Wiley.

McComas, J. J., Wacker, D. P., Co

oper, S. J., Asmus, J. M., Richman, D., & Stoner, B. (1996

). Brief experimental analysis of stimulus prompts for accu

rate responding on academic tasks in an outpatient clinic.

Journal of Applied Behavior Analysis, 29, 397-401.

Narayan,

J. S., Heward, W. L., Gardner, R., III, Courson, F. H., &

Omness, C. K. (1990). Using response cards to increase stu

dent participation in an elementary classroom. Journal of A

pplied Behavior Analysis, 23, 483-490.

Nelson, R. O., & Ha

yes, S.C. (1986). The nature of behavioral assessment. In R

. O. Nelson & S.C. Hayes (Eds.), Conceptual foundations of

behavioral assessment (pp. 3-41). New York: Guilford.

North

up, J., Broussard, C., Jones, K., George, T., Vollmer, T. R

., & Herring, M. (1995). The differential effects of teache

r and peer attention on the disruptive classroom behavior o

f three children with a diagnosis of attention deficit hype

ractivity disorder. Journal of Applied Behavior Analysis, 28

, 227-228.

Rosenshine, B. V. (1980). How time is spent in

elementary classrooms. In C. Denham, & A. Lieberman (Eds.)

, Time to learn (pp. 107-126). Washington, DC: U.S. Departm

ent of Education.

Rosenshine, B., & Berliner, D.C. (1978).

Academic engaged time. British Journal of Teacher Educatio

n, 4, 3-16.

Seybert, S., Dunlap, G., & Ferro, J. (1996).

The effects of choice making on the problem behaviors of hi

gh school students with intellectual disabilities. Journal o

f Behavioral Education, 6, 49-65.

Shapiro, E. S. (1996). A

cademic skills problems: Direct assessment and intervention

(2nd ed.). New York: Guilford.

Shapiro, E. S., & Bradley,

K. L. (1995). Treatment of academic problems. In M. A. Rein

ecke, E M. Dattilio, & A. Freeman (Eds.), Cognitive therapy

with children and adolescents (pp. 344-366). New York: Gui

lford.

Shinn, M. R. (1989).Curriculum-based measurement: As

sessing special children. New York: Guilford.

Staats, A. W

., & Butterfield, W. A. (1964). Treatment of nonreading in

a culturally deprived juvenile delinquent: An application of

reinforcement principles. Child Development, 36, 925-942.

Staats, A. W., Minke, K. A., Finley, J. R., Wolf, M. M., &

Brooks, L. O. A. (1964). A reinforcer system and experimen

tal procedure for the laboratory study of reading acquisiti

on. Child Development, 35, 209-231.

Stallings, J. (1975).

Implementation and child effects of teaching practice in fo

llow through classrooms. [Monographs of the Society for Res

earch, Serial No. 163]. Child Development, 40, 7-8.

Sterli

ng, R., Barbetta, P. M., Heron, T. E., & Heward, W. L. (19

97). A comparison of active student response and on-task in

struction on the acquisition and maintenance of health fact

s by students with learning disabilities. Journal of Behavi

oral Education, 7, 151-166.

Vargas, J. S. (1984). What are

your exercises teaching? An analysis of stimulus control i

n instructional materials. In W. L. Heward, T. E. Heron, D.

S. Hill, & J. Trap-Porter (Eds.), Focus on behavior analys

is in education (pp. 126-141). Columbus, OH: Merrill.

Voll

mer, T. R., Marcus, B. A., Ringdahl, J. E., & Roane, J. S.

(1995). Progressing from brief assessments to extended exp

erimental analyses in the evaluation of aberrant behavior.

Journal of Applied Behavior Analysis, 28, 561-576.

Walberg,

H. J. (1992). The knowledge base for educational productiv

ity. International Journal of Educational Reform, 1, 1-10.

Wolery, M., Bailey, D. B., Jr., & Sugai, G. M. (1988). Eff

ective teaching: Principles and procedures of applied behavi

or analysis with exceptional children. Boston: Allyn & Baco

n.

Ysseldyke, J., & Christenson, S. (1993). The instructio

Curricular Activities Broughton, S. E, & Lahey, B. B.

(1978). Direct and collateral effects of positive reinforc

ement, response cost, and mixed contingencies for academic

performance. Journal of School Psychology, 16, 126-136.

Dun

lap, G., DePerczel, M., Clarke, S., Wilson, D., Wright, S.,

White, R., & Gomez, A. (1994). Choice making to promote a

daptive behavior for students with emotional and behavioral

challenges. Journal of Applied Behavior Analysis, 27, 505-

518.

Dunlap, G, & Kern, L. (1993). Assessment and interven

tion for children within the instructional curriculum. In J

. Reichle & D. P. Wacker (Eds.), Communicative approaches t

o the management of challenging behaviors (pp. 177-203). Ba

ltimore: Paul Brookes.

Kern, L., Childs, K. E., Dunlap, G.

, Clarke, S., & Falk, G. D. (1994). Using assessment-based

curricular intervention to improve the classroom behavior of

a student with emotional and behavioral challenges. Journa

l of Applied Behavior Analysis, 27, 7-19.

Increasing A

ctive Student Responding Greenwood, C. R., Delquadri,

J., & Hall, R. V. (1984). Opportunity to respond and stud

ent academic performance. In W. L. Heward, T. E. Heron, J.

Trapp-Porter, & D. S. Hill (Eds.), Focus on behavior analy

sis in education (pp. 58-88). Columbus, OH: Merrill.

Green

wood, C. R., Hart, B., Walker, D., & Risley, T. (1994). Th

e opportunity to respond and academic performance revisited:

A behavioral theory of developmental retardation and its p

revention. In R. Gardner III, D. Sainato, J. Cooper, T. Her

on, W. Heward, J. Eshleman, & T. Grossi (Eds.), Behavior an

alysis in education: Focus on measurably superior instructio

n (pp. 213-224). Pacific Grove, CA: Brooks/Cole.

Heward, W

. L. (1994). Three "Low-Tech" strategies for increasing the

frequency of active student response during group instruct

ion. In R. Gardner III, D. Sainato, J. Cooper, T. Heron, W

. Heward, J. Eshleman, & T. Grossi (Eds.), Behavior analysi

s in education: Focus on measurably superior instruction (p

p. 283-320). Pacific Grove, CA: Brooks/Cole.

Increasing

Rate of Complete Learning Trials Albers, A. E., &

Greer, R. D. (1991). Is the three-term contingency trial a

predictor of effective instruction? Journal of Behavioral

Education, 1, 337-354.

Belfiore, P. J., Skinner, C. H., &

Ferkis, M. (1995). Effects of response and trial repetition

on sightword training for students with learning disabilit

ies. Journal of Applied Behavior Analysis, 28, 347-348.

Fa

ntuzzo, J. E, King, J. A., & Heller, L. R. (1992). Effects

of reciprocal peer tutoring on mathematics and school adju

stment: A component analysis. Journal of Educational Psychol

ogy, 84. 331-339.

Fantuzzo, J. F., Polite, K., & Grayson,

N. (1990). An evaluation of reciprocal peer tutoring across

elementary school settings. Journal of School Psychology,

28, 309-323.

Greenwood, C. R. (1991). A longitudinal analy

sis of time, engagement, and achievement in at-risk versus

non-risk students. Exceptional Children, 57, 521-535.

Hewar

d, W. L., Courson, F. H., & Narayan, J. S. (1989). Using c

horal responding to increase active student response during

group instruction. Teaching Exceptional Children, 21, 72-75

.

Narayan, J. S., Heward, W. L., Gardner, R., III, Courson

, E H., & Omness, C. (1990). Using response cards to incre

ase student participation in an elementary classroom. Journa

l of Applied Behavior Analysis, 23, 483-490.

Skinner, C. J

., Bamberg, J. W., Smith, E. S., & Powell, S.S. (1993). Co

gnitive cover, copy, and compare: Subvocal responding to in

crease rates of accurate division responding. RASE: Remedial

and Special Education, 14, 49-56.

Sterling, R., Barbetta,

P.M., Heron, T. E., & Heward, W. L. (1997). A comparison

of active student response and on-task instruction on the a

cquisition and maintenance of health facts by students with

learning disabilities. Journal of Behavioral Education, 7,

151-166.

Van Houton, R., & Rolider, A. (1989). An analysi

s of several variables influencing flashcard instruction. Jo

urnal of Applied Behavior Analysis. 22, 111-118.

Increas

ing Modeling and Error Correction Barbetta, P.M.,

Heron, T. E., & Heward, W L. (1993). Effects of active stu

dent response during error correction on the acquisition, m

aintenance, and generalization of sight words by students w

ith developmental handicaps. Journal of Applied Behavior Ana

lysis, 26, 111-119.

Barbetta, P. M., & Heward, W. L. (1993

). Effects of active student response during error correcti

on on the acquisition and maintenance of geography facts by

elementary students with learning disabilities. Journal of

Behavioral Education, 3, 217-233.

Daly, E. J., III, & Mar

tens, B. K. (1994). A comparison of three interventions for

increasing oral reading performance: Application of the in

structional hierarchy. Journal of Applied Behavior Analysis,

27, 459-469.

Drevno, G. E., Kimball, J. W., Possi, M. K.

, Heward, W. L., Gardner, R., III, & Barbetta, P.M. (1994).

Effects of active student response during error correction

on the acquisition, maintenance, and generalization of sci

ence vocabulary by elementary students: A systematic replica

tion. Journal of Applied Behavior Analysis, 27, 179-180.

Hendrickson, J. M., & Gable, R. A. (1981). The use of mode

ling tactics to promote academic skill development of excep

tional learners. Journal of Special Education Technology, 4,

20-29.

Hendrickson, J. M., Robert, M., & Shores, R. E. (

1978). Antecedent and contingent modeling to teach basic si

ght vocabulary to learning disabled children. Journal of Le

arning Disabilities, 11, 6973.

O'Shea, L. J., Munson, S. M

., & O'Shea, D. J. (1984). Error correction in oral reading

: Evaluating the effectiveness of three procedures. Educatio

n and Treatment of Children, 7, 203-214.

Increasing Pr

actice/Drill and/or Incentives Herman, P. (1985). T

he effect of repeated reading on rate, speech pause, and wo

rd recognition accuracy. Reading Research Quarterly, 20, 553

-565.

Rashotte, C. A., & Torgesen, J. K. (1985). Repeated

reading and reading fluency in learning disabled children.

Reading Research Quarterly, 20, 180188.

Rasinski, T V. (19

90). Effects of repeated reading and listening while readin

g on reading fluency. Journal of Educational Research, 83,

147-150.

Skinner, C. H., & Shapiro, E. S. (1989). A compar

ison of taped-words and drill interventions on reading flue

ncy in adolescents with behavior disorders. Education and T

reatment of Children, 12, 123-133.

Lovitt, T., Eaton, M.,

Kirkwood, M., & Pelander, J. (1971). Effects of various rei

nforcement contingencies on oral reading rate. In E. Ramp &

G. Hopkins (Eds.), A new direction for education: Behavior

analysis (pp. 54-71). Lawrence, KS: University of Kansas.

Instructing the Student to Generalize Use of t

he Skill Fantuzzo, J. W., & Rohrbeck, C. A. (1992). Self

-managed groups: Fitting self-management approaches into cla

ssroom systems. School Psychology Review, 21, 255-263.

Lenz

, B. K. (1992). Self-managed learning strategy systems for

children and youth. School Psychology Review, 21, 211-228.

Howell, K. W., Fox, S. L., & Morehead, M. K. (1993). Curri

culum-based evaluation: Teaching and decision making. Pacifi

c Grove, CA: Brooks/Cole.

Hughes, C. (1994). Teaching gene

ralized skills to persons with disabilities. In R. Gardner

III, D. Sainato, J. Cooper, T. Heron, W. Heward, J. Eshlema

n, & T. Grossi (Eds.), Behavior analysis in education: Focu

s on measurably superior instruction (pp. 335-348). Pacific

Grove, CA: Brooks/Cole.

Skinner, C. H., & Smith, E. S. (

1992). Issues surrounding the use of self-management interve

ntions for increasing academic performance. School Psycholog

y Review, 21, 202-210.

Stokes, T., & Baer, D. (1977). An

implicit technology of generalization. Journal of Applied Be

havior Analysis, 10, 349-367.

Stokes, T. F., & Osnes, P. G

. (1988). The developing applied technology of generalizatio

n and maintenance. In R. H. Homer, G. Dunlap, & R. L. Koeg

el (Eds.), Generalization and maintenance (pp. 519). Baltimo

re: Paul Brookes.

Changing Instructional Material

s to Match the Curricular Objective Vargas, J. S.

(1984). What are your exercises teaching? An analysis of s

timulus control in instructional materials. In W. Heward, T

. Heron, D. Hill, & J. Trap-Porter (Eds.), Focus on behavio

r analysis in education (pp. 126-141). Columbus, OH: Merril

l.

Increasing Student Responding Using Better

Matched Instructional Levels Daly, E. J., III, Mart

ens, B. K., Kilmer, A., & Massie, D. (1996). The effects o

f instructional match and content overlap on generalized re

ading performance. Journal of Applied Behavior Analysis, 29,

507-518.

Gickling, E. E., & Armstrong, D. L. (1978). Leve

ls of instructional difficulty as related to on-task behavi

or, task completion, and comprehension. Journal of Learning

respond to the instructional demands.

Performance Defi

cit Protocol for Classroom Assignments 1. Obtain

three previously completed assignments from the teacher. E

ach assignment should be one which the child has failed or

has performed markedly below expectations.

2. Present the

first assignment (with previous answers removed) to the ch

ild and say, "If you are able to get X number of items (80

%) correct, then we can (get a coke, play a game, go outsi

de for 10 minutes, etc.). If the child increases his or he

r score by 25% or obtains a score of 80% or above, then mo

ve to the next step.

3. Present the child with a list of t

eacher approved reinforcers and have the child indicate whi

ch one he or she would like to work for in the future.

4.

Test reinforcer efficacy. Present the previously failed as

signment to the child and say, "If you are able to get X n

umber (80%) correct, then you can select one of these (show

list of rewards). If you get X or fewer correct, then you

will not get to select anything."

5. Evaluate outcomes. If

the child markedly increases performance when offered ince

ntives, suspect a performance deficit. If performance remain

s constant under reinforced and non-reinforced trials, then

suspect a skill deficit.

Performance Deficit Protoc

ol for Oral Reading Fluency 1. Obtain those reinforc

ers that the student identifies as being his or her first,

second, and third choice of things to work for.

2. Ident

ify the reading passage to be instructed for the day.

3. C

ompute the passage's criterion rate [i.e., the rate, in sec

onds, at which the student must read the passage in order

to receive his or her top choice of reinforcers using instr

uctional placement guidelines for mastery from Shapiro (1996

). The rate is simply the number of words in the passage (

which is equivalent to a rate of 60 correctly read words p

er minute) for first and second grade readers and the numbe

r of words in the passage divided by 1.67 (which is equiva

lent to a rate of 100 correctly read words per minute) for

third through sixth grade readers].

4. Explain to the stud

ent that he or she will (a) receive his or her first choic

e if he or she reads faster than the criterion and misses

less than 3 words, (b) receive his or her second choice if

he or she reads at the criterion and misses less than 3 w

ords, and (c) receive his or her third choice if he or she

reads almost at the criterion and misses less than 3 word

s.

5. Have the student read the passage aloud, timing how

long it takes him or her to read the passage and marking e

rrors on your copy.

6. Offer the student that reinforcer co

rresponding to the conditions described in step 4.

REASONAB

LE HYPOTHESIS:

The student displays poor fluency in target

skill(s).

Repeated Readings 1. Identify the reading

passage to be instructed for the day.

2. Tell the student

that you will time him or her while reading.

3. Have the

student begin reading the passage and start the stopwatch.

4. If the student misses a word or hesitates for 3 seconds

, say the word and have the student repeat the word.

5. At

the end of the passage tell the student the time that it

took to read the passage.

6. Repeat steps 3 through 5 thre

e more times.

REASONABLE HYPOTHESIS:

The student displays

poor accuracy in target skills.

Listening Passage Pr

eview and Phrase Drill Error Correction 1. Ident

ify the reading passage to be instructed for the day.

2.

Read the passage aloud while having the student follow alon

g on his or her copy.

3. Have the student read the passage

aloud while underlining errors on the examiner copy and su

pplying the words for mispronunciations, omissions, and hesi

tations of more than 3 s.

4. Show the student each underli

ned error word in the passage.

5. Say the word and have th

e student read each phrase containing the error word three

times.

REASONABLE HYPOTHESIS:

The student does not general

ize use of the skill to the natural setting and/or to othe

r materials/settings.

Generalization Training for P

honics Objectives 1. Obtain passages containing high p

ercentages of target phonics skills.

2. Create word lists

using the phonetically regular words.

3. Explain the phonic

rule to the student and read the words on the word list t

o the student.

4. Have the student read the word list and

correct errors.

5. Read the passage to the student.

6. Have

the student read the passage and correct errors.

REASONABL

E HYPOTHESIS:

The student's skill level is poorly matched

to the difficulty of the instructional materials.

Instru

ctional Match 1. Identify the curriculum for reading in

struction.

2. Obtain passages from books below, at, and ab

ove the student's current reading level based on teacher es

timates.

3. Obtain local norms (e.g., building level CBM no

rms) (Shinn, 1989) or refer to conventional placement stand

ards (e.g., Shapiro, 1996).

4. Have the student read each p

assage for 1 minute while recording the number of correctly

read words per minute and errors.

5. Determine the student

's instructional level by either comparing to average fluen

cy rates for students at different grade levels when using

local norms or by comparing to instructional placement stan

dards.

~~~~~~~~By Edward J. Daly III, University of Cincinn

ati; Joseph C. Witt, Louisiana State University; Brian K. M

artens, Syracuse University and Eric J. Dool, University of

Cincinnati

Address all correspondence concerning this arti

cle to Edward J. Daly III, School Psychology Program, Colle

ge of Education, P.O. Box 210002, University of Cincinnati,

Cincinnati, OH 45221-0002

Edward J. Daly III, PhD, is Ass

istant Professor of School Psychology at the University of

Cincinnati, Cincinnati, OH. His research interests include i

ntervention design for academic performance problems

Joseph

C. Witt, PhD, has research interests in the area of helpi

ng teachers and other professionals to be more effective in

assessing and intervening with learning and behavior probl

ems. He is currently editor of School Psychology Quarterly

Brian K. Martens, PhD, is an Associate Professor of Psychol

ogy at Syracuse University. His research interests include

applied behavior analysis, instructional intervention, and s

chool consultation

Eric J. Dool, MEd, is a doctoral studen

t in the School Psychology program at the University of Cin

cinnati, Cincinnati, OH. He is currently completing his int

ernship in the Cincinnati Public Schools.

Copyright of Sc

hool Psychology Review is the property of National Associat

ion of School Psychologists and its content may not be copi

ed or emailed to multiple sites or posted to a listserv wi

thout the copyright holder's express written permission. How

ever, users may print, download, or email articles for indi

vidual use.

Back enabled

top related