reproductions supplied by edrs are the best that can be ...this educational newsletter highlights a...

18
ED 481 629 TITLE INSTITUTION SPONS AGENCY PUB DATE NOTE CONTRACT AVAILABLE FROM PUB TYPE JOURNAL CIT EDRS PRICE DESCRIPTORS IDENTIFIERS ABSTRACT DOCUMENT RESUME SO 035 370 Evidence-Based Research in Education. Southwest Educational Development Lab., Austin, TX. National Center for the Dissemination of Disability Research. National Inst. on Disability and Rehabilitation Research (ED/OSERS), Washington, DC. 2003-00-00 17p. H133A990008-A National Center for the Dissemination of Disability Research, Southwest Educational Development Laboratory, 211 East Seventh Street, Suite 400, Austin, TX 78701-3253. Tel: 800- 266-1832 (Toll Free); Fax: 512-476-2286; e-mail: [email protected]; Web site: http://www.ncddr.org/ . Collected Works Serials (022) Reports Descriptive (141) Research Exchange; v8 n2 2003 EDRS Price MF01/PC01 Plus Postage. Academic Achievement; *Disabilities; *Educational Research; Evaluation; Standards Department of Education; *Research in Education; What Works This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that evidence- based research emerged in the field of medicine over 50 years ago, resulting in major advances in the treatment and prevention of disease. It adds that clinical guidelines and protocols are based on the results of controlled experiments following rigorous standards of science. The U.S. Department of Education is embracing evidence-based research to improve the effectiveness of educational interventions and, in turn, academic achievement. The article discusses the No Child Left Behind Act and the Coalition for Evidence-Based Policy. It recommends strategies for the Department of Education to bring evidence-driven progress to education. Other information materials in the newsletter are: "The What Works Clearinghouse"; "NDDR Grantees Review WWC Draft Standards"; and "Resolution of the AAMR on Evidence-Based Research and Intellectual Disability." (BT) Reproductions supplied by EDRS are the best that can be made from the ori inal document.

Upload: others

Post on 18-Jan-2021

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Reproductions supplied by EDRS are the best that can be ...This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that

ED 481 629

TITLE

INSTITUTION

SPONS AGENCY

PUB DATENOTE

CONTRACTAVAILABLE FROM

PUB TYPE

JOURNAL CITEDRS PRICEDESCRIPTORS

IDENTIFIERS

ABSTRACT

DOCUMENT RESUME

SO 035 370

Evidence-Based Research in Education.

Southwest Educational Development Lab., Austin, TX. NationalCenter for the Dissemination of Disability Research.National Inst. on Disability and Rehabilitation Research(ED/OSERS), Washington, DC.

2003-00-0017p.

H133A990008-ANational Center for the Dissemination of Disability Research,Southwest Educational Development Laboratory, 211 EastSeventh Street, Suite 400, Austin, TX 78701-3253. Tel: 800-266-1832 (Toll Free); Fax: 512-476-2286; e-mail:[email protected]; Web site: http://www.ncddr.org/ .

Collected Works Serials (022) Reports Descriptive(141)

Research Exchange; v8 n2 2003EDRS Price MF01/PC01 Plus Postage.Academic Achievement; *Disabilities; *Educational Research;Evaluation; StandardsDepartment of Education; *Research in Education; What Works

This educational newsletter highlights a lead article,"Evidence-Based Research in Education." The article explains that evidence-based research emerged in the field of medicine over 50 years ago, resultingin major advances in the treatment and prevention of disease. It adds thatclinical guidelines and protocols are based on the results of controlledexperiments following rigorous standards of science. The U.S. Department ofEducation is embracing evidence-based research to improve the effectivenessof educational interventions and, in turn, academic achievement. The articlediscusses the No Child Left Behind Act and the Coalition for Evidence-BasedPolicy. It recommends strategies for the Department of Education to bringevidence-driven progress to education. Other information materials in thenewsletter are: "The What Works Clearinghouse"; "NDDR Grantees Review WWCDraft Standards"; and "Resolution of the AAMR on Evidence-Based Research andIntellectual Disability." (BT)

Reproductions supplied by EDRS are the best that can be madefrom the ori inal document.

Page 2: Reproductions supplied by EDRS are the best that can be ...This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that

(\I CD01

00

C-T-1

Ca)cosZoil

U.S. DEPARTMENT OF EDUCATIONOffice of Educational Research and Improvement

EDUCATIONAL RESOURCES INFORMATIONCENTER (ERIC)

o This document has been reproduced asreceived from the person or organizationoriginating it.

0 Minor changes have been made toimprove reproduction quality.

Points of view or opinions stated in thisdocument do not necessarily representofficial OERI position or policy.

At a GlanceEVIDENCE-BASED RESEARCH IN

EDUCATION 1

A WORD FROM THE DIRECTOR

Evaluating the Quality ofResearch Findings 2

THE WHAT WORKS

CLEARINGHOUSE 4

The CampbellCollaboration 5

NIDDR GRANTEES REVIEW WWC

DRAFT STANDARDS 6

Comments on Study 01ADand CREAD 6

Comments fromResearchers on Applicationof the WWC Standards toNIDRR Research 11

Response from the WWC toNIDRR Researchers'Comments 13

RESOLUTION OF THE AAMR ON

EVIDENCE-BASED RESEARCH

AND INTELLECTUAL

DISABILITY 12

ONLINE RESOURCES 13

WHO'S IN THE NEWS 14

NIDRR GRANTEE AND STAFF

RECOGNITION 15

This newsletter isavailable in alternateformats upon request.

SOUTHWEST EDUCATIONAL

DEVELOPMENT LABORATORYBuilding Knowledge to Support Learning

A project funded by the

iirn't YAWN. 1,111011 01,1111111,

1

Ann

8 0 Number 2 0 2003

Evidence-basedResearch in EducationEvidence-based research emerged in the field of medicine over 50years ago, and has resulted in major advances in the treatment andprevention of disease. Clinical guidelines and protocols are based on theresults of controlled experiments following rigorous standards of science.Improvements in public health and reduced death rates have occurred notby chance but due to the requirements of the Federal Drug Administration(FDA) and National Institutes of Health (NIH) to implement effective,evidence-based research (Coalition for Evidence-Based Policy, 2002).

The Cochrane Collaboration http://www.cochrane.org/ is an internationalorganization "that aims to help people makewell informed decisions about health careby preparing, maintaining and ensuringthe accessibility of systematic reviews ofthe effects of health care interventions"(Cochrane Collaboration, 2003). The firstCochrane Center was established in 1992to fulfill the legacy of Archie Cochrane,a British epidemiologist, who suggestedthe need for a systematic review ofrandomized controlled trials in order tounderstand health care and what was mosteffective. The Cochrane Collaborationincludes groups of researchers (and others)

in many countries, that focus on reviewing studies in different health careareas and developing reliable reports to be archived in accessible databases.

The U.S. Department of Education (ED) is embracing evidence-basedresearch in order to improve the effectiveness of educational interventionsand in turn, academic achievement. The "No Child Left Behind" Act

The U.S. Department of

Education is embracing

evidence-based research

in order to improve the

effectiveness of educational

interventions and in turn,

academic achievement.

Evidence-based Research in Education0 continued on page 3

BEST COPY AVAILABLE

Page 3: Reproductions supplied by EDRS are the best that can be ...This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that

Research Exchange Volume 8, Number 2 2

Evaluatin the Qof Research Findin s

Quality is an elusive characteristic that often derivesas much from the "eye of the beholder" as it doesfrom an objective application of a measurementstandard. Additionally, quality often becomes definedin relation to what has happened, what has been thecase, or what has been produced in the past. This iswhy statements associated with establishing criteriato distinguish quality from non-quality frequentlychange over time due to the relative and changingnature of our perception of what quality is.

It is also the case that quality of information derivedfrom research faces acceptance or rejection by othersbased on its perceived credibility. In the world ofresearch, this is referred to as face validity. Theperception of "high quality research results" can beinfluenced by a variety of factors ranging from thearea selected for research and why, how the researchwas designed, diversity of the sample, data collectionmethods, partners in the research activity, dataanalysis methods, and statement of implications of thefindingsjust to name a few of the factors involved.In the real world, those learning of research findingsusually are most concerned with: (a) Does it relate tome and a need I have today? and (b) Can I access it,use it, and benefit from it?

NIDRR's Rehabilitation Research and TrainingCenters (NCDDR 8c NARRTC, 2003) used aconsumer-oriented category system in summarizingtheir accomplishments. This system was based aroundquality of life questions a person with a disabilityoften raises (Corrigan, 1994). These include:

How long will I live?What will I live on?Where will I live?What will I do?Whom will I love?How much choice will I have?

Most consumers, the ultimate beneficiaries ofdisability research efforts, would likely not findresearch to be of "high quality" unless it addressed atleast one of these major quality of life questions.

Today, the quality of research has been implied in avariety of national policies under a different name

scientifically based (or evidence-based) research.The National Research Council (2002) has suggestedseveral principles to guide us in distinguishingscientifically based research from other forms ofresearch. These principles describe scientifically basedresearch as being:

driven by significant questions;empirical in nature;theory based;designed around a sound linkage betweenthe research question and the researchmethod;

based on clear inferential reasoning;capable of being replicated producing similarresults;

generalizable to significant audiences; andavailable for professional scrutiny.

In linking quality considerations with informationdissemination, a growing demand exists forinformation that has been determined to have reachedan acceptable level of quality prior to making itavailable for utilization by others. This issue of TheResearch Exchange discusses one of these efforts beingconducted by the What Works Clearinghouse. Theirdraft criteria to judge quality of research results wasshared with a group of NIDRR-sponsored researchersfor their reactions and comments. These areincluded in this issue along with news about specialrecognitions and media featuring NIDRR grantees.

hJohn D. Westbrook, Ph.D.Director

National Research Council. (2002). Scientific research ineducation. Washington, DC: National Academies Press.

NCDDR & NARRTC. (2003). RRTC Highlights ofAccomplishments. Austin: SEDL. Available:

hup://www.ncddr.org/du/productshrtc-hilights/index.html

Page 4: Reproductions supplied by EDRS are the best that can be ...This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that

3 Volume 8, Number 2 The Research Exchange

Evidence-based Research in Educationcontinued from page 1

(NCLB) signed by President Bush in2001 emphasizes accountability and theuse of scientifically based research.

The Coalition for Evidence-BasedPolicy has worked with ED since 2001to help define and forge a policy thatsupports evidence-based research. Itsponsored a forum on November 18,2002, Rigorous Evidence: The Key toProgress in Education, that involved stafffrom ED, the White House, and theOffice of Management and Budget, aswell as representatives from educationcommittees in the Congress, educators,advocates, and researchers.

Bringing Evidence-Driven Progress ThEducation: A Recommended Strategy forthe U.S. Department of Education is acollaborative report by the Coalitionthat was released at the forum. TheExecutive Summary of the report offersthe following suggestions:

1. A major, Department-wide effortshould be launched to:

(i) Build the knowledge baseof educational interventionsproven effective throughrandomized controlled trialsnot just in small demonstrationprojects but in large-scalereplications; and

(ii) Provide strong incentivesfor the widespread use ofsuch proven interventions byrecipients of federal educationfunds.

2. In order to build the knowledge baseof effective, replicable interventionsin High Priority Areas:The Department should focus itsdiscretionary funds for research andevaluation on randomized trials toidentify such interventions.

The Department's grant programsshould give applicants majorincentives to use their discretionaryfunds to carry out such randomizedtrials.

3. In order to provide strong incentivesfor the widespread use of proveninterventions, the Department's

grant programs should requireapplicants to provide a plan forwidespread implementation ofresearch-proven interventions, withquantifiable goals.

Coalition for Evidence-Based Policy.(2002.) Bringing Evidence-Driven ProgressTo Education: A Recommended Strategy

for the U.S. Department of Education(Executive Summary.) Washington,DC: Author. Available: http://www.excelgov.org/usermedia/images/uploads/PDFs/CoalitionExSum.pdf

The Education Sciences Reform Act(ESRA) of 2002 (P.L. 107-279) wassigned on November 5, 2002. Thepurpose of the ESRA is "to providefor improvement of Federal educationresearch, statistics, evaluation,information, and dissemination..." ESRAestablished the Institute of EducationSciences (IES) within ED, replacingthe Office of Educational Research andImprovement (0ERI).

The IES reflects the intent of thePresident and Congress to advance thefield of education through support ofrigorous evidence-based educationalresearch. Dr. Grover J. (Russ) Whitehurstwas appointed as the first Director ofthe IES (http://www.ed.gov/about/offices/list/ies/index.html). Whitehurstdescribes the current state of educationas 90 percent professional wisdom basedon experience, and 10 percent empiricalevidence based on research. He wants toswap the percentages, using evidence-based or scientifically based researchto guide educational improvement(Whitehurst, 2002. http://www.ed.gov/nclb/methods/whatworks/eb/edlite-slide021.html).

The Individuals with DisabilitiesEducation Act (IDEA) is due forreauthorization in 2003. Followingthe lead of the NCLB, Secretary ofEducation Rod Paige has emphasizedthe need for increased accountabilityand "doing what works" to improveeducational outcomes for students withdisabilities.

"IDEA should ensure that schools, local

education agencies, state educationagencies and the Federal Departmentof Education quickly adopt researchand evidence-based practices. OSERSresearch and training activities shouldbe aligned with the work of theDepartment's Institute of EducationSciences. Additionally, informationshould be provided to families andteachers on effective programs based onrigorous research, including requiringthe federally funded parent trainingcenters to educate parents abouteffective research that improves resultsfor students with disabilities. IDEAshould also reflect the research principlesoutlined by the President's Commissionon Excellence in Special Educationwhile adhering to the standards forhigh quality research established by theEducation Sciences Reform Act of 2002"(http://www.ed.gov/news/pressreleases/2003/02/02252003.html).

The reauthorization of the RehabilitationAct of 1973, as amended, is scheduledto occur prior to October 1, 2003. Asthis date draws near, it is likely that thechanges seen in previous legislationmay be discussed as part of thereauthorization process.

Many NIDRR grantees are locatedin medical centers that are involvedwith evidence-based clinical research.However, many NIDRR grantees maynot be conducting experimental researchthrough randomized trials.

Research that samples low-incidencestudents and adults with disabilities withunique conditions may not be feasiblein research designs utilizing randomizedtrials. Groups such as the AmericanAssociation for Mental Retardation haveissued position statements describingthe potential exclusion of peoplewith developmental disabilities from"approved" evidence-based research(see related article, "Resolution of theAAMR on Evidence-Based Research andIntellectual Disability," on page 12).

Page 5: Reproductions supplied by EDRS are the best that can be ...This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that

The Research Exchange Volume 8, Number 2 4

The What Works Clearinghouse"The What WorksClearinghouse (WWC),launched in 2002, hasspecified clear and rigorousmethodological standardsfor demonstrations ofprogram effectiveness ineducation. ...We expect theClearinghouse to becomethe principal source of validinformation on effectiveeducational practice"(Whitehurst, 2003).

In his March 13, 2003 statement beforethe U.S. House Subcommittee onLabor/HHS/Education Appropriations,Assistant Secretary Whitehurst notedthat the WWC was the focal point ofthe nearly $20 million requested fordissemination activities within the IESbudget (Whitehurst, 2003.http://www.ed.gov/news/speeches/2003/03/03132003a.html).

The WWC was funded to identifyeducational interventions and programsthat are based on research conductedwith rigorous experimental methods.Whitehurst noted that in the No ChildLeft Behind Act (NCLB), the termscientifically based research is found 111times. The NCLB defines scientificallybased research as "...research thatinvolves the application of rigorous,systematic, and objective proceduresto obtain reliable and valid knowledgerelevant to education activities andprograms" (Whitehurst, 2002. http://www.ed.gov/nclb/methods/whatworks/eb/edlite-slide013.html).

Providing evidence.Improving education.The WWC has been established bythe U.S. Department of Education'sInstitute of Education Sciences to

provide educators, policymakers, and thepublic with a central, independent, andtrusted source of scientific evidence ofwhat works in education. The WWC isadministered by ED through a contractto a joint venture of the AmericanInstitutes for Research and the CampbellCollaboration, internationally recognizedleaders in the fields of education researchand rigorous reviews of scientificevidence.

The Cochrane Collaboration serves as amodel for the Campbell Collaboration,which focuses on the "social, behavioral,and educational arenas" (CampbellCollaboration, 2003). Established in2000, the Campbell Collaboration worksclosely with the Cochrane group (seerelated article on page 5).

Subcontractors for the WWC includeAspen Systems Corporation andCaliber Associates, Inc. Aspen providestechnological, communications, andmeeting support, including designingand maintaining the WWC Website and databases, publishing anelectronic news bulletin, and producingoutreach products. Caliber Associates iscontributing to the first year educationtopic reviews for the WWC.

The WWC seeks input from theeducation community and encouragesyou to visit the Web sitehttp://w-w-c.org/ for more informationon the WWC and how you can beinvolved. To receive regular updateson the work of the WWC, you maysend your name and email address [email protected], or subscribe toWWCUpdate at:http://w-w-c.org/list.html

Scope of WorkThe WWC scope of work includessupport for informed educationdecision-making by producing WWCEvidence Reports, high-quality reviewsof scientific evidence of the effectivenessof educational programs, practices,products, and policies. Through a set

of accessible Web-based databases, theWWC will provide decision makerswith the information they need to makechoices based on high-quality scientificresearch. The WWC is to provide:

reviews of potentially replicableinterventions (programs, products,and practices) that are intended toenhance student outcomes;

information about the evaluationstudies on which intervention reviewshave been based;

scientifically rigorous reviews oftest instruments used to assesseducational effectiveness; and

outcome evaluators (individualsand organizations) willing toconduct evaluations of educationalinterventions.

The WWC databases are intended tocomplement those of the current ERICClearinghouses. The administration ofthe ERIC system is being recompetedand the current Clearinghouses arescheduled to begin phasing out inDecember 2003.

WWC StandardsThe WWC's standards for reviewingstudies and interventions for scientificevidence of educational effectivenesswill be used to prepare WWC EvidenceReports on identified topic areas.

These documentsthe StudyDesign and Implementation Device(Study DIAD) and the CumulativeResearch Evidence Assessment Device(CREAD)will be used to reviewand report on the characteristicsof individual educational-effectiveness studies, identify thestrengths and weaknesses of eachstudy's methodology, and assessthe strength of the conclusionsthat can be drawn from a body ofresearch on specific educationalinterventionsthat is, well-definedand replicable programs, products,practices, and policies.(http://www.w-w-c.org/pr_062003.html)

Page 6: Reproductions supplied by EDRS are the best that can be ...This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that

5 Volume 8, Number 2 The Research Exchange

The draft Study DIAD, used to reviewindividual studies and interventions,was posted on the WWC Web site andonline comments were accepted fromNovember through December, 2002. Inaddition, a forum through which morepublic input was received, was heldon November 22, 2002. Feedback, asummary document and report on theforum were posted on the WWC's Website in January 2003. An initial draftof the CREAD, which is designed foruse in reviewing a cumulative body ofwork, was posted in January 2003 andcomments were taken.

Versions 1.0 of the Study DIAD andof the CREAD were approved by theWWC's Technical Advisory Group(TAG) in June 2003. The Study DIAD,Version 1.0 was posted on the WWCWeb site in July, and Version 1.0 of theCREAD will be posted in August 2003.

First Year Topic AreasWWC Evidence Report topic areasare chosen to meet the needs of K-12,adult educators, and education decisionmakers to identify and implementeffective approaches to improvingstudent outcomes. Each year, the WWCis to develop an agenda for the educationtopic areas it will review. Nominationsfor future topic areas are continuallyaccepted via the WWC Web site.Nominations should meet the definitionsand criteria outlined athttp://www.w-w-c.org/topicnom.html

The topic areas identified by the WWCfor first year review are:

Interventions for Beginning Reading

Curriculum-Based Interventions forIncreasing K-12 Math Achievement

Preventing High School Dropout

Programs for Increasing AdultLiteracy

Peer-Assisted Learning in ElementarySchools: Reading, Mathematics, andScience Gains

Interventions To Reduce Delinquent,Disorderly, and Violent Behavior in

C.0

The Cam ell CollaboratiThe Campbell Collaboration isan international organization thataims to prepare, maintain, anddisseminate high-quality, systematicreviews of studies of effectivenessof social and educational policiesand practices. By supporting theproduction of these reviews and bydisseminating results in an accessiblefashion, the Campbell Collaborationintends to contribute to decisionsin practice, policy and publicunderstanding.1

The Campbell Collaborationcollaborates with its siblingorganization, the CochraneCollaboration, which prepares andmaintains systematic reviews of theeffects of interventions in healthcare. Established in 2000, theCampbell Collaboration is namedafter an American psychologistand thinker, Donald Campbell,who drew attention to the need forsocieties to assess more rigorously theeffects of their social and educationalexperiments, that is, the policies andpractices that they introduce andpromote.2

The Campbell Education Groupaspires to help teachers, parents,students, school administrators, andeducation policy makers make well-

informed decisions about teachingand learning by putting the bestavailable evidence from systemicreviews of educational research atthe heart of educational policies and

3practice.

The Campbell CollaborationAmerican Institutes for ResearchJoint Venture was establishedspecifically to develop and maintainthe What Works Clearinghouse, andbrings together nationally recognizedleaders in the field of rigorousreviews of scientific evidence.The chair of the Steering Groupof the Campbell Collaboration,Robert Boruch, serves as PrpalcipInvestigator for the WWC.

Sources:1 Campbell Collaboration Methods Group

Guiding Principleshttp://www.missouri.edu/-c2method/

2 The Campbell Collaboration -A Brief Introductionhttp://www.aic.gov.au/campbellcyintro.html

3 Campbell CollaborationCoordinating Groupshttp://www.campbellcollaboration.org/cccgroup.html

4 U.S. Department of EducationAwards Contract for 'WhatWorks Clearinghouse'http://www.ed.gov/news/pressreleases/2002/08/08072002a.html

Middle and High Schools

Interventions for ElementarySchool English Language Learners:Increasing English LanguageAcquisition and AcademicAchievement.

The WWC Web site information doesnot specifically mention, nor at thistime, include specific interventionsfor students with disabilities. In thefuture, the WWC hopes it will produce

6

Evidence Reports that specifically focuson interventions targentig stucientswith disabilities. At present, wheneverpossible, if there are high-qualitystudies that break out findings bysubcategories the \X/WC will reportfindings on students with disabilitieswithin all relevant Evidence Reports. Itshould be noted that the WWC doesnot conduct field research, and thereforedepends on existing research to produceits Evidence Reports.

BEST COPY AVAILABLE

Page 7: Reproductions supplied by EDRS are the best that can be ...This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that

The Research Exchange Volume 8, Number 2 6

r te s viDraft St ards

Although much NIDRR-fundedresearch is focused on areas outside ofK-12 education, the NCDDR believesit is important to make all NIDRRgrantees aware of the evidence-based andscientifically based emphasis that ED isplacing on research and interventionsfor education. For projects that do focuson K-12, it is critical for their NIDRR-funded research to be included in theWWC databases. The NCDDR hasbegun an activity to gather informationdeveloped by NIDDR grantees thataddresses infants, children and youthwith disabilities. This activity is beingconducted in order to bring forwardNIDRR-sponsored research that mayappropriately be included in the WWCresearch collection.

NCDDR staff asked researchers fromRehabilitation Research and TrainingCenters (RRTCs) and a sample of otherNIDRR grantees to review the draftdocuments Study DIAD, Version 0.6 andCREAD, Version 0.6, along with otherinformation available from the WWCWeb site.

Study DIADThe Study Design and ImplementationAssessment Device (Study DIAD) "isa system for assessing the degree towhich the design and implementationof individual evaluations permitconclusions about the causal effects of anintervention" (p.1, Valentine & Cooper,2003). The designers of the Study DIADattempted to create one instrument

that can be used to answer questionsat four different levels of specificity.We wanted the most general level tobe understandable to an audience ofnonresearchers and the most detailedlevel to be specific enough to satisfyresearchers' desire for comprehensivenessand explicitness.

The questions are arranged so thatanswers at one levelthe most specificlevelfeed into a set of design andimplementation questions at a secondlevel, a set of Composite Questions at athird level, and then into a fourth levelof even more general questions" (p. 2-3,Valentine & Cooper, 2003).

Valentine, J. C., & Cooper, H.(2003). What Works ClearinghouseStudy Design and ImplementationAssessment Device (Version 1.0).Washington, DC: U.S. Department ofEducation

NIDRR grantees reviewed the draftVersion 0.6 (posted 3/11/03)http://w-w-c.org/standards06.doc

Approved in June 2003, Version 1.0 isnow available (posted 7/17/03)http://w-w-c.org/DIAD_Final.doc

CREADCompanion to the Study DIAD is theCumulative Research Evidence AssessmentDevice, (CREAD) Version 1.0. TheCREAD is a device that is meant to

an expression of the confidencewith which a conclusion can be drawnabout the existence of causal effects of anintervention based on an entire body ofaccumulated evidence." ..."The CREADwill assess confidence in causal inferencesbased upon the (a) number, (b) validitycharacteristics, (c) variations, and(d) consistency in results of studiesincluded in the evidence base"(p. 1, Cooper & Valentine, 2003).

The CREAD and the Study DIAD bothutilize Cook & Campbell's "threats tovalidity" framework. The CREAD usesthree dimensions that are unique toevaluating the certainty of inferencespermitted by a set of studies, or anevidence base: depth, breadth, andconsistency (p. 2, Cooper & Valentine,2003).

Cooper, H. & Valentine, J.C.(2003). What Works ClearinghouseCumulative Research Evidence AssessmentDevice (Version 0.6). Washington, DC:US Department of Education.

NIDRR grantees reviewed draft Version0.6 (posted 1/23/03) http://w-w-c.org/creadv06.doc

Approved in June 2003, Version 1.0will be posted on the WWC Web site inAugust 2003.

Comments on Study DIADand CREADSelected NIDRR researcher were askedto submit any questions about howthe standards and review process mightapply to NIDRR-funded research,specifically:

(a) Overarching questions regardinghow "quality of research results"will be established through thesestandards, and

(b) Questions that may be relevant toestablishing the extent to whichresearch results that may be includedin the WWC's databases wouldadequately address the needs ofindividuals with disabilities.

Comments and questions about theStudy DIAD and CREAD, versions 0.6,were received from 14 NIDRR grantees,and they were forwarded to staff of theWWC for their comment.

Respondent 1:EF Currently, the Study DIAD only

covers papers reporting interventionstudies, and within that categoryonly those that were designed asintervention studies. Other designsthat often are used to evaluate

7

Page 8: Reproductions supplied by EDRS are the best that can be ...This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that

7 Volume 8, Number 2 The Research Exchange

interventions (any retrospectivedesigns, e.g. case control) arenot covered. Certain prospectiveintervention designs are also notcovered, (e.g. single subject designs,case series and other "weak"designs). Many of these approachesto research yield useful data thatwould either be not counted ordiscounted by the Study DIAD.

,j3 NIDRR grantees doing interventionresearch often produce multiplepublications, all relevant to themain research question or majorresearch hypothesis. Study DIADhas no mechanism to handle this.(However, if the publications do not"contradict" one another - differentnumber of cases, etc. - the outcomescould be combined for a singleStudy DIAD scoring).

EP Based on their research dataNIDRR grantees often also publishpapers that address side questions,ad hoc hypotheses, etc., for whichthe research design that was usedto answer the main question maynot be optimal. Study DIADshould not be used to condemnpapers addressing "peripheral"issues and ad-hoc questions; aspilot/exploratory work they mayhelp the science move forward, andshould be published with creditgiven to NIDRR for facilitating"innovation."

EP Study DIAD does not come upwith one total score/judgment, butwith four; possible answers include"YES,'' "NO," "MAYBE YES"and "MAYBE NO." These lattertwo judgments are not sufficientlygrounded to serve as a basis forgrant program managementdecision making. Similarly, there isno mechanism to combine just YESand NO judgments into an overalljudgment of a paper, let alone agrant-funded research program.

EP While criteria for what is good andwhat is poor research practice (seep. 5 of Study DIAD) are prettyclear and agreed upon among

researchers and writers of researchmethodology textbooks, there is lessconsensus as to what is good or bestor acceptable practice in a resource-poor environment: limited funding,overly restrictive human subjectsrequirements that result in largeand selective attrition, and otherlogistical problems that are outsidethe investigator's control. (Especiallyas uncontrolled/unplanned eventsunfold after a study has beenstarted, e.g. the introduction ofHIPAA). Study DIAD does notallow a judgment of "adequate giventhe circumstances."

Bringing forward the science ofeffective interventions is a good thing;the judgment on the usefulness andeffect size of an intervention should bebased strictly on the results of a studyas implemented. For management ofa research program this is a bad thing;judgments on productivity should takeinto account the fact that resources maybe limited, or that after the start of aresearch study there were changes in theenvironment of the study that madeimplementation of the original plandifficult if not impossible.

Possible refinements to Study DIAD:

EP Include questions addressingwhether the experimental or thecontrol groups may have beenexposed to interventions outsidethe study (co-interventions) thatmay have enhanced or diminishedthe effect of the experimentalintervention. Similarly missing fromthe Study DIAD are other aspectsof history, maturation, testing,instrumentation, etc. that imperilinternal validity.

Following are specific comments aboutthe eight Composite Questions in theStudy DIAD. These will be identified asCQ1 (Composite Question #1), etc.

c/3 Construct validityinterventionaddressed in CQ1 and constructvalidityoutcome measures in CQ2do not seem to reflect traditionalissues of construct validity (which

refer to the adequacy of the dataproduced by an instrument inquantifying a construct), but toother issues such as:

Was the interventionproperly defined, described,implemented? (issues oftreatment fidelity)

Was the outcome measurerelevant to the intervention?(only CQ2.2 addressesconstruct validity proper)

13° It is possible to have a misguided,muddled, undefined, undescribedintervention that is measured(exquisitely, but wrongly) using apsychometrically strong instrument.The same is possible for outcomes:the instrument is a great one, but ithas just no relevance to the conceptthe researcher aims to measure.The correspondence betweenconcept and measurement data isthe domain of construct validity; touse the same term for other issuesjust confuses us even more than wealready are with the 213 types of"validity" researchers have inventedto date.

CQ3 dealing with internalvalidityselection (for randomizedexperiments, quasi-experimental,and regression discontinuitydesigns) does not address adequacyof the randomization scheme andof the randomization concealment.These have been shown to bemajor problems in clinical medicalresearch.

EP CQ5 combines questions onsampling of cases, sampling ofoutcomes and sampling of post-intervention time points. The lattertwo issues might be better placedin CQ2 that deals with outcomemeasures.

EP Issues of negative side effects,balance of benefits and risks arenot included. Are these irrelevant

NIDRR Grantees review WWC Draft Standardscontinued on page 8

Page 9: Reproductions supplied by EDRS are the best that can be ...This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that

The Research Exchange Volume 8, Number 2 8

NIDRR Grantees review WWC Draft Standards

continued from page 7

to study quality, and should theybe picked up elsewhere by theevidence report team? Or is ameasure of poor study quality thefact that side/negative effects arenot mentioned?

CQ63 in Composite Question #6addresses testing across outcomes.This would also better be placedin CQ2 dealing with outcomemeasures.CQ64 asks about testing fortime of measurement; needsclarification by rephrasing.CQ65 on testing interventionvariations might be better placedin CQ1 (construct validityinterventions).

CQ7 addresses statistical validityeffect size estimation; however, thevalid use of statistical tests is notjudged.CQ7.3 and CQ7.4 addresssample size and outcomemeasurement error. These rollup into the precision of effectsizes, for example, the confidenceinterval. Effect sizes can becalculated whatever sample sizeand measurement error; whetherthe estimate of effect size is"sufficiently precise" is fairlyirrelevant. If the various studiessurviving Study DIAD review arecombined, a joint study effectsize will be calculated, with itsown precision estimate. That's thetime to worry about "sufficientlyprecise."

cP CQ8 addresses the adequacyof reporting of statisticaltests, NOT whether the testsperformed themselves wereadequate or appropriate (e.g. thatparametric tests were not usedfor non-parametric data). In thisComposite Question, Q8.3a is arelevant question, if effect sizescould be estimated for importantmeasured outcomes. If the answerto Q8.3a is YES, the answer

to Q8.1 (if sample sizes werereported/estimable) is YES, andthe answer to Q8.2 (if directionsof effects could be identified)is YES. If the answer to Q8.3ais NO, questions Q8.1, Q8.2,and CQ8.3b wold be irrelevant.Either you can calculate an effectsize (using standard proceduresor advanced mathematics), oryou cannot. If you cannot, thestudy cannot contribute to theEvidence Report - unless studiessummarized in "effect: yes, effectsize: unknown" are included.

If some of the items in CQ7and CQ8 are deleted, it may beuseful to combine the remainingquestions into one CompositeQuestion.

CREAD

The Cumulative Research EvidenceAssessment Device (CREAD) suggeststhat Study DIAD Composite Question#8 is based on a judgment ofcensoring/publication bias. The StudyDIAD does not mention this problem;in fact, a judgment of publicationbias cannot be made based on a singlestudy, and that's likely why the StudyDIAD is silent as to this problem.

Respondent 2:I appreciate the work that the panelput into the project. I would liketo emphasize that some of my mostserious concerns are not related tothe documents per se, but rather withwhat actions the developers can take toensure that the information is used inthe most appropriate ways.

First, I would like to applaud theoverall quality of the effort. Thetransparency of the process andalgorithms for including, excluding,and aggregating the studies is a verypositive step. I appreciate the effortsof the group to be maximally scientificand dispassionate in applying thecriteria to the various studies. This isparticularly true of the Study DIAD.,the CREAD, and then the actual

reports as written, will obviouslycontain relatively greater elementsof subjectivity. While this is notnecessarily a bad thing, I think thatit is important to acknowledge thatthere is no failsafe road to objectiverecommendations. Since most peoplewill not be aware of the limitations ofthe method involved in reaching thefinal report, I think it is importantto include relevant disclaimers andcautions, and remind people thatthis is an interpretation of availableinformation. I do not think this is afault with the method of the WWC,rather it is inherent in this sort ofeffort.

I do have several concerns with theparticulars of the Study DIAD/CREAD,however, which I list below. I willinclude first those most directly relatedto the nature of disability studies,and follow with comments related tostudies in education more generally.

My primary concern from theperspective of someone who has afocus on children with disabilitiesstems from the restrictions regardingthe design of studies that are to beconsidered for review under theguidelines for the Study DIAD. Theforward to the document notes thatthe review will target studies thatare capable of adducing causality;however, randomized trials and quasi-experimental methods are certainly notthe only means for doing so.

Researching interventions for childrenwith emotional and behavioraldisorders (EBD) and/or developmentaldisabilities (DD), much well-regardedresearch rests on within-subjectsdesigns which, when well done,certainly can assess causal effectivenessand deserve to be called "scientific."By omitting these types of designs,the universe of "scientific" studiesleaves out much of what we know tobe effective in educational settingswith children with EBD and DD. Forexample, there are many studies onpositive behavioral supports, effectiveteacher interventions to increase

Page 10: Reproductions supplied by EDRS are the best that can be ...This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that

9 Volume 8, Number 2 The Research Exchange

academic engagement, peer-mediatedlearning and social skills strategies.

In a similar vein, though of lessparticular concern to the populationwe tend to study, is that other sorts ofdesigns are also left out, (e.g. regressiondiscontinuity - although I suppose thiscan be considered a form of quasi-experimental design, it doesn't appearto fit from the criteria as currentlylisted). Methodologically speaking, Ibelieve inferences regarding causalitydrawn from regression discontinuitydesigns are on a par with randomizedexperiments.

My other concerns relate tointerventions in education moregenerally, and are thus relevant both todisabled (EBD/DD) and nondisabledpopulations. I will use examples fromresearch on reading interventions, sincethis is a hotly contested issue currentlyand also this is one of the First topicsthat the WWC is intending to address.

First, I appreciate the concern of thedevelopers of the Study DIAD with thealignment of outcomes, particularlyof the over alignment of outcomeswith interventions for examplethis will address in a productive waythe issue of whether teaching kidsphonics by reading pseudowords andthen measuring intervention groupversus control on reading pseudowordsactually has anything to say aboutdecoding of real words, much lesscomprehension. In other studies, thereis even a trickier distinction to make(e.g. is fluency accurately measured byread-aloud outcomes only?). This canonly be a judgment call, and I thinkthat consumers of the WWC reportsshould be made aware of how thereporters have come down on this andother similarly controversial issues.

Even with this attention to alignmentof immediate outcomes, I have seriousconcerns about long-term outcomesfrom reading interventions as well asthe effect over time that might resultfrom people consulting the WWCor similar sources. There is a sort of

dynamic over time that can occur,and I think we have seen evidencefor some of this already in the areaof reading: Short-term studies witheasy-to-measure outcomes are theeasiest to undertake. Many are done,these approaches appear promising(and may be recommended as "whatworks", which is legitimate) and arefunded more frequently. Textbookcompanies structure materials aroundthese short-term approaches (sincepolitical pressures push for short-term,easy-to-measure outcomes), and theseapproaches acquire the status of dogmaeven though there is little evidence ofpositive long-term outcomes that aremeaningful in terms of preparing kidsfor advanced study and careers.

Recent international comparisonsshow, for example, that U.S. kidsreceive more hours per week in readinginstruction than international peers.And while the average reading is onpar with international peers (at fourthgrade measures), an astonishinglyhigh percentage of U.S. fourthgraders NEVER read for fun outsideof school (32 percent vs. 18 percentinternationally). Longer term outcomesfor reading levels of US high schoolstudents compare less favorably tointernational outcomes, particularlyfor certain groups of kids (who happento be also those least likely to read forfun). I think it is certainly a reasonablehypothesis that increasing demandto focus on drill-type, skill-building,direct instruction methods forreading can have the perverse effect ofincreasing short-term outcomes whilehaving a long-term negative impact onkids' desire to read and their ultimateacademic success.

In the results from the NationalReading Panel, we saw this sort ofdynamic more or less in effect, withthe result that there has been increasingpressure on schools to adopt a verylimited number of prepackaged readingprograms. And while the studies citedin the NRP report found the largesteffect for relatively small amountsof instruction in some of these basic

0

skills (e.g. phonemic awareness andphonics) the pressure in classrooms hasbeen to do more and more, with verylittle attention to comprehension orstimulation of interest in reading.

I realize that these concerns aresomewhat outside the immediatepervue of the Study DIAD/CREADdevelopers, and I do not mean toimply that it is a shortcoming of theapproach; however,I also think thatthe developers have a responsibility toanticipate how this sort of informationwill be used, and to package andcontextualize it, and to provide explicitcommentary of what the results doand do not allow us to say. Hopefully,this will minimize the potential formisinterpretation or misuse of theinformation. Thank you for theopportunity to provide input into thisinteresting and useful project.

Respondent 3:Overall, I found the standards tobe clear and comprehensive as theypertained to research that lends itselfto a clean experimental or quasi-experimental design. The value ofthe guidelines is less clear in areas ofdisability support and disability policywhere experimental designs are moredifficult to implement, or, because ofthe dispersed nature of the populationand the difficulty of exertingexperimental controls in communityenvironments, establishing adequatepower and comparable populationswill be more difficult to achieve. Theseissues are particularly true for lowincidence populations.

A second area of concern is the role ofthe growing emphasis on experimentaldesign in relation to the broad missionof NIDRR around disability policyand the structure and organization ofdisability and generic supports. Thisconcern is particularly evident for aCenter like ours that focuses on stateservice systems. Our broad mission todescribe and improve the delivery of

NIDRR Grantees review WWC Draft Standardscontinued on page 10

Page 11: Reproductions supplied by EDRS are the best that can be ...This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that

The Research Exchange Volume 8, Number 2 10

NIDRR Grantee Review WWC Draft Standardscontinued on page 9

employment supports. Our work is ata state structure and state policy level,and interventions necessarily involve acomplex set of variables that don't lendthemselves well to experimental control.For example, the total n, from a stateperspective, is 50 (or 51 with DC).An intervention in one state is veryunlikely to be directly comparable to anintervention in another state given thevariation in political and cultural factors,lack of experimental controls over theimplementation process, and a host ofother variables.

Some general questions are listed below.Thanks for the opportunity to comment.

0° To what extent will the WWCguidelines be applied tononacademic areas includingtransition planning, employmentsupports, and development of self-determination skills?

O ° How will effective practices for lowincidence populations be addressedin the implementation of theguidelines?

EP How will the guidelines addresslarger issues like self-determination,employment support, and relatedtransition needs for students fromlow incidence populations? It willbe difficult to amass sufficientevidence for many of thesepopulations.

EP How will qualitative researchbe incorporated in the overallguidance that the WWC providesto practitioners? In particular,qualitative research may servetwo roles. As a precursor to thereview process it serves to framehypotheses and the parameters ofinterventions to allow for effectivedesign of experimental studies.Second, qualitative research mayprovide closer description of theintervention, and help practitionersto identify the key parametersand relevant variations in theintervention. It seems that the

WWC should provide for reviewand incorporation of qualitativeresearch as part of its process.

EP How will the WWC addresscomplex systems-level change suchas the design and implementationof employment supports withinstate government?

EP How might single subjectexperimental designs beincorporated in the analysis?

Respondent 4:D o I am enthusiastic about the

adoption of Campbell's frameworkregarding threats to validity. I'veused Campbell & Stanley, andCook & Campbell, for many yearsto prepare young researchers toevaluate their (and others') researchdesigns. The approach is eminentlyreasonable for application to thebreadth of research funded byNIDRR, OSEP, IES and otherentities. I think it serves us well.

0° The hierarchical arrangement ofquestions in both tools is alsoreasonable and, furthermore, Ifound the wording of the questionsto be comprehensible and suitable.

0° In terms of questions, I admit to aninadequate understanding of somekey aspects of the presentation. Forinstance,

- It is not clear to me how thetools would be used, by whom,at what point, and for whatparticular purposes. A number ofthe questions call for judgments,so the issue of who the reviewerswould be is vital.

- Some parts of the operations(procedures) are confusing tome. For example, I admit (withhesitation) that I do not grasp thestatement on the tables, "Readdown to determine the answer tothe question."

- In the CREAD, what determinesthe set of studies that would beincluded in the,e aluation of the

consistency, depth and breadth ofthe evidence base?

Respondent 5:Before I make specific comments onthese documents, I pose a couple ofcaveats based upon my own research andthat of my agency.

(1) Although much of our researchconcerns "intervention," none ofour research concerns "educational"interventions which is the specificresearch focus of these devices.

(2) As stated in the introductoryremarks, these devices do not addressqualitative research (the primarytype of research I do) nor other typesof research that is not randomizednor focused on uncovering causalrelationships.

As such, it is difficult to for me toevaluate how well these assessmentsapply to the types of research we areinvolved in here. I did find parts of theintroductory explanations to be lackingsufficient clarity.

I'm concerned how well either of theseevaluation devices work in practice.That is, I'd like to see evidence thatthese evaluation devices actually reflecthow the evaluator(s) would otherwiseassess the research. Have these evaluationdevices been piloted with peopleoutside the group that developed them?Following are some specific concernsabout the Study DIAD:

EP On page 4: The fourth assumptiondiscussed is that Donald Campbell'sapproach is "an obvious and naturalchoice." I'm not familiar withhis work, so it doesn't seem eitherobvious or natural to me. It's notthat I'd reject his approach, but I'dcertainly want more discussion ordescription to inspire a little moreconfidence.

o° On page 6: If the authors of a studydid not report how participantswere allocated to groups, thatstudy would be given a 'no' on therandomized assessment question.

1 1

Page 12: Reproductions supplied by EDRS are the best that can be ...This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that

11 Volume 8, Number 2 The Research Exchange

Why "no" instead of "not available?"From this example, I'm concernedthat the overall push to reducethe research to fit an algorithmicflow chart under- or over-estimatesconfidence in these evaluations.

0° CQ6: Was the intervention tested forits effect within important subgroupsof participants, settings, outcomes,occasions, and intervention variations?What does "important" mean here?

013 On page 7: I appreciate the attemptto be transparent and flexibleregarding definition of terms.However, if each time "vagueor ambiguous" terms are givenmeaning by different leaders doingthe evidence reports, then how willpossible disagreements or differencesbe accounted for when trying todo cumulative evaluations underCREAD?

Concern about the CREAD, page 4:The authors raise the question of howinconsistency of findings will be treated,and suggest that some issues "will beresolved uniformly" while "other issuesmay be left up to the evidence reportteam" which "will have to exercise goodprofessional judgement in making thesedeterminations." While I appreciate thecomplexity of this issue, it leaves meuncertain how consistent and objectivethese cumulative evaluations would be.

I think questions of consistency acrossprojects are particularly noteworthywhen considering much of NIDRR'swork across different disabilities. That is,how will possibly similar interventionswith different disability populations beaddressed?

Respondent 6:

This is an excellent project. I have somecomments regarding the two documents:

Social science research designand analysis is also tied to ethicalconsiderations. That is, all else equal,research with better methodologicaldesign is of higher ethical quality. Unless,I missed this point, much more couldbe said about the quality of the research

f

endeavor and ethical considerations. Forinstance, with external validity issues,meaningful involvement of personswith disabilities in designing andinterpreting disability policy research canbe important. Regarding the CREAD,I have argued that meta-analysis, wherepossible, is an ethical imperative.

Respondent 7:

I read over the WWC's CREAD and theStudy DIAD, and I think that they areboth excellent. Clearly, a lot of workhas gone into these documents. Myquestion would be, are they going to bedoing something similar for single studysubject research designs?

These documents appear to orienttowards research that is looking atinterventions done at the classroomlevel. A lot of rehabilitation, say,rehabilitation for kids with CerebralPalsy who might have impaired speechor language processing, or kids withother physiological disorders who mighthave impaired cognitive processing, isa tailored program where the personserves as his or her own control. Becausethere is no way that there would everbe enough kids, except in the very, verylargest districts, to get a large group andthen randomize them to interventions.

Respondent 8:

My primary concern is that allevidence-based practice carry with itdocumentation of representativeness(external validity) of population.Specifically, for example, if anintervention or practice is consideredevidence -based and recommended orused with American Indians, AmericanIndians should be documented as havingbeen included in the research sample.

Respondent 9:

The documents look very good. Theyreflect the best in the construction ofgroup designs. The lack of discussionof single subject design is of concern,however. First-rate group design can bevery difficult to do with low incidencepopulations, where sample sizes are

12

small and populations may be prettyheterogeneous. For these populations,single subject design can be the best wayto demonstrate treatment effects.

Comments fromResearchers onAppBication of the WWCStandards to N1DRR

esearchSeveral NIDRR grantee respondentsexpressed concerns that the StudyDIAD and CREAD could be applied toNIDRR-funded research for purposesother than consideration for the WWCdatabases. The following commentsreflect some of those perceptions:

Respondent 1:

In general our biggest issue was howthese documents would be used byNIDRR. As a tool to evaluate "centersof excellence," investigators, or research,more work needs to be done.

00 Study DIAD cannot be usedeffectively prior to studyimplementation and reporting. Thequality and quantity of a study'simplementation (e.g. treatmentfidelity, number of subjects) affectStudy DIAD scores, as does thequality of reporting of the studyresults. Thus, it is no use whenapplied to the evaluation of researchproposals, i.e., peer review.

cP NIDRR grantees doing interventionresearch often produce multiplepublications, all relevant to themain research question/majorresearch hypothesis. Study DIADhas no mechanism to handle this.(However, if the publications do notcontradict" one another different

number of cases, etc. - the outcomescould be combined for a singleStudy DIAD scoring).

NIDRR Grantee Review WWC Draft Standardscontinued on page 12

Page 13: Reproductions supplied by EDRS are the best that can be ...This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that

The Research Exchange Volume 8, Number 2 12

NIDRR Grantee Review WWC Draft Standardscontinued from page 11

cp Based on their research dataNIDRR grantees often also publishpapers that address side questions,ad hoc hypotheses, etc., for whichthe research design that was usedto answer the main question maynot be optimal. Study DIAD shouldnot be used to condemn thesepapers addressing "peripheral" issuesand ad-hoc questions: as pilot orexploratory work they may helpthe science move forward, andshould be published with creditgiven to NIDRR for facilitating"innovation."

Respondent 3:Establishing true experiments on a scaleadequate for meeting the guidelineswill require a substantial change in theallocation of funds to research activities.What should the role of NIDRR be, andwhat are the implications for fundinglevels usually associated with FIP andRRTC activities? Might NIDRR serve asa R&D arm of ED, framing issues thatneed a larger scale-experimental analysis?

Respondent 5:Overall, I do appreciate the attempt tostandardize how research design andimplementation are assessed. However, Idefinitely have questions about how thistype of evaluation would be applied toNIDRR research (i.e., by ignoring othertypes of research and/or implying thatother types of research are less desirableif they are not oriented to causal effect-randomized samples).

If NIDRR were to consider using theseevaluation tools for (at least some of)its research, I think NIDRR needs toarticulate what the purpose would be (asdescribed in the Study DIAD).

Respondent 6:What is the intended use of theseinstruments, as relates to thefunctioning and evaluation of NIDRR'sRehabilitation Research and TrainingCenters?

Is the idea to apply this model that isdesigned for educational research to theclinical, policy, and socio-environmentalresearch conducted by NIDRR RRTCgrantees?

EP Would the model be applied as is,or modified?

EP If to be modified, who wouldbe involved in the modificationprocess?

EP If to be modified, would one newtool with modifications be appliedto all RRTCs, or would specificmodifications be made to cohorts ofRRTCs with similar functions andfocuses?

EP Should these tools be applied, whowould be the reviewers that woulduse the tools to review RRTCfunctioning?

EP Would results of the review be usedfor program evaluation purposes?

013 Who would use these results(NIDRR staff, external reviewers,or the program staff themselvesfor internal program evaluationpurposes)?

Resolution of the AAM on Evidence-BaseResearch and Intellectual DisabilityThe move toward evidence-based research as specifiedin the No Child Left Behind legislation is wise, but itsoverly restrictive definition harms people with intellectualdisabilities and those with other severe disabilities. Anydefinition of evidence-based research must includemethodologies which are widely regarded as scientificallydefensible (such as single subject experimental design).

The requirement that research methods be restricted to groupdesign with a preference for randomized clinical trials willsignificantly inhibit the development and validation of newscientific knowledge in education. There are many situationsin which it is not feasible to assign children at randomto school placements or to types of instructional settings.Yet these situations do not preclude other viable researchdesigns. These concerns are especially great for people withintellectual and developmental disabilities given their lowincidence and their unique characteristics which requireindividualization of educational intervention.

BEST COPY AVAILABLE

We strongly urge Congress, the U.S. Department ofEducation, and other federal agencies to recognize andsupport the continued use of the full array of researchmethodologies that are scientifically accepted in the fields ofeducation, psychology, and child development. We requestthat a committee of highly respected educational researchersin low incidence disabilities be convened without delayto develop appropriate language on the array of scientificresearch methods for incorporation into the reauthorizationof IDEA.

January 13, 2003AAMR Board of Directors

American Association on Mental Retardation Board of

Directors. (2003). Resolution of the AAMR on Evidence-Based Research

and Intellectual Disability. Retrieved August 1, 2003 from

http://www.aamr.org/Reading_Room/pdf/resolution_research.pdf

Copyright 2003 AAMR. Reprinted with permission.

13

Page 14: Reproductions supplied by EDRS are the best that can be ...This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that

13 Volume 8, Number 2 The Research Exchange

Response from theWWC to NIDRRResearchers' CommentsFull responses from the WWC were notreceived in time for this issue, but theNCDDR will share them in a separatedocument at a later date. Following arethe WWC's preliminary comments:The WWC thanks the NIDRR granteesfor your thoughtful comments onthe Study Design and ImplementationAssessment Device (Study DIAD), Version0.6 and the Cumulative Research EvidenceAssessment Device (CREAD) Version 0.6.We do want to note that the Study DIADand CREAD were posted for publiccomment (November-December 2002and January-February 2003 respectively)and that the opportunity for publiccomment was widely circulated amongeducation constituencies. The Feedbackon Draft Standards can be viewed athttp://w-w-c.org/standards.html

The FAQ's posted at http://w-w-c.org/faqs/index.html provide additionalcontext and clarification about thestandards and other aspects of theWWC's work. As you will see if youreview our Web site at http://w-w-c.orgl,many of the comments and questionsyou have raised are addressed in thesedocuments.

Public comments, as well as those offeredby the WWC's Technical Advisory Group(TAG), were incorporated in the revisedstandards. Version 1.0 of both the StudyDIAD and the CRE4D were recentlyapproved by the TAG. The WWCwill revisit, and if necessary revise andimprove upon the standards periodically,based on comments such as the ones youhave offered and experiences in using thestandards in Evidence Reports.

The WWC looks forward to providingresponses to your questions andcomments in the near future. Wewelcome your further comments orquestions on the WWC. Please addressthem to: [email protected]

Online Resources

Coalition for Evidence-Based Policyhttp://www.excelgov.org/displayContent.asp?Keyword=prppcHomePage

Bringing Evidence-Driven ProgressTo Education: A RecommendedStrategy for the US. Department ofEducation (Executive Summary.)http://www.excelgov.org/usermedia/images/uploads/PDFs/CoalitionExSum.pdf

Rigorous Evidence: The Key to

Progress in Educationhttp://www.excelgov.org/displayContent.asp?Keyword=prppcEvidence

Education Sciences Reform Act (ESRA)http://www.ed.gov/policy/rschstat/leg/PL107-279.pdf

Evidence Based Education (EBE)

(Whitehurst, 2002).http://www.ed.gov/nclb/methods/whatworks/eb/edlite-index.html

Institute for Education Sciences (IES)http://www.ed.gov/about/offices/list/ies/index.htnil

No Child Left Behind Act (NCLB)http://www.ed.gov/policy/elsec/leg/esea02/index.html

NCLB Web sitehttp://www.nclb.gov

Reforming and Reauthorizing IDEA.H.R. 1350, Improving Results forChildren With Disabilities Act of 2003http://edworkforce.house.gov/issues/108th/education/idea/idea.htm

Reauthorization of the Rehabilitation

Act (ILRU/TIRR)http://www.ilru.org/ilnet/rehabl

Resolution of the AAMR on Evidence-

Based Research and Intellectual

Disabilityhttp://www.aamr.org/Reading_Room/pdf/resolution_research.pdf

14

RSA Legislation & Policyhttp://www.ed.gov/about/offices/list/osers/rsa/policy.html

Statement before the House

Subcommittee on Labor/HHS/

Education Appropriations,

(Whitehurst, 2003).http://www.ed.gov/news/speeches/2003/03/03132003a.html

What Works Clearinghousehttp://w-w-c.org/

American Institutes forResearch (AIR)http://www.air.org/

Aspen Systems Corporationhttp://www.aspensys.com/

Caliber Associates, Inc.http://www.calib.com/home/index.cfm

Campbell Collaborationhttp://www.campbellcollaboration.org/

EVIDENCE-BASED MEDICINE

Agency for Health Care Research

and Policy (AHRQ)http://www.ahrq.gov/

Centre for Evidence

Based Medicinehttp://www.cebm.net/

Centre for Health Evidence -

Users' Guides to Evidence-Based

Practicehttp://www.cche.nedusersguides/main.asp

Cochrane Collaborationhttp://www.cochrane.org/

BEST COPY AVAILABLE

Page 15: Reproductions supplied by EDRS are the best that can be ...This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that

The Research Exchange Volume 8, Number 2 14

Members of the NCDDRstaff are on the lookoutfor popular and disabilitymedia pieces that presentresearch funded byNIDRR. In this issue, weshare items from:

Disability Networknewsletter

The Boston Globe,EContent Magazine, andGovernment ComputerNews

Daily Mtuon Gazette,George Mason University

Who's in the News is afeature of The ResearchExchange that sharessome of the stories aboutNIDRR grantees and their

research that have appeared in nationalmedia sources. Staff members will talkwith grantees and media representativesabout the origin and evolution of thestories, and their interactions withmedia representatives. Sharing thismay be helpful to other grantees whowould like to establish relationshipswith journalists and work with them tomake information about their researchavailable to the public.

Please let the NCDDR know whenan item representing your NIDRR-funded project appears in the media.Call us, 1-800-266-1832, or sendemail to [email protected] and theitem will be reviewed for Who's inthe News. You may also use an onlineform: http://www.ncddr.org/forms/submitnews.html

The announcement of theIT Works Ability Awards washighlighted in DiversityWorld's Disability Network

newsletter, February 7, 2003. TheAwards were established by theInformation Technology Association ofAmerica (ITAA), in cooperation withthe University of Iowa Law, HealthPolicy 8c Disability Center's NIDRR-funded IT Works project. Read theDiversity World newsletter article:

http://www.diversityworld.com/Disability/newsletter.htm#February%207,%202003

IT Works Ability Awards is a nationalawards program "to stimulate interest inemploying individuals with disabilitiesand to give public recognition andreward to IT firms that have developedeffective strategies that promote theemployment and advancement ofpeople with disabilities."

(ITAA, 2003, http://wwwitaa.org/workforce/gendoc.cfm?DocID=101)

Press release from University of Iowa:http://www.uiowa.edut-nurnews/2002/may/0509disability-center.html

The 2003 award winners wereannounced on May 5 at the ITAA'ssixth annual National IT WorkforceConvocation in Arlington, Virginiaand included IBM (Recruitment andHiring), Microsoft (Accommodations),and Xerox (Developing Accessible ITProducts and Services).Press Release about the Awardsannouncement on the ITAA Web site:http://www.itaa.org/news/pr/PressRelease.cfin?ReleaseID=1052825625

For more information,contact IT Works Project director,James Schmeling:[email protected]

The Boston Globe (Businesssection), EContentMagazine, and GovernmentComputer News picked

up a press release about the newAccessible Technology Knowledgebase(ATKB) implemented by the NIDRR-funded Information TechnologyTechnical Assistance & TrainingCenter (ITTATC). Based in Atlanta atthe Georgia Institute of Technology,ITTATC has added a powerful anduser-friendly natural language searchcapability to their Web site.

Searches will automatically query theWeb sites and databases of the followingfederally funded disabilities-relatedresources.

15

AccessIThttp://www.washington.edu/accessit/

AssistiveTech.nethttp://www.assistivetech.net

DisabilityInfo.govhttp://www.disabilityinfo.gov

ITTATC.orghttp://www.ittatc.org

Section508.govhttp://www.section508.gov

Partners in the ATKB project includeiPhrase Technologies, Inc.; IDEALGroup, Inc.; NCR Corporation; andITTATC. iPhrase is completing theATKB's year-end advanced tuningprocess. Advanced tuning helps toinsure the results of a search produceknowledge that is as current andrelevant as possible.

For more information, contact MimiKessler, ITTATC Project [email protected]

Boston Globe February 24, 2003by Peter J. Howe iPhrase behindgovernment initiative to help disabledhttp://www.iphrase.com/PDFs/02-24-03_BostonGlobe.pdf

EContent MagazineFebruary 25, 2003http://www.econtentmag.com/Articles/ArticlePrint.aspx?ArticleID=4078&CategoryID=18

Government Computer Newe April1, 2003 http://www.gcn.com/voll_no 1 /daily-updates/ 21544-1.html

iPhrase Press Release,February 24, 2003 http://iphrase.com/news/feb2403.html

About the ITTATC AccessibleTechnology Knowledgebase (ATKB)http://www.ittatc.org/search_tips.cfm

An article by Robin Herronappeared in the Daily MasonGazette, George MasonUniversity, on March 11, 2003describing the Middle School

Phonemic Awareness Study. The study

Page 16: Reproductions supplied by EDRS are the best that can be ...This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that

15 Volume 8, Number 2 The Research Exchange

is a field-initiated project funded byNIDRR and directed by Dr. BarbaraGiven, Associate Professor, GraduateSchool of Education at George MasonUniversity. Dr. Given is director ofadolescent and adult learning researchprojects at the Krasnow Institute forAdvanced Study at GMU.

The project focuses on the role ofphonemic awareness in adolescentswith low reading skills. Herron's

article describes the study and howit is intended to help students makereading gains at a middle school inthe Fairfax County Public Schools inAlexandria, Virginia. Glasgow MiddleSchool, site of the Middle SchoolPhonemic Awareness Study, was alsonamed an "Accelerative Learning ModelSchool" by the International Alliancefor Learning, for its part in Dr. Given'sresearch efforts.

For more information about thestudy, contact Dr. Barbara Given:[email protected]

"Reading Research Program BenefitsFairfax County Middle SchoolStudents"http://gazette.gmu.edu/articles/index.php?id=4385

146.

The NCDDRcontinues to sharethe recognition givento NIDRR-fundedresearchers and their

Is staff. The itemspresented in TheResearch Exchangedemonstrate the widevariety and prestige ofspecial awards madeto staff membersof NIDRR-fundedprojects across the

41 country.

All grantees areencouraged to sendthis information to theNCDDR for future

issues. Send email to [email protected],call 1-800-266-1832, or use the onlineforn-i available on the NCDDR Website: http://www.ncddr.org/forms-submitrecog.html

Judy Brewer, Director of theWeb Accessibility Initiative(WAI) at the World WideWeb Consortium (W3C),was honored by the Alliance

for Public Technology (APT) with theSusan G. Hadden Pioneer Award. Shewas recognized "for pioneering effortsin telecommunications and consumeraccess" at a luncheon held February21, 2003 at the National Press Club inWashington, DC.

The Susan G. Hadden Pioneer Awardhonors the memory of APT's Public

Policy Committee Chair who was killedwhile a tourist in Cambodia, in January1995. APT leaders called Dr. Hadden,who was Professor of Public Affairs atthe LBJ School of Public Affairs at theUniversity of Texas in Austin, "uniquein her mastery of the complex tangleof regulatory and technical issues thataccompany the current debate overtelecommunications." The awardrecognizes those who continue Hadden'slegacy of ensuring equitable access totechnology as a democratizing principle.

For more information, contactJudy Brewer: [email protected]

Media Access Generatorsoftware, MAGpie 2.01,recently received Honorable

KV* Mention for Best EducationalStreaming Program in Streaming

Magazine's 2003 Reader's Choiceawards. Developers of Web- andCD-ROM-based multimedia needan authoring tool for making theirmaterials accessible to persons withdisabilities. The CPB/WGBH NationalCenter for Accessible Media (NCAM)has developed MAGpie version 1.0 and2.01 for creating captions and audiodescriptions for rich media. Fundingwas received from NIDRR, the TraceResearch and Development Center,and the Mitsubishi Electric AmericaFoundation.

For more information about MAGpie2.01, contact: [email protected]

16

Dr. Judith Cook has receivedthe American Sociological

"1-114, Association's (ASA) 2003William Foote Whyte

Distinguished Career Award. Dr.Cook is the Principal Investigator ofthe NIDRR-funded UIC NationalResearch and Training Center onPsychiatric Disability, University ofIllinois at Chicago.

Dr. Cook was honored for "her successin applied research, her commitmentto participatory action research (PAR),and her effective work as a mentor to alarge number of students." The WilliamFoote Whyte Distinguished Career Awardis presented by the Sociological PracticeSection of the ASA to an individualwho has made notable contributions tosociological practice which can includeseveral of the following elements:outstanding clinical or applied work,exceptional service to the section,publications that advance both thetheory and methods of sociologicalpractice, or mentoring and trainingof students for careers in sociologicalpractice. The award honors the ASA's72nd President.

For more information, contact Ms. EdieBamberger: [email protected]

In October, 2002, MitchellRosenthal, Ph.D., Project

of*, Director of the TraumaticBrain Injury National

NIDRR Grantee and Staff Recognition

continued on page 16

Page 17: Reproductions supplied by EDRS are the best that can be ...This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that

The-Research Exchange Volume 8, Number 2 16

NIORR Grantee and Staff Recognitioncontinued from page 15

Data Center, was awarded the 2002Gold Key Award of Merit from theAmerican Congress of RehabilitationMedicine (ACRM), "in recognitionof extraordinary service to the causeof rehabilitation medicine." It is thehighest honor given by ACRM.

Dr. Rosenthal was also honoredrecently as the Leonard Diller Lecturerby the Division of RehabilitationPsychology of the AmericanPsychological Association at itsconference "Rehabilitation Psychology2003" held in Tucson in April, 2003.Dr. Rosenthal is Vice-Presidentfor Research at Kessler MedicalRehabilitation Research and EducationCorporation and Professor of PhysicalMedicine and Rehabilitation at theUniversity of Medicine andDentistry of New Jersey - New JerseyMedical School.

For more information, contact MitchellRosenthal: [email protected]

A number of NIDRR granteeswere named in the 14th annualUS News and World Report'sBest Hospitals. Seven of thetop 17 hospitals on the 2003

Honor Roll have NIDRR grants. Thesemedical center and hospitals were highlyranked in 6 or more of the 17 BestHospitals specialty areas, out of 203hospitals ranked.

1. Johns Hopkins Hospital, Baltimore

2. Mayo Clinic, Rochester, MN

6. Duke University Medical Center,Durham, NC

9. University of Michigan MedicalCenter, Ann Arbor

10. University of Washington MedicalCenter, Seattle

15. Stanford Hospital and Clinics,Stanford, CA

17. Vanderbilt University MedicalCenter, Nashville

See original article:

http://www.usnews.com/usnews/nycu/health/hosptl/honorroll.htm

A total of 18 oflhe top 23Rehabilitation Hospitals are currentgrantees of NIDRR. The hospitalsranked in this list were identified by atleast three percent of the board-certifiedphysicians who responded to U.S. Newssurveys in 2001, 2002, and 2003.

1. Rehabilitation Institute of Chicago

2. TIRR The Institute forRehabilitation and Research,Houston

3. University of Washington MedicalCenter, Seattle

4. Mayo Clinic, Rochester, MN

5. Craig Hospital, Englewood, CO

6. Kessler Institute for Rehabilitation,West Orange, NJ

8. Thomas Jefferson UniversityHospital, Philadelphia

9. Spaulding Rehabilitation Hospital,Boston

10. Ohio State University MedicalCenter, Columbus

11. Rancho Los Amigos NationalRehabilitation Center,Downey, CA

12. Johns Hopkins Hospital, Baltimore

13. National Rehabilitation Hospital,Washington, DC

14. University of Michigan MedicalCenter, Ann Arbor

15. Moss Rehabilitation Hospital,Albert Einstein Medical Center,Philadelphia

16. Shepherd Center, Atlanta

17. Mount Sinai Medical Center,New York City

18. Stanford Hospital and Clinics,Stanford, CA

23. University of Alabama Hospital atBirmingham

See original article:

http://www.usnews.com/usnews-nycu/health/hosptl/rankings/specrepreha.htm

See full report: US News and WorldReport's Best Hospitals 2003http://www.usnews.com/usnews/nyculhealth/hosptl/tophosp.htm

17

Now lb Contact The Mauna°Center for the Disseminationof Disability Research

Call Us1-800-266-1832 or 512-476-6861 V/TT

8 A.M.NOON and 1 P.m.-5 P.M. C.TMondayFriday (except holidays) or

record a message 24 hr./day

Explore Our Web Site

http://www.ncddr.org/

E-mail Us

[email protected]

AiWrite'Us

National Center for theDissemination of Disability Research

Southwest EducationalDevelopment Laboratory

211 East Seventh Street, Suite 400Austin, Texas 78701-3281

Visit UsDowntown Austin, Texas 4th floor,Southwest Tower, Brazos at 7th St.

8 A.M.NOON and 1 P.M.-5 P.M. C.TMondayFriday (except holidays)

Fax Us

512-476-2286

The NCDDR is operated by the Southwest EducationalDevelopment Laboratory (SEDL). SEDL is an Equal Employ-mentOpportunity/Affirmative Action Employer and is committed toaffording equal employment opportunities for all indMduals in allemployment matters. Neither SEDL nor the NCDDR discriminateon the basis of age, sex, race, color, creed, religion, nationalorigin, sexual orientation, marital or veteran status, or thepresence of a disability. This document was developed undergrant H133A990008-A from the National Institute on Disabilityand Rehabilitation Research (NIDRR) in the U.S. Departmentof Education's Office of Special Education and RehabilitativeServices (OSERS). However, these contents do not necessarilyrepresent the policy of the U.S. Department of Education,and you should not assume endorsement by the Federalgovernment.

Copyright 2003 by theSouthwest Educational Development Laboratory

An electronic version of The Research Exchange,

Vol. 8, No. 2 is available on the Internet at URL

<http://www.ncddr.org/du/researchexchange/>

The Research Exchange is available in alternate formats

upon request.

John Westbrook, DirectorMagda Acufia, Web AuthorSean W. Claes, Communications Specialist, Graphic DesignUn HaMs, Information SpecialistJohn Middleton, Web AdministratorJoann Starks, Program Associate

BEST COPY AVAILABLE

Page 18: Reproductions supplied by EDRS are the best that can be ...This educational newsletter highlights a lead article, "Evidence-Based Research in Education." The article explains that

U.S. Department of EducationOffice of Educational Research and Improvement (OERI)

National Library of Education (NLE)Educational Resources Information Center (ERIC)

NOTICE

Reproduction Basis

Ettactlind Rasmus Infummlioa Cent!

This document is covered by a signed "Reproduction Release (Blanket)"form (on file within the ERIC system), encompassing all or classes ofdocuments from its source organization and, therefore, does not require a"Specific Document" Release form.

This document is Federally-funded, or carries its own permission toreproduce, or is otherwise in the public domain and, therefore, may bereproduced by ERIC without a signed Reproduction Release form (either"Specific Document" or "Blanket").

EFF-089 (1/2003)