using naep data on the web for educational policy...
Post on 09-Jun-2020
4 Views
Preview:
TRANSCRIPT
Using NAEP Data on the Web for Educational Policy Research
AERA
April 14, 2018
Emmanuel Sikali, NCES
Paul Hilliard, ETS
Amanda Papa, ETS
Debbie Kline, ETS
Dave Freund, ETS
• The only nationally representative and continuing assessment of what America’s students know and can do.
• Also known as “The Nation’s Report Card”
• http://nces.ed.gov/nationsreportcard/
• http://nationsreportcard.gov/
National Assessment of Educational Progress (NAEP)
2
• Administered to a representative sample of students (not all students)
• Provides estimates of group performance, not individual performance. For example,
- Female students,
- Hispanic students,
- Students whose teachers have a Master’s degree,
- Students in California (for state-level assessments only)
• Covers a broad domain of content – by design, any given student responds to only a small portion of the content
NAEP is a large-scale survey
3
• Conducted periodically in- mathematics, reading, science, and writing
• National-, state-, and district-level results
• About 140,000 – 200,000 students nationally
• About 2,000-3,000 students per state
- the arts, civics, economics, geography, U.S. history, and Technology & Engineering Literacy (TEL)• National-level results only; about 10,000 – 20,000
students
• Students assessed in grades 4, 8, and 12
• Assessment schedule:http://nces.ed.gov/nationsreportcard/about/assessmentsched.asp
Subjects and Cohorts
4
• The 2017 assessment at grades 4 and 8 (reading and math) included a bridge study of paper-based assessments (PBA) and digitally-based assessments (DBA)
• Sample sizes in each state were about 500 (PBA) and 2200 (DBA) per grade/subject
• Extensive evaluation of PBA and DBA led to the desired outcome that the 2017 reported data will be based on DBA
Digital Transition in 2017
5
• Successful design and linking serves the intended goal for the continuation of
• Reporting trends for subgroups, states, and urban districts
• Enabling within-year comparisons of performance for 2017 DBA for subgroups, states, and urban districts
• Establishing a DBA trend-line in 2017 that will be DBA-based in 2019 and beyond
Digital Transition in 2017 (cont.)
6
• Subject-matter achievement, in the form of- Overall (composite) content area scores (e.g.,
Mathematics)
- Subscale scores within a content area (e.g., Algebra)
- Each of these is in the form of average scale scores (either 0-500 or 0-300 scale)
- Achievement level percentages (e.g., percent at or above Basic)
- Percentiles (e.g., 90th percentile: scale score at which 90% of the students fall below and 10 percent are above)
• Responses to student, teacher, and school surveys in areas such as instructional experiences and school environment
Types of data
7
• Student Questionnaire
• Teacher Questionnaire
• School Questionnaire
• Some census-type school and district information provided by assessment contractors
http://nces.ed.gov/nationsreportcard/bgquest.asp
http://nces.ed.gov/nationsreportcard/researchcenter/variablesrudata.asp
Survey Instruments in NAEP
8
• NAEP does not provide scores for - Individual students
- Individual schools
- Individual teachers
- No Personally Identifying Information (PII) is retained or used during any analyses
• Teacher sample is not representative of teachers
- Teacher data can only be reported in the context of the student, i.e., “20 percent of students had teachers who reported that…”
What NAEP Does Not Report
9
• What is assessed in each academic subject?
• NAEP is built around an organizing framework - developed by the National Assessment Governing
Board
- guides the development of the assessment instrument
- determines the content to be assessed
- provides item development specifications
- provides information on skills appropriate for each grade level
NAEP Frameworks
10
NAEP Framework Example
• The mathematics framework classifies assessment questions in two dimensions, content area and mathematical complexity
• Mathematics content areas:
- number properties and operations
- measurement
- geometry
- data analysis, statistics, and probability
- algebra
• Mathematical complexity:
- Low
- Moderate
- High
• Details on the frameworks:
http://nces.ed.gov/nationsreportcard/frameworks.asp
11
Understanding Assessment Results
• NAEP presents assessment results of student performance in two ways:
1.Average scores on the NAEP subject scale
2.Percentages of students attaining NAEP achievement levels
12
Average Scale Scores
• Average scale scores are aggregated and reported at the student group level for the nation, states, and districts.
• They can also be used for comparisons among states, districts, and student groups.
13
Achievement Levels
• Achievement levels show how student performance measures against defined expectations for achievement.
• Three achievement levels for each grade assessed by NAEP:
• Basic: partial mastery of prerequisite knowledge
• Proficient: competency of challenging subject matter
• Advanced: superior performance
• Developed under the direction of the National Assessment Governing Board
• Defined and reviewed by representative panel of teachers, education specialists, and members of general public
14
Item Maps
• Illustrate what students know by positioning individual assessment items (and descriptions of them) along the NAEP scale at each grade level
• An item is placed on the map at a location where students of that ability level are likely to answer that item successfully
• https://www.nationsreportcard.gov/itemmaps/?subj=MAT&grade=4&year=2017
15
• NAEP is an observational study, not an experiment
• NAEP allows for patterns to be observed, but not for causality to be established
• Student responses to questions about their classroom experiences and home environments may be unreliable
• It is a snapshot in time, not a longitudinal study (different cohorts)
Caveats
17
The NDE Variable Structure
• Major Reporting Groups
• Student Factors
• Instructional Content & Practice
• Teacher Factors
• School Factors
• Community Factors
• Factors Beyond School
• Government Factors
“Select Variables” Groups
•Student Factors: All (when you want everyone), disability, gender, school lunch, parental education, race, ELL
•School Factors: Public, nonpublic, private, charter school, school location
•Community Factors: Region of the country
18
The NDE Variable Structure
• Student Factors
• Instructional Content & Practice
• Teacher Factors
• School Factors
• Community Factors
• Factors Beyond School
• Government Factors
“Select Variables” Groups
19
•Each group has several Subcategories
•Many variables based on Survey Question responses
•A “Search Variables” feature is available (located above “Select a Subcategory”)
•Variable levels may be combined or compared via cross tabulation
Hands on the Data Explorer!
• http://nces.ed.gov/nationsreportcard/naepdata/
20
• NAEP can help with these:
- Do high school students … ? 12th grade NAEP
- Do students in cities … ? Urban districts (aka TUDAs)
- Do students whose teachers are more experienced … ?
- Do students who are read to in class perform better on NAEP (on average) compared to … ?
- Which states have scored significantly higher than the National average on...
- Has an achievement gap between two subgroups narrowed or widened over time?
General Research Questions
25
NAEP Research Question #1
• Do boys score higher than girls in math (within year)? Is there a change over time?
• Select the following in NDE:
- Mathematics, grade 8, 2017, 2015, 2013, 2011, 2009, 2007, Composite scale
- National
- Gender
- Average scale scores
- Trend charts (Table and Line Chart)
- Significance test
- Gap analysis
26
NAEP Research Question #2
• Do students in charter schools perform better on NAEP than students in public schools?
- Math, grade 8, 2017, Composite scale
- National
- School Factors: School identified as charter?
- Select within “Major Reporting Groups” category and “School Factors” Subcategory
- Scale scores and Achievement levels – Cumulative- Select “at or above Basic” and “at or above Proficient”
- Table
- Significance test
27
• 3A – There has been an increase in English Language Learner (ELL) students in the past several years. How does this impact overall NAEP results by race/ethnicity?
- Reading, grade 8, 2017, Composite scale
- National Public, NJ, NY, PA, New York City, Philadelphia
- Status as ELL (2 categories), Race/Ethnicity used to report trends:- Combine Variable Categories: all other Race/Ethnicity
categories as “Not Hispanic”
- Create Crosstab option: variable “ELL by Race”
- Statistic Options: Average scale scores, percentages
- Table, Bar Charts for Scale Scores and Percentages
• 3B – Repeat using Mathematics
NAEP Research Question #3A-B
28
• Do students in my state read outside of school? Is that related to how well they read?
- Reading, grade 8, 2017, Composite scale
- National Public, All States
- Factors Beyond School Category
- Time Use Outside of School Subcategory- “Time spent reading outside of school”
- Remove Standard Errors (SE) from printing, and no decimals
- Average Scale Scores and Percentages
- Table
- Sig test map for percentages (“About 30 minutes” level)
- Sig test map for average scale scores (“About 30 minutes” level)
NAEP Research Question #4
29
• Has there been a closing of the achievement gap between eighth-grade white and Hispanic students in math? How about locally and in the region?
- Math, grade 8, 2017, 2015, 2013, 2011, 2009, 2007, Composite scale
- National Public, New York, Northeast Region
- Race/ethnicity used to report trends- Collapse all other races besides White and Hispanic into one
category by selecting “Combine Variable Categories”
- Achievement Level of “at or above Proficient”
- Table
- Line Charts
- Gap Analysis (Across Years, between White and Hispanic Groups)
NAEP Research Question #5
30
• Rural areas have experienced an economic decline in the past several years. Have we observed a recent decline in NAEP scores in such areas?
- Reading, Grade 8, 2017, 2015, 2013, 2011, 2009, 2007, Composite scale
- National
- Major Reporting Groups, School Factors Subcategory, School Location (4 categories)
- Average Scale Scores
- Table, Line Chart, Significance Test (select Rural)
- Repeat for Mathematics, Grade 8
NAEP Research Question #6
31
• Has there been a significant overall change in Scale Scores in New York City Public Schools (2007 to 2017)? How does this compare to other large city school districts?
- Reading, Grade 8, 2017, 2015, 2013, 2011, 2009, 2007, Composite scale
- National Public, Large City, New York City along with all Districts (All Students variable)
- Average Scale Scores
- Line Chart (compare National Public, Large City, NYC)
- Significance Tests- Table for New York City only across years
- Comparison with other large districts nationwide (Large City plus Atlanta, Baltimore, Chicago, Dallas, Detroit, District of Columbia, Houston, Los Angeles, Philadelphia)
- 2017 and 2015 (click filter arrows in table and select New York City)
NAEP Research Question #7
32
• What is the relationship between access to a computer/tablet and student achievement? How does my state (New York as an example) compare to other states and the nation as a whole?
- Math, Grade 4, 2017, Composite scale
- National Public, All States
- Factors Beyond School; Subcategory: Time use outside of school “Have at home: Computer or tablet you can use”
- Average Scale Scores, Achievement Levels (select “at or above Proficient” only)
- Data Tables
- Significance Test Maps for Achievement Levels for “Access=Yes” versus “Access=No”
NAEP Research Question #8
33
A Practice Exercise
• Choose a subject, grade, and jurisdiction
• Choose a variable that you believe is correlated with performance. Confirm whether or not it appears to be related to performance.
• Add race or percent eligible for free/reduced price school lunch.
• Obtain cross tab for both scale score and percentages.
• Run a significance test and/or gap analysis.
34
Thank you for participating
Emmanuel Sikali (Emmanuel.Sikali@ed.gov)
Paul Hilliard (philliard@ets.org)
Amanda Papa (apapa@ets.org)
Dave Freund (dfreund@ets.org)
Debra Kline dkline@ets.org
35
top related