common perkins iv - cloud object storage | store...

100
Perkins IV Accountability Data Guide Carl D. Perkins Career and Technical Education Act of 2006 MPR Associates, Inc. 2150 Shattuck Avenue, Suite 800 Berkeley, CA 94704 Editor Jim Schoelkopf [email protected] 503-963-3756 September 2009 Prepared under contract to Office of Vocational and Adult Education, U.S. Department of Education

Upload: dangnguyet

Post on 26-May-2018

213 views

Category:

Documents


0 download

TRANSCRIPT

Perkins IV Accountability Data GuideCarl D. Perkins Career and Technical Education Act of 2006

MPR Associates, Inc.2150 Shattuck Avenue, Suite 800

Berkeley, CA 94704

EditorJim Schoelkopf

[email protected]

September 2009

Prepared under contract toOffice of Vocational and Adult Education,

U.S. Department of Education

II

iii

Contents

PAGE

Common Perkins IV Accountability Terms: A Hyperlink Index............................................................................................................................................................................................................................................1

Introduction and Overview............................................................................................................................................................................................................................................5

Student Definitions............................................................................................................................................................................................................................................11

Secondary CTE Measures............................................................................................................................................................................................................................................17

Postsecondary/Adult CTE Measures............................................................................................................................................................................................................................................35

Secondary Tech Prep Indicators............................................................................................................................................................................................................................................49

Postsecondary Tech Prep Indicators............................................................................................................................................................................................................................................55

iv

Perkins IV Validity and Reliability Checklist: Sample Decision Tree Diagram............................................................................................................................................................................................................................................59

Cross Measure Issues............................................................................................................................................................................................................................................61

Definitions: Race/Ethnicity, Special Populations and Other Student Categories............................................................................................................................................................................................................................................67

A Guide to Crosswalking Nontraditional Occupations and Programs............................................................................................................................................................................................................................................71

Strategies for Identifying Nontraditional CTE Programs............................................................................................................................................................................................................................................73

Annotated Resource Directory Accountability Data Guide............................................................................................................................................................................................................................................75

Perkins IV Accountability Data Guide Hyperlink Index—APPENDICES

COMMON PERKINS IV ACCOUNTABILITY TERMS: A HYPERLINK INDEX 1

Common Perkins IV Accountability Terms: A Hyperlink Index

The intended purpose of this hyperlink index is to enable the user to more efficiently search and locate relevant Perkins IV accountability guidance.

In addition to the text within the Perkins IV Accountability Data Guide, each section may contain a number of web hyperlinks to additional resources. As appropriate, you may wish to refer to these documents when researching answers to your Perkins IV accountability questions.

Perkins IV Accountability Provisions

Student definitions

Concentrator

Secondary concentrator

Postsecondary concentrator

Participant

Secondary participant

Postsecondary participant

Tech Prep

Secondary Tech Prep student

Postsecondary Tech Prep student

Section 113 Core Indicators

Core Indicator 1: Student Attainment

1S1—Secondary academic attainment—Reading/Language Arts

1S2—Secondary academic attainment—Mathematics

2S1—Secondary technical skill attainment

COMMON PERKINS IV ACCOUNTABILITY TERMS: A HYPERLINK INDEX 2

1P1—Postsecondary technical skill attainment

Core Indicator 2: Credential Attainment

3S1—Secondary school completion

4S1—Secondary school graduation rates

2P1—Postsecondary credential, certificate, or degree

Core Indicator 3: Placement and Retention

5S1—Secondary placement

3P1—Postsecondary education retention or transfer

4P1—Postsecondary placement

Core Indicator 4: Participation in, and Completion of, Nontraditional Programs

6S1—Participation in secondary nontraditional programs

6S2—Completion in secondary nontraditional programs

5P1—Participation in postsecondary nontraditional programs

5P2—Completion in postsecondary nontraditional programs

Nontraditional programs and occupations crosswalks

Section 203 Tech Prep Indicators

Secondary Tech Prep Indicators

1STP1—Enrollment in postsecondary education

1STP2—Enrollment in the same field or major

1STP3—Completion of a State or industry-recognized certification or licensure

1STP4—Completion of courses that award postsecondary credit

1STP5—Enrollment in remedial postsecondary courses

Postsecondary Tech Prep Indicators

1PTP1—Placement in a related field of employment

COMMON PERKINS IV ACCOUNTABILITY TERMS: A HYPERLINK INDEX 3

1PTP2—Completion of a State or industry-recognized certification or licensure

1PTP3—Completion of a two-year degree or certificate program

1PTP4—Completion of a baccalaureate degree program

Cross Measure Issues

Performance indicators (or performance measures)

Performance levels

Disaggregation of performance data

Race/Ethnicity definitions

Special populations and other student categories definitions

Record matching

Reliability

Reporting year

Validity

Monitoring

INTRODUCTION AND OVERVIEW 5

Introduction and Overview

Purpose: This Accountability Data Guide provides resource materials and background information for State teams for use as a program development/improvement mechanism for the field. This guide fulfills two purposes: in conjunction with the Appendices, it is a resource for identifying requirements contained in legislation and in non-regulatory guidance. However, it also provides a host of suggestions that should be viewed a Best Practices resource that states may (but are not required to) use when implementing Perkins IV. This guide is a resource manual for Perkins IV accountability review and reporting, to ensure state systems are valid and reliable, and when updating definitions to address changes in State accountability measures.

Tools: The Appendices contain a variety of resources for use by states. There is a copy of the actual legislation, as well as a variety of previously published non-regulatory guidance and Frequently Asked Questions (FAQ). There are also provided a number of other items that states may find valuable as a resource to help review their own programs or as instruments in conducting program evaluations (such as Appendix O – Self-Evaluation Checklist” or Appendix P “Perkins Monitoring Document”). The tools and “Non-regulatory Guidance” are exactly that. They are not “mandated”, but highly recommended for use by states.

Definitions of Terms: The terms and accompanying definitions contained in this guide reflect the “recommended” guidance provided to states at the inception of Perkins IV. Since that time, states have negotiated a series of terms and definitions that vary somewhat from those contained in this guide. It is impractical to list all of the terms and definitions negotiated with each state in this document (please refer to the annual Perkins IV Report provided to Congress annually to see those). Those terms and definitions are reviewed by states annually, and there is some adjustment that occurs with several states each year through a negotiation process. The “recommended” terms and definitions in this guide DO NOT supersede those negotiated with each state.

INTRODUCTION AND OVERVIEW 6

This guide opens with an overview of the Perkins IV Core Indicator Framework and definitions of key terms and components. This is followed with a description of recommended measure constructions for each measure contained within the Carl D. Perkins Career and Technical Education Act of 2006. Information provided for each of the performance measures addresses several issues:

Performance Measure Definition —Presents the indictor definition offered in OVAE’s March 2007 nonregulatory guidance and provides definitions of key terms. Where appropriate and needed, definitions are provided for the “sub-indicators” within selected performance measures.

Issues and Considerations—Offers probing questions and considerations States should address as they implement their State’s strategies for collecting and reporting data on each of the performance measures.

Measurement Approaches—Details the definitions of allowable measurement approaches for each indicator, including recommended assessment and data-collection strategies.

Quality Criteria—Defines criteria for performance measurement to ensure sufficient validity and reliability of State performance measurement and reporting.

INTRODUCTION AND OVERVIEW 7

Section 113(2) Accountability: Indicators of Performance Framework

Purpose of Core Indicator Framework: Defining Sufficient Rigor, Validity, and Reliability

Perkins IV defines major roles for OVAE and States in establishing performance accountability systems for career and technical education. States are given the responsibility of developing performance measures and data-collection systems for the required core indicators. OVAE is responsible for ensuring that State performance measures are sufficiently valid and reliable for setting and adjusting performance levels (sec. 113(b)(3)(A)(i–vi)) and reporting State data (sec. 113(c)(2)(A–B)). The major purpose of the framework is to establish and define State measures, measurement approaches, and quality criteria for each core indicator to ensure validity and reliability within State accountability systems.

Components of Core Indicator Framework

The core indicator framework addresses core indicators and performance measures. These indicators and performance measures are shown below.

Core Indicators and Performance Measures

Core Indicator 1. Student Attainment1S1: Secondary Academic Attainment–Reading/Language Arts1S2: Secondary Academic Attainment–Mathematics 2S1: Secondary Technical Skill Attainment1P1: Postsecondary Technical Skill Attainment

Core Indicator 2. Credential Attainment3S1: Secondary School Completion4S1: Secondary School Graduation2P1: Postsecondary Credential, Certificate, or Degree

INTRODUCTION AND OVERVIEW 8

Core Indicator 3. Placement and Retention5S1: Secondary Placement3P1: Postsecondary Education Retention or Transfer4P1: Postsecondary Placement

Core Indicator 4. Participation in, and Completion of, Nontraditional Programs6S1: Participation in Secondary Nontraditional Programs6S2: Completion of Secondary Nontraditional Programs5P1: Participation in Postsecondary Nontraditional Programs5P2: Completion of Postsecondary Nontraditional Programs

As described below, the core indicator framework defines the performance measures and State measurement approaches for each of the indicators. It also suggests the quality criteria for assessing State measurement approaches.

Performance Measure Definitions. The definition of the performance measure for each indicator including the definition of the numerator and denominator of each performance measure. The Act allows numbers or percentages for the purpose of accountability. States must use a number for the numerator and a number for the denominator in order to have a complete measure definition.

Performance Measurement Approaches. The major State approaches for performance measurement for each performance measure. These approaches include assessment and data-collection strategies.

Quality Criteria for Performance Measurement. The quality criteria for performance measurement to ensure sufficient validity and reliability of State performance measurement and reporting.

Although Perkins IV requires that States report information on all students participating in career and technical education (CTE), the OVAE core indicator framework applies only to those students who concentrate at a State-defined level of career and technical education. Throughout the framework, these students are defined as CTE concentrators. The only exceptions are the two indicators for participation in nontraditional programs: 6S1—Participation in Secondary Nontraditional Programs and 5P1—Participation in Postsecondary Nontraditional Programs. These two indicators address all CTE participants—that is, students who enrolled in at least one or more credits in any CTE program area.

INTRODUCTION AND OVERVIEW 9

OVAE has worked cooperatively with States to develop the definition of State-defined threshold level of career and technical education for both secondary and postsecondary CTE. States have the flexibility to develop their own definitions of threshold level of CTE within the context of their own unique career and technical education system, as long as these definitions provide the foundation for the development of high-quality performance measures for all core indicators.

INTRODUCTION AND OVERVIEW 10

Section 203(e) Tech Prep Indicators of Performance and Accountability Framework

Purpose of Tech Prep Indicator Framework: Defining Sufficient Rigor, Validity, and Reliability

For those States that choose to maintain a dedicated Perkins IV Tech Prep funding stream, the Act requires States to establish a Tech Prep accountability system. States are given the responsibility for developing performance measure definitions and data-collection systems for the required Tech Prep indicators. The major purpose of the Tech Prep accountability framework is to establish and define Tech Prep measures, suggest considerations for implementation, and suggest resources to ensure validity and reliability within the State’s Tech Prep accountability system.

Components of the Tech Prep Indicator Framework

The Tech Prep indicator framework addresses indicators and performance measures. These indicators and performance measures are shown in detail in the Tech Prep section of this Guide.

Tech Prep Indicators and Performance Measures

Secondary Tech Prep Indicators: The number and percentage of secondary education Tech Prep students enrolled in the Tech Prep program who:

1STP1—enroll in postsecondary education;

1STP2—enroll in postsecondary education in the same field or major as the secondary education Tech Prep students were enrolled at the secondary level;

1STP3—complete a State or industry-recognized certification or licensure;

1STP4—successfully complete, as a secondary school student, courses that award postsecondary credit at the secondary level;

INTRODUCTION AND OVERVIEW 11

1STP5—enroll in remedial mathematics, writing, or reading courses upon entering postsecondary education.

Postsecondary Tech Prep Indicators: The number and percentage of postsecondary education Tech Prep students who:

1PTP1—are placed in a related field of employment not later than 12 months after graduation from the Tech Prep program;

1PTP2—complete a State or industry-recognized certification or licensure;

1PTP3—complete a two-year degree or certificate program within the normal time for completion of such program;

1PTP4—complete a baccalaureate degree program within the normal time for completion of such program.

STUDENT DEFINITIONS 13

Student Definitions

Secondary Student Definitions

Definitions

CTE Participant: A secondary student who has earned one (1) or more credits in any career and technical education (CTE) program area.

CTE Concentrator: A secondary student who either:

1. Has earned three (3) or more credits in a single CTE program area (e.g., health care or business services); OR

2. Has earned two (2) credits in a single CTE program area, but only in those program areas where two credit sequences at the secondary level are recognized by the State and/or its local eligible recipients.

Issues and Considerations

Issues and considerations are offered to assist with the development and implementation of data-reporting strategies associated with each of the Perkins IV performance indicators. Specific answers to the questions posed are influenced by circumstances for data collection and reporting in each State. Additional questions may surface as consideration is given to the implementation approach to each performance measure.

How are credits defined?

Carnegie units or some other measurement unit?

Are the credits based on a year? A semester? A trimester?

Are credits earned at an Area CTE center equivalent to the credits earned at the sending high school?

How many courses does one credit represent?

The credit definition may impact the number of courses in a CTE sequence and influence the concentrator definition.

STUDENT DEFINITIONS 14

Is there a minimum grade required for a student to earn credit in a course that’s part of a CTE sequence?

Do credits earned in an “applied” or “integrated” academic course count toward earning credits in a CTE program sequence or program area?

Is there an explanation or data file specifications of how the definitions will be operationalized using the State’s secondary data system? For example, if one credit is equal to one course in a two- or three-course sequence, is that included in the definition?

Resources

Office of Vocational and Adult Education (OVAE) Non-Regulatory Guidance: http://cte.ed.gov/perkinsimplementation/nrg.cfm

Peer Collaborative Resource Network (PCRN) Core Indicators of Performance: http://cte.ed.gov/accountability/coreindicators.cfm

Peer Collaborative Resource Network (PCRN) Next Steps Work Group: http://cte.ed.gov/accountability/nswg.cfm

Peer Collaborative Resource Network (PCRN) Data Quality Institutes: http://cte.ed.gov/accountability/dqi.cfm

STUDENT DEFINITIONS 15

Postsecondary/Adult Student Definitions

Definitions

CTE Participant: A postsecondary/adult student who has earned one (1) or more credits in any CTE program area.

CTE Concentrator: A postsecondary/adult student who either:

1. Completes at least 12 academic or CTE credits within a single program area sequence that is comprised of 12 or more academic and technical credits and terminates in the award of an industry-recognized credential, a certificate, or a degree; OR

2. Completes a short-term CTE program sequence of fewer than 12 credit units that terminates in an industry-recognized credential, a certificate, or a degree.

Issues and Considerations

Is it possible for a student to be a participant if he/she earns one academic credit and no CTE credits in a CTE program?

Do courses represent more than one credit (in general)?

Does the State have any programs that are 11 credits or fewer (i.e., any programs that would fit into section 2 of the concentrator definition)?

Must a student declare a major to be considered a CTE concentrator?

Does the State’s CTE Concentrator definition reflect the appropriate number of credits based on the State’s credit system (e.g., semester vs. quarter hour)?

Resources

Office of Vocational and Adult Education (OVAE) Non-Regulatory Guidance: http://cte.ed.gov/perkinsimplementation/nrg.cfm

Peer Collaborative Resource Network (PCRN) Core Indicators of Performance: http://cte.ed.gov/accountability/coreindicators.cfm

STUDENT DEFINITIONS 16

Peer Collaborative Resource Network (PCRN) Next Steps Work Group: http://cte.ed.gov/accountability/nswg.cfm

Peer Collaborative Resource Network (PCRN) Data Quality Institutes: http://cte.ed.gov/accountability/dqi.cfm

STUDENT DEFINITIONS 17

Secondary Tech Prep Student Definitions

Definitions

Secondary Tech Prep Student: A secondary education student who has enrolled in two courses in the secondary education component of a Tech Prep program.

Issues and Considerations

Is there a clear definition for a “Tech Prep program”?

Are courses defined by Carnegie credits? Or, by some other factor?

If courses are defined by Carnegie credits, are two courses defined as two 1-credit courses? Or, two .5-credit courses?

Resources

National Association for Tech Prep Leadership: http://www.natpl.org/docs/mastel-docs2009/04-02/Tech-Prep-Indicators-Definitions-PM-formulas-OVAE-NSWG-2-23-09.pdf

Peer Collaborative Resource Network (PCRN) Next Steps Work Group:http://cte.ed.gov/accountability/nswg.cfm

Peer Collaborative Resource Network (PCRN) Data Quality Institutes: http://cte.ed.gov/accountability/dqi.cfm

STUDENT DEFINITIONS 18

Postsecondary Tech Prep Student Definitions

Definitions

Postsecondary Tech Prep Student: A student who has:

1. Completed the secondary education component of a Tech Prep program; and

2. Has enrolled in the postsecondary education component of a Tech Prep program at an institution of higher education described in clause (i) or (ii) of section 203 (a)(1)(B).

Issues and Considerations

How will the postsecondary institution determine if a student has completed the secondary component of a Tech Prep program?

How does the postsecondary institution define the postsecondary education component of a Tech Prep program?

Resources

National Association for Tech Prep Leadership: http://www.natpl.org/docs/mastel-docs2009/04-02/Tech-Prep-Indicators-Definitions-PM-formulas-OVAE-NSWG-2-23-09.pdf

Peer Collaborative Resource Network (PCRN) Next Steps Work Group:http://cte.ed.gov/accountability/nswg.cfm

Peer Collaborative Resource Network (PCRN) Data Quality Institutes:http://cte.ed.gov/accountability/dqi.cfm

SECONDARY CTE MEASURES 19

Secondary CTE Measures

1S1: Academic Attainment—Reading/Language ArtsAll students who reach State-defined CTE concentration should also achieve mastery of academic knowledge and skills that meet State academic standards. To assess attainment of academic knowledge, Congress requires State and local education agencies to report on the academic outcomes of students who have concentrated in CTE and left secondary education.

Numerator: Number of CTE concentrators who have met the proficient or advanced level on the Statewide high school reading/language arts assessment administered by the State under Section 1111(b)(3) of the Elementary and Secondary Education Act (ESEA) as amended by the No Child Left Behind Act based on the scores that were included in the State’s computation of adequate yearly progress (AYP) and who, in the reporting year, left secondary education.

Denominator: Number of CTE concentrators who took the ESEA assessments in reading/language arts and whose scores were included in the State’s computation of AYP and who, in the reporting year, left secondary education.

1S2: Academic Attainment—MathematicsNumerator: Number of CTE concentrators who have met the proficient or advanced level on the Statewide high school mathematics assessment administered by the State under Section 1111(b)(3) of the Elementary and Secondary Education Act (ESEA) as amended by the No Child Left Behind Act based on the scores that were included in the State’s computation of AYP and who, in the reporting year, left secondary education.

Denominator: Number of CTE concentrators who took the ESEA assessments in mathematics and whose scores were included in the State’s computation of AYP and who, in the reporting year, left secondary education.

SECONDARY CTE MEASURES 20

Issues and Considerations

Are all CTE concentrators who took the assessments included?

Include all students who: (1) achieved the CTE concentrator threshold, (2) who left secondary education in the reporting year, and (3) who took a State academic exam. Include concentrators who graduated early or on time, or who were recorded as dropouts in the reporting year. Note that States may have the option of extending the reporting period into the summer of the reporting year to account for concentrators who completed their program by attending summer school. Exclude concentrators who transferred to another institution.

Is the data for the current reporting year? Or, can the measure include academic test scores from different years?

Report on concentrators who were recorded as having left high school in the reporting year. In practice, this means that the measure may include data on students who took the State academic exam in different years.

The schedule for the administration of the State’s assessment may occur prior to a student reaching CTE concentrator status.

Since States may administer their department’s ESEA guidelines for academic assessment at different times, it is possible that students may sit for their academic assessment prior to attaining concentrator status. As a consequence, State academic data may not provide useful information on the contribution that CTE makes to student attainment of academic skills.

How is the State defining “left secondary education”? (Graduation only? Inclusive of dropouts? Inclusive of GED recipients?)

Does the State report 1S1 performance on the State’s reading assessment only? Does the State report 1S1 performance on a combined score from the State’s reading and writing assessment?

States may choose to report data on concentrators’ performance on the State reading assessment alone or as a combination of the State’s reading and writing assessments. Either approach is permissible, so long as the State clearly articulates its testing approach in the State plan. The State also must set consistent

SECONDARY CTE MEASURES 21

performance benchmarks and targets in consultation with its federal Regional Accountability Specialist, consistent with the guidelines of the Department of Education reform.

Does the State permit students to retake any of the State department’s ESEA guideline assessments? How often? When?

States may permit concentrators to retake any of their State department’s ESEA guideline assessments, so long as students are held to the same standard as all other students in the State. In practice, States should adopt the same testing methodology and performance expectations for all other students.

Over time, the State may institute different cut-points for proficiency and/or testing rules for ELL, or students with disabilities that may affect the comparability of 1S1/1S2 from year to year.

State academic exams may change over time as States modify exams, institute different cut-points for proficiency, or impose new testing rules for ELL or disabled students. States should contact their federal Regional Accountability Specialist to discuss any changes to State testing and any accommodations that may be needed to negotiated rates or performance levels.

Is the State able to look at previous years to identify scores for students who took the assessment in a year other than the year they left?

Can the State account for early graduates, late graduates, or students who do not take the exam?

Quality Criteria

A. Alignment to State Academic Standards: Attainment measures and assessment systems are aligned to (validity) State academic content and performance standards.

B. Scope of Attainment Measurement: Attainment measures provide a representative coverage (validity) for reading/language arts and mathematics in State academic standards and assessment systems that have been developed for all students in the State.

C. Timing of Attainment Measurement: Attainment is measured concurrent with or after concentrated participation in CTE.

SECONDARY CTE MEASURES 22

D. Reliability of Assessment Instruments: Attainment is measured using reliable assessment instruments.

E. Reliability of Assessment Administration: Attainment is measured using assessment instruments that are administered consistently within assessment systems.

F. Student Coverage in Attainment Measurement: Performance measurement reports attainment data for all students reaching State-defined CTE concentration in the State.

Measurement Approaches

The measurement approach for these indicators must be the student academic assessments of proficiency according to ESEA.

A disaggregated, student-level administrative record match of CTE concentrators with the State’s ESEA reading/language arts and mathematics assessment results.

Under these indicators, a State would not include in the data it reports under the Perkins Act a CTE concentrator who at the time of the administration of the State assessment has not attended public schools within the State for a full academic year, as defined in the State’s Consolidated State Application Accountability Workbook. Likewise, the State would not include this student in its computation of Statewide AYP under the ESEA.

Under these indicators, a State would include in the data it reports under the Perkins Act a CTE concentrator who took the reading/language arts and mathematics assessments in the 10th grade and dropped out in the 11th grade, if the student’s 11th grade year is the reporting year.

Under these indicators, if a State’s Consolidated Application Accountability Workbook allows for the State to report a student’s last score on the reading/language arts and mathematics assessments for accountability purposes under the ESEA, the State may follow the same procedures for reporting the number of CTE concentrators who met the proficient or advanced level of an ESEA assessment. That’s because a State would report the same score for a CTE student as reported under the ESEA.

SECONDARY CTE MEASURES 23

Calculate 1S1 and 1S2 measures consistently with the overall State department’s ESEA guideline results, and at the same time the State department’s ESEA guideline results are calculated.

If the State formula allows, include concentrators who:

left secondary education in the reporting year, even though they may not have taken the assessment that year and those who dropped out without graduating but still completed the State assessment (numerator and denominator);

were included in the State’s computation of AYP (numerator and denominator); and

who met the proficient or advanced level on the assessment (numerator).

Concentrators who never took the assessment should be excluded from the numerator.

Resources

Office of Vocational and Adult Education (OVAE) Non-Regulatory Guidance:http://cte.ed.gov/perkinsimplementation/nrg.cfm

Peer Collaborative Resource Network (PCRN) Core Indicators of Performance:http://cte.ed.gov/accountability/coreindicators.cfm

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

EDFacts Initiative and EDEN (Education Data Exchange Network) Submission:http://www.ed.gov/about/inits/ed/edfacts/index.html

EDEN File Specifications for Data Reporting:http://www.ed.gov/about/inits/ed/edfacts/file-specifications.html

2S1: Technical Skill AttainmentAll students who reach State-defined CTE concentration should master knowledge and skills that meet State-defined and industry-validated

SECONDARY CTE MEASURES 24

career and technical skill standards. To assess whether CTE concentrators have attained technical skills, State and local education agencies should report on the CTE outcomes of students who have left secondary education.

Numerator: Number of CTE concentrators who passed technical skill assessments that are aligned with industry-recognized standards, if available and appropriate, during the reporting year.

Denominator: Number of CTE concentrators who took the assessments during the reporting year.

Issues and Considerations

Are all CTE concentrators who took the assessment included? (States need only include CTE concentrators who took the skill-attainment assessments.)

Is the data for the current reporting year?

Is the assessment aligned with the curriculum (validity)?

Is there an indication of consistent scoring (reliability)?

Is the measure, instrument, and item(s) standardized (e.g., administration, grading, reporting) throughout the district, State?

Are the results meaningful and timely to improve instruction and inform policy, or administrative decisions, at the school, district and/or State level?

Quality Criteria

A. Alignment to Industry Standards: Attainment measures are aligned (validity) to established, industry-validated skill standards—both content and performance standards.

B. Scope of Attainment Measurement: Attainment measures provide a representative coverage (validity) of established, industry-validated skill standards.

C. Timing of Attainment Measurement: Attainment is measured concurrent with or after concentrated participation in CTE.

SECONDARY CTE MEASURES 25

D. Reliability of Assessment Instruments: Attainment is measured using reliable assessment instruments.

E. Reliability of Assessment Administration: Attainment is measured using assessment instruments that are administered consistently within assessment systems.

F. Student Coverage in Attainment Measurement: Performance measurement reports attainment data for all students reaching State-defined CTE concentration status.

Measurement Approaches

Commercially developed assessments administered by a third-party testing provider.

State occupational licensure examinations.

State developed, industry-validated technical skill assessments based on State-adopted technical skill standards.

Locally developed, industry-validated technical skill assessments based on State-adopted technical skill standards.

Resources

Office of Vocational and Adult Education (OVAE) Non-Regulatory Guidance:http://cte.ed.gov/perkinsimplementation/nrg.cfm

Peer Collaborative Resource Network (PCRN) Core Indicators of Performance:http://cte.ed.gov/accountability/coreindicators.cfm

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

SECONDARY CTE MEASURES 26

3S1: Secondary School CompletionAll students who reach concentration of CTE coursework should go on to complete their academic and CTE studies and complete secondary school. To assess students’ success in attaining their educational goals, Congress requires State and local education agencies to report completion rates for secondary students who concentrate in CTE.

Numerator: Number of CTE concentrators who earned a regular secondary diploma, earned a General Education Development (GED) credential as a State-recognized equivalent to a regular high school diploma (if offered by the State) or other State-recognized equivalent (including recognized alternative standards for individuals with disabilities), or earned a proficiency credential, certificate, or degree in conjunction with a secondary school diploma (if offered by the State) during the reporting year.

Denominator: Number of CTE concentrators who left secondary education during the reporting year.

Supplementary Disaggregated Indicator Definitions

The core indicator 3S1: Secondary School Completion as defined in the nonregulatory guidance lists additional or supplementary data indicators that further refine the data collected for secondary school completion of CTE concentrators. OVAE will collect a count of students who represent attainment of each the listed school completion awards through the Perkins Consolidated Annual Report (CAR) annual submission.

Diploma: Document awarded by the State, local school district, or other authorized entity indicating a student has met the graduation conditions of that granting entity and meets the conditions established by Section 1111(b)(2)(C)(vi) of the ESEA.

General Education Development (GED) or other State-recognized equivalent: A general equivalency diploma awarded by recognized State entity with authority to award an official GED certificate; or a document awarded by the State, local school district, or other

SECONDARY CTE MEASURES 27

authorized entity in lieu of a diploma to students not meeting the criteria of a diploma.

Proficiency Credential, Certificate, or Degree: A technical skill proficiency credential, technical skill or CTE program completion certificate, or CTE program degree granted to students in conjunction with a diploma.

Local secondary agencies in States that do not award proficiency credentials are not required to report on this sub-indicator. State CTE administrators will inform OVAE about the type of diplomas and degrees offered—States not awarding this type of credential are exempt from this sub-indicator. Secondary educators in States offering proficiency credentials are required to collect data on student attainment of these credentials. They should consult their State CTE administrative staff or State reporting guidelines to clarify the manner in which data should be collected.

Issues and Considerations

Note: 3S1: School Completion is a different standard than 4S1: School Graduation which is defined by the State for annual reporting purposes through EDFacts/EDEN.

Is a Statewide database being used to match or identify CTE concentrators as a subset of all secondary school completers?

How will the State identify concentrators who left without earning a diploma or equivalent?

Does the State collect regular diploma data only?

Is the State collecting data on students obtaining a GED from a reliable Statewide database?

Is the State collecting data on students obtaining a proficiency credential, certificate, or degree, from a reliable Statewide database?

Quality Criteria

A. Alignment to Completion Measure to State Graduation and Completion Requirements: Completion measure reports only those students meeting all State requirements for a regular secondary school diploma, GED, or other State-recognized equivalent.

SECONDARY CTE MEASURES 28

B. Scope of Completion Measurement: Completion measure includes all students reaching CTE concentration and whether or not they received diplomas, GED, or State-recognized equivalent.

C. Timing of Completion Measurement: Completion is measured at the same time for all students reaching CTE concentration.

D. Reliability of Completion Measurement: Completion measurement is based on consistent definitions of State requirements and is reported using standardized methods for calculating rates.

E. Student Coverage in Completion Measurement: Performance measurement reports completion data for all students reaching CTE concentration.

Measurement Approaches

Use a disaggregated, student-level administrative record match to identify CTE concentrators as a subset of all high school completers within the State’s high school completion database.

Use a follow-up survey of CTE concentrators who completed high school in the reporting year including only those secondary concentrators.

As part of the State’s measurement approach, count CTE concentrators who:

earned a regular high school diploma, a GED, or a State-recognized equivalent during the reporting year;

earned a technical skill proficiency credential, technical skill or CTE program completion certificate, or CTE program degree granted to students in conjunction with a diploma; and

left secondary education during the reporting year.

Resources

Office of Vocational and Adult Education (OVAE) Non-Regulatory Guidance:http://cte.ed.gov/perkinsimplementation/nrg.cfm

SECONDARY CTE MEASURES 29

Peer Collaborative Resource Network (PCRN) Core Indicators of Performance:http://cte.ed.gov/accountability/coreindicators.cfm

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

SECONDARY CTE MEASURES 30

4S1: Student Graduation RatesNumerator: Number of CTE concentrators who, in the reporting year, were included as graduated in the State’s computation of its graduation rate as described in Section 1111(b)(2)(C)(vi) of the ESEA.

Denominator: Number of CTE concentrators who, in the reporting year, were included in the State’s computation of its graduation rate as defined in the State’s Consolidated Accountability Plan pursuant to Section 1111(b)(2)(C)(vi) of the ESEA.

Issues and Considerations

Does the State have the capability to match CTE concentrators within a State database with records of all secondary school students?

Is the State able to apply its computation of the State department’s ESEA graduation rate guidelines to only those concentrators who graduated, instead of all graduates?

Quality Criteria

A. Alignment to Completion Measure to State Graduation Requirements: Graduation measure reports only those students meeting all State requirements for a regular secondary school diploma as described in Section 1111(b)(2)(C)(vi) of the Elementary and Secondary Education Act (ESEA).

B. Scope of Completion Measurement: Graduation measure includes all students reaching CTE concentration and whether or not they received a regular secondary school diploma.

C. Timing of Completion Measurement: Graduation is measured at the same time for all students reaching CTE concentration.

D. Reliability of Completion Measurement: Graduation measurement is based on consistent definitions of State requirements and is reported using standardized methods for calculating rates.

E. Student Coverage in Completion Measurement: Performance measurement reports graduation data for all students reaching CTE concentration.

SECONDARY CTE MEASURES 31

Measurement Approaches

A disaggregated, student-level administrative record match of CTE concentrators with the State’s student graduation rate database.

Calculate 4S1 measure consistently with the overall State department’s ESEA guideline results and at the same time that the State department’s ESEA guideline results are calculated.

Resources

Office of Vocational and Adult Education (OVAE) Non-Regulatory Guidance:http://cte.ed.gov/perkinsimplementation/nrg.cfm

Peer Collaborative Resource Network (PCRN) Core Indicators of Performance:http://cte.ed.gov/accountability/coreindicators.cfm

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

EDFacts Initiative and EDEN (Education Data Exchange Network) Submission:http://www.ed.gov/about/inits/ed/edfacts/index.html

EDEN File Specifications for Data Reporting:http://www.ed.gov/about/inits/ed/edfacts/file-specifications.html

SECONDARY CTE MEASURES 32

5S1: Secondary PlacementSecondary CTE program concentrators should obtain skills that will prepare them for successful transition to postsecondary education or advanced training, employment, and/or military service.

Numerator: Number of CTE concentrators who left secondary education and were placed in postsecondary education or advanced training, in the military service, or employment in the second quarter following the program year in which they left secondary education (i.e., unduplicated placement status for CTE investors who graduated by June 30, 2007 would be assessed between October 1, 2007 and December 31, 2007).

Denominator: Number of CTE concentrators who left secondary education during the reporting year.

Supplementary Disaggregated Indicator Definitions

The core indicator 5S1: Secondary Placement as defined in the nonregulatory guidance lists additional or supplementary data indicators that further refine the data collected for secondary placement of CTE concentrators. OVAE will collect a count of students who are placed in each of the listed post-high school locations through the Perkins Consolidated Annual Report (CAR) annual submission.

Postsecondary Education or Advanced Training: Continuation of formal education and training in a recognized or accredited postsecondary institution or training program following secondary school exit.

Employment: Engagement in lawful activities resulting in compensation.

Military: Active enlistment in a branch of the United States armed forces (Army, Navy, Marines, Air Force, Coast Guard, as well as Army National Guard, Air National Guard or Active Reserves).

SECONDARY CTE MEASURES 33

Issues and Considerations

Is the State collecting data on all concentrators from a survey or matching concentrator records to a reliable national or Statewide database?

If using a survey, are all concentrators sent surveys, has the survey been field tested for validity, and is the response rate greater than 50 percent?

Can the State collect data on each of the sub-indicators of 1) postsecondary education or advanced training, 2) military, and 3) employment?

Is the State tracking all concentrators, or only those who completed their CTE program?

Is the State tracking all concentrators, or only those who earned a secondary diploma/equivalent?

How will the State identify who left secondary education, if including graduates and nongraduates?

Can the State track placement in apprenticeship programs?

Quality Criteria

A. Alignment to Definitions of Three Types of Placement: Surveys, placement forms, or administrative record exchanges use consistent definitions of the three types of placement—postsecondary education or advanced training, employment, and military.

B. Timing of Placement Measurement: Placement is measured during the same measurement intervals for all concentrators (e.g., three months, six months after concentrators leave secondary education).

C. Reliability of Placement Measurement: Placement measurement reports placement data based on consistent surveys, forms, and administrative records and measurement procedures.

D. Student Coverage in Placement Measurement: Placement measurement attempts to gather data for all program concentrators who graduate for all CTE programs within the State.

SECONDARY CTE MEASURES 34

E. Response/Match Capacity: Placement measurement achieves acceptable and consistent survey response rates or sufficient match capacity.

F. Nonduplicated Counts: Placement measurement collects and reports placement information for each type of placement, but reports only nonduplicated counts in calculating the overall performance level.

Measurement Approaches

Administrative record match using student Social Security Number or other unique student identifier to electronically track secondary graduates as they transition to further education, employment, or the military. Data sources include State postsecondary education record, Unemployment Wage Record information, Federal Employment Data Exchange System (FEDES), National Student Clearinghouse, and Wage Record Interchange System (WRIS).

Mail or telephone student follow-up survey of all CTE program completers at the end of the State-designated placement period.

Low response rates are a challenge to conducting follow-up surveys on students. Initial response rates of less than 40 percent are not uncommon for the first round of a follow-up survey rate. Consider taking a number of steps to increase responses, such as:

Mailing a postcard or e-mail two weeks prior to the survey to check for invalid addresses and prepare students;

Providing an online, web-based survey available to various mobile technologies;

Offering a sweepstakes prize eligible to all students responding by a given date;

Including a coupon for free or reduced merchandise from a local business(es);

Sending a reminder postcard or e-mail to nonrespondents a week following the deadline;

Calling nonrespondents;

Requesting forwarding information from the person answering the phone; or

SECONDARY CTE MEASURES 35

Using State or national databases to track student who may have moved within or outside the State.

For either measurement approach, include all concentrators who left secondary education and for whom there is documentation of placement in any advanced training, any employment, the military and postsecondary education during the second quarter following the end of the program year in which they left secondary education. Although data must be reported for each disaggregated indicator, “total placement” is the only level that is negotiated for performance targets.

Resources

Office of Vocational and Adult Education (OVAE) Non-Regulatory Guidance:http://cte.ed.gov/perkinsimplementation/nrg.cfm

Peer Collaborative Resource Network (PCRN) Core Indicators of Performance:http://cte.ed.gov/accountability/coreindicators.cfm

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

Federal Employment Data Exchange System (FEDES): http://www2.ubalt.edu/jfi/fedes/index2.cfm

National Student Clearinghouse: http://www.studentclearinghouse.org

Wage Record Interchange System (WRIS) : http://www.doleta.gov/performance/WRIS.cfm

Caution

The sharing of student record information for administrative record matching may be limited or restricted given local or State interpretation of the Family Educational Rights and Privacy Act (FERPA). Consult your State’s interpretation of the FERPA guidance to determine how your State approaches the issue of matching student data with other agencies. Guidance can be found at:

SECONDARY CTE MEASURES 36

Family Educational Rights and Privacy Act:http://www.ed.gov/policy/gen/guid/fpco/ferpa/index.html

SECONDARY CTE MEASURES 37

6S1: Participation in Secondary Nontraditional ProgramsAll secondary students should have the opportunity to pursue studies in a CTE program area of their choice, including those that are nontraditional for their gender. To ensure all students have access to CTE programs, Congress requires State and local educational agencies to track student participation in and completion of career and technical education programs that lead to nontraditional training and employment.

Numerator: Number of CTE participants from underrepresented gender groups who participated in a program that leads to employment in nontraditional fields during the reporting year.

Denominator: Number of CTE participants who participated in a program that leads to employment in nontraditional fields during the reporting year.

6S2: Completion in Secondary Nontraditional ProgramsNumerator: Number of CTE concentrators from underrepresented gender groups who completed a program that leads to employment in nontraditional fields during the reporting year.

Denominator: Number of CTE concentrators who completed a program that leads to employment in nontraditional fields during the reporting year.

Supplementary Definitions

Nontraditional Training and Employment: Occupations for fields of work and other emerging high-skill occupations, for which individuals from one gender comprise less than 25 percent of the individuals employed in each such occupation or field of work.

SECONDARY CTE MEASURES 38

Nontraditional CTE Program: A CTE program area that addresses occupational areas in which underrepresented gender groups comprise less than 25 percent of employed persons.

Crosswalks of Occupations and CTE Programs: A list that associates occupations or fields of work identified as nontraditional in the labor market with the CTE program areas that prepare students for entry into these fields.

Issues and Considerations

Which students should be included in the denominator for this measure?

The denominator of this measure should include all students, male or female, who participate in a CTE program area designated as nontraditional by your State. Typically, a CTE field that has less than 25 percent of one gender is considered to be “nontraditional.”

Is the State aligning with nontraditional cluster crosswalks?

Consult with State CTE administrators to identify national or State data sources that can be used to identify occupations that are out-of-balance in the workplace, and develop a consistent set of guidelines to assist LEAs in identifying CTE courses and program areas that are nontraditional, as identified in the National Alliance for Partnerships in Equity (NAPE) crosswalks.

How is “nontraditional” participation defined?

Nontraditional CTE participation describes a student who enrolls in a CTE program area or course that prepares individuals for entry into a nontraditional occupation.

Quality Criteria

A. Accurate Classification of Programs as Nontraditional: Nontraditional programs are accurately defined at the State and local levels based on a State or national crosswalk between programs and State or national occupational data.

SECONDARY CTE MEASURES 39

B. Reliability of Participation or Completion Measurement: Student participation or completion of nontraditional programs is measured based on consistent definitions of, and criteria for, participation.

C. Student Coverage in Reporting Nontraditional Programs: All secondary CTE participants are measured and reported if they participate in nontraditional programs.

Measurement Approaches

A disaggregated, student-level administrative record match of CTE concentrators from underrepresented gender groups with the State’s CTE program completion database for programs leading to employment in nontraditional fields.

A follow-up survey of CTE concentrators from underrepresented gender groups who completed a CTE program leading to employment in nontraditional fields in the reporting year.

For 6S1, include only those secondary participants who:

met the definition of participant (numerator and denominator), and

participated in a program that leads to employment in a nontraditional occupation (as identified by NAPE) (numerator and denominator), and

are from the underrepresented gender group for the nontraditional occupation (numerator only).

For 6S2, include only those secondary concentrators (male and female) who:

completed a CTE program that leads to employment in a nontraditional field (include students who received a degree, certificate, or credential) (numerator and denominator), and

are from the underrepresented gender group for the nontraditional occupation (numerator only).

Note: 6S2 does not require a student to graduate in order to be included in the numerator or denominator.

SECONDARY CTE MEASURES 40

Resources

Office of Vocational and Adult Education (OVAE) Non-Regulatory Guidance:http://cte.ed.gov/perkinsimplementation/nrg.cfm

Peer Collaborative Resource Network (PCRN) Core Indicators of Performance:http://cte.ed.gov/accountability/coreindicators.cfm

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

Peer Collaborative Resource Network (PCRN) Perkins IV Crosswalks:http://cte.ed.gov/docs/nswg/Perkins_IV_Crosswalks_on_PCRN.pdf

National Alliance for Partnerships in Equity: http://www.napequity.org/page.php?24

POSTSECONDARY/ADULT CTE MEASURES 41

Postsecondary/Adult CTE Measures

1P1: Technical Skill AttainmentAll students who reach State-defined CTE concentration should master knowledge and skills that meet State-defined and industry-validated career and technical skill standards. To assess whether CTE concentrators have attained technical skills, State and local postsecondary institutions should report on the CTE outcomes of students who have stopped program participation at the postsecondary level.

Numerator: Number of CTE concentrators who passed technical skill assessments that are aligned with industry-recognized standards, if available and appropriate, during the reporting year.

Denominator: Number of CTE concentrators who took technical skill assessments during the reporting year.

Issues and Considerations

Is the assessment aligned with the curriculum (validity)?

Is there an indication of consistent scoring (reliability)?

Is the measure, instrument, item(s) standardized (e.g., administration, grading, reporting) throughout the district, State?

Are the results meaningful and timely to improve instruction, inform policy, or administrative decisions at the postsecondary institution and/or State level?

Quality Criteria

A. Alignment to Industry Standards: Attainment measures are aligned (validity) to established, industry-validated skill standards—both content and performance standards.

B. Scope of Attainment Measurement: Attainment measures provide a representative coverage (validity) of established, industry-validated skill standards.

POSTSECONDARY/ADULT CTE MEASURES 42

C. Timing of Attainment Measurement: Attainment is measured concurrent with or after concentrated participation in CTE.

D. Reliability of Assessment Instruments: Attainment is measured using reliable assessment instruments.

E. Reliability of Assessment Administration: Attainment is measured using assessment instruments that are administered consistently within assessment systems.

F. Student Coverage in Attainment Measurement: Performance measurement reports attainment data for all students reaching State-defined CTE concentration status.

Measurement Approaches

Commercially developed assessments administered by a third-party testing provider.

State occupational licensure examinations.

State developed, industry-validated technical skill assessments based on State-adopted technical skill standards.

Locally developed, industry-validated technical skill assessments based on State-adopted technical skill standards.

Some other “proxy” measurement that meets the criteria of “valid and reliable” and provides evidence of skill attainment. This should be identified in the State plan, reported on in the annual CAR in the narrative, and be used only after approval by the Department.

Resources

Office of Vocational and Adult Education (OVAE) Non-Regulatory Guidance:http://cte.ed.gov/perkinsimplementation/nrg.cfm

Peer Collaborative Resource Network (PCRN) Core Indicators of Performance:http://cte.ed.gov/accountability/coreindicators.cfm

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

POSTSECONDARY/ADULT CTE MEASURES 43

POSTSECONDARY/ADULT CTE MEASURES 44

2P1: Credential, Certificate, or DiplomaAll students who reach concentration of CTE coursework should go on to complete their academic and CTE studies and complete a postsecondary credential, certificate, or diploma. To assess student success in attaining their education goals, Congress requires State and local postsecondary institutions to report credential, certificate, or diploma completion rates for postsecondary students who concentrate in CTE.

Numerator: Number of CTE concentrators who received an industry-recognized credential, certificate, or a degree during the reporting year.

Denominator: Number of CTE concentrators who left postsecondary education during the reporting year.

Issues and Considerations

How is “leaving postsecondary education” defined? How long does a student need to be gone to identify him/her as having left?

Is your State including credentials students obtain from external organizations (i.e., from outside of the community college system)?

If so, how will your State obtain information about credential/certificate information from a Statewide or national database or certified grantor?

Quality Criteria

A. Alignment to Completion Measure to Standards: Completion measure is aligned to program-defined academic and industry-validated skill standards and related program standards for receiving credentials, certificates, or degrees.

B. Scope of Completion Measurement: Completion measure includes all students reaching concentration and whether or not they receive credentials, certificates, or degrees.

C. Timing of Completion Measurement: Completion is measured at the same time for all students reaching CTE concentration.

POSTSECONDARY/ADULT CTE MEASURES 45

D. Reliability of Completion Measurement: Completion measurement is based on consistent definitions of State requirements and is reported using standardized methods for calculating rates.

E. Student Coverage in Completion Measurement: Performance measurement reports completion data for all students reaching CTE concentration.

Measurement Approaches

Include only those postsecondary concentrators who:

received a degree, certificate, or industry-recognized credential during the reporting year (for numerator), and

did not re-enroll in postsecondary education or advanced training for a specified period of time to be determined by the State (for numerator and denominator).

Resources

Office of Vocational and Adult Education (OVAE) Non-Regulatory Guidance: http://cte.ed.gov/perkinsimplementation/nrg.cfm

Peer Collaborative Resource Network (PCRN) Core Indicators of Performance:http://cte.ed.gov/accountability/coreindicators.cfm

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

National Student Clearinghouse: [not part of hotlink?]http://www.studentclearinghouse.org/

POSTSECONDARY/ADULT CTE MEASURES 46

3P1: Student Retention or TransferNumerator: Number of CTE concentrators who

1. remained enrolled in their original postsecondary institution or transferred to another two- or four-year postsecondary institution during the reporting year; AND

2. were enrolled in postsecondary education in the fall of the previous re-porting year.

Denominator: Number of CTE concentrators who were enrolled in postsecondary education in the fall of the previous reporting year and who did not earn an industry-recognized credential, a certificate, or a degree in the previous reporting year.

Issues and Considerations

What methods is the State using to track further education? Can the State determine if the student is continuing in the same institution or another institution within the State’s postsecondary system?

Does the State have access to the National Student Clearinghouse?

On what schedule will the State report these results? Data are often not available in time for the December CAR report.

If using a survey, are all concentrators sent surveys, has the survey been field tested for validity, and is the response rate greater than 50 percent?

Quality Criteria

A. Alignment to Definitions of Retention or Transfer: Surveys, placement forms, or administrative record exchanges use consistent definitions for retention or transfer.

B. Reliability of Retention or Transfer Measurement: Retention or transfer measurement reports placement data based on consistent surveys, forms, administrative records, and measurement procedures.

C. Student Coverage in Retention or Transfer Measurement: Retention or transfer measurement attempts to gather data for all program

POSTSECONDARY/ADULT CTE MEASURES 47

concentrators who transferred or are retained in a postsecondary educational program.

D. Response/Match Capacity: Retention or transfer measurement achieves acceptable and consistent survey-response rates or sufficient match capacity.

Measurement Approaches

Currently, there are two different ways to operationalize measure 3P1.

1. States can identify the concentrators enrolled during the reporting year and then look back to see if they were enrolled in fall of the prior year. In this scenario, the base group of concentrators will remain consistent with all the other measures, but the measure looks back instead of forward. In addition, it is not a good measure of retention, instead it determines, “of those enrolled now, what percentage were enrolled the fall before.”

2. States can identify the concentrators enrolled in fall of the year prior to the reporting year, and then determine what percentages of those students were enrolled at an institution at any time during the reporting year. This scenario does look at retention/transfer and it looks forward in time. However, the base group of concentrators under study will not match any other measure.

States may wish to consider a third option, which will allow this measure to be more consistent with other measures.

3. Identify all concentrators enrolled during the reporting year and then determine what percentages of those students were enrolled at any time during the following year. This construction allows the base group of concentrators to remain consistent with other measures, rather than limiting it to only fall concentrators or selecting a group of concentrators from a prior year.

If a State were to select scenario three, the issue of reporting year arises as it does with other measures. If the State identifies its group of concentrators in a year, it will need to wait until the following academic year has ended in order to determine whether students were enrolled anywhere during that year. In addition, it will take some time to obtain that

POSTSECONDARY/ADULT CTE MEASURES 48

data once the following year has ended. For example, the State would identify its concentrator group for the 2008–09 academic year and wait to see if those students attended a postsecondary institution during the 2009–10 academic year. Data will likely become available for analysis in fall 2010, and the results could be reported on the 2010 CAR.

Timeline Example for option #3 to measure retention or transfer:

2008–2009 2009–2010 December 2010 CARIdentify postsecondary concentrator group to measure for retention or transfer.

Monitor and/or follow the 2008–2009 retention or transfer concentrator group to determine if any students in the group attended a postsecondary institution during the 2009–2010 academic year.

Report the retention or transfer rate for the 2008–2009 postsecondary concentrator group in the December 2010 CAR submission.

Resources

Office of Vocational and Adult Education (OVAE) Non-Regulatory Guidance:http://cte.ed.gov/perkinsimplementation/nrg.cfm

Peer Collaborative Resource Network (PCRN) Core Indicators of Performance:http://cte.ed.gov/accountability/coreindicators.cfm

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

National Student Clearinghouse: [not part of hotlink?]http://www.studentclearinghouse.org/

POSTSECONDARY/ADULT CTE MEASURES 49

4P1: Student PlacementAll students who complete a postsecondary CTE program should obtain skills that will prepare them for successful transition to employment, military service, and/or apprenticeship programs. Congress requires State and local postsecondary institutions to report the postsecondary outcomes of students who complete a postsecondary program in the reporting year.

Numerator: Number of CTE concentrators who were placed or retained in employment, or placed in military service or apprenticeship programs in the second quarter following the program year in which they left postsecondary education (i.e., unduplicated placement status for CTE concentrators who graduated by June 30, 2007 would be assessed between October 1, 2007 and December 31, 2007).

Denominator: Number of CTE concentrators who left postsecondary education during the reporting year.

Supplementary Disaggregated Indicator Definitions

The core indicator 4P1: Student Placement, as defined in the nonregulatory guidance, lists additional or supplementary data indicators that further refine the data collected for postsecondary placement of CTE concentrators. OVAE will collect a count of students who are placed in each of the listed placement locations through the Perkins Consolidated Annual Report (CAR) annual submission.

Employment: Engagement in lawful activities resulting in compensation.

Military: Active enlistment in a branch of the United States armed forces (Army, Navy, Marines, Air Force, Coast Guard, as well as Army National Guard, Air National Guard or Active Reserves).

Apprenticeship: Participation in a federal or State formal preparation program leading to recognition as a journey-level worker in a skill trade and acknowledged by union and nonunion employers.

POSTSECONDARY/ADULT CTE MEASURES 50

Issues and Considerations

Is the State collecting data from a survey or from a reliable national or Statewide database?

If using a survey, are all concentrators sent surveys, has the survey been field tested for validity, and is the response rate greater than 50 percent?

Can the State collect data on each of the subindicators of 1) employment or retention in employment, 2) military, and 3) apprenticeship?

Does the State plan to track all concentrators, or only those who completed their CTE program?

Can the State track placement in apprenticeship programs?

Quality Criteria

A. Alignment to Definitions of Three Types of Placement: Surveys, placement forms, or administrative record exchanges use consistent definitions of the three types of placement: employment, military, and apprenticeship.

B. Timing of Placement Measurement: Placement is measured during the same intervals for all concentrators (e.g., three months, six months after concentrators leave secondary education).

C. Reliability of Placement Measurement: Placement measurement reports placement data based on consistent surveys, forms, and administrative records and measurement procedures.

D. Student Coverage in Placement Measurement: Placement measurement attempts to gather data for all program concentrators who graduate from all CTE programs within the State.

E. Response/Match Capacity: Placement measurement achieves acceptable and consistent survey-response rates or sufficient match capacity.

F. Non-Duplicated Counts: Placement measurement collects and reports placement information for each type of placement, but reports only nonduplicated counts in calculating the overall performance level.

POSTSECONDARY/ADULT CTE MEASURES 51

Measurement Approaches

Administrative record match using student Social Security Number or other unique student identifier to electronically track secondary graduates as they transition to further education, employment, or the military. Data sources include State postsecondary education record, Unemployment Wage Record information, Federal Employment Data Exchange System (FEDES), National Student Clearinghouse, and Wage Record Interchange System (WRIS).

Mail or telephone student follow-up survey of all CTE program completers at the end of the State-designated placement period.

Low response rates are a challenge to conducting follow-up surveys on students. Initial response rates of less than 40 percent are not uncommon for the first round of a follow-up survey rate. Consider taking a number of steps to increase responses, such as:

Mailing a postcard or e-mail two weeks prior to the survey to check for invalid addresses and prepare students;

Providing an online, web-based survey available to various mobile technologies;

Offering a sweepstakes prize eligible to all students responding by a given date;

Including a coupon for free or reduced merchandise from a local business(es);

Sending a reminder postcard or e-mail to nonrespondents a week following the deadline;

Calling nonrespondents;

Requesting forwarding information from the person answering the phone; or

Using State or national databases to track student who may have moved within or outside the State

Include in the measurement approach only those postsecondary concentrators who:

completed their CTE program during the reporting year (completion includes those who received a degree, diploma, or

POSTSECONDARY/ADULT CTE MEASURES 52

credential) (include if State chooses to include only students who have completed their program) (numerator and denominator) and

left postsecondary education (identify how students are identified as having left) (numerator and denominator) and

were identified as 1) placed or retained in employment, 2) placed in military service, or 3) placed in an apprenticeship program in the second quarter following the end of the reporting year (numerator only).

Resources

Office of Vocational and Adult Education (OVAE) Non-Regulatory Guidance:http://cte.ed.gov/perkinsimplementation/nrg.cfm

Peer Collaborative Resource Network (PCRN) Core Indicators of Performance:http://cte.ed.gov/accountability/coreindicators.cfm

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

Federal Employment Data Exchange System (FEDES): [part of hotlink?]http://www2.ubalt.edu/jfi/fedes/index2.cfm

National Student Clearinghouse: [part of hotlink?]http://www.studentclearinghouse.org

Wage Record Interchange System (WRIS): [part of hotlink?]http://www.doleta.gov/performance/WRIS.cfm

POSTSECONDARY/ADULT CTE MEASURES 53

5P1: Participation in Postsecondary Nontraditional ProgramsEach individual should have the opportunity to pursue studies in a CTE program area of their choice, including those that are nontraditional for their gender. To ensure individuals have access to CTE programs, Congress requires State and local educational agencies to record student participation in, and completion of, career and technical education programs that lead to nontraditional training and employment.

Numerator: Number of CTE participants from underrepresented gender groups who participated in a program that leads to employment in nontraditional fields during the reporting year.

Denominator: Number of CTE participants who participated in a program that leads to employment in nontraditional fields during the reporting year.

5P2: Completion in Postsecondary Nontraditional ProgramsNumerator: Number of CTE concentrators from underrepresented gender groups who completed a program that leads to employment in nontraditional fields during the reporting year.

Denominator: Number of CTE concentrators who completed a program that leads to employment in nontraditional fields during the reporting year.

Supplementary Definitions

Nontraditional Training and Employment: Occupations for fields of work and other emerging high-skill occupations, for which individuals from one gender comprise less than 25 percent of the individuals employed in each such occupation or field of work.

POSTSECONDARY/ADULT CTE MEASURES 54

Nontraditional CTE Program: A CTE program area that addresses occupational areas in which underrepresented gender groups comprise less than 25 percent of employed persons.

Crosswalks of Occupations and CTE Programs: A list that associates occupations or fields of work that are identified as nontraditional in the labor market with the CTE program areas that prepare students for entry into these fields.

Issues and Considerations

Which students should be included in the denominator for this measure?

The denominator of this measure should include all students, male or female, who participate in or complete a CTE program area of course designated as nontraditional by your State.

Is the State aligning with nontraditional cluster crosswalks?

Consult with State CTE administrators to identify national or State data sources that can be used to identify occupations that are out of balance in the workplace, and develop a consistent set of guidelines to assist the LEA in identifying CTE courses and program areas that are nontraditional as identified in the National Alliance for Partnerships in Equity (NAPE) crosswalks.

How is “nontraditional” participation defined?

Nontraditional CTE participation describes a student who enrolls in a CTE program area or course that prepares individuals for entry into a nontraditional occupation.

Are 5P2 data reported on CTE program completion and not secondary school completion?

Nontraditional CTE completion describes a student who fulfills a set of State-defined criteria that signifies that he or she has mastered a set of academic and/or technical skills to prepare him or her for future education and career success. Consult the definitions developed by your State to determine what constitutes completion in your State.

POSTSECONDARY/ADULT CTE MEASURES 55

Quality Criteria

A. Accurate Classification of Programs as Nontraditional: Nontraditional programs are accurately defined at the State and local levels based on a State or national crosswalk between programs and State or national occupational data.

B. Reliability of Participation or Completion Measurement: Student participation or completion of nontraditional programs is measured based on consistent definitions of, and criteria for, participation.

C. Student Coverage in Reporting Nontraditional Programs: All secondary CTE participants are measured and reported if they participate in nontraditional programs.

Measurement Approaches

A disaggregated, student-level administrative record match of CTE concentrators from underrepresented gender groups with the State’s CTE program completion database for programs leading to employment in nontraditional fields.

A follow-up survey of CTE concentrators from underrepresented gender groups who completed a CTE program leading to employment in nontraditional fields in the reporting year.

Include only those postsecondary participants who:

met the definition of participant (numerator and denominator);

participated in a program that leads to employment in a nontraditional occupation (as identified by NAPE) (numerator and denominator); and

are from the underrepresented gender group for the nontraditional occupation (numerator only).

Resources

Office of Vocational and Adult Education (OVAE) Non-Regulatory Guidance:http://cte.ed.gov/perkinsimplementation/nrg.cfm

POSTSECONDARY/ADULT CTE MEASURES 56

Peer Collaborative Resource Network (PCRN) Core Indicators of Performance:http://cte.ed.gov/accountability/coreindicators.cfm

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

Peer Collaborative Resource Network (PCRN) Perkins IV Crosswalks:http://cte.ed.gov/docs/nswg/Perkins_IV_Crosswalks_on_PCRN.pdf

National Alliance for Partnerships in Equity: [part of hotlink??]http://www.napequity.org/page.php?24

SECONDARY TECH PREP INDICATORS 57

Secondary Tech Prep Indicators

1STP1: Enrollment in Postsecondary Education Numerator: Number of secondary Tech Prep students who graduated last year and are enrolled in postsecondary education during the current year.

Denominator: Number of secondary Tech Prep students who graduated last year.

Issues and Considerations

Access to data. Is the data available from the postsecondary institution or institutions?

How is “graduated” being defined? Will the 3S1 or 4S1 definitions be used?

What defines “enrolled”? What defines full time? Part time? Is part-time enrollment defined by a minimum number of credits? Is one credit enough?

Resources

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

National Association for Tech Prep Leadership:http://www.natpl.org/docs/mastel-docs2009/04-02/Tech-Prep-Indicators-Definitions-PM-formulas-OVAE-NSWG-2-23-09.pdf

SECONDARY TECH PREP INDICATORS 58

1STP2: Enrollment in Postsecondary Education in the Same Field or MajorNumerator: Number of secondary Tech Prep students who graduated last year and are enrolled in postsecondary education during the current year in the same major or cluster/pathway as they were in high school.

Denominator: Number of secondary Tech Prep students who graduated last year.

Issues and Considerations

Access to data. Is the data available from the postsecondary institution or institutions?

How is “graduated” being defined? Will the 3S1 or 4S1 definitions be used?

What defines “enrolled”? What defines full time? Part time? Is part-time enrollment defined by a minimum number of credits? Is one credit enough?

How is the “same field” being defined? By career cluster? By classification of Instructional Program (CIP) code or similar classification?

Has the secondary-postsecondary Tech Prep program been explicitly defined?

Must the Tech Prep program be followed without variation from secondary into postsecondary?

Resources

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

National Association for Tech Prep Leadership:http://www.natpl.org/docs/mastel-docs2009/04-02/Tech-Prep-Indicators-Definitions-PM-formulas-OVAE-NSWG-2-23-09.pdf

SECONDARY TECH PREP INDICATORS 59

SECONDARY TECH PREP INDICATORS 60

1STP3: Complete a State or Industry-Recognized Certification or LicensureNumerator: Number of secondary Tech Prep students who graduated last year with a State or industry-recognized certification or licensure.

Denominator: Number of secondary Tech Prep students who graduated last year.

Issues and Considerations

Access to data. Are data accessible from certification or licensure exams?

How is “graduated” being defined? Will the 3S1 or 4S1 definitions be used?

Resources

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

National Association for Tech Prep Leadership:http://www.natpl.org/docs/mastel-docs2009/04-02/Tech-Prep-Indicators-Definitions-PM-formulas-OVAE-NSWG-2-23-09.pdf

SECONDARY TECH PREP INDICATORS 61

1STP4: Complete Courses That Award Postsecondary Credit Numerator: Number of secondary Tech Prep students who graduated last year with postsecondary credit.

Denominator: Number of secondary Tech Prep students who graduated last year.

Issues and Considerations

What defines “postsecondary credit”? Does it need to be “college level”?

Can the “postsecondary credit” be elective to the Tech Prep program? Or does it need to be a required credit in the Tech Prep program?

Does graduating with “any” postsecondary credit count in the numerator?

How is “graduated” being defined? Will the 3S1 or 4S1 definitions be used?

Resources

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

National Association for Tech Prep Leadership:http://www.natpl.org/docs/mastel-docs2009/04-02/Tech-Prep-Indicators-Definitions-PM-formulas-OVAE-NSWG-2-23-09.pdf

SECONDARY TECH PREP INDICATORS 62

1STP5: Enroll in Postsecondary Remedial Mathematics, Writing, or Reading CoursesNumerator: Number of secondary Tech Prep students who graduated last year who enrolled in postsecondary remedial mathematics, writing, or reading courses upon entering postsecondary education.

Denominator: Number of secondary Tech Prep students who graduated last year and enrolled in postsecondary education.

Issues and Considerations

What defines “remedial”? Is the remedial definition uniform to the postsecondary institution? Or, is remedial defined differently for various programs within the institution?

Do all postsecondary institutions in the State define “remedial” the same?

Will you count enrollment in remedial during any term after entering postsecondary education? Or only the first term upon entering postsecondary education?

How is “graduated” being defined? Will the 3S1 or 4S1 definitions be used?

Resources

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

National Association for Tech Prep Leadership:http://www.natpl.org/docs/mastel-docs2009/04-02/Tech-Prep-Indicators-Definitions-PM-formulas-OVAE-NSWG-2-23-09.pdf

POSTSECONDARY TECH PREP INDICATORS 63

Postsecondary Tech Prep Indicators

1PTP1: Employment After GraduationNumerator: Number of postsecondary Tech Prep students placed in a related field no later than 12 months after graduation.

Denominator: Number of postsecondary Tech Prep students who graduated last year.

Issues and Considerations

How is “graduated” being defined?

How is “placed” being defined? Full time? Part time?

How is “related field” being determined? Will there be a classification code matching system used to determine related field?

Will data be collected through administrative record matching? Or use of a student follow-up survey?

Resources

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

National Association for Tech Prep Leadership:http://www.natpl.org/docs/mastel-docs2009/04-02/Tech-Prep-Indicators-Definitions-PM-formulas-OVAE-NSWG-2-23-09.pdf

POSTSECONDARY TECH PREP INDICATORS 64

1PTP2: Complete a State or Industry-Recognized Certificate or LicensureNumerator: Number of postsecondary Tech Prep students who left postsecondary education this year with a State, industry-recognized certification, or licensure.

Denominator: Number of postsecondary Tech Prep students who left postsecondary education this year.

Issues and Considerations

How is “left postsecondary education” defined? Many community colleges do not have formal drop-out procedures.

What if students obtain credentials from external organizations (i.e., from outside of the community college system)?

Can the certification or licensure information be obtained from a Statewide or national database or certified provider?

Will the certification or licensure information be obtained from the student through a student follow-up survey? Or, is there the ability to do an administrative record match?

Resources

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

National Association for Tech Prep Leadership:http://www.natpl.org/docs/mastel-docs2009/04-02/Tech-Prep-Indicators-Definitions-PM-formulas-OVAE-NSWG-2-23-09.pdf

POSTSECONDARY TECH PREP INDICATORS 65

1PTP3: Complete a Two-Year Degree or Certificate ProgramNumerator: Number of postsecondary Tech Prep students who entered postsecondary education three years ago and who completed a two -year degree or certificate program.

Denominator: Number of postsecondary Tech Prep students who entered postsecondary education three years ago.

Issues and Considerations

Can the postsecondary data collection reach back in time to capture student data?

What constitutes “entered postsecondary education”? Does a middle or early college program constitute “entered postsecondary education”?

Resources

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

National Association for Tech Prep Leadership:http://www.natpl.org/docs/mastel-docs2009/04-02/Tech-Prep-Indicators-Definitions-PM-formulas-OVAE-NSWG-2-23-09.pdf

POSTSECONDARY TECH PREP INDICATORS 66

1PTP4: Complete a Baccalaureate Degree ProgramNumerator: Number of postsecondary Tech Prep students who entered postsecondary education six years ago and completed a baccalaureate degree program.

Denominator: Number of postsecondary Tech Prep students who entered postsecondary education six years ago.

Issues and Considerations

Can data be collected between community college and university data systems?

How does the Tech Prep-accountable agency access the data needed to report this measure?

When does a State begin reporting data for this measure (1PTP4)?

Resources

Peer Collaborative Resource Network (PCRN) Support to States:http://cte.ed.gov/accountability/supporttostates.cfm

National Association for Tech Prep Leadership:http://www.natpl.org/docs/mastel-docs2009/04-02/Tech-Prep-Indicators-Definitions-PM-formulas-OVAE-NSWG-2-23-09.pdf

Is the data from the previous reporting year allowed for this indicator?

Does the approach need to be more precisely composed in order to align?

Does a subset of concentrators align with the nonregulatory guidance?

Is state using the numerator from the nonregulatory guidance? (Cite numerator)

Is state using the denominator from the nonregulatory guidance? (Cite denominator)

Are all concentrators being reported?

Special Question #1

Is state using a measurement approach clearly aligned with the nonregulatory guidance? (Cite approach)

Special Question #3

Submit plan to policy staff as is; no need to contact state.

Is the data from the current reporting year?

Special Question #2

YES

NO

YES

YES

YES

YES

YES

YES

YES

NO

NO

NO

NO

NO

NO

NO

1) Denote all NO answers, returning to boxes on left to assure all are reviewed.2) Contact state for discussion on all accumulated NO answers. 3) Ask state to consider alterations in plan that would generate YES answers to replace NO answers.

Submit plan to policy staff denoting for each left-side question that remains with a NO answer, how the state has explained its method for assuring validity and reliability.

Insert Box (es) for follow-up question(s) as needed.

For ALL remaining NO answers: Does plan have explanation of how validity and reliability of state’s measure is being assured regarding each NO item?

Acquire explanation needed from the state.

NO

YES

YES Contact state to submit more clearly composed approach.

NO

NO

YES

YES

START HERE

PERKINS IV VALIDITY AND RELIABILITY CHECKLIST: SAMPLE DECISION TREE DIAGRAM 67

Perkins IV Validity and Reliability Checklist: Sample Decision Tree Diagram

CROSS MEASURE ISSUES 69

Cross Measure Issues

Performance indicators (or performance measures)Under the new law, States and local programs will be required to report on separate core performance indicators for secondary and postsecondary students. Measures for each indicator must be valid and reliable. At different times of the year, different accountability activities occur with regard to collecting data, reporting data, negotiating performance targets, and making changes to the State plan (see Appendix X-Calendar-X).

Performance levelsPlans submitted to the State Department of Education by LEAs must at minimum include objective, quantifiable, and measurable target performance levels for each of the core performance indicators for secondary and postsecondary students. The performance levels established in State negotiations with LEAs must demonstrate continual progress toward improving the performance of career and technical education students. The law requires the State Department of Education to negotiate performance measures.

Local programs must either accept the State-adjusted levels of performance as the local levels of performance or negotiate with the State to reach agreement on new local adjusted levels of performance. Local performance levels must also be objective, quantifiable, and measurable. Local programs and States must reach agreement on local adjusted levels of performance for the first two program years covered by the local plan, and the subsequent third and fifth years.

If unanticipated circumstances arise in a State or local area resulting in a significant change that would cause a State to not meet the agreed upon performance levels and as defined in the individual State Plan, the State or local area may request that the U.S. Secretary of Education or State revise its adjusted levels of performance.

CROSS MEASURE ISSUES 70

The State negotiates performance targets on each indicator with the U.S. Department of Education and the two entities reach a Final Agreed Upon Performance List (FAUPL) that includes the performance targets, the measurement tools, and the definitions for each indicator. This FAUPL becomes a contract between the two entities. States must achieve at least 90 percent of each agreed indicator target to be “in compliance.” States can negotiate changes to the measurement tools and definitions, but the changes would have to be valid and reliable. The State reports the performance data and other information on a yearly basis in the Consolidated Annual Report (CAR) and can identify problems or corrective actions that the State is undertaking to improve its programs.

MonitoringThe U.S. Department of Education will conduct on-site monitoring of 10 to 15 States per year. This accomplishes three purposes: validating the information reported by the State, ensuring that the program is being implemented in accordance with legislation, and providing the States with some technical assistance. A department risk-analysis procedure will select States for on-site monitoring. On average a State could be expected to be monitored once every five years, yet the actual scheduled monitoring visits could occur as often as once every three years, or as infrequently as once in 10 years— all influenced by a number of factors.

Monitoring looks at the whole Perkins program in detail, including the performance of LEAs and their local applications. The monitoring team uses a monitoring check sheet to review the programs. States will be notified at least three months in advance (sometimes as much as a year) and can schedule the monitoring visit to minimize disruption to their own activities. States also have available a checklist that they can use on a yearly basis to do a self-assessment to evaluate their own program.

The monitoring team will visit the State and remain for approximately one week. It will consist of several members who will review different aspects of the Perkins program within the State. The State Regional Accountability Specialist (RAS) will be part of that team to review the performance and the indicators at the State and LEA level. The team

CROSS MEASURE ISSUES 71

(possibly the RAS) will also look at Tech Prep programs, Special Populations, and any “performance gaps” that may exist between population subgroups. The team will also be reviewing other aspects including administration, finances, programs of study, and articulation agreements. While the team will develop a final official report, and this may require the State to implement some modifications in procedures, it is primarily interested in helping the State to improve performance and make the program better for the State and its customers, the students.

Disaggregation of performance data The 2006 Act requires State and local programs to annually report on the performance of CTE students. Data must be disaggregated by race/ethnicity, special populations and other student categories as defined in the Act, and subgroups as defined under the No Child Left Behind Act (State department’s ESEA guidelines). Unless groups are too small to preserve student anonymity, disparities between subgroups and all other students must be identified and quantified. Reports must also include quantifiable descriptions of the progress being made under each subcategory of students being served. The report must be made available in a variety of formats, including electronically.

The following special populations and other student categories will be used for the disaggregation of data in the Perkins CAR submission:

Individuals with disabilities (ADA)—primarily for use with postsecondary/adult students

Disability status (ESEA/IDEA)—primarily for use with secondary students

Economically disadvantaged

Single parents

Displaced homemakers

Limited English proficient

Migrant status—secondary student reporting only

Nontraditional enrollees

CROSS MEASURE ISSUES 72

Tech prep

Record matchingOne State’s secondary sector has relied on teacher reporting to determine if former students enter employment or postsecondary education after high school. This process begins late in the year following spring graduation, and teachers are allowed to report what they’ve learned about students’ employment and education activities through February of the year after students graduate.

The option of using Unemployment Insurance (UI) records and matching postsecondary data through the State’s college data system(s) and the National Student Clearinghouse was originally addressed through a prior round of State technical assistance in 2007; a recommendation was made at that time for the State to consider transitioning to record matching.1 With the introduction of Perkins IV, the State needed to focus on interpreting new measures and determining how those should be operationalized using its current systems. Now that work is nearing an end, the State may be ready to address the issue of data matching to improve the validity and reliability of the secondary placement measure.

Reporting yearWithin the Perkins IV measures, “reporting year” indicates the most recent full academic year ending prior to the submission of the Consolidated Annual Report (CAR) in December of any year. For example, for a CAR submitted in December 2009, the reporting year would be academic year 2008–09 (July 1, 2008 through June 30, 2009). However, several secondary and postsecondary measures pose challenges for States trying to report data six months after the end of an academic year.

1 The 2007 Alabama technical assistance report, Recommendations to Improve the Collection of Perkins Placement Data in Alabama, may be found at http://www.mprinc.com/products_and_publications/pdf/Collecting_Perkins_Data_Alabama.pdf .

CROSS MEASURE ISSUES 73

3S1: Secondary School Completion

There may be a question if the General Education Development (GED) data will be available in time to report for the December CAR.

5S1: Secondary Placement

Information regarding student enrollment in postsecondary education is not available from postsecondary sources until after the first term of the year is complete. If a State’s postsecondary institutions are on a semester system, the data may not become available until January. If the State uses UI records to identify students employed in the second quarter after exit (October 1 through December 31), those data are also not available until January, after the CAR has been submitted.

1P1: Technical Skill Attainment

Although information on technical skill assessments taken within the community college is immediately available, data on assessments taken through external licensing and certification organizations may be delayed. In some cases, assessment data through a certification organization may not be directly available to the institution, but given only to the student taking the assessment. Colleges may have to rely on self-reporting by the student through a survey or other means; or, develop an arrangement with the certification organization for access to the assessment data.

Some licensing boards offer assessments only a few times per year, and students may not have an opportunity to complete the assessment process by December following the reporting year. In addition, external organizations may need time to process and submit data to community colleges.

2P1: Credential, Certificate, or Degree

It is more difficult to identify students who “leave” community colleges than those who leave high schools. Instead of formal drop-out procedures, community college students simply stop enrolling in courses. And many community college students take time between courses to work, making it even harder to differentiate between students who “stop out” and those who drop out.

CROSS MEASURE ISSUES 74

States are struggling with the timeline they should use to determine if a student has exited. Some States have chosen to say that students have left postsecondary education if they do not return in the fall following an academic year. Other States have chosen to look at the entire academic year and identify students who exit and did not return at any time during the academic year following the reporting year. Either way, the data are generally not available in time to report in the December CAR.

In addition to allowing sufficient time to identify students who exit, States are confronting issues with obtaining information about external certificates and credentials that students earn. As in measure 1P1, some external organizations offer assessments only a few times per year and those organizations may need time to process and submit data to community colleges; or, they may release assessment data only to the student.

3P1: Student Retention or Transfer

As it is currently constructed, there is no reporting year issue with measure 3P1. However, this measure presents other challenges that may be resolved by adopting a new strategy. See the section on measure 3P1 for more information.

4P1: Student Placement

If the State uses UI records to identify students employed in the second quarter after exit (October 1 through December 31), those data are also not available until January, after the CAR has been submitted.

Two of the eight secondary and four of the six postsecondary Perkins IV measures are affected by a reporting year issue. During on-site technical assistance visits, State staff and MPR researchers discussed several options for addressing this challenge.

1. The State could choose to vary its reporting year by individual measure. That would mean, for example, that for the December 2009 CAR, the State would report on 2008–09 concentrator outcomes for measures 1S1, 1S2, 2S1, 4S1, 6S1, 6S2, 5P1, and 5P2 and would report on 2007–08 concentrator outcomes for 3S1, 5S1, 1P1, 2P1, 3P1, and 4P1.

CROSS MEASURE ISSUES 75

2. The State could choose to vary its reporting year by individual measure for secondary and use the prior reporting year for postsecondary. That would mean, for example, that for the December 2009 CAR, the State would report on 2008–09 secondary concentrator outcomes for measures 1S1, 1S2, 2S1, 4S1, 6S1, and 6S2 and would report on 2007–08 concentrator outcomes for 3S1 and 5S1. The State would then report on 2007–08 concentrator outcomes for all postsecondary measures (1P1, 2P1, 3P1, 4P1, 5P1, and 5P2).

The State could use the prior reporting year for all secondary and postsecondary measures. That would mean that for the December 2009 CAR, the State would report on 2007–08 concentrator outcomes for all measures (1S1, 1S2, 2S1, 3S1, 4S1, 5S1, 6S1, 6S2, 1P1, 2P1, 3P1, 4P1, 5P1, and 5P2). [should this paragraph be numbered “3”?]

DEFINITIONS: RACE/ETHNICITY, SPECIAL POPULATIONS AND OTHER STUDENT CATEGORIES 77

Definitions: Race/Ethnicity, Special Populations, and Other Student Categories

In completing the Perkins IV CAR forms, each State must use the definitions for race, ethnicity, special populations, and other student categories as described below.

Definition of Terms a. Race and Ethnicity Categories in the 1977 Standards

Note: Secondary use of the 1977 standards will end with the 2009–2010 academic year. Postsecondary use of the 1977 standards will end with the 2010–2011 academic year.

A State may report disaggregated data by race and ethnicity using the following categories and definitions based on the “The Standards for the Classification of Federal Data on Race and Ethnicity (Statistical Policy Directive No. 15)” that was issued by the Office of Management and Budget (OMB) in 1977:

American Indian or Alaskan Native–A person having origins in any of the original peoples of North America and who maintains cultural identification through tribal affiliation or community recognition.

Asian or Pacific Islander–A person having origins in any of the original peoples of the Far East, Southeast Asia, the Pacific Islands, or the Indian subcontinent including, for example, China, India, Japan, Korea, the Philippine Islands, and Samoa.

Black (not Hispanic)–A person having origins in any of the Black racial groups of Africa.

Hispanic–A person of Mexican, Puerto Rican, Cuban, Central or South American, or other Spanish culture or origin regardless of race.

DEFINITIONS: RACE/ETHNICITY, SPECIAL POPULATIONS AND OTHER STUDENT CATEGORIES 78

White (not Hispanic)–A person having origins in any of the original peoples of Europe, North Africa, or the Middle East.

Race and/or Ethnicity Unknown–A postsecondary student only who does not self-identify a race and/or ethnicity on a local information collection.

b. Race and Ethnicity Categories in the 1997 Revised Standards

Note: The 1997 standards will be required for secondary use beginning with the 2010–2011 academic year and the December 2011 CAR submission. The 1997 standards will be required for postsecondary use beginning with the 2011–2012 academic year and the December 2012 CAR submission.

A State may report disaggregated data by race and ethnicity using the following categories and definitions, which are based on the Revisions to the Standards for the Classification of Federal Data on Race and Ethnicity issued by OMB in 1997:

American Indian or Alaskan Native–A person having origins in any of the original peoples of North and South America (including Central America), and who maintains a tribal affiliation or community attachment.

Asian–A person having origins in any of the original peoples of the Far East, East, Southeast Asia, or the Indian subcontinent including, for example, Cambodia, China, India, Japan, Korea, Malaysia, Pakistan, the Philippine Islands, Thailand, and Vietnam.

Black or African American–A person having origins in any of the Black racial groups of Africa.

Hispanic or Latino–A person of Cuban, Mexican, Puerto Rican, South or Central American, or other Spanish culture or origin.

Native Hawaiian or Other Pacific Islander–A person having origins in any of the original peoples of Hawaii, Guam, Samoa, or other Pacific Islands.

White–A person having origins in any of the original peoples of Europe, the Middle East, or North Africa.

DEFINITIONS: RACE/ETHNICITY, SPECIAL POPULATIONS AND OTHER STUDENT CATEGORIES 79

Two or More Races–A person belonging to two or more racial groups.

Race and/or Ethnicity Unknown–A postsecondary student only who does not self-identify a race and/or ethnicity on a local information collection.

c. Race and Ethnicity Categories Approved under Department ESEA Guidelines

A State may report disaggregated data by race and ethnicity using any additional or combined categories approved under Department ESEA guidelines as discussed above. In such a case, the State must report these categories in the “additional information” section on each form.

d. Special Populations and Other Student Categories Described in Department ESEA Guidelines

Unless otherwise noted, the following categories and definitions are described in section 3 of Perkins IV.

Disability Status: The term “disability status” as used in section 1111(h)(1)(C)(i) of the ESEA refers to a “child with a disability,” which under section 9101 of the ESEA has the same meaning as the term in section 602 of the Individuals with Disabilities Education Act. Under section 602(3) of the IDEA, the term “child with a disability” means a child “(i) with mental retardation, hearing impairments (including deafness), speech or language impairments, visual impairments (including blindness), serious emotional disturbance (referred to in this title as ‘emotional disturbance’), orthopedic impairments, autism, traumatic brain injury, other health impairments, or specific learning disabilities; and (ii) who, by reason thereof, needs special education and related services.”

Displaced Homemaker: An individual who—

(i) has worked primarily without remuneration to care for a home and family and for that reason has diminished marketable skills;

DEFINITIONS: RACE/ETHNICITY, SPECIAL POPULATIONS AND OTHER STUDENT CATEGORIES 80

(ii) has been dependent on the income of another family member but is no longer supported by that income; or

(iii) is a parent whose youngest dependent child will become ineligible to receive assistance under part A of title IV of the Social Security Act (42 U.S.C. 601 et seq.) not later than two years after the date on which the parent applies for assistance under this title; and

(iv) is unemployed or underemployed and is experiencing difficulty in obtaining or upgrading employment.

Economically Disadvantaged: Individuals from economically disadvantaged families, including foster children.

Individual with Limited English Proficiency: A secondary school student, an adult, or an out-of-school youth, who has limited ability in speaking, reading, writing, or understanding the English language, and—

(i) whose native language is a language other than English; or

(ii) who lives in a family or community environment in which a language other than English is the dominant language.

Individual with a Disability: The term “individual with a disability” means an individual with any disability (as defined in section 3 of the Americans with Disabilities Act of 1990 (ADA)). Under section 3(2) of the ADA, the term “disability” means, with respect to an individual, (A) a physical or mental impairment that substantially limits one or more of the major life activities of such individual; (B) a record of such an impairment; or (C) being regarded as having such impairment.

Migrant Status: (Secondary Student Reporting ONLY) The term “migrant status” as used in section 1111(h)(1)(C)(i) of the ESEA is not defined; however, the Department strongly encourages a State to use the same definition of “migrant status” as a State uses in its annual State report card and as approved in its Consolidated State Accountability Workbook.

Nontraditional Fields: Occupations or fields of work, including careers in computer science, technology, and other current and

DEFINITIONS: RACE/ETHNICITY, SPECIAL POPULATIONS AND OTHER STUDENT CATEGORIES 81

emerging high-skill occupations, for which individuals from one gender comprise less than 25 percent of the individuals employed in each such occupation or field of work.

Single parents: The term “single parents” includes single pregnant women.

Special populations: The term “special populations” means—

(A) Individuals with disabilities;

(B) Individuals from economically disadvantaged families, including foster children;

(C) Individuals preparing for nontraditional fields;

(D)Single parents, including single pregnant women;

(E) Displaced homemakers; and

(F) Individuals with limited English proficiency.

A GUIDE TO CROSSWALKING NONTRADITIONAL OCCUPATIONS AND PROGRAMS 83

A Guide to Crosswalking Nontraditional Occupations and Programs

Step 1: Identify Occupations That Are Nontraditional in the WorkforceIn collaboration with State CTE administrators, identify a set of occupations—based on State or national data—that are nontraditional for either gender. State-specific occupational data can typically be obtained from your State’s department of economic development or other employment agency. To assist States, OVAE has identified nontraditional occupations based on national data collected by the U.S. Department of Labor, Bureau of Labor Statistics.

Step 2: Identify Work Skills Associated With Nontraditional OccupationsBegin by identifying the skills associated with each nontraditional occupation. Skill lists can be obtained by reviewing O*NET, developed by the U.S. Department of Labor. Access the site by linking to the following URL: http://www.doleta.gov/programs/onet/

Next, search for the occupation you’ve identified as nontraditional and identify the skills that are required for workers in this field. Alternatively, you may consult with industry associations or educators in your State to identify the skills required for success in a given nontraditional occupation.

Step 3: Crosswalk Nontraditional Occupations with CTE ProgramsIdentify CTE programs within your State that prepare students for entry into the nontraditional occupations you identified above. Depending upon your State, you may have a number of options to use to associate occupations with CTE programs in your State.

A GUIDE TO CROSSWALKING NONTRADITIONAL OCCUPATIONS AND PROGRAMS 84

1: State Classification Systems

If your State maintains a standardized classification system for CTE that all LEAs use to code courses, then you may want to base your crosswalk on this system. For each nontraditional occupation, link the occupational skills you identified with a vocational program area code identified by your State. Ideally, each occupation will correspond to a single course sequence; however, don’t be surprised if one CTE program area prepares students for multiple occupations. Consult with State CTE curriculum experts if you are not sure of the skills taught within a given CTE sequence.

2: National Classification Systems

If your State relies on local agencies to develop their own course and program codes, you may want to consider using the Classification of Instructional Programs (CIP) codes developed by the U.S. Department of Education to crosswalk occupations with CTE programs. You may access the most current CIP by linking to the following URL: http://nces.ed.gov/pubs2002/cip2000/

For each occupational skill, search the CIP for the CTE program area that provides students with the skills required for success in the nontraditional occupations you have identified.

Step 4: Develop and Circulate Instructions to LEAsUsing the list of CTE programs you identified above, develop written guidelines to assist LEAs in identifying nontraditional programs. Ideally, these instructions will contain a list of course codes or descriptions of CTE programs that will enable all LEAs in the State to report on students participating in similar courses, irrespective of the program classification system used locally.

Has your state identified nontraditional occupations? Consult with state staff to identify nontraditional occupations. Use national or state data to identify occupations.

(See list of nontraditional occupations developed by OVAE.)

Does your state have a standardized CTE course and/or program classification system?

Have lists of course or program codes been shared with the field in written form?

Develop written guidelines and circulate to LEAs.Provide ongoing technical assistance

Develop a list of CTE courses and program areas that prepare students for entry into nontraditional occupations.

(See Guide to Crosswalking.)

Identify skills associated with nontraditional occupations that each LEA can use to crosswalk its own programs.

(See Guide to Crosswalking.)

Has your state crosswalked the identified nontraditional occupations to CTE programs that prepare students for entry into these fields?

YES

YES

YES

YES NO

NO

NO

STRATEGIES FOR IDENTIFYING NONTRADITIONAL CTE PROGRAMS 85

Strategies for Identifying Nontraditional CTE Programs

ANNOTATED RESOURCE DIRECTORY ACCOUNTABILITY DATA GUIDE 87

Annotated Resource Directory Accountability Data Guide

Accountability (extract from Act); PL 109-270, Perkins IV Section 113.

An extract from the accountability section of the full Carl D. Perkins Career and Technical Education Act of 2006 (Perkins IV) is included in the Accountability Data Guide. This extract provides the statutory language that authorizes establishment of the core indicators of performance. The full text of Perkins IV can be accessed at: http://cte.ed.gov/docs/perkins_iv.pdf

Check Sheets for Monitoring Grants Awarded under Title I and Title II of the Carl D. Perkins Career and Technical Education Act of 2006 (Perkins IV); U.S. Department of Education, Office of Vocational and Adult Education (OVAE), Division of Academic and Technical Education (DATE).The check sheets are a tool to outline the scope and detail of on-site, State monitoring visits conducted by OVAE. The check sheets cover the areas of: State administration, fiscal program responsibility, local applications, programs of study, Tech Prep programs, special populations, and accountability. An extract of the Tech Prep, Special Populations, and Accountability sections is included in the Accountability Data Guide appendix. The full check sheet document can be accessed at http://cte.ed.gov/stategrants/statemonitoring.cfm.

Core Indicators of Performance; Peer Collaborative Resource Network (PCRN):http://cte.ed.gov/accountability/coreindicators.cfmThis document provides a concise listing of the Perkins IV student definitions and core indicators of performance for both secondary and postsecondary.

Crosswalks—Programs/Occupations, Clusters/Pathways, Nontraditional; Peer Collaborative Resource Network (PCRN): http://cte.ed.gov/accountability/crosswalks.cfm 2006 crosswalk of Bureau of Labor Statistics occupations reported to be nontraditional to NCES Classification of Instructional Program

ANNOTATED RESOURCE DIRECTORY ACCOUNTABILITY DATA GUIDE 88

Codes and Education Program titles to USDOE Career Clusters to USDOE Pathways.

Data Quality Institutes; Peer Collaborative Resource Network (PCRN):http://cte.ed.gov/accountability/dqi.cfmOVAE and DATE sponsor an annual Data Quality Institute (DQI) for State teams of secondary and postsecondary career and technical education directors and their accountability staffs.

EDFacts Initiative and EDEN (Education Data Exchange Network) Submission:http://www.ed.gov/about/inits/ed/edfacts/index.html EDFacts relies on the Education Data Exchange Network (EDEN), a centralized portal through which States submit their educational data to the U.S. Department of Education. EDEN comprises three main components: (1) the EDEN Submission System (ESS), an electronic data system capable of receiving data on more than 100 data groups at the State, district, and local levels; (2) the EDEN Survey Tool (EST), which collects data supplementary to the ESS data; and (3) the EDEN staging database, a holding area for newly submitted data. Future secondary Perkins IV data may use EDEN for the annual CTE data submission rather than the current Consolidated Annual Report (CAR).

EDEN File Specifications for Data Reporting:http://www.ed.gov/about/inits/ed/edfacts/file-specifications.html Detailed, technical file specifications are provided on this web site to access the CTE data collection file specifications used by EDEN.

Family Educational Rights and Privacy Act:http://www.ed.gov/policy/gen/guid/fpco/ferpa/index.html The Family Educational Rights and Privacy Act (FERPA) (20 U.S.C. § 1232g; 34 CFR Part 99) is a federal law that protects the privacy of student education records. The law applies to all schools that receive funds under an applicable program of the U.S. Department of Education.

Federal Employment Data Exchange System (FEDES):http://www2.ubalt.edu/jfi/fedes/index2.cfmThe FEDES initiative provides information on federal employment to participating States to help them meet their reporting requirements. Quarterly data exchanges are conducted with three federal agencies: the Office of Personnel Management (OPM); the Department of

ANNOTATED RESOURCE DIRECTORY ACCOUNTABILITY DATA GUIDE 89

Defense, Defense Manpower Data Center (DMDC); and the U.S. Postal Service (USPS).

National Alliance for Partnerships in Equity: http://www.napequity.org/page.php?24The National Alliance for Partnerships in Equity (NAPE) is a consortium of State and local agencies, corporations, and national organizations committed to the advancement of equity and diversity in classrooms and workplaces.

National Association for Tech Prep Leadership (NATPL):http://www.natpl.org/docs/mastel-docs2009/04-02/Tech-Prep-Indicators-Definitions-PM-formulas-OVAE-NSWG-2-23-09.pdfOVAE has not issued specific nonregulatory guidance for Tech Prep. NATPL has created a set of performance definitions and measures for the Tech Prep performance indicators that can be accessed on NATPL’s web site. The suggested definitions and measures are available for State use, but States can establish their own Tech Prep performance definitions and measures. NOTE: Tech prep accountability is required for those States keeping a portion or all of their Title II Tech Prep funds separate from their Title I Perkins funds.

National Student Clearinghouse:http://www.studentclearinghouse.org The National Student Clearinghouse streamlines the student record-verification process for colleges and universities, high schools and high school districts, students and alumni, lending institutions, employers, the U.S. Department of Education, and other organizations. The Clearinghouse maintains a comprehensive electronic registry of student records that provides a single, automated point of contact for organizations and individuals requiring timely, accurate verification of student enrollment, diploma, degree, and loan data.

Negotiating and Reporting Dates:This document posted on PCRN lists the dates associated with Perkins IV performance negotiations, State plan updates, and CAR submission due dates. http://cte.ed.gov/lib/libraryCategory.cfm?classnumber=6

Next Steps Work Group; Peer Collaborative Resource Network (PCRN):http://cte.ed.gov/accountability/nswg.cfm

ANNOTATED RESOURCE DIRECTORY ACCOUNTABILITY DATA GUIDE 90

The Next Steps Work Group (NSWG) is a voluntary information-sharing network comprised primarily of State and Office of Vocational and Adult Education (OVAE) staff to dialog topics of common interest related to Carl D. Perkins Act compliance and data reporting. NSWG summaries attempt to capture many of the questions and comments, but are not intended to be the definitive source for specific OVAE policy.

Q&A Regarding the Implementation of the Carl D. Perkins Career and Technical Education Act of 2006—Version 1.0, released Jan. 9, 2007:http://cte.ed.gov/perkinsimplementation/nrg.cfm This document provides OVAE responses to questions generated by States regarding implementation of Perkins IV. Topics include State plans, accountability, definitions, fiscal considerations, incentives and sanctions, and Tech Prep programs.

Q&A Regarding the Implementation of the Carl D. Perkins Career and Technical Education Act of 2006—Version 2.0, released June 7, 2007: http://cte.ed.gov/perkinsimplementation/nrg.cfmThis document provides additional OVAE responses to questions generated by States regarding implementation of Perkins IV. Topics include State plans, accountability, definitions, fiscal considerations, incentives and sanctions, Tech Prep programs, and occupational and employment information.

Q&A Regarding the Implementation of the Carl D. Perkins Career and Technical Education Act of 2006—Version 3.0, released June 2, 2009: http://cte.ed.gov/perkinsimplementation/nrg.cfm This document provides additional OVAE responses to questions generated by States regarding implementation of Perkins IV. Topics include State plans, accountability, definitions, fiscal considerations, incentives and sanctions, Tech Prep programs, occupational and employment information, participation of private schools and students and personnel, and articulation agreements.

State Career and Technical Education (CTE) Self-Assessment—An Extract; Office of Vocational and Adult Education, U.S. Department of Education (currently being revised-August 2009); full version available at:

ANNOTATED RESOURCE DIRECTORY ACCOUNTABILITY DATA GUIDE 91

http://cte.ed.gov/lib/libraryCategory.cfm?classnumber=6 The relevant Self-Assessment sections of Tech Prep, Special Populations, and Accountability have been included in the Accountability Data Guide. The Self-Assessment Tool offers a comprehensive, voluntary tool designed to help States with their CTE program improvement efforts. The Self-Assessment Tool is being updated by OVAE to better reflect Perkins IV provisions.

Student Definitions and Measurement Approaches for the Core Indicators of Performance under the Carl D. Perkins Career and Technical Education Act of 2006 (Perkins IV):

http://cte.ed.gov/perkinsimplementation/nrg.cfmThe purpose of this document is to offer guidance regarding student definitions and measurement approaches for the core indicators of performance under Perkins IV and assist States in building a valid and reliable CTE accountability system. The guidance incorporates and builds upon the extensive work by States in the Data Quality Institutes (DQIs) and Next Step Workgroup (NSWG) conference calls. It better aligns the student definitions and measurement approaches for the core indicators of performance to the requirements of the new Act. Please note that under the final Perkins IV State Plan Guide, (OMB Control Number 1830-0029), a State may choose to propose other student definitions and measurement approaches for the core indicators of performance in its State plan.

Support to States; Peer Collaborative Resource Network (PCRN):http://cte.ed.gov/accountability/supporttostates.cfm To assist States in improving the quality of their Career and Technical Education accountability systems, OVAE invites State directors of career and technical education to submit requests for individualized technical assistance. The reports posted on PCRN summarize individual State assistance projects and offer recommendations that States may wish to consider.

Tech Prep Program | Indicators of Performance and Accountability (extract from Act); PL 109-270, Perkins IV Section 203(e): An extract from the Title II, Tech Prep section of Perkins IV is included in the Accountability Data Guide. This extract provides the statutory language that authorizes establishment of Tech Prep indicators of performance for those eligible agencies that keep a portion or all of their Title II Tech Prep funds separate from their Title I Perkins funds.

ANNOTATED RESOURCE DIRECTORY ACCOUNTABILITY DATA GUIDE 92

The full text of Perkins IV can be accessed at: http://cte.ed.gov/docs/perkins_iv.pdf

Wage Record Interchange System (WRIS)http://www.doleta.gov/performance/WRIS.cfm The Wage Record Interchange System (WRIS) facilitates the exchange of wage data among participating States for the purpose of assessing and reporting on State and local employment and training program performance, evaluating training provider performance, and for other purposes allowed under the WRIS Data Sharing Agreement. The exchange permits State workforce program performance agencies to secure wage data of individuals who have participated in workforce investment programs in one State, then subsequently secured employment in another.