the program assessment rating tool (part)

23
The Program Assessment Rating Tool (PART) Mary Cassell Office of Management and Budget April 28, 2011 1

Upload: odina

Post on 07-Jan-2016

34 views

Category:

Documents


0 download

DESCRIPTION

The Program Assessment Rating Tool (PART). Mary Cassell Office of Management and Budget April 28, 2011. Overview. What is the PART? How was it developed? What are the components? Quality controls How was the PART used? Budget Program Improvements Lessons learned. “In God we trust… - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: The Program Assessment Rating Tool (PART)

The Program Assessment Rating Tool (PART)

Mary CassellOffice of Management and Budget

April 28, 2011 1

Page 2: The Program Assessment Rating Tool (PART)

Overview

• What is the PART?• How was it developed?• What are the components?• Quality controls• How was the PART used?

– Budget– Program Improvements

• Lessons learned

2

Page 3: The Program Assessment Rating Tool (PART)

“In God we trust…

…all others, bring data.”

-W. Edwards Deming

3

Page 4: The Program Assessment Rating Tool (PART)

Introduction

• The PART was a component of the Bush Administration’s Management Agenda that focused on Budget and Performance Integration

• The PART promoted efforts to achieve concrete and measurable results

• The PART supported program improvements

4CJ

Page 5: The Program Assessment Rating Tool (PART)

What is the Program Assessment Rating Tool (PART)?

• A set of questions that evaluates program performance in four critical areas:

• Program Purpose and Design• Strategic Planning• Program Management• Program Results and Accountability

• A tool to assess performance using evidence• Provides a consistent, transparent approach to

evaluating programs across the Federal government

5CJ

Page 6: The Program Assessment Rating Tool (PART)

Why PART?

• Measure and diagnose program performance

• Evaluate programs in a systematic, consistent, and transparent manner

• Inform agency and OMB decisions on resource allocations• Focus on program improvements through management,

legislative, regulatory, or budgetary actions• Establish accountability for results

6

Page 7: The Program Assessment Rating Tool (PART)

How did the PART work?

• Answers to questions generated scores which are weighted to tally to a total score.

• Based on evidence, evaluations, and data• Ratings based on total scores: Effective,

Moderately Effective, Adequate, Ineffective.• Results Not Demonstrated assigned to

programs that do not have performance measures or data, regardless of overall score.

7

Page 8: The Program Assessment Rating Tool (PART)

PART Questions and Process

• Roughly 25-30 analytical questions; explanations and evidence are required

• Standards of evidence hold programs to a high bar• Question weight can be tailored to reflect program

specifics• Interactions between questions. • Yes/No answers in diagnostic sections. Four levels of

answers in results section.• Collaborative process with agencies; OMB had the

pen.8

Page 9: The Program Assessment Rating Tool (PART)

How was the PART developed?• Designed by 12 OMB career staff, including one representative from each

division• Piloted with about 60 programs• Pilot generated extensive input from agencies that resulted in several

revisions – changes in scoring, elimination of a question about whether the program served an appropriate federal roles

• Conducted trial runs with research institutions• Agency roll-out:

– OMB training– Agency meetings– Agency trainings– Incorporation into 2002 budget decisions and materials

• Development, pilot, and revision process took about 6 months, including development of guidance and training.

9

Page 10: The Program Assessment Rating Tool (PART)

PART Program Types

• Direct Federal• Competitive Grant• Block/Formula Grant• Regulatory Based• Capital Assets and Service Acquisition• Credit• Research and Development

10

Page 11: The Program Assessment Rating Tool (PART)

PART Questions

• Section I: Program Purpose & Design (20%)– Is the program purpose clear?– Does the program address an existing problem or need?– Is the program unnecessarily duplicative?– Is the program free of major design flaws?– Is the program targeted effectively?

• Section II: Strategic Planning (10%)– Does the program have strong long-term performance measures?– Do the long-term measures have ambitious targets– Does the program have strong annual performance targets?– Does the program have baselines and ambitious targets?– Do all partners agree to the goals and targets?– Are independent evaluations conducted of the program?– Are budgets tied to performance goals?– Has the program taken steps to correct strategic planning deficiencies?

11

Page 12: The Program Assessment Rating Tool (PART)

PART Questions

• Section III: Program Management (20%)– Does the program collect timely performance information and use it to manage?– Are managers and partners held accountable for program performance?– Are funds obligated in a timely manner?– Does the program have procedures (IT, competitive sourcing, etc) to improve efficiency?– Does the program collaborate with related programs?– Does the program use strong financial management practices?– Has the program taken meaningful steps to address management deficiencies?– Additional questions for specific types of programs.

• Section VI: Program Results (50%)– Has the program made adequate progress in achieving its long-term goals?– Does the program achieve its annual performance goals?– Does the program demonstrate improved efficiencies?– Does the program compare favorably to similar programs, both public and private?– Do independent evaluations shows positive results?

12

Page 13: The Program Assessment Rating Tool (PART)

Performance Measures, Data and Evaluations

• Strong focus on performance measures. Performance measures should capture the most important aspects of a program’s mission and priorities.

• Key issues to consider: 1) performance measures and targets . 2) focus on outcomes whenever possible.3) annual and long-term timeframes.

• Efficiency measures required

• Rigorous evaluations are strongly encouraged

13

Page 14: The Program Assessment Rating Tool (PART)

Quality Controls

• The PART is a tool used to guide a collective analysis-not a valid and reliable evaluation instrument. Therefore it required other mechanisms to promote consistent application.– Guidance and standards of evidence– Training– On-going technical assistance– Consistency check– Appeals process– Public transparency

14

Page 15: The Program Assessment Rating Tool (PART)

How was the PART used? A Focus on Improvement

• Every program developed improvement plans• Focus on findings in the PART assessments• Implementation of plans and report on

progress• Reassessments occurred once the program

has made substantive changes

15

Page 16: The Program Assessment Rating Tool (PART)

The Use of the PART in the Budget Process

• Informed budget decisions (funding, legislative, and management)

• Increased prominence of performance in the Budget

• Increased accountability and focus on data and results

16

Page 17: The Program Assessment Rating Tool (PART)

Example: Migrant Education and the PART

• Collaborative process between OMB and program office.

• Program office provided evidence to back up PART answers (such as monitoring instruments, State data, action plans, etc.)

• OMB and ED met to discuss evidence• OMB and ED shared PART drafts• ED developed follow-up actions.

17

Page 18: The Program Assessment Rating Tool (PART)

Migrant Education PART PART Findings:• Program is well-designed and has a good strategic planning structure• Program is well-managed

– Issues relating to possible inaccuracies in the eligible student count are being addressed• States are showing progress in providing data and in improving student achievement• Results section:

– Ensure all States report complete and accurate data – Continue to improve student achievement outcomes– Improve efficiencies, in particular in migrant student records transfer system– Complete a program evaluation

Areas for Improvement and Action Steps for Migrant Education• Complete national audit of child eligibility determinations• Implement and collect data on Migrant Student Information Exchange (MSIX)• Use data, in particular on student achievement, to improve performance

18

Page 19: The Program Assessment Rating Tool (PART)

Distribution of Ratings Government-wide

50%38%

29% 26% 22%

5%

5%

4% 5%3%

15%

20%26% 27%

28%

24%26%

26% 28%30%

6% 11% 15% 14% 17%

0%

20%

40%

60%

80%

100%

2002 (234) 2003 (407) 2004 (607) 2005 (821) 2006 (977)

Results Not Demonstrated Ineffective Adequate Moderately Effective Effective

19

45%

55%

75%

25%

The Process

Page 20: The Program Assessment Rating Tool (PART)

Department of Education Cumulative Ratings

56% 52%63%

56% 54%

22%

12%4%

8%4%

17%

27% 25% 28%29%

6%3% 2%

5%8%

0%6% 9%

3% 5%

0%

20%

40%

60%

80%

100%

2002 (18) 2003 (33) 2004 (56) 2005 (74) 2006 (89)

Results Not Demonstrated Ineffective Adequate Moderately Effective Effective

20

The Process

Page 21: The Program Assessment Rating Tool (PART)

21

Page 22: The Program Assessment Rating Tool (PART)

Lessons Learned• Pros

– Focus on results, data, performance measurement, evaluation

– Program improvements– Common analysis– Transparency– Cross-program and cross-agency comparisons

between similar programs– Identification of best practices– Informed budget descisions

22

Page 23: The Program Assessment Rating Tool (PART)

Lessons Learned

• Cons– Not consistent enough to allow trade-offs

between unlike programs– Better for program improvement than

accountability, unless coupled with strong evaluation

– Became too burdensome– Not fully embraced by agencies or Congress

23