SlideShare a Scribd company logo
GPRAMA Implementation After
Five Years
GWU/Washington Evaluators Presentation
October 27, 2015
GAO’s Strategic Issues Team
For more information, contact Kathleen Padulchick (padulchickk@gao.gov). Page 1
Overview
• The GPRA Modernization Act (GPRAMA) includes a provision
for GAO to, by September 30, 2015, evaluate and report on:
• how implementation of the act is affecting performance
management at the CFO Act agencies; and
• crosscutting goal implementation.
• GAO’s assessment focused on four key areas:
(1) Addressing crosscutting efforts;
(2) Ensuring performance information is useful and used;
(3) Aligning daily operations with results; and
(4) Communicating performance information.
Page 2GAO’s findings are reported in: Managing for Results: Implementation of GPRA Modernization Act Yielded Mixed Progress
in Addressing Pressing Governance Challenges, GAO-15-819 (Washington, D.C.: September 30, 2015).
Background
• GPRAMA provides important tools that can help decision
makers address challenges facing the federal government.
• Full and effective implementation of GPRAMA will also be
instrumental in addressing these pressing governance issues
in anticipation of the transition to the next presidential
administration in 2017.
Page 3
Background – Priority Goals and Objectives
• GPRAMA includes requirements that OMB and agencies
establish different types of government-wide and agency-level
performance goals.
• Government-wide:
• Cross-agency priority (CAP) goals - outcome-oriented
goals covering a limited number of policy areas as well
as goals for management improvements needed across
the government
• Agency-level:
• Agency priority goals (APG) - reflect the highest
priorities of each of these agencies, and to be informed
by the CAP goals
Page 4
Background – Leadership Positions and
Councils
• GPRAMA provided a statutory basis for selected senior
leadership positions that had been created by executive
orders, presidential memorandums, or OMB guidance.
• Chief operating officer (COO)
• Performance improvement officer (PIO)
• Goal leaders
• Performance Improvement Council (PIC)
Page 5
Background – Performance Reviews
• GPRAMA and related OMB guidance require the regular
review of progress in achieving goals and objectives through
performance reviews.
• Strategic reviews: leadership-driven, annual reviews of
their progress toward achieving each strategic objective
• Data-driven reviews: regularly scheduled structured
meetings used by organizational leaders and managers to
review and analyze data on progress toward key
performance goals and other management-improvement
priorities
Page 6
Background – Transparency and Public
Reporting
• GPRAMA includes several provisions related to transparency
and public reporting of performance information:
• Performance.gov
• Program inventory
• Performance information quality
• Major management challenges
Page 7
Background – Key GPRAMA Requirements and
Their Frequency
Page 8
Notes: *This figure addresses GPRAMA requirements and OMB implementation guidance.
Strategic and annual performance plans and performance reports were required under the Government Performance and
Results Act of 1993.
Implementation Status of GAO’s Recommendations
Made under GPRAMA, from 2012 through September
2015
Page 9
Crosscutting Issues
• Many of the meaningful results that the federal government
seeks to achieve—such as those related to protecting food
and agriculture, providing homeland security, and ensuring a
well-trained and educated workforce—require the coordinated
efforts of more than one federal agency and often more than
one sector and level of government.
• GPRAMA takes a crosscutting and integrated approach to
achieving results and improving agency performance.
Page 10
Collaboration is Key to Address Challenge
Areas Across the Federal Government
Page 11
Issue Area Example of Related Challenge
GAO’s High Risk List –
Federal Food Safety
The safety and quality of the U.S. food supply is
governed by a highly complex system stemming from
at least 30 laws related to food safety that are
collectively administered by 15 federal agencies.
Need for a government-wide food safety performance
plan and to formalize the Food Safety Working Group
through statute to help ensure sustained leadership
across food safety agencies over time.
Fragmentation, Overlap,
and Duplication –
Nonemergency
Medical Transportation
42 programs across six federal departments provide
funding for nonemergency medical transportation.
Lead agencies are working to develop a new 2-year
strategy to coordinate.
Collaboration is Key to Address Challenge
Areas Across the Federal Government
Issue Area Example of Progress Being Made
GAO’s High Risk List –
Sharing and Managing
Terrorism-Related
Information
In December 2012, the President signed the National
Strategy for Information Sharing and Safeguarding.
In response to the strategy, the Strategic
Implementation Plan was released in 2013. The plan:
• Assigns stewards to coordinate each priority
objective
• Provides time frames and milestones for achieving
the outcomes in each objective
Page 12
GPRA Modernization Act Has the Potential to
Address Crosscutting Issues
1. Cross-agency Priority (CAP) Goals – GAO-14-526
2. Strategic Reviews – GAO-15-602
3. Data-Driven Reviews – GAO-15-579
4. Government-wide Program Inventory – GAO-15-83
Page 13
CAP Goals
Page 14
Current
management
CAP goals
CAP Goals
Page 15
Current
mission CAP
goals
OMB Has Increased Emphasis on Crosscutting Issues
through CAP Goal Guidance and Governance
• OMB is taking steps to enhance governance and
implementation
• Capacity building – 2+ CAP goal leaders
• OMB and PIC provide ongoing support
• New improved reporting tools
• Initial progress, but challenges remain in measuring progress
• 5 of the 7 CAP goals we reviewed have indicators under
development for some of their goals.
• More work needed to establish targets
Page 16
Effective Implementation of Strategic Reviews
Could Help Address Crosscutting Issues
• In July 2015 we reported on seven practices that can help
ensure agencies conduct effective strategic reviews.
Such as..
• Identifying the various strategies that influence outcomes and
determining which are most important
• Identifying key stakeholders to participate in the review
• Assessing the effectiveness in achieving strategic objectives
• Identifying actions to improve implementation and impact
Page 17
Data-Driven Reviews Should Be Collaborative
• Treasury officials said their reviews allowed different functional
management groups and program areas within their agencies
to collaborate and identify strategies which led to performance
improvements.
Page 18
Data-Driven Reviews Have Had a Positive Effect on
Collaboration, but Agencies Are Still Missing
Opportunities
• 21 of the 22 agencies we surveyed that reported holding in-
person data-driven reviews said that the reviews have had a
positive effect on collaboration.
But…
• Agencies are still missing opportunities to include
stakeholders from other federal agencies
Page 19
Issues with Program Inventories Limit Their
Usefulness
• Inconsistent approaches in defining programs across agencies
• We were unable to identify a large majority of the programs
previously identified in our work:
• Of 179 programs identified in our work, only 59 were found
in the inventory
• Plans for updating the inventories are on hold
• We recommended that OMB accelerate efforts to produce a
federal program inventory
Page 20
Ensuring Performance Information Is Useful
and Used by Managers Remains a Challenge
• Agencies can use performance information to:
• Identify performance improvement opportunities
• Improve program implementation and organizational
processes
• Inform management and resource allocation decisions.
• Agencies continue to have problems effectively using
performance information.
Page 21
Ensuring Performance Information Is Useful
and Used by Managers Remains a Challenge
Federal Agencies’ Average Scores on Use of Performance Information Index—2007 and 2013
Page 22
Agency Performance Reviews Should Improve the
Use of Performance Information for Decision Making
• Data-Driven Reviews. Agency Performance Improvement
Officers (PIO) reported that data-driven reviews have had
positive effects on the use of performance information in their
agencies. For example, PIOs from nearly all CFO Act
agencies reported:
• Their agencies always or often use data-driven reviews to
assess progress on APGs, and to identify goals at risk and
strategies for improvement.
• Reviews have had a positive effect on APG progress and
their ability to identify and mitigate risks to goal
achievement.
Page 23
Agency Performance Reviews Should Improve the
Use of Performance Information for Decision Making
• Strategic Reviews. Agency officials should use relevant
performance information and evidence during the strategic
review process to:
• Assess whether strategies are being implemented as
planned and whether they are having the desired effect.
• Identify areas where action is needed to improve
implementation and impact.
• Identify evidence gaps or areas where additional analyses
of performance data are needed.
Page 24
Page 25
Practices and Related Federal Managers Survey Questions Statistically and Positively
Related to the Use of Performance Information Index
Other Tools with the Potential to Increase Use of
Performance Information – Program Evaluation
Page 26
Agencies Reporting Changes since 2010 in Citing Evaluations as Supporting Evidence in Decisions
Other Tools with the Potential to Increase Use of
Performance Information – Pay-for-Success
Types of Organizations Most Commonly Involved in Pay for Success Projects and the Roles They Play
Page 27
Agencies Continue to Face Challenges Linking
Individual and Agency Performance to Results
• Our previous work has highlighted the importance of creating
a “line of sight” showing how unit and individual performance
can contribute to overall organizational goals.
• GPRAMA and related requirements that support this
alignment:
• Goal setting
• Goal leader designation
• Data-driven reviews
• Strategic reviews
Page 28
Goal Leader Designation and Performance
Review Meetings Provide Accountability
Page 29
Goal leader
designation
GAO-14-639
Data-driven
reviews
Strategic
reviews
Most of the 46 goal leaders we interviewed told us that the designation
provides accountability. They also cited other positive effects, such as
greater visibility for the goal.
Twenty-one of the 22 agencies that reported holding in-person data-driven
reviews reported that they had a positive effect on the agency’s ability to
hold goal leaders and others accountable for progress toward goals and
milestones.
Similar to data-driven reviews, strategic reviews have the potential to
promote individual accountability for organizational results.
Agencies Are Missing Opportunities to Strengthen
Alignment of Individual Performance and Results
Page 30
• GAO and others have found problems with agency oversight
and accountability.
• For example, VA used
unreliable data to monitor
wait times for veterans’
medical appointments. This,
along with inconsistent
scheduling practices, may
have resulted in increased
wait times and delays.
Agencies Are Missing Opportunities to Strengthen
Alignment of Individual Performance and Results
• Performance appraisal systems, including performance plans,
are a powerful mechanism for promoting alignment with and
accountability for organizational goals.
• We found that agencies are missing opportunities to use
performance plans to support alignment and accountability:
Page 31
Goal leader
performance plans
Goal leader performance plans we reviewed generally did not
reflect responsibility for goal achievement.
SES ratings
GAO-15-189
The five agencies we reviewed linked SES performance plans
with agency goals, but we found disparities in ratings
distributions and pay.
Examples of Difficulties Agencies Face in
Measuring Performance by Program Type
Program Type Example of Problems Measuring Performance
Government
contracts
Agencies have not systematically reviewed portfolios of energy savings
performance contracts (GAO-15-432)
Direct services TSA has not fully measured the effectiveness of its secure flight
screening program (GAO-14-531)
Grants DOJ lacks information to evaluate grants targeting child abuse (GAO-
15-351)
Regulations USDA needs additional performance measures for its efforts to reduce
contamination in poultry (GAO-14-744)
Research and
development
DHS had difficulties evaluating outcomes of research and development
for radiation detection technology (GAO-15-263)
Tax
expenditures
The generating capacity of renewable energy projects financed through
tax expenditures is unknown because IRS is not required to collect or
report project-level data (GAO-15-302)
Page 32
Examples of Agency Problems Measuring
Customer Service
• Our examination of customer service standards for six federal
programs (at five agencies) found that three of the programs
did not have standards that met key elements of customer
service standards.
Page 33
Agency Customer service standards
that include targets or goals
for performance
Customer service
standards that include
performance measures
Customs and Border Protection No No
Forest Service No No
Federal Student Aid Yes Yes
National Park Service No No
VBA – Disability Compensation Yes Yes
VBA – Veterans’ Group Life
Insurance
Yes Yes
Source: GAO analysis of agency documentation | GAO-15-84.
OMB and Agencies Have Not Clearly
Communicated Key Performance Information
• Congress, the administration, and federal managers must have ready access
to reliable and complete financial and performance information in order to
address the federal government’s fiscal and performance challenges.
• We identified the communication of performance information as a challenge
for federal agencies in our 2013 report, and our work since then has shown
that the challenge has persisted.
• Our work has identified areas in which agencies have not clearly reported
information:
Page 34
Issue Area Example of Transparency Problems
USA Spending
website
Agencies did not properly report information on assistance
awards totaling $619 billion (GAO-14-476)
Spending on
Broadband
USDA did not report on the effects of ~ $3 billion in spending
intended to increase broadband availability (GAO-14-511)
More Effective Implementation of GPRAMA
Requirements Would Improve Transparency
Program Inventories
• Effective implementation of GPRAMA’s program inventory
provisions and the DATA Act—especially the ability to crosswalk
spending data to individual programs—could provide vital
information to decision makers.
• OMB could approach merging the two laws’ requirements by
exploring ways to improve the comparability of program data
by using tagging or similar approaches.
Page 35
More Effective Implementation of GPRAMA
Requirements Would Improve Transparency
Major Management Challenges
• We found that agencies did not generally report this information
in their plans in a transparent manner. For example:
• Some agencies told us they had internal plans for addressing
their major management challenges, but 12 of 24 agencies
did not publicly report their planned actions to address these
challenges.
• Reasons for why agencies did not report this information varied,
but agencies told us OMB guidance appeared to give them
flexibility on the information they needed to report.
Page 36
More Effective Implementation of GPRAMA
Requirements Would Improve Transparency
CAP Goals
• We found that quarterly updates for the 14 interim CAP goals did
not always provide a complete picture of progress, with some
missing information like targets or milestone due dates.
• The incomplete information provided a limited basis for ensuring
accountability for progress toward targets and milestones, so we
recommended steps to ensure missing information was reported.
• For the current CAP goals, OMB has taken actions to address
our recommendations:
• For example, reporting templates now direct CAP goal teams
to identify targets and milestone due dates.
Page 37
More Effective Implementation of GPRAMA
Requirements Would Improve Transparency
Quality of Performance Information
• The agencies reviewed generally did not publicly report on how they ensured the
accuracy and reliability of performance information used to measure progress on their
APGs. GAO made recommendations to the agencies and OMB, in response to GAO’s
review, made changes to A-11 to require data quality reporting on Performance.gov.
Page 38
Description of how
agency ensured
performance information
quality overall
Number of APGs for
FY14 and FY15
Description of how each
APG met GPRAMA
performance information
quality requirements
Agriculture Yes 3 0
Defense Yes 4 0
Homeland Security Yes 3 3
Interior Yes 6 0
Labor Yes 3 0
NASA Yes 4 0
Source: GAO analysis of selected agencies’ performance plans and reports. |GAO-15-788
Recent Related GAO Reports
• Managing for Results: Greater Transparency Needed in Public Reporting on the Quality of
Performance Information for Selected Agencies’ Priority Goals, GAO-15-788, Sept 2015.
• Managing for Results: Practices for Effective Agency Strategic Reviews, GAO-15-602, July 2015.
• Managing for Results: Agencies Report Positive Effects of Data-Driven Reviews on Performance
but Some Should Strengthen Practices, GAO-15-579, July 2015.
• Program Evaluation: Some Agencies Reported that Networking, Hiring, and Involving Program
Staff Help Build Capacity, GAO-15-25, Nov 2014.
• Government Efficiency and Effectiveness: Inconsistent Definitions and Information Limit the
Usefulness of Federal Program Inventories, GAO-15-83,Oct 2014.
• Managing for Results: Selected Agencies Need to Take Additional Efforts to Improve Customer
Service, GAO-15-84, Oct 2014.
• Managing for Results: Agencies’ Trends in the Use of Performance Information to Make
Decisions. GAO-14-747, Sept 2014.
• Managing for Results: Enhanced Goal Leader Accountability and Collaboration Could Further
Improve Agency Performance, GAO-14-639, July 2014.
• Managing for Results: OMB Should Strengthen Reviews of Cross-Agency Goals, GAO-14-526,
June 2014.
• Managing for Results: Implementation Approaches Used to Enhance Collaboration in Interagency
Groups. GAO-14-220, Feb 2014.
Page 39
Page 40
GAO on the Web
Web site: http://www.gao.gov/
Congressional Relations
Katherine Siggerud, Managing Director, siggerudk@gao.gov
(202) 512-4400, U.S. Government Accountability Office
441 G Street, NW, Room 7125, Washington, DC 20548
Public Affairs
Chuck Young, Managing Director, youngc1@gao.gov
(202) 512-4800, U.S. Government Accountability Office
441 G Street, NW, Room 7149, Washington, DC 20548
Copyright
This is a work of the U.S. government and is not subject to copyright protection in the United States.
The published product may be reproduced and distributed in its entirety without further permission
from GAO. However, because this work may contain copyrighted images or other material,
permission from the copyright holder may be necessary if you wish to reproduce this material
separately.

More Related Content

PPTX
Challenges and Solutions to Conducting High Quality Contract Evaluations for ...
PPT
GAO GPRA Modernization Act Overview
PDF
Lessons from the US Perfromance Management System by Donald Moynihan
PDF
GPRA Modernization Act Overview (GAO-Benjamin Licht)
PPTX
ASPA Presentation - Performance Budgeting at the State Level (3-2014)
PPTX
APO Lecture: Best Practices II
PDF
Implementing Strategic Reviews by Mark Bussow
PDF
Evolution of budgeting system in malaysia (10 page)
Challenges and Solutions to Conducting High Quality Contract Evaluations for ...
GAO GPRA Modernization Act Overview
Lessons from the US Perfromance Management System by Donald Moynihan
GPRA Modernization Act Overview (GAO-Benjamin Licht)
ASPA Presentation - Performance Budgeting at the State Level (3-2014)
APO Lecture: Best Practices II
Implementing Strategic Reviews by Mark Bussow
Evolution of budgeting system in malaysia (10 page)

What's hot (20)

PDF
OECD best practices for performance budgeting - Jon BLÖNDAL, OECD
PPT
Organizing and Managing Program Evaluation
PDF
Integrated Development Performance Monitoring and Evaluation System in Indone...
PDF
Best practices for performance budgeting - Ivor BEAZLEY, OECD
PPTX
Result based management
PDF
Performance Budgeting - Key Performance Indicators -- Wojciech ZIELINSKI, OECD
PPT
Log Frames and Indicators for Result Based Management (IWC5 Presentation)
PPT
Rbm for improved dev results
PPTX
Altus Alliance 2016 - Performance-based Budgeting with Questica
PDF
360 Policy Implementation Presentation and Understanding.
DOCX
Example pol 501601 budget and financial management and admini
PPT
Be independent and objective
PDF
9.00 10.15am How To Initiate A Performance Framework (Pokar Khemani) English
PPTX
Overlooked Links in the Results Chain
PPTX
PIP overview presentation
PDF
Monitoring and Evaluation System for CAADP Implementation_2010
PPT
A Primer On Performance Based Budgeting For State & Local Government Agencies
PPTX
2014 Head Start Program Governance
PDF
Evolution of budgeting system in malaysia presentation (3 nov 3pm edit)
PDF
Fairfax County Economic Success Strategic Plan 2019 Update
OECD best practices for performance budgeting - Jon BLÖNDAL, OECD
Organizing and Managing Program Evaluation
Integrated Development Performance Monitoring and Evaluation System in Indone...
Best practices for performance budgeting - Ivor BEAZLEY, OECD
Result based management
Performance Budgeting - Key Performance Indicators -- Wojciech ZIELINSKI, OECD
Log Frames and Indicators for Result Based Management (IWC5 Presentation)
Rbm for improved dev results
Altus Alliance 2016 - Performance-based Budgeting with Questica
360 Policy Implementation Presentation and Understanding.
Example pol 501601 budget and financial management and admini
Be independent and objective
9.00 10.15am How To Initiate A Performance Framework (Pokar Khemani) English
Overlooked Links in the Results Chain
PIP overview presentation
Monitoring and Evaluation System for CAADP Implementation_2010
A Primer On Performance Based Budgeting For State & Local Government Agencies
2014 Head Start Program Governance
Evolution of budgeting system in malaysia presentation (3 nov 3pm edit)
Fairfax County Economic Success Strategic Plan 2019 Update
Ad

Similar to GPRAMA Implementation After Five Years (20)

DOCX
PROGRAM EVALUATION Some Agencies Reported that .docx
PPTX
Role of ICT & big data in performance budgeting - Lenora Stiles, United States
PPTX
Performance measurementpresentation
PDF
Zients Testimony
PDF
pnadw107.pdf
DOCX
Monitoring and Evaluation Policy
PDF
Aligning the centre and line ministries - Mark Bussow, United States
PPT
Monitoring and evaluation seminar_module3.ppt
PPT
formseminar_module3.ppMJKJHJKtFRGKJTYJGF
PPT
formseminar_module: Building M& E System
PPTX
Regional Development and RBM: Proposals for improvement
PDF
Designing baseline surveys for impact analysis and evaluation of progress
PPTX
Enhancing SDGs through M&E (2nd version).pptx
PPT
Chapter III of project planning-PPP (2).ppt
PDF
Strategic planguidelines
PDF
Progress, Effectiveness, and Gaps Monitoring and Evaluation Tool
PPTX
NVBIA NAIOP GOAL 3 Update
PDF
Pfm Measure 2008
PDF
Pfm Measure 2008
PDF
Pfm Measure 2008
PROGRAM EVALUATION Some Agencies Reported that .docx
Role of ICT & big data in performance budgeting - Lenora Stiles, United States
Performance measurementpresentation
Zients Testimony
pnadw107.pdf
Monitoring and Evaluation Policy
Aligning the centre and line ministries - Mark Bussow, United States
Monitoring and evaluation seminar_module3.ppt
formseminar_module3.ppMJKJHJKtFRGKJTYJGF
formseminar_module: Building M& E System
Regional Development and RBM: Proposals for improvement
Designing baseline surveys for impact analysis and evaluation of progress
Enhancing SDGs through M&E (2nd version).pptx
Chapter III of project planning-PPP (2).ppt
Strategic planguidelines
Progress, Effectiveness, and Gaps Monitoring and Evaluation Tool
NVBIA NAIOP GOAL 3 Update
Pfm Measure 2008
Pfm Measure 2008
Pfm Measure 2008
Ad

More from Washington Evaluators (20)

PPTX
2020 WE Member Engagement Survey Results - Summary
PDF
Building a Community of Practice through WE's Mentor Minutes
PPTX
George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...
PPTX
Harry Hatry: Cost-Effectiveness Basics for Evidence-Based Policymaking
PPTX
George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...
PDF
DC Consortium Student Conference 1.0
PPTX
Are Federal Managers Using Evidence in Decision Making?
PDF
Causal Knowledge Mapping for More Useful Evaluation
PDF
Partnerships for Transformative Change in Challenging Political Contexts w/ D...
PPTX
@WashEval: Facilitating Evaluation Collaboration for 30+ Years
PPTX
Transitioning from School to Work: Preparing Evaluation Students and New Eval...
PPTX
The Importance of Systematic Reviews
PPT
Junge wb bb presentation 06 17-15 final
PPTX
Building Program Evaluation Capacity in Central Asia, Part 1
PPTX
Building Program Evaluation Capacity in Central Asia, Part 1
PDF
Washington Evaluators 2014 Annual Report
PPTX
Sustaining an Evaluator Community of Practice
PPTX
Visualizing Evaluation Results
PDF
Influencing Evaluation Policy and Practice: The American Evaluation Associati...
PDF
Emerging directions and challenges in survey methods
2020 WE Member Engagement Survey Results - Summary
Building a Community of Practice through WE's Mentor Minutes
George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...
Harry Hatry: Cost-Effectiveness Basics for Evidence-Based Policymaking
George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...
DC Consortium Student Conference 1.0
Are Federal Managers Using Evidence in Decision Making?
Causal Knowledge Mapping for More Useful Evaluation
Partnerships for Transformative Change in Challenging Political Contexts w/ D...
@WashEval: Facilitating Evaluation Collaboration for 30+ Years
Transitioning from School to Work: Preparing Evaluation Students and New Eval...
The Importance of Systematic Reviews
Junge wb bb presentation 06 17-15 final
Building Program Evaluation Capacity in Central Asia, Part 1
Building Program Evaluation Capacity in Central Asia, Part 1
Washington Evaluators 2014 Annual Report
Sustaining an Evaluator Community of Practice
Visualizing Evaluation Results
Influencing Evaluation Policy and Practice: The American Evaluation Associati...
Emerging directions and challenges in survey methods

Recently uploaded (20)

PDF
ESG Alignment in Action - The Abhay Bhutada Foundation
PDF
CXPA Finland Webinar: Rated 5 Stars - Delivering Service That Customers Truly...
PPTX
Presentatio koos kokos koko ossssn5.pptx
PDF
PPT Item # 5 - 5307 Broadway St (Final Review).pdf
PPTX
11Sept2023_LTIA-Cluster-Training-Presentation.pptx
PPTX
BHARATIYA NAGARIKA SURAKSHA SAHMITA^J2023 (1).pptx
PDF
Item # 4 -- 328 Albany St. compt. review
PPTX
True Fruits_ reportcccccccccccccccc.pptx
PPTX
SUKANYA SAMRIDDHI YOJANA RESEARCH REPORT AIMS OBJECTIVES ITS PROVISION AND IM...
PPT
The Central Civil Services (Leave Travel Concession) Rules, 1988, govern the ...
PPTX
Weekly Report 17-10-2024_cybersecutity.pptx
PDF
UNEP/ UNEA Plastic Treaty Negotiations Report of Inc 5.2 Geneva
PDF
eVerify Overview and Detailed Instructions to Set up an account
PDF
Population Estimates 2025 Regional Snapshot 08.11.25
PPTX
Part I CSO Conference and AVP Overview.pptx
PDF
Item # 5 - 5307 Broadway St final review
PDF
Item # 3 - 934 Patterson Final Review.pdf
PPTX
SOMANJAN PRAMANIK_3500032 2042.pptx
PDF
Abhay Bhutada Foundation’s ESG Compliant Initiatives
PPTX
Workshop-Session-1-LGU-WFP-Formulation.pptx
ESG Alignment in Action - The Abhay Bhutada Foundation
CXPA Finland Webinar: Rated 5 Stars - Delivering Service That Customers Truly...
Presentatio koos kokos koko ossssn5.pptx
PPT Item # 5 - 5307 Broadway St (Final Review).pdf
11Sept2023_LTIA-Cluster-Training-Presentation.pptx
BHARATIYA NAGARIKA SURAKSHA SAHMITA^J2023 (1).pptx
Item # 4 -- 328 Albany St. compt. review
True Fruits_ reportcccccccccccccccc.pptx
SUKANYA SAMRIDDHI YOJANA RESEARCH REPORT AIMS OBJECTIVES ITS PROVISION AND IM...
The Central Civil Services (Leave Travel Concession) Rules, 1988, govern the ...
Weekly Report 17-10-2024_cybersecutity.pptx
UNEP/ UNEA Plastic Treaty Negotiations Report of Inc 5.2 Geneva
eVerify Overview and Detailed Instructions to Set up an account
Population Estimates 2025 Regional Snapshot 08.11.25
Part I CSO Conference and AVP Overview.pptx
Item # 5 - 5307 Broadway St final review
Item # 3 - 934 Patterson Final Review.pdf
SOMANJAN PRAMANIK_3500032 2042.pptx
Abhay Bhutada Foundation’s ESG Compliant Initiatives
Workshop-Session-1-LGU-WFP-Formulation.pptx

GPRAMA Implementation After Five Years

  • 1. GPRAMA Implementation After Five Years GWU/Washington Evaluators Presentation October 27, 2015 GAO’s Strategic Issues Team For more information, contact Kathleen Padulchick (padulchickk@gao.gov). Page 1
  • 2. Overview • The GPRA Modernization Act (GPRAMA) includes a provision for GAO to, by September 30, 2015, evaluate and report on: • how implementation of the act is affecting performance management at the CFO Act agencies; and • crosscutting goal implementation. • GAO’s assessment focused on four key areas: (1) Addressing crosscutting efforts; (2) Ensuring performance information is useful and used; (3) Aligning daily operations with results; and (4) Communicating performance information. Page 2GAO’s findings are reported in: Managing for Results: Implementation of GPRA Modernization Act Yielded Mixed Progress in Addressing Pressing Governance Challenges, GAO-15-819 (Washington, D.C.: September 30, 2015).
  • 3. Background • GPRAMA provides important tools that can help decision makers address challenges facing the federal government. • Full and effective implementation of GPRAMA will also be instrumental in addressing these pressing governance issues in anticipation of the transition to the next presidential administration in 2017. Page 3
  • 4. Background – Priority Goals and Objectives • GPRAMA includes requirements that OMB and agencies establish different types of government-wide and agency-level performance goals. • Government-wide: • Cross-agency priority (CAP) goals - outcome-oriented goals covering a limited number of policy areas as well as goals for management improvements needed across the government • Agency-level: • Agency priority goals (APG) - reflect the highest priorities of each of these agencies, and to be informed by the CAP goals Page 4
  • 5. Background – Leadership Positions and Councils • GPRAMA provided a statutory basis for selected senior leadership positions that had been created by executive orders, presidential memorandums, or OMB guidance. • Chief operating officer (COO) • Performance improvement officer (PIO) • Goal leaders • Performance Improvement Council (PIC) Page 5
  • 6. Background – Performance Reviews • GPRAMA and related OMB guidance require the regular review of progress in achieving goals and objectives through performance reviews. • Strategic reviews: leadership-driven, annual reviews of their progress toward achieving each strategic objective • Data-driven reviews: regularly scheduled structured meetings used by organizational leaders and managers to review and analyze data on progress toward key performance goals and other management-improvement priorities Page 6
  • 7. Background – Transparency and Public Reporting • GPRAMA includes several provisions related to transparency and public reporting of performance information: • Performance.gov • Program inventory • Performance information quality • Major management challenges Page 7
  • 8. Background – Key GPRAMA Requirements and Their Frequency Page 8 Notes: *This figure addresses GPRAMA requirements and OMB implementation guidance. Strategic and annual performance plans and performance reports were required under the Government Performance and Results Act of 1993.
  • 9. Implementation Status of GAO’s Recommendations Made under GPRAMA, from 2012 through September 2015 Page 9
  • 10. Crosscutting Issues • Many of the meaningful results that the federal government seeks to achieve—such as those related to protecting food and agriculture, providing homeland security, and ensuring a well-trained and educated workforce—require the coordinated efforts of more than one federal agency and often more than one sector and level of government. • GPRAMA takes a crosscutting and integrated approach to achieving results and improving agency performance. Page 10
  • 11. Collaboration is Key to Address Challenge Areas Across the Federal Government Page 11 Issue Area Example of Related Challenge GAO’s High Risk List – Federal Food Safety The safety and quality of the U.S. food supply is governed by a highly complex system stemming from at least 30 laws related to food safety that are collectively administered by 15 federal agencies. Need for a government-wide food safety performance plan and to formalize the Food Safety Working Group through statute to help ensure sustained leadership across food safety agencies over time. Fragmentation, Overlap, and Duplication – Nonemergency Medical Transportation 42 programs across six federal departments provide funding for nonemergency medical transportation. Lead agencies are working to develop a new 2-year strategy to coordinate.
  • 12. Collaboration is Key to Address Challenge Areas Across the Federal Government Issue Area Example of Progress Being Made GAO’s High Risk List – Sharing and Managing Terrorism-Related Information In December 2012, the President signed the National Strategy for Information Sharing and Safeguarding. In response to the strategy, the Strategic Implementation Plan was released in 2013. The plan: • Assigns stewards to coordinate each priority objective • Provides time frames and milestones for achieving the outcomes in each objective Page 12
  • 13. GPRA Modernization Act Has the Potential to Address Crosscutting Issues 1. Cross-agency Priority (CAP) Goals – GAO-14-526 2. Strategic Reviews – GAO-15-602 3. Data-Driven Reviews – GAO-15-579 4. Government-wide Program Inventory – GAO-15-83 Page 13
  • 16. OMB Has Increased Emphasis on Crosscutting Issues through CAP Goal Guidance and Governance • OMB is taking steps to enhance governance and implementation • Capacity building – 2+ CAP goal leaders • OMB and PIC provide ongoing support • New improved reporting tools • Initial progress, but challenges remain in measuring progress • 5 of the 7 CAP goals we reviewed have indicators under development for some of their goals. • More work needed to establish targets Page 16
  • 17. Effective Implementation of Strategic Reviews Could Help Address Crosscutting Issues • In July 2015 we reported on seven practices that can help ensure agencies conduct effective strategic reviews. Such as.. • Identifying the various strategies that influence outcomes and determining which are most important • Identifying key stakeholders to participate in the review • Assessing the effectiveness in achieving strategic objectives • Identifying actions to improve implementation and impact Page 17
  • 18. Data-Driven Reviews Should Be Collaborative • Treasury officials said their reviews allowed different functional management groups and program areas within their agencies to collaborate and identify strategies which led to performance improvements. Page 18
  • 19. Data-Driven Reviews Have Had a Positive Effect on Collaboration, but Agencies Are Still Missing Opportunities • 21 of the 22 agencies we surveyed that reported holding in- person data-driven reviews said that the reviews have had a positive effect on collaboration. But… • Agencies are still missing opportunities to include stakeholders from other federal agencies Page 19
  • 20. Issues with Program Inventories Limit Their Usefulness • Inconsistent approaches in defining programs across agencies • We were unable to identify a large majority of the programs previously identified in our work: • Of 179 programs identified in our work, only 59 were found in the inventory • Plans for updating the inventories are on hold • We recommended that OMB accelerate efforts to produce a federal program inventory Page 20
  • 21. Ensuring Performance Information Is Useful and Used by Managers Remains a Challenge • Agencies can use performance information to: • Identify performance improvement opportunities • Improve program implementation and organizational processes • Inform management and resource allocation decisions. • Agencies continue to have problems effectively using performance information. Page 21
  • 22. Ensuring Performance Information Is Useful and Used by Managers Remains a Challenge Federal Agencies’ Average Scores on Use of Performance Information Index—2007 and 2013 Page 22
  • 23. Agency Performance Reviews Should Improve the Use of Performance Information for Decision Making • Data-Driven Reviews. Agency Performance Improvement Officers (PIO) reported that data-driven reviews have had positive effects on the use of performance information in their agencies. For example, PIOs from nearly all CFO Act agencies reported: • Their agencies always or often use data-driven reviews to assess progress on APGs, and to identify goals at risk and strategies for improvement. • Reviews have had a positive effect on APG progress and their ability to identify and mitigate risks to goal achievement. Page 23
  • 24. Agency Performance Reviews Should Improve the Use of Performance Information for Decision Making • Strategic Reviews. Agency officials should use relevant performance information and evidence during the strategic review process to: • Assess whether strategies are being implemented as planned and whether they are having the desired effect. • Identify areas where action is needed to improve implementation and impact. • Identify evidence gaps or areas where additional analyses of performance data are needed. Page 24
  • 25. Page 25 Practices and Related Federal Managers Survey Questions Statistically and Positively Related to the Use of Performance Information Index
  • 26. Other Tools with the Potential to Increase Use of Performance Information – Program Evaluation Page 26 Agencies Reporting Changes since 2010 in Citing Evaluations as Supporting Evidence in Decisions
  • 27. Other Tools with the Potential to Increase Use of Performance Information – Pay-for-Success Types of Organizations Most Commonly Involved in Pay for Success Projects and the Roles They Play Page 27
  • 28. Agencies Continue to Face Challenges Linking Individual and Agency Performance to Results • Our previous work has highlighted the importance of creating a “line of sight” showing how unit and individual performance can contribute to overall organizational goals. • GPRAMA and related requirements that support this alignment: • Goal setting • Goal leader designation • Data-driven reviews • Strategic reviews Page 28
  • 29. Goal Leader Designation and Performance Review Meetings Provide Accountability Page 29 Goal leader designation GAO-14-639 Data-driven reviews Strategic reviews Most of the 46 goal leaders we interviewed told us that the designation provides accountability. They also cited other positive effects, such as greater visibility for the goal. Twenty-one of the 22 agencies that reported holding in-person data-driven reviews reported that they had a positive effect on the agency’s ability to hold goal leaders and others accountable for progress toward goals and milestones. Similar to data-driven reviews, strategic reviews have the potential to promote individual accountability for organizational results.
  • 30. Agencies Are Missing Opportunities to Strengthen Alignment of Individual Performance and Results Page 30 • GAO and others have found problems with agency oversight and accountability. • For example, VA used unreliable data to monitor wait times for veterans’ medical appointments. This, along with inconsistent scheduling practices, may have resulted in increased wait times and delays.
  • 31. Agencies Are Missing Opportunities to Strengthen Alignment of Individual Performance and Results • Performance appraisal systems, including performance plans, are a powerful mechanism for promoting alignment with and accountability for organizational goals. • We found that agencies are missing opportunities to use performance plans to support alignment and accountability: Page 31 Goal leader performance plans Goal leader performance plans we reviewed generally did not reflect responsibility for goal achievement. SES ratings GAO-15-189 The five agencies we reviewed linked SES performance plans with agency goals, but we found disparities in ratings distributions and pay.
  • 32. Examples of Difficulties Agencies Face in Measuring Performance by Program Type Program Type Example of Problems Measuring Performance Government contracts Agencies have not systematically reviewed portfolios of energy savings performance contracts (GAO-15-432) Direct services TSA has not fully measured the effectiveness of its secure flight screening program (GAO-14-531) Grants DOJ lacks information to evaluate grants targeting child abuse (GAO- 15-351) Regulations USDA needs additional performance measures for its efforts to reduce contamination in poultry (GAO-14-744) Research and development DHS had difficulties evaluating outcomes of research and development for radiation detection technology (GAO-15-263) Tax expenditures The generating capacity of renewable energy projects financed through tax expenditures is unknown because IRS is not required to collect or report project-level data (GAO-15-302) Page 32
  • 33. Examples of Agency Problems Measuring Customer Service • Our examination of customer service standards for six federal programs (at five agencies) found that three of the programs did not have standards that met key elements of customer service standards. Page 33 Agency Customer service standards that include targets or goals for performance Customer service standards that include performance measures Customs and Border Protection No No Forest Service No No Federal Student Aid Yes Yes National Park Service No No VBA – Disability Compensation Yes Yes VBA – Veterans’ Group Life Insurance Yes Yes Source: GAO analysis of agency documentation | GAO-15-84.
  • 34. OMB and Agencies Have Not Clearly Communicated Key Performance Information • Congress, the administration, and federal managers must have ready access to reliable and complete financial and performance information in order to address the federal government’s fiscal and performance challenges. • We identified the communication of performance information as a challenge for federal agencies in our 2013 report, and our work since then has shown that the challenge has persisted. • Our work has identified areas in which agencies have not clearly reported information: Page 34 Issue Area Example of Transparency Problems USA Spending website Agencies did not properly report information on assistance awards totaling $619 billion (GAO-14-476) Spending on Broadband USDA did not report on the effects of ~ $3 billion in spending intended to increase broadband availability (GAO-14-511)
  • 35. More Effective Implementation of GPRAMA Requirements Would Improve Transparency Program Inventories • Effective implementation of GPRAMA’s program inventory provisions and the DATA Act—especially the ability to crosswalk spending data to individual programs—could provide vital information to decision makers. • OMB could approach merging the two laws’ requirements by exploring ways to improve the comparability of program data by using tagging or similar approaches. Page 35
  • 36. More Effective Implementation of GPRAMA Requirements Would Improve Transparency Major Management Challenges • We found that agencies did not generally report this information in their plans in a transparent manner. For example: • Some agencies told us they had internal plans for addressing their major management challenges, but 12 of 24 agencies did not publicly report their planned actions to address these challenges. • Reasons for why agencies did not report this information varied, but agencies told us OMB guidance appeared to give them flexibility on the information they needed to report. Page 36
  • 37. More Effective Implementation of GPRAMA Requirements Would Improve Transparency CAP Goals • We found that quarterly updates for the 14 interim CAP goals did not always provide a complete picture of progress, with some missing information like targets or milestone due dates. • The incomplete information provided a limited basis for ensuring accountability for progress toward targets and milestones, so we recommended steps to ensure missing information was reported. • For the current CAP goals, OMB has taken actions to address our recommendations: • For example, reporting templates now direct CAP goal teams to identify targets and milestone due dates. Page 37
  • 38. More Effective Implementation of GPRAMA Requirements Would Improve Transparency Quality of Performance Information • The agencies reviewed generally did not publicly report on how they ensured the accuracy and reliability of performance information used to measure progress on their APGs. GAO made recommendations to the agencies and OMB, in response to GAO’s review, made changes to A-11 to require data quality reporting on Performance.gov. Page 38 Description of how agency ensured performance information quality overall Number of APGs for FY14 and FY15 Description of how each APG met GPRAMA performance information quality requirements Agriculture Yes 3 0 Defense Yes 4 0 Homeland Security Yes 3 3 Interior Yes 6 0 Labor Yes 3 0 NASA Yes 4 0 Source: GAO analysis of selected agencies’ performance plans and reports. |GAO-15-788
  • 39. Recent Related GAO Reports • Managing for Results: Greater Transparency Needed in Public Reporting on the Quality of Performance Information for Selected Agencies’ Priority Goals, GAO-15-788, Sept 2015. • Managing for Results: Practices for Effective Agency Strategic Reviews, GAO-15-602, July 2015. • Managing for Results: Agencies Report Positive Effects of Data-Driven Reviews on Performance but Some Should Strengthen Practices, GAO-15-579, July 2015. • Program Evaluation: Some Agencies Reported that Networking, Hiring, and Involving Program Staff Help Build Capacity, GAO-15-25, Nov 2014. • Government Efficiency and Effectiveness: Inconsistent Definitions and Information Limit the Usefulness of Federal Program Inventories, GAO-15-83,Oct 2014. • Managing for Results: Selected Agencies Need to Take Additional Efforts to Improve Customer Service, GAO-15-84, Oct 2014. • Managing for Results: Agencies’ Trends in the Use of Performance Information to Make Decisions. GAO-14-747, Sept 2014. • Managing for Results: Enhanced Goal Leader Accountability and Collaboration Could Further Improve Agency Performance, GAO-14-639, July 2014. • Managing for Results: OMB Should Strengthen Reviews of Cross-Agency Goals, GAO-14-526, June 2014. • Managing for Results: Implementation Approaches Used to Enhance Collaboration in Interagency Groups. GAO-14-220, Feb 2014. Page 39
  • 40. Page 40 GAO on the Web Web site: http://www.gao.gov/ Congressional Relations Katherine Siggerud, Managing Director, siggerudk@gao.gov (202) 512-4400, U.S. Government Accountability Office 441 G Street, NW, Room 7125, Washington, DC 20548 Public Affairs Chuck Young, Managing Director, youngc1@gao.gov (202) 512-4800, U.S. Government Accountability Office 441 G Street, NW, Room 7149, Washington, DC 20548 Copyright This is a work of the U.S. government and is not subject to copyright protection in the United States. The published product may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately.

Editor's Notes

  • #6: Chief operating officer (COO): overall responsibility for improving agency management and performance Performance improvement officer (PIO): assists the agency head and COO with performance management activities Goal leaders: designated for CAP goals, strategic objectives, and APGs Performance Improvement Council (PIC): assists OMB to improve the performance of the federal government and achieve the CAP goals and facilitate the exchange of useful performance improvement practices and work to resolve government-wide or crosscutting performance issues
  • #8: GPRAMA includes several provisions related to reporting of performance information. Performance.gov single, government-wide performance website to communicate government-wide and agency performance information Program inventory publicly available list of all federal programs identified by agencies, along with related budget and performance information Performance information quality requirements for agencies to describe how they are ensuring the accuracy and reliability of the data used to measure progress toward APGs and performance goals Major management challenges challenges may include programs or management functions that have greater vulnerability to fraud, waste, abuse, and mismanagement addressed in performance plans
  • #10: OMB and Agencies Generally Agreed with GAO’s Prior Recommendations to Improve GPRAMA Implementation, but Most Have Not Yet Been Implemented Since GPRAMA was enacted in January 2011, we have made a total of 69 recommendations to OMB and agencies aimed at improving its implementation. OMB and the agencies have generally agreed with the recommendations we have made thus far, and have implemented some of them. However, of the 69 recommendations we have made, 55 (about 80 percent) have not yet been implemented, while 14 recommendations (about 20 percent) have been implemented. OMB, which has been the focus of most of our recommendations, has implemented just over one-third (14) of the 38 recommendations we have made to it.
  • #12: Food Saftey: HHS and USDA vary in the amount of detail they provide on their crosscutting food safety efforts and they do not include several relevant crosscutting efforts in their strategic and performance planning documents. HHS and USDA have mechanisms in place to facilitate interagency coordination on food safety that focus on specific issues, but they do not provide for broad-based, centralized collaboration. Centralized mechanisms needed. NEMT An interagency coordinating council was developed to enhance federal, state, and local coordination activities, and it has taken some actions to address program coordination. Limited leadership – has not convened since 2008.
  • #18: Managing for Results: Practices for Effective Agency Strategic Reviews: GAO‑15‑602. Characteristics of strategic reviews: Leadership-driven Held annually Assessment of an agency's progress toward their strategic objectives Identification of strategies for performance improvement, such as strengthening collaboration to better address crosscutting challenges using evidence to identify and implement more effective program designs
  • #19: GPRAMA requires data driven reviews for agencies APGs Characteristics of data driven reviews: Regularly scheduled, structured meetings Include key players Leaders and managers review and analyze data on progress toward agency performance goals Used to identify at-risk goals and strategies to improve performance Participants engaged in rigorous and sustained follow-up
  • #20: Most agencies reported that relevant contributors from other federal agencies did not participate in their reviews As we previously reported in 2013, failing to include all goal contributors may lead to missed opportunities to have all the relevant parties apply their knowledge of the issues and participate in developing solutions to performance problems As a result, in that 2013 report, we recommended that OMB work with the PIC and other relevant groups to identify and share promising practices to help agencies extend their performance reviews to include, as relevant, representatives from outside organizations that contribute to achieving their agency performance goals. OMB staff said that while agencies have found that at times it is useful to engage external stakeholders in improving program delivery, officials view data-driven reviews as internal agency management meetings and believe it would not always be appropriate to regularly include external representatives
  • #21: According to OMB staff, agencies used different approaches for valid and legitimate reasons and a one-size-fits-all approach would not work for all agency inventories. While this may be true, OMB could do more to direct agencies to find common ground on similar programs. One of OMB’s stated purposes for the inventories is to facilitate coordination across programs that contribute to similar outcomes. However, OMB put the plans for updating the inventories on indefinite hold and agencies have not published updated inventories with program-level budget information, in part due to the enactment of the DATA Act. OMB staff told us that they are considering how implementation of DATA Act requirements can be tied to the program inventories.
  • #23: Comparing reported use of performance information by federal managers in 2007 and 2013, only 2 agencies experienced a statistically significant improvement, while 4 agencies experienced a statistically significant decline.
  • #24: Reviews are designed to shift agency practices from the passive collection and reporting of performance information to a model where agency leaders actively use this data to assess performance, diagnose problems, and decide on next steps to improve performance. Data-driven reviews. GPRAMA and OMB require that in-person reviews of progress of agency priority goals be held at least quarterly, be led by the agency head and/or COO, and be used to assess progress toward each goal and develop strategies to improve performance where necessary.
  • #25: Strategic reviews. OMB has directed agencies to conduct strategic reviews, which are leadership-driven, annual reviews of progress on an agency’s strategic objectives (the outcome or impact the agency intends to achieve through its various programs and initiatives).
  • #26: Other Practices Have the Potential to Increase the Use of Performance Information Agencies can adopt leading practices to enhance the use of performance information for policy and program decisions aimed at improving results: Aligning agency-wide goals, objectives, and measures Improving the usefulness of performance information Developing agency capacity to use performance information Demonstrating management commitment Communicating performance information frequently and effectively.
  • #27: Evidence-based tools that can increase use of performance data include: Program evaluations. Systematic studies of program performance can influence program management and policy decisions if leaders support using evaluations for program improvement, there is a strong body of evidence, and stakeholders are engaged throughout the process.
  • #28: Evidence-based tools that can increase use of performance data include Pay for Success. Under PFS, government contracts for specific performance outcomes and requires that a program’s impact be independently evaluated. Contract provisions can also require service providers and stakeholders to regularly review performance data.
  • #29: The third area in which we evaluated GPRAMA’s impact was in aligning individual and agency performance with results. GAO’s prior work on performance management has emphasized the importance of practices that create a clear linkage – or “line of sight” – between individual performance and organizational success (GAO-03-488). These practices include: Aligning individual performance expectations with organizational goals, and Making meaningful distinctions in performance. At the organizational level, aligning performance with results involves setting meaningful goals for performance and measuring progress against these goals. The GPRAMA requirements listed on the slide support this alignment, by emphasizing goal setting, accountability for goals, and review of goal progress.
  • #30: Goal Leader – For our July 2014 report on the role of the agency priority goal leader, we interviewed 46 goal leaders. Most of them said that the goal leader designation had positive effects, including that it provided accountability for goal achievement. Data-driven reviews – As part of our July 2015 report on data-driven reviews, we surveyed performance improvement officers at the 23 agencies that have APGs. Of the 22 agencies that reported holding regular, in-person data-driven reviews, 21 reported that the reviews have had a positive effect on their agency’s ability to hold goal leaders and other officials accountable for progress toward goals and milestones. Strategic reviews – Similar to data-driven reviews, these reviews can promote accountability for results. In our July 2015 report, in which we identified and illustrated practices that facilitate effective strategic reviews, we reported that accountability for results is one of the key features of the reviews. Leaders should hold strategic objective leaders and other responsible managers accountable for managing progress on strategic objectives.
  • #31: Our work has found problems with agencies’ efforts to link performance to results. One of the most well-known of these problems was illustrated in the VA scandal in which veterans were left waiting for or never received care, while schedulers were pressured to use practices to make wait time data appear more favorable than it actually was. The VA’s goal of providing timely care was reflected both in 1) performance contracts for senior leaders, and 2) the agency’s annual budget submissions and performance and accountability reports. On the surface, this is a good thing! But… the wait time data on which goal progress was measured was not reliable. VA staff had used gaming strategies to manipulate it to appear more favorable. This manipulation may have resulted in increased wait times and delays in providing care.
  • #32: Performance appraisal systems, including individuals’ performance plans, Goal leader performance plans – we reviewed agency priority goal leaders’ and deputy goal leaders’ performance plans as part of our July 2014 report on the role of the goal leader. Reviewed plans for leaders/ deputies for the 47 APGs in our sample (just about half of all APGs for 2012/2013). 32 goal leader plans and 35 deputy goal leader plans. Very few plans referenced the APGs. Only one goal leader and one deputy goal leader plan linked performance standards to goal outcomes. Going back to key practices associated with line of sight – this is a failure to align individual performance expectations with organizational goals Senior Executive Ratings: Our January 2015 report on SES ratings looked at SES performance plans and ratings distributions among agencies. The 5 agencies we studied all linked performance plans with organizational goals, BUT… We found wide variation in ratings distributions. For example DOD rated ~30 % of its SES employees at the highest rating level; DOJ rated ~74% of its SES at this level. Going back to key practices associated for line of sight – this raises questions about whether agencies are making meaningful distinctions in performance. We made recommendations related to each of these findings (see source reports).
  • #33: Agencies Have Long-standing Difficulties in Measuring Performance of Selected Program Types A critical element in an organization’s efforts to manage for results is its ability to set meaningful goals and to measure progress toward these goals. GPRAMA reinforces the need to set meaningful goals by directing agencies to publish a balanced set of performance measures across program areas. However, measuring the performance of different program types—such as grants, regulations, and tax expenditures—is a significant and long-standing government-wide challenge and one we have addressed in our previous work. The table on this slide illustrates some of the examples from our work over the past two years in which we identified agency problems measuring the performance of different program types. As you can see, the related issue areas range from measuring the effectiveness of grants targeting child abuse to food safety issues. More details are in the capping report.
  • #34: A performance measurement problem we found at several federal programs was in measuring customer service. Our October 2014 report on customer service examined customer service standards of six federal programs (at five agencies). We compared these programs’ standards to key elements of customer service standard (identified based on our review of GPRAMA and executive orders). We found that three of the six were not effectively measuring customer service. Specifically, these programs did not: - Include targets or goals for performance - Include performance measures For example, we reported that the National Park Service did not have performance goals or measures directly linked to those goals. Because of this, the agency is unable to determine the extent to which the standards are being met agency wide, or strategies to close performance gaps.
  • #36: Program inventories have the potential to improve the transparency of performance information, but we identified issues that affect their usefulness: For example, we found that some inventories did not identify a program’s contribution to the agency’s goals. The effectiveness of the inventories could also be improved by their being presented in a more dynamic, web-based format.