SlideShare a Scribd company logo
India Performance Monitoring And
Evaluation Systems ( PMES )
23
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
While a truly effective Government Performance Management (GPM) system
must include all three sub-systems, more often than not, most countries
tend to focuson only one or two of the sub-systems mentioned above.Even
*Secretary,Performance Management,Cabinet Secretariat,Government of India
INDIA
Performance Monitoring and Evaluation
System (PMES)
PrajapatiTrivedi*
1.Introduction
To improve performance of any organisation we need a multidimensional
effort. Experts believe that the following three systems are necessary for
improving performance of any organisation: (a) Performance Information
System, (b) Performance Evaluation System, and (c) Performance Incentive
System.
A performance information system ensures that appropriate information, in a
useful format, is available in a timely manner to stakeholders. A performance
evaluation system is meant to convert, distill and arrange this information
in a format that allows stakeholders to assess the true effectiveness of the
organisation. Finally, no matter how sophisticated the information system
and how accurate the evaluation system, performance of any organisation
can improve in a sustainable manner only if it has a performance incentive
system. A performance incentive system links the performance of the
organisation to the welfare of its employees. This allows the employees to
achieve organisation objectivesin their own self-interest.
These three sub-systems are as relevant for the public sector as they are
for the private sector. Within the public sector, these systems are equally
important for Government Departments and state-owned enterprises
or public enterprises. While the focus of this paper is on the performance
management of Government Departments, occasional references and
comparisons will be made to the public enterprise sector as well.
24
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
when countries take actions covering all three sub-systems, often these
sub-systems are not adequately dealt with or organically connected to each
other for yielding desired results.
This was also the case in India till September 2009. There were a plethora
of actions and policies in all three areas but they remained sporadic and
fragmented efforts. In the area of performance information the following
initiatives come to mind immediately: the enactment of the Right to
Information (RTI), publication of annual reports by departments, suo moto
disclosure on departmental websites, creation of an independent Statistical
Commission to bolster an already robust statistical tradition, reports of the
Planning Commission, certified accounts of Government Departments by
Controller General of Accounts, audit reports by Comptroller and Auditor
General, free media and its reports on departments, outcome budgets
and, finally, reports of the departmental standing committees of the
Indian Parliament. One could say that there was overwhelming amount of
information available on Government Departments. Similarly, all the above
sources of information provide multiple, though often conflicting, narrative
on performanceevaluation of GovernmentDepartments.
Similarly, a proposal for a performance incentive for Central Government
employees has been around ever since the Fourth Pay Commission
recommended introducing a performance related incentive scheme (PRIS)
for Central Government employees and was accepted by the Government
of India in 1987. This recommendation for PRIS was once again reiterated by
the Fifth and Sixth Pay Commissions and both times, accepted by the then
Governmentsinpower.Asof 2009,however,verylittlewasdonetoimplement
a performance-related incentivescheme in the Central Government.
At the end of 2008, two major reports provided the impetus for action on
this front. The 10th
Report of the Second Administrative Reform Commission
(2nd
ARC) argued for introduction of a Performance Management System
in general and Performance Agreements in particular. The Sixth Pay
Commission, as mentioned earlier, submitted its report in 2008 urging for
introduction of a PerformanceRelated Incentive Scheme (PRIS).
After the election in 2009, the new Government decided to take action on
both reports. Through the President’s Address to the Parliament in June
2009, the new Government made a commitment to: 'Establish mechanisms
for performance monitoring and performance evaluation in government on
a regular basis'.
Pursuant to the above commitment made in the President’s address to both
Houses of the Parliament on June 4, 2009, the Prime Minister approved the
outline of the Performance Monitoring and Evaluation System (PMES) for
GovernmentDepartments on September 11, 2009.With the introduction of
25
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
PMES, a concerted effort was made to refine and bring together the three
sub-systems. Before elaborating the details of PMES in the subsequent
sections, it is worth outlining the key features of the System. The essence of
PMES is as follows.
According to PMES, at the beginning of each financial year,with the approval
of the Minister concerned, each Department is required to prepare a Results-
Framework Document (RFD). The RFD includes the priorities set out by the
Ministry concerned, agenda as spelt out in the manifesto, if any, President’s
Address, and announcements/agenda as spelt out by the Government from
time to time.The Minister in-charge is expected to decide the inter-se priority
among the departmental objectives.
Aftersixmonths,theachievementsofeach Ministry/Departmentarereviewed
by the High Power Committee on Government Performance, chaired by the
Cabinet Secretary, and the goals reset, taking into account the priorities of
the Government at that point of time.This enables the Government to factor
in unforeseen circumstances such as drought conditions, natural calamities
or epidemics.
At the end of the year, all Ministries/Departments review and prepare a
report listing the achievements of their Ministry/Department against the
agreed results in the prescribed formats. This report is finalised by the 1st of
May each year and submitted for approval by the High Power Committee,
beforeforwarding the results to the Prime Minister.
2.Origin of Performance Monitoring and
Evaluation System (PMES)
The immediate origins of PMES can be traced to the 10th
Report of the
Second Administrative Reform Commission finalised in 2008. In Chapter 11
(see Box 1),the Reportgoes on to say:
'Performance agreement is the most common accountability
mechanism in most countries that have reformed their public
administration systems. This has been done in many forms - from
explicit contracts to less formal negotiated agreements to more
generally applicable principles. At the core of such agreements are
the objectives to be achieved, the resources provided to achieve
them, the accountability and control measures, and the autonomy
and flexibilitiesthatthecivil servantswillbegiven'.
The Prime Minister’s order of September 11, 2009, mandating PMES was
based on this basic recommendation. However, the Government of India
preferred to use the term Results-Framework Document (RFD) rather than
Performance Agreement, which is the commonly used generic term for such
policy instruments.Indeed,RFD is the centrepiece of PMES.
26
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
Box 1
Excerptsfrom the10th
Report of
SecondAdministrativeReformsCommission
(Chapter 11 onPerformanceManagement)
November2008
14. PerformanceAgreements
1.Performance agreement is the most common accountability
mechanism in most countries that have reformed their public administration
systems. This has been done in many forms - from explicit contracts to less
formal negotiated agreements to more generally applicable principles. At
the core of such agreements are the objectives to be achieved, the resources
provided to achieve them, the accountability and control measures, and the
autonomy and flexibilities that the civil servants will be given.
2.In New Zealand, for example, the
Public Finance Actof 1989 provided for
a performance agreement to be
signed between the chief executive
and the concerned minister every
year. The performance agreement
describes the key result areas that
require the personal attention of the
chief executive. The expected results
are expressed in verifiable terms, and
include output-related tasks. The chief
executive’s performance is assessed
every year with reference to the
performance agreement. The system
provides for bonuses to be earned for
good performance and removal for poor performance. The assessment is
done by a third party - the State Services Commission. Due consideration
is given to the views of the departmental Minister. A written performance
appraisal is prepared.The chief executive concerned is given an opportunity
to comment,and his/her commentsform part of the appraisal.
3.The Centres de Responsabilitein Franceisanother example.Since 1990, many
State services at both central and devolved levels have been established as
Responsibility Centres in France. A contract with their Ministry gives the
Directors greater management flexibility in operational matters in exchange
for a commitment to achieve agreed objectives. It also stipulates a method for
evaluatingresults.Contracts,negotiated case by case,arefor threeyears.
27
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
4.Reforms in these countries are instructive in the way accountabilities were
clarified as a necessary first step. The important part of this clarifying
process was that it was done by law. As a result of legal clarification of
accountabilities, the civil servant in charge of a department became directly
accountable to the departmental Minister through the annual performance
agreement that was defined in advance and used as a benchmark for
measuring end-of-the-period performance. In India, a provision in the
proposed Public Services Law could be incorporated specifying that the
heads of the line departments or of the executive agencies whenever
they are set up, should sign annual performance agreements with the
departmental Minister.
5.The performance agreements should be signed between the
departmental Minister and the Secretary of the Ministry as also between
the departmental Minister and heads of Department, well before the
financial year. The annual performance agreement should provide
physical and verifiable details of the work to be done by the Secretary/
Head of the Department during the financial year. The performance of
the Secretary/Head of the Department should be assessed by a third
party – say, the Central Public Services Authority with reference to the
annual performance agreement. The details of the annual performance
agreements and the results of the assessment by the third party should
be provided to the legislature as a part of the Performance Budget/
Outcome Budget.
This recommendation of the Second Administrative Reforms Commission
(2nd
ARC) was, in turn, building on the recommendation of the L. K. Jha
Commission on Economic Administration Reforms (1982). The L. K. Jha
Commission had recommended the concept of Action Plans for Government
Departments. The Government of India accepted this recommendation and
implemented it for a few years. Action Plans were found to be ineffective as
they suffered from two fatal flaws.The actions listed were not prioritised and
there was no agreement on how to measure deviations from targets. As this
paper will reveal later, the Performance Monitoring and Evaluation System
(PMES) overcame these flaws in the Results-Framework Documents (RFDs)
that replaced the instrumentof Action Plans.
The real inspiration for the recommendations of the 2nd
ARC regarding
Performance Agreements comes from the 1984 report of Arjun Sengupta
for Public Enterprises which recommended Memorandum of Understanding
(MOU) for public enterprises. In concept and design, MOU and RFD are mirror
images of each other,as will be discussed shortly.
28
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement 3. Attributes of the Performance Monitoring and
Evaluation System (PMES)
This is a system to both 'evaluate' and 'monitor' the performance of
Government Departments. Evaluation involves comparing the actual
achievements of a department against the annual targets at the end
of the year. In doing so, an evaluation exercise judges the ability of the
department to deliver results on a scale ranging from excellent to poor.
Monitoring involves keeping a tab on the progress made by departments
towards achieving their annual targets during the year. So while the focus of
‘evaluation’ is on achieving the ultimate ‘ends', the focus of ‘monitoring’ is on
‘means'.They are complementsto each other and not substitutes.
To be even more accurate, PMES is not merely a ‘performance evaluation’
exercise, instead, it is a ‘performance management exercise'. The former
becomes the latter when accountability is assigned to a person for the results
of the entity managed by that person. In the absence of consequences, an
evaluation exercise remains an academic exercise. This explains why a large
amount of effort on Monitoring & Evaluation (M&E) does not necessarily
translate into either results or accountability.
Second, PMES takes a comprehensive view of departmental performance by
measuring performance of all schemes and projects (iconic and non-iconic)
and all relevant aspects of expected departmental deliverables such as:
financial, physical, quantitative, qualitative, static efficiency (short-run) and
dynamic efficiency (long-run). As a result of this comprehensive evaluation
covering all aspects of citizen’s welfare, this system provides a unified and
single view of departmental performance.
Third, by focusing on areas that are within the control of the department,
PMES also ensures ‘fairness’ and, hence, high levels of motivation for
departmental managers.
These attributes will be discussed detail in the subsequent section of this
paper.
4. How does PMESwork?
The working of the PMES can be divided into the following three distinct
stages of the fiscal year:
a. Beginning of the Year (by April 1): Design of Results-Framework
Document
b. During the Year (after six months - October 1):Monitor progress against
agreed targets
c. End ofthe year(March31):Evaluateperformanceagainstagreedtargets
29
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
1
Beginningof
Year
PrepareRFD April 1
2 DuringtheYear
Monitor
Progress
October1
3 EndofYear Evaluate
Performance
June1
4.1 Beginning of the Year (byApril 1):Designof Results-
Framework Document
As mentioned earlier, at the beginning of each financial year, with the
approval of the minister concerned, each department prepares a Results-
Framework Document (RFD) consisting of the priorities set out by the
Minister, agenda as spelt out in the party manifesto if any, President’s
Address, announcements/agenda as spelt out by the Government from
time to time. The Minister incharge approves the inter-se priority among
the departmental objectives.
To achieve results commensurate with the priorities listed in the Results-
Framework Document, the Minister approves the proposed activities
and schemes for the ministry/department. The Minister also approves
the corresponding success indicators (Key Result Indicators - KRIs or Key
Performance Indicators -KPIs) and time-bound targets to measure progress
in achieving these objectives.
The Results-Framework Document (RFD) prepared by each department
seeks to address three basic questions:
a. What are department’s main objectivesfor the year?
b. What actions are proposed to achieve these objectives?
c. How to determine progress made in implementing these actions?
RFD is simply a codification of answers to these questions in a uniform and
meaningful format. All RFD documents consist of the six sections depicted
in Figure 2:
Figure1:How doesRFDwork?(TheProcess)
30
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
Section 1
Section 2
Section 3
Section
Section 5
Section
Ministry'sVision,Mission,Objectivesand Functions
Inter se Prioritiesamong key objectives,success
indicatorsand targets.
Trend values of the success indicators
Descriptionand definition of success indicatorsand
proposed measurementmethodology
Specific performancerequirementsfrom other
departments that are critical for delivering agreed
results
6 Outcome/ImpactofactivitiesofDepartment/Ministry
4
Figure2:SixSectionsof Results-FrameworkDocument (RFD)
A typical example of an actual RFD is enclosed at Annex D. In what follows
we will brieflydescribe each of the six sections of RFD.
Section 1:Ministry’sVision,Mission,Objectives
and Functions
This section provides the context and the background for the Results-
Framework Document. Creating a vision and mission for a department is a
significant enterprise. Ideally, vision and mission should be a by-product of
a strategic planning exercise undertaken by the department. Both concepts
are interrelated and much has been written about them in management
literature.
A vision is an idealised state for the Ministry/Department. It is the big
picture of what the leadership wants the Ministry/Department to look
like in the future. Vision is a long-term statement and typically generic
and grand. Therefore a vision statement does not change from year to
year unless the Ministry/Department is dramatically restructured and is
expected to undertake very different tasks in the future. Vision should
never carry the ‘how’ part since the ‘how’ part of the vision may keep on
changing with time.
The Ministry’s/Department’s mission is the nuts and bolts of the vision.
Mission is the ‘who, what and why’ of the Ministry's/Department's
31
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
existence. The vision represents the big picture and the mission represents
the necessary work.
Objectives represent the developmental requirements to be achieved by the
departmentinaparticularsectorbyaselectedsetofpoliciesandprogrammes
over a specific period of time (short/medium/long). For example, objectives
of the Ministry of Health &Family Welfare could include:(a) reducing the rate
of infant mortality for children below five years; and (b) reducing the rate of
maternity death by (30%) by the end of the developmentplan.
Objectives could be of two types: (a) Outcome Objectives specify ends
to achieve, and (b) Process Objectives specify the means to achieve the
objectives. As far as possible, the department should focus on Outcome
Objectives.
Objectives should be directly related to attainment and support of the
relevant national objectives stated in the relevant Five Year Plan, National
Flagship Schemes, Outcome Budget and relevant sector and departmental
priorities and strategies, President’s Address, the manifesto, and
announcement/agenda as spelt out by the Governmentfrom time to time.
Objectives should be linked and derived from the departmental vision and
mission statements and should remain stable over time. Objectives cannot be
added or deleted without a rigorous evidence-based justification.In particular,
a department should not delete an objective simply because it is hard to
achieve. Nor, can it add an objective simply because it is easy to achieve.There
must be a logicalconnection between vision,missionand objectives.
The functions of the department should also be listed in this section. These
functions should be consistent with the Allocation of Business Rules for the
Department/Ministry. Unless they change, they cannot be changed in the
RFD. This section is supposed to reflect the legal/administrative reality as it
exists,and not a wish list.
Section 2:Inter sepriorities among key objectives,
successindicators and targets
This section is the heart of the RFD. Table 1 contains the key elements of
Section 2 and in what follows we describe each column of this Table.
Column 1:Select KeyDepartmental Objectives
From the list of all objectives, departments are expected to select those key
objectives that would be the focus for the current RFD. It is important to be
selective and focus on the most important and relevant objectivesonly.
32
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
Table1:Extract of Section2 from the2013-14RFDof Department of
AgricultureandCooperation
Section 2: Inter se Priorities among Key Objectives, Success Indicators and
Targets Column 1 Column 2 Column 3
Column 4
Objective Weight Action Success Indicator
(1) Increasing crop
production and
productivity
thereby ensuring
food security and
enhanced income
level to farmers
15.50 (1.1) Preparation of tentative
allocations & approval
of State Action Plans for
2013-14
(1.1.1) Approval by
31.05.2013
(1.2) Release of funds to
States/Institutions
(1.2.1) % of funds (R.E.)
released by
31.03.2014
(1.3) Additional foodgrain
production
(1.3.1) Additional
production of 5
million tons over
year (5 year moving
average)
(1.4) Area expansion of
pulses by promoting
pulse cultivation in
rice fallows, as well as
intercrops and summer
crops
(1.4.1) Increase in area by
1.5 lakh ha. over
5 year moving
average
(1.5) NFSM Impact
Evaluation Studies
(1.5.1) Completion of
studies by CDDs and
submission of report
(1.6) Monitoring and review
of BGREI programme in
all 7 States
(1.6.1) State visits twice a
year
(2) Incentivising
states to enhance
public investment
in agriculture
& allied sectors
to sustain and
maintain capital
formation and
agriculture
infrastructure
9.00 (2.1) Incentivise states
to make additional
allocation in agriculture
& allied sectors
(2.1.1) Increase in
percentage points
of States’plan
expenditure in
agriculture and
allied sectors as
per Point No. 3 of
Annexure-II of RKVY
Guidelines
33
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
Column 5 Column 6 Column 7
Unit Weight
Target/Criteria Value
Excellent Very Good Good Fair Poor
100% 90% 80% 70% 60%
3.50 31/05/2013 05/06/2013 10/06/2013 15/06/2013 20/06/2013
3.00 95 85 75 65 60
3.00 5.0 4.8 4.6 4.4 4.2
2.00 1.5 1.3 1.25 1.10 1.00
2.00 30/11/2013 30/12/2013 01/01/2014 28/02/2014 31/03/2014
2.00 14 12 10 08 06
% points 4.00 0.25 0.20 0.15 010 0.05
34
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
The objectives are derived from the Five Year Plan, departmental strategies,
party manifesto, and President’s Address to the Parliament. As depicted in
Figure 3,this is required to ensure vertical alignment between National Vision
as articulated in the National Five Year Plan and departmental objectives.
The objective of the departmental strategy is to outline the path for reaching
the vision. It usually covers five years and needs to be updated as the
circumstances change. Ideally, one should have the departmental strategy in
place before preparing an RFD. However, RFD itself can be used to motivate
departments to preparea strategy.Thisis whatwas done in our case in India.
Figure3:VerticalAlignment of FiveYear Planswith RFDs
Vision
Column 2:AssignRelativeWeightsto Objectives
Objectives in the RFD are required to be ranked in a descending order of
priority according to the degree of significance, and specific weights should
be attached to these objectives. In the ultimate analysis, the concerned
minister has the prerogative to decide the inter se priorities among
departmental objectives and all weights must add up to 100. Clearly, the
Long-TermStrategy
Five-Year
DevelopmentPlan
Results-Framework
Document
Objectives
Policies
Projects/Schemes
35
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
process starts with the departmental secretary suggesting a set of priorities
in her best technical judgment. However, the Minister has the final word as
she represents the will of the people in our form of government.
The logic for attaching specific weights, all adding up to 100%, is
straightforward. For instance, if a department has 15 objectives and, at the
end of the year, the secretary of the department goes to the Minister and
says 'I have achieved 12 out of the 15 objectives'. How is the Minister to
judge secretary’s performance? The answer depends on which of the three
objectives were achieved.Ifthe important core objectives of the departments
were not,then this does not reflectgood performance.
In fact, any evaluation system that does not prioritise objectives is a non-
starter. We know that all aspects of departmental operations are not equally
important. When we have a shared understanding of departmental priorities,
it createsa much greater chance of getting the importantthings done.
Column 3:Identify Means (Actions) forAchieving
DepartmentalObjectives
For each objective, the department must specify the required policies,
programmes, schemes and projects. These also have to be approved by the
concerned minister. Often, an objective has one or more policies associated
with it. An objective represents the desired 'end' and associated policies,
programs and projects represent the desired 'means'. The latter are listed as
'actions' under each objective.
Column 4:DefineSuccessIndicators
For each 'action' specified in Column 3, the department must specify one or
more 'success indicators'.They are also known as 'Key Performance Indicators
(KPIs)'or 'Key Result Indicators (KRIs)'. A success indicator provides a means to
evaluate progress in achieving the policy, programme, scheme and project
objectives/targets. Sometimes more than one success indicator may be
required to tell the entire story. If there are multiple actions associated with
an objective, the weight assigned to a particular objective should be spread
across the relevant success indicators.
The choice of appropriate success indicators is as important as the choice of
objectives of the department. It is the success indicators that are most useful
at the operational level. They provide a clear signal as to what is expected
from the department.
Success indicators are important management tools for driving improvements
in departmental performance. They should represent the main business of
the organisation and should also aid accountability. Success indicators should
considerbothqualitativeandquantitativeaspectsofdepartmentalperformance.
36
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
ImplementationResults
Goals(Impacts)
Outcomes
Outputs
Activities
Inputs
In selecting success indicators, any duplication should be avoided. For
example, the usual chain for delivering results and performance is depicted
in Figure 4.An example of this results chainis depicted in Figure 5.
If we use Outcome (increased literacy) as a success indicator,then it would be
duplicative to also use inputs and activities as additional success indicators.
Ideally, one should have success indicators that measure Outcomes and
Impacts. However, sometimes due to lack of data one is able to only measure
activitiesor output.The common definitions of these terms are as follows:
1. Inputs: The financial, human, and material resources used for the
developmentintervention.
Figure4:TypicalResultsChain
Results-Based Management
Long-term,widespread improvement
in society
Intermediateeffects of outputs on
clients
Products and servicesproduced
Tasks personnel undertaketo
transform inputs to outputs
Financial,human and material
resources
Goal(Impacts)
Outcomes
Outputs
Activities
Inputs
Higher incomelevels;increase access
to higher skill jobs
Increased literacy skill;more
employmentopportunities
Number of adults completing literacy
courses
Literacy training courses
Facilities,trainers,materials
Figure5:An Exampleof ResultsChain
Results-Based Management:Adult Literacy
37
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
2. Activity:Actions taken or work performed through which inputs, such as
funds, technical assistance and other types of resources are mobilised to
produce specific outputs
3. Outputs: The products, capital goods and services that result from a
development intervention; may also include changes resulting from
the intervention which are relevant to the achievement of outcomes.
Sometimes, ‘Outputs’ are divided into two sub-categories – internal and
external outputs. ‘Internal’ outputs consist of those outputs over which
managers have full administrative control.For example, printing a brochure
is considered an internal output as it involves spending budgeted funds in
hiring a printer and giving orders to print a given number of brochures. All
actions required to print a brochure are fully within the manager’s control
and,hence,this action is considered‘Internal’output.However,having these
brochures picked up by the targeted groups and,consequently,making the
desired impact on the target audience would be an example of external
output. Thus, actions that exert influence beyond the boundaries of an
organisation are termedas‘external’outputs.
3. Outcome: The likely or achieved short-term and medium-term effects/
impactof an intervention’s Outputs.
Departments are required to classify SIs into the following categories:
While categories numbered 1-5 are mutually exclusive, a Success Indicator
can also measure qualitative aspects of performance. As can be seen from
the figure given below, management begins where we do not have full
control.Up until that point,we consider it to be the realm of administration.
Figure6:AdministrationversusManagement
SelectingSuccessIndicators
Goal(Impacts) ProgrammeEvaluation
Outcomes
Results Management
External Outputs
InternalOutputs
Inputs
Activities Administration
Input Activity Internal External Outcome Measures
Output Output Qualitative
Aspects
(1) (2) (3) (4) (5) (6)
38
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement Column 5:AssignRelativeWeightsto SuccessIndicators
If we have more than one action associated with an objective, each action
should have one or more success indicators to measure progress in
implementing these actions. In this case we will need to split the weight
for the objective among various success indicators associated with the
objective. The rationale for using relative weights has already been given in
the context of the relative weights for objectives. The same logic applies in
this context as well.
Column 6:SetTargetsfor SuccessIndicators
The next step in designing an RFD is to choose a target for each success
indicator. Targets are tools for driving performance improvements. Target
levels should, therefore, contain an element of stretch and ambition.
However, they must also be achievable. It is possible that targets for radical
improvement may generate a level of discomfort associated with change,
but excessively demanding or unrealistic targets may have a longer-term
demoralising effect. The target should be presented as per the five-point
scale given below:
The logic for using a five-point scale can be illustrated with the following
example. Let us say a Minister (the principal) gives the Secretary (the agent)
a target to build 7000 KMs of road. However, at the end of the year, if the
Secretary reports that only 6850 KMs of roads could be built, then how is
the Minister to evaluate Secretary’s performance? The reality is that under
the present circumstances, a lot would depend on the relationship between
the Minister and the Secretary. If the Minister likes the Secretary, he is likely
to overlook this shortfall in achievement. If, however, the Minister is unhappy
with the Secretary for some reason, then the Minister is likely to make it
a big issue. This potential for subjectivity is the bane of most problems in
the government. The five-point scale addresses this problem effectively. By
having an ex-ante agreement on the scale, the performance evaluation at
the end is automatic and fair. Incidentally, it could be a five, seven or even a
nine point scale. If an evaluation system does not use a scale concept for ex-
ante targets,it is a non-starter.
It is expected that, in general, budgetary targets would be placed at 90%
(Very Good) column. There are only two exceptions: (a) When the budget
requires a very precise quantity to be delivered. For example, if the budget
provides money for one bridge to be built, clearly we cannot expect the
department to build two bridges or 1.25 of a bridge.(b) When there is a legal
mandate for a certain target and any deviation may be considered a legal
breach. In these cases, and only in these cases, the targets can be placed
Excellent Very Good Good Fair Poor
100 % 90% 80% 70 % 60 %
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
under 100%. For any performance below 60%, the department would get a
score of 0 in the relevant success indicator.
The RFD targets should be aligned with Plan priorities and be consistent
with departmental budget as well as the outcome budget. A well-framed
RFD document should be able to account for the majority of the budget.
Towards this end, departments must ensure that all major schemes, relevant
mission mode projects and Prime Ministers flagship programs are reflected
in the RFD.
Teamtargets
In some cases, the performance of a department is dependent on the
performance of one or more departments in the government. For
example, to produce power, the Ministry of Power is dependent on the
performance of the following: (a) Ministry of Coal, (b) Ministry of Railways,
(c) Ministry of Environment and Forest, and (d) Ministry of Heavy Industry
(e.g. for power equipment from BHEL). Therefore, in order to achieve the
desired result, it is necessary to work as a team and not as individuals. Hence,
the need for team targets for all five Departments and Ministries.
For example, if the Planning Commission fixes 920 BU as target for power
generation, then two consequences will follow. First, RFDs of all five
departments will have to include this as a ‘team target'. Second, if this
Figure7:HorizontalAlignment amongDepartments
Vision
Long-Term Strategy
Objectives Objectives Objectives Objectives Objectives
Five-Year Development
Projects/ Projects/ Projects/ Projects/ Projects/
Schemes Schemes Schemes Schemes Schemes 39
RFD of RFD of RFD of RFD of RFD of
Department 1 Department 2 Department 3 Department 4 Department ...
40
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
‘team target’ is not achieved, all five departments will lose some points
at the time of evaluation of RFDs. The relative loss of points will depend
on the weight for the team target in the respective RFDs. To illustrate, let
us imagine the following. The RFD for Ministry of Coal has two types of
targets, one deals with coal production and other with ‘team target for
Table2:ExampleofTeamTargetin the2013-14RFDof Ministryof Coal.
TEAM
TARGET
Section 2: Inter se Priorities among Key Objectives, Success Indicators and Targets
Column 1 Column 2 Column 3 Column 4 Column
5
Objective Weight Action Success Indicator Unit
(29) GPS based
tracking of
transportation
of coal
1.00 (29.1) Installation
of GPS by
31.03.2014
(29.1.1) All
companies
have floated
tenders for
installing
GPS
No. of
subsidiaries
(30) Joint
responsibility
for power
generation
5.00 (30.1) Give necessary
support and
clearance
(30.1.1) Additional
capacity
installed
MW
(30.1.2) Total power
generated
BU
* Efficient
functioning of the
RFD System
3.00 Timely submission of
draft RFD 2014-15 for
approval
On-time submission Date
Timely submission of
results for 2012-13
On-time submission Date
* Transparency/
Service delivery
Ministry/
Department
3.00 Independent audit
of implementation
of Citizens’/Clients’
Charter (CCC)
% of
implementation
%
Independent audit
of implementation
of Public Grievance
Redressal System
% of
implementation
%
* Administrative
reforms
6.00 Implement mitigating
strategies for reducing
potential risk of
corruption
% of
implementation
%
Implement ISO 9001
as per the approved
action plan
% of
implementation
%
Implement Innovation
Action Plan (IAP)
% of milestones
achieved
%
Identification of
core and non-core
activities of the
Ministry/Department
as per 2nd ARC
recommendations
Timely submission Date
41
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
power generation'. They have a weight of 15 % and 2 % respectively. Now
if the target of 920 BU for power generation is not achieved, even if the
target for coal production is achieved, Ministry of Coal will still lose 2%. An
actual example of Team Target in the RFD of Ministry of Coal in the year
2013-14 is reproduced below:
Column 6 Column 7
Weight Excellent Target/Criteria Value
Very Good Good Fair
Poor
100% 90% 80% 70% 60%
1.00 7 6 5 4 3
3.00 18500 18000 17000 16000 1500 0
2.00 1030 1000 950 920 900
2.00 05/03/2014 06/03/2014 07/03/2014 08/03/2014 11/03/2014
1.00 01/05/2013 02/05/2013 03/05/2013 06/05/2013 07/05/2013
2.00 100 95 90 85 80
1.00 100 95 90 85 80
1.00 100 95 90 85 80
2.00 100 95 90 85 80
2.00 100 95 90 85 80
1.00 27/01/2014 28/01/2014 29/01/2014 30/01/2014 31/01/2014
42
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
The logic is that all team members must ensure (like relay race runners) that
the entire chain works efficiently. To borrow an analogy from cricket, there
is no consolation in a member of the team scoring a double century if the
team ends up losing the match. That is, the departments included for team
targets will be responsible for achieving the targets jointly. This is one of the
ways in which RFDs try to ensure horizontal alignment and break away from
the silo mentality.
Section 3:Trend values of the successindicators
For every success indicator and the corresponding target, RFD must provide
actual values for the past two years and also projected values for two years
in the future as given inTable 3.
Table3:Extract of Section3 from the2013-14RFD
of Department of Agricultureand Cooperation
Section 3: Trend Values of the Success
Indicators Column 1 Column
2
Column 3
Objective Action Success Indicator
(1) Increasing crop production
and productivity thereby
ensuring food security and
enhanced income level to
farmers
(1.1) Preparation of tentative
allocations & approval
of State Action Plans for
2013-14
(1.1.1) Approval by 31.05.2013
(1.2) Release of funds to
States/Institutions
(1.2.1) % of funds (R.E.)
released by 31.03/2014
(1.3) Additional foodgrain
production
(1.3.1) Additional production
of 5 million tons over
year (5 year moving
average)
(1.4) Area expansion of pulses
by promoting pulse
cultivation in rice fallows,
as well as intercrops and
summer crops
(1.4.1) Increase in area by 1.5
lakh ha.over 5 year
moving average
(1.5) NFSM Impact Evaluation
Studies
(1.5.1) Completion of
studies by CDDs and
Submission of report
(1.6) Monitoring and review of
BGREI programme in all
7 States
(1.6.1) State visits twice a year
(2) Incentivising states
to enhance public
investment in agriculture
& allied sectors to sustain
and maintain capital
formation and agriculture
infrastructure
(2.1) Incentivise states to make
additional allocation
in agriculture & allied
sectors
(2.1.1) Increase in percentage
points of states’
plan expenditure in
agriculture and allied
sectors as per Point
No. 3 of Annexure-II of
RKVY Guidelines
43
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
Section 4:Description and definition of success
indicators and proposedmeasurement
methodology
RFD contains a section with detailed definitions of various success indicators
and the proposed measurement methodology. Wherever possible, the
rationale for using the proposed success indicators may be provided.
Abbreviations/acronyms of policies, programmes, schemes used may also
be elaborated in this section.
Column 4 Column 5 Column 6 Column 7 Column 8 Column 9
Unit Actual
Value for
FY 11/12
Actual
Value for
FY 12/13
Actual
Value for
FY 13/14
Actual
Value for
FY 14/15
Actual
Value for
FY 15/16
Date 31/05/2011 18/05/2012 31/05/2013 31/05/2014 31/05/2015
% 98 - 90 90 90
Million Tons .. 12.59 5.0 5.2 5.4
Area .. .. 1.5 1.75 2.00
Date .. .. 30/11/2013 .. ..
No. of visits 10 12 14 14 15
% Points 0.9 - 0.25 0.25 0.25
44
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement Section 5:Specific performance requirements from
other departments that are critical for
delivering agreed results
This section should contain expectations from other departments that
impact on the department’s performance. These expectations should be
mentioned in quantifiable,specific,and measurable terms.
Table4:Extract of Section5 from the2013-14RFDof Ministryof Coal
Section 5: Specific Performance Requirements from other Departments
Column 1 Column 2 Column 3 Column 4 Column 5
Location Type State
Organisation Organisation Relevant Success
Type Name Indicator
Central
Government
Ministry Ministry of
Environment
and Forests
(27.1.1) Holding
quarterly
meetings
Ministry of Law
and Justice
(1.3.2.1) Vetting of
notifications
under section
4(1),7(1),9(1)
and 11(1)
of the CBA
(A&D) Act,
1957 will be
done within
the prescribed
time on receipt
of each
45
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
The purpose of this section is to promote horizontal alignment among
departments and overcome the tendency of working in silos. This
section allows us to know, ex-ante, the requirements and expectations of
departments from each other. This section is a complement to the concept
of‘teamtargets’discussed above.
Column 6 Column 7 Column 8 Column 9
What is your
Requirement
from the
Organisation
Justification for
this
Requirement
Please Quantify
your
Requirement
from this
Organisation
What Happens if
your Requirement
is not Met.
MoEF is required to
review the status of
providing EC and FC
proposals at their
level for expediting
the
same.Similarly time
taken in conveying
the approvals
after the same are
recommended by
EAC/FAC needs to
be reviewed for
avoiding delays in
communicating
administrative
approvals. Further,
MoEF is required
to give priority for
online processing of
EC and FC proposals.
Flow chart for steps
involved in EC process
is enclosed as per
Annexure-IV.
In the absence of EC
and FC clearance,
projects will not
progress further.
All the proposals
pending with MoEF
Commencement of
subsequent activities
will not start leading
to delay in projects.
Ministry of Law and
Justice is required
to vet notifications
within prescribed
time
Vetting by Ministry
of Law and justice
is necessary before
processing the cases
further.
All the cased related
to land acquisition.
Notification cannot
be issued within
prescribed time
without vetting.
46
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement Section 6:Outcome/Impact of activities of
Department/Ministry
This section should contain the broad outcomes and the expected impact
the Department/Ministry has on national welfare. It should capture the very
purpose for which the Department/Ministry exists and the rationale for
undertaking the RFD exercise.
The Department’s evaluation will be done against the targets mentioned
in Section 2 in RFD. The whole point of Section 6 in RFD is to ensure that
Departments/Ministries serve the purpose for which they were created in
the firstplace.
*NA = Not Available
Table5:Extract of Section6 from the2013-14RFD
of Department of AIDSControl
Section 6: Outcome/Impact of Department/Ministry
Column 1 Column 2 Column 3
Jointly responsible for
Outcome/Impact of
influencing this
outcome/
Department/Ministry
impact with the following Success
Indicator department (s)/
ministry (ies)
1.Survival of AIDS patients
on ART
% of adults and children with
HIV known to be on treatment
at 24 months after initiation of
antiretroviral therapy at select
ART centres
2.Reduction in estimated
AIDS related deaths
Estimated number of annual
AIDS related
3.Reduction in estimated new
HIV Infections
Estimated number of annual
new HIV infections
4.Improved Prevention of
Parent of Child Transmission
Department of Health and
Family Welfare
5. Improved prevention of
AIDS in High Risk Group
(HRG)
Coverage of HRG through
Targeted Interventions (TIs) -
Female Sex Worker
Coverage of HRG through
TIs - Males having Sex with
Males (MSM) (including
Transgenders)
Coverage of HRG through TIs -
Injecting Drug Users (IDUs)
6.Improved health seeking
behaviour of HRGs
% HRGs who received HIV test
47
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
The required information under Section 6 should be entered in Table 5.
Column 1 of Table 5 is supposed to list the expected outcomes and impacts.
It is possible that these are also mentioned in the other sections of the RFD.
Even then they should be mentioned here for clarity and ease of reference. For
example, the purpose of Department of AIDS Control would be to ‘control the
spread of AIDS’. Now it is possible that AIDS Control may require collaboration
between several departments like Health and Family Welfare, Information and
Broadcasting, etc. In Column 2, all Departments/Ministries jointly responsible
for achieving national goals are required to be mentioned. In Column 3, the
Department/Ministry is expected to mention the success indicator(s) to
measure the Department’s outcome or impact. In the case mentioned, the
success indicator could be ‘Percentage of Indians infected with AIDS’. Columns
5 to 9 give the expected trend valuesfor various success indicators.
Column 4 Column 5 Column 6 Column 7 Column 8 Column 9
Unit FY 11/12 FY 12/13 FY 13/14 FY 14/15 FY 15/16
% NA NA NA NA NA
No. 1,47,729 NA NA NA NA
No. 1,16,456 NA NA NA NA
NA NA NA NA
% 81 NA NA NA NA
% 64 NA NA NA NA
% 80 NA NA NA NA
% 40 NA NA NA NA
48
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement 4.RFDDesignProcess
Once the RFDhas been preparedand approvedby the concerned minister,it
goes through the cycle depicted in Figure 8 and explainedbelow:
Figure8:QualityAssurance Processfor RFD
Minister
approves RFD
Departments
send RFDto
Cabinet Secretariat
RFDsreviewed by
PMD and ATF
Departments
incorporate PMD/
ATFsuggestions
RFDsapproved by
HPC on Government
Performance
Departments place
RFDson Departmental
Websites
1 Step 1: Minister
Approves
the RFD
The process starts with the
secretarymaking a draft RFD and
proposing for approval of the
concernedminister.It is primarily
the responsibilityof the minister
to ensure that RFD includes all the
importantprioritiesof the
government.
2 Step 2: Draft RFD
sent to
Cabinet
Secretariat
The PerformanceManagement
Division(PMD),Cabinet
Secretariat,examinesthe draft for
their quality and consistencywith
RFD Guidelines.Thecritiquesof all
drafts are preparedbased on the
RFD Evaluationmethodology
(REM).Thismethodologyallowsus
to quantify the quality of an RFD
based on agreed criteria(Annex E).
49
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
An RFD approved by HPC, for the year 2012-13, for the department of
Agriculture and Cooperation,is enclosed at Annex D.
3 Step 3: Review by PMD
and Ad-Hoc
Task Force
(ATF)
ATF consists of distinguished
academicians,former Secretaries
to Governmentof India,former
chiefs of large public enterprises
and private sector domain
experts (Annex F).This is a non-
governmentbody that reviews
the critiquesand vets the draft
RFDs.These commentsare
conveyed to departments during
meetings with departmental
secretaries.
4 Step 4: Departments
incorporateATF
comments and
revise RFD
drafts
Based on the minutes of the
meetings with ATF members,
departments modifydraft
RFDs and resubmit them for
approval by the High Power
Committee (HPC) on Government
Performancechaired by the
Cabinet Secretary.For details of
HPC see Annex G.
5 Step 5: Approval by
HPC on
Government
Performance
The HPC consists of Secretary
Finance,Secretary Expenditure,
and Secretary Planning.They are
responsiblefor ensuring that all
main targets in RFDs are
consistent with targets in the
budget as well as the FiveYear
Plan.This is also an occasion to
resolve any differencesbetween
ATF and departments.
6 Step 6: RFDs placed
on
departmental
websites
RFDs approved by HPC are placed
on the respective departmental
websites.In addition,they are also
placed on the PMD website
(www.performance.gov.in).
50
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement 2. During the Year (aftersixmonths -October1):
Monitor progressagainst targets
After six months, the RFD as well as the achievements of each Ministry/
Department against the performance goals laid down, may have to be
reviewed and the goals reset, taking into account the priorities at that point
of time. This enables the Government to factor in unforeseen circumstances
such as droughtconditions,natural calamitiesor epidemics.
3. End of the year (March31):
Evaluation of performance against agreedtargets
At the end of the year, we look at the achievements of the government
departments, compare them with the targets, and determine the composite
score. Table 6 provides an example from the health sector. For simplicity, we
have taken only one objectiveto illustratethe evaluation methodology.
The Raw Score for Achievement in Column 6 of Table 6 is obtained by
comparing the achievement with the agreed target values. For example,
the achievement for first success indicator (percentage increase in primary
health care centres) is 15%. This achievement is between 80% (Good) and
70% (Fair) and hence the‘raw score is 75%'.
The Weighted Raw Score for Achievement in Column 6 is obtained by
multiplying the raw score (column 7) with the relative weights (column 4).
Thus for the first success indicator, the weighted raw Score is obtained by
multiplying 75% by 0.50. This gives us a weighted score of 37.5%. Finally,
the composite score is calculated by adding up all the weighted raw scores
(column 8) for achievements. In Table 6, the composite score is calculated to
be 84.5%.
The composite score shows the degree to which the Government
Department in question was able to meet its objective. The fact that it got a
score of 84.5 % in our hypothetical example implies that the Department’s
performancevis-a-vis this objectivewas rated as 'Very Good'.
The methodology outlined above is transcendental in its application.
Various Government Departments will have a diverse set of objectives
Departmental Rating Valueof Composite Score
Excellent= 100% - 96%
Very Good = 95% - 86%
Good= 85 - 76%
Fair = 75% - 66%
Poor = 65% and below
51
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
Table6:ExampleofPerformanceEvaluationattheEndofthe
YearColumn7Column8
Weighted
RawScore
37.5%
27%
20%
84.5%
Raw
Score
75%
90%
100%
CompositeScore=
ColumnSColumn6
Achievements
15
18
600
TargetICriteriaValues
Poor
60%
5
12
250
Fair
70%
10
14
300
Good
80%
20
16
400
Very
Good
90%
25
18
450
Excellent
100%
30
20
500
Column4
Weight
.50
.30
.20
Column3
Unit
%
%
%
Criteria/Success
Indicators
%Increaseinnumberofprimary
healthcarecentres
%Increaseinnumberofpeople
withaccesstoaprimaryhealth
centrewithin20KMs
NumberofhospitalswithISO9000
certificationbyDecember31,2009
1
2
3
Column2
Action
Improve
Accessto
Primary
Health
Care
Column1
Objective
Better
Rural
Health
52
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
and corresponding success indicators. Yet, at the end of the year every
Department will be able to compute its composite score for the past year.
This composite score will reflect the degree to which the Department was
able to achieve the promised results.
The actual achievement and the composite score for the sample RFD for
Department of Agriculture and Cooperation given in Annex D is enclosed
at Annex H. Today all Departments are also required to include the RFD
and corresponding results at the end of the year in the Annual Reports of
respective Departments. These annual reports of individual Departments
are placed in the Parliament every year.
The results for the year 2011-12 are summarised as a pie chart below:
Figure9:Resultsforthe year2011-12
Resultsfor 2011-2012
From the above it is clear the system is stabilising as the results were
distributednormally.
Of all the state governments, Kerala has taken the lead in completing two full
cycles of RFD and declaring its results widely through a Government order
as show in Figure 10 (page 48)
5.Why isPMESrequired?
Systems prior to introduction of PMES suffered from several limitations. The
Government examined these limitations and designed PMES to overcome
these limitations.Some examples of these limitations follow:
Excellent= (100% - 96%)
Very Good = (86% to 95%)
Good = (76% to 85%)
Fair = (66% to 75%)
Poor = (65% and Below)
Fair
18% Very Good
37%
Good
28%
Poor
9%
Excellent
8%
53
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
1. Thereisfragmentation of institutional responsibility
for performance management
Departments are required to report to multiple principals who often
have multiple objectives that are not always consistent with each
other. A Department could be reporting to the Ministry of Statistics and
Programme Implementation on important programmes and projects;
Department of Public Enterprises on the performance of PSUs under
it; Department of Expenditure on performance in relation to Outcome
Budgets; Planning Commission on plan targets; CAG regarding the
procedures, processes, and even performance; Cabinet Secretariat on
cross cutting issues and issues of national importance; minister in-charge
on his priorities; Standing Committee of the Parliament on its annual
report and other political issues; etc.
2. Fragmented responsibilityfor implementation
Similarly, several important initiatives have fractured responsibilities for
implementation and hence accountability for results is diluted. For example,
e-governance initiatives are being led by the Department of Electronics and
Information Technology, Department of Administrative Reforms and Public
Grievances,NIC, as well as individualministries.
3. Selectivecoveragewith time-lag in reporting
Some of the systems are selective in their coverage and report on
performance with a significant time-lag. The comprehensive Performance
Audit reports of the CAG are restricted to a small group of schemes and
institutions (only 14 such reports were laid before the Parliament in 2008)
and come out with a substantial lag. Often, by the time these reports are
produced, both the management and the issues facing the institutions
change. The reports of enquiry commissions and special committees set-
up to examine performance of Government Departments, schemes and
programmes suffer from similar limitations.
4. Mostperformance management systemsare
conceptually flawed
As mentioned earlier, an effective performance evaluation system is at
the heart of an effective performance management system. Typically,
performance evaluation systems in India suffer from two major conceptual
flaws.Firstthey list a large number of targets that are not prioritised.
54
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
Figure10:Publicdeclaration of Results byGovernmentof Kerala
GOVERNMENTOF KERALA
Abstract
Planning & Economic Affairs (CPMU) Department - Performance Monitoring and Evaluation
System - Results Frameworks Document Evaluation Report (2012-13) of 35 Administrative
Departments - Approved - Orders issued.
Planning & Economic Affairs (CPMU) Department
GO(MS)No.42/2013/Plg. Dated, Thiruvananthapuram :07.08.2013,
Read: GO(MS) No.24/13/Plg dtd 27.03.2013.
ORDER
Results-Framework Documents is a part of the Performance Monitoring and Evaluation System
(PMES) to monitor and evaluate the performance of the Government Departments.RFD includes
the agreed objectives, policies, programmes and projects along with the success indicators and
targets to measure the performance in implementing them.The document is to be prepared by
each department at the beginning of every financial year.
Vide paper read above,Govt.have approved the RFD 2012-13 of 35 Administrative Departments.
As per the guidelines of Results-Framework Documents, the concerned Administrative
Departments have carried out the evaluation of the achievement of targets mentioned in
their Results-Framework Documents for the year 2012-13 and submitted the evaluation report
online to the Planning and Economic Affairs Department.
The department-wise composite scores are as follows.
SL. No. Name of Department Composite Score
1 Agriculture 69.83
2 Animal Husbandry 70.98
3 Co-operation 78.14
4 Cultural Affairs 86.5
5 Environment 30.94
6 Excise 85.28
7 Finance 78.09
8 Fisheries 75.44
9 Food, Civil Supplies & Consumer Affairs 53.39
10 Forest 68.83
11 General Administration 68.27
12 General Education 59.91
13 Health & Family Welfare 87.05
14 Higher Education 68.88
15 Housing 42.27
16 Industries & Commerce 71.55
17 Information & Public Relations 65.77
18 Information Technology 71.35
19 Labour & Rehabilitation 51.23
20 LSGD 61.08
55
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
Government, after examining in detail the Evaluation Report of Results-Framework
Documents 2012-13 of each Administrative Department are pleased to approve the
scores as mentioned above.
Government have approved in principal to use the concept of Results-Framework
Documents to improve the performance of departments and not to grade them.
Further it is not indicative of the level of performance.
(By Order of the Governor)
Rachna Shah,
Secretary (Planning)
To
All Additional Chief Secretaries,Principal Secretaries and Secretaries
Dr.Prajapati Trivedi,Secretary,PMD,Cabinet Secretariat,Government
of India (with C/L)
Performance Management Division, Cabinet Secretariat,Govt.of India.
All Heads of Departments
All Districts Collectors
Private Secretary to Hon’ble Chief Minister
Private Secretary to all Ministers
Copy to
Additional Secretary to Chief Secretary
PA to Principal Secretary to Govt.(Planning)
CA to Additional Secretary &Director,(CPMU)
Stock file/OC.
Forwarded / By Order
Section Officer
21 NORKA 62.54
22 P&ARD 51.17
23 Planning & Economic Affairs 76.35
24 Ports 40.54
25 Power 63.03
26 PWD 73.76
27 Registration 76.37
28 Revenue 51.6
29 SC/ST Development Department 66.2
30 Social Welfare 30.5
31 Sports & Youth Affairs 52.33
32 Taxes 76.75
33 Tourism 67.09
34 Transport 23.75
35 Water Resources 55.52
56
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
Hence, at the end of the year it is difficult to ascertain performance. For
example, simply claiming that 14 out of 20 targets were met is not enough.
It is possible that the six targets that were not met were in the areas that are
the most important areas of the department’s core mandate.This is the logic
for using weights in RFDs.
Similarly, most performance evaluation systems in the Government use
single-point targets rather than a scale. This is the second major conceptual
flaw and it makes it difficult to judge deviations from the agreed target. For
example, how are we to judge the performance of the department if the
target for rural roads for a particular year is 15000 KMs and the achievement
is 14500 KMs?
In the absence of explicit weights attached to each target and a specific scale
of deviations, it is impossible to do a proper evaluation. This is the reason
why a five-pointscale and weights are used for RFDs.
As can be seen from the example in Table 6, evaluation methodology
embedded in RFDs is a significant improvement over the previous
approaches. Once we are able to prioritise various success indicators based
on government’s prevailing priorities and agree on how to measure deviation
from the target, it is easy to calculate the composite score at the end of the
year. In the above hypothetical example, this composite score is 84.5%. The
ability to compute a composite score for each department at the end of the
year is the most important conceptual contribution of RFD methodology
and makesit a forerunner amongst its peers.
This conceptual approach brings the Monitoring and Evaluation of
Government Departments in India into a distinctly modern era of new public
management. It creates benchmark competition amongst Government
Departments and, we know, competition is the source of all efficiency. The
composite score of departments measures the ability of the departments
to meet their commitments. While the commitments of the Department of
Road Transport are very different from those of the Department of School
Education, we can still compare their managerial ability to achieve their
agreed targets.
In the absence of the evaluation methodology embedded in RFDs,
Government Departments were often at the mercy of the most powerful
individuals in the government. Different individuals in the government
could always find something redeeming for those departments they
favoured and something amiss for those departments not in their good
books. Thus, depending on the perspective, departments could be
simultaneously good, bad or ugly. Even when the evaluators were being
objective, others could easily allege subjectivity in evaluation because the
pre-RFD evaluation methodology was severely flawed.
57
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
The result was, as in many governments around the world, the views of
the most powerful person prevailed in the end. This kind of subjective
personalised approach often seems to work in the short run because of the
so called ‘audit effect.’ Compared to no system, even a flawed M & E system
upon introduction can have some temporary beneficial effect on behaviour
because officials are now mindful that they are being audited (watched).
However, officials are as clever as anyone else and they realise very soon
that the evaluation system is subjective, selective and non-scientific. Once
this realisation dawns upon them, they go back to their old habits. Thus, the
labour-intensive M &E by hauling up departments for endless presentations
is often a counter-productive strategy in the long run.It leads to M&E fatigue
and eventually a calculated disregardfor such M&E systems in government.
RFDs also differ from previous efforts in another fundamental way.Compared
to previous approaches, RFDs represent the most comprehensive and
holistic evaluation of government departments. To understand this point let
us look at Figure 11. As can be seen from this Figure, there is a fundamental
difference between Monitoring and Evaluation. Just because they are
referredtogether as M&E,the distinction is often lost.
Figure11:Evolution of EvaluationInstrumentsin Government
M & E
Budget
Performance
Budget
Outcome
Budget
RFD
Financial Inputs
Monitoring Evaluation
The process of evaluation, on the other hand, helps us arrive at the bottom-
line for the evaluated organisations. A sound evaluation exercise should
inform us whether the performance of an entity is good, bad or ugly.
Monitoring, on the other hand, allows an entity to determine whether we
are on track to achieving our bottom-line.
For example, as a passenger travelling from point A to point B, we care about
the on-time departure and arrival of the plane, experience of cabin service
during the flight, the cost of journey, etc. If these parameters are to our
Financial Inputs
Activities
Outputs
Outcomes
Non-financial
Outcomes
Financial Inputs
Activities
Outputs
Outcomes
Financial Inputs
Activities
Outputs
58
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
satisfaction, we come to the conclusion that we had a ‘good’ flight. However,
to achieve this result, the captain of the flight has to monitor a large number
of diverse parameters—headwinds, tailwinds, outside temperature, inside
temperature,fuel levels,fuel distribution,weight distribution,etc.
Monitoring and Evaluation require different perspectives and skills.
‘Monitoring’is concerned with the means to achieve a desirable end,whereas,
‘Evaluation’is concerned with the end itself. In government,we often confuse
between the two and end up doing neither particularly well. By making a
clear distinction between the two, RFD approach has contributed to a more
effective performance evaluation of Government Departments for the first
time since independence.
Not to say that ‘evaluation’ of Government Departments did not happen
in Government of India before the introduction of RFD. Like most other
countries, ‘budget’ was the main instrument of evaluating Government
Departments. The bottom line was the size of the budget and a department’s
performance was determined by its ability to remain within budget.
Dissatisfaction with the narrow focus on‘financial inputs’as a success indicator,
however, led to the adoption of ‘Performance Budgets,’ which broadened the
scope of evaluation exercise to include ‘activities’ and ‘outputs,’ in addition to
financial inputs. With further advances in evaluation, experts began to focus
on the outcomes and hence in 2006, Government of India adopted ‘Outcome
Budget.’ In 2009, RFD further expanded the scope of departmental evaluation
and included non-financial outcomes, in addition to all others in the
Outcome Budget. Thus, RFD represents the most comprehensive definition
of departmental performance. It includes static and dynamic aspects of
departmental performance; long-term and short-term aspects; financial and
non-financial aspects of performance; as well as quantitative and qualitative
aspects of performance.That is to say, that while RFD still belongs to the genre
of approaches that fall under the rubric of Management by Objective (MBO)
approaches, it has the most sophisticated evaluation methodology and has
the most comprehensivescopecompared to its predecessors.
This is not an insignificant point. Many governments have tried using a
selective approach by focusing on a few key aspects of departmental
performance. This approach leads to the famous ‘water-bed’ effect. Those
areas of the department that are under scrutiny may improve but the rest of
the department slackens and eventually the whole department suffers. Even
if a government wants to focus on a few items of particular interest to them,
it is best to give these items (action points) higher weights in the RFD and
not take them out of RFD for special monitoring.
In the next section we will see that by adopting RFD approach for managing
departmental performance,India has moved into a very distinguished league
of reformersand,indeed,represents current international best practice.
59
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
6.What isthe international experience in thisarea?
6.1 Similar policiesused widely in developedand
developing countries
The inspiration for this policy is derived from the recommendations of the
Second Administrative Reform Commission (ARC II). As mentions before, in
the words of Second Administrative Reform Commission (ARC II):
‘Performance agreement is the most common accountability
mechanism in most countries that have reformed their public
administrationsystems'.
'At the core of such agreements are the objectives to beachieved, the
resources provided to achieve them, the accountability and control
measures, and the autonomy and flexibilities that the civil servants
willbegiven'.
Similar policies are being used in most OECD countries.The leading examples
of this policy come from New Zealand, United Kingdom and USA. In the USA,
the US Congress passed a law in 1994 called the Government Performance
Results Act. Under this law the US President is obliged to sign a Performance
Agreement with his Cabinet members. In the UK, this policy is called Public
Service Agreement. In developing countries, the best examples come from
Malaysia and Kenya.
The table summarises the international experience with regard to
approaches similar to India’s policy of Results-Framework Document(RFD):
Table7:Summaryof international experiencesin GPM
Country Brief description of the system
Australia
All line departments in Australia operate in the agency mode.The Public
Service Act of 1999 includes a range of initiatives that provides for improving
public accountability for performance, increasing competitiveness and
enhancing leadership in the agencies. These initiatives include:
a. public performance agreements (similar to RFDs) for the Agency Heads
b. replacement of out-of-date hierarchical controls with more contemporary
team-based arrangements
c. greater devolved responsibility to the agency levels
d. giving agencies flexibility to decide on their own systems for rewarding
high performance
e. streamlined administrative procedures
f. a strategic approach to the systematic management of risk.
The Financial and Accountability Act,1997 provides the accountability and
accounting framework for the agencies. Under this Act,the Agency Heads are
given greater flexibility and autonomy in their financial management.The Act
requires Agency Heads to manage resources in an efficient, effective and
ethical manner.
60
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement Country Brief description of the system
Brazil
In Brazil,the most advanced form of performance management is found
at the State level. In the state of Minas Gerais,heads of Government
Departments are required to have a ResultsAgreement (RA).While similar to
RFDs,these RAs are for a period of four years,with yearly reviews,while the
Indian RFDs are negotiated for one year at a time.The achievement scores
against commitments made in ResultAgreements are published documents
and available to the public through the official websites in Minas Gerais.
Result Agreements (RAs) in Minas Gerais also extend to city administrations
and cover the quality of expenditure by the secretariat.In addition, these
ResultAgreements contain provisions for performance-related pay.
The 2007 innovation in RA was the cascading of the RA into two levels: first
level between the State Governor and the heads of State Secretariats and
agencies, focused on results of impactfor society;and the second level
between the heads of agencies and their respective teams,identifying clearly
and objectively the contribution of each staff member to the achievement of
results.
Bhutan
Of all the developing countries, the recent adoption of Performance
Agreements (PA) in Bhutan is the most impressive. Performance Agreements in
Bhutan are signed between the Prime Minister and the respective ministers.A
sample PA from Bhutan is enclosed at Flag C.Bhutan examined the
international experience, including Indian experience with RFDs,and decided
to improve on all previous approaches. The Harvard-educated Prime Minster
is credited with this policy.We have done a comparison of the quality of
Performance Agreements in Bhutan with the quality of RFDs in India
and found the Bhutanese PAs to be ahead of us in India.That is why I have
included them in this list.
Canada
Canada has a long tradition of Government Performance Management.It
introduced a basic performance management system as far back as 1969 and
has a very sophisticated system of performance management.In addition
to Performance Contracts(PC) with departmental heads, which are similar to
RFD,it has a ManagementAccountability Framework (MAF) thatalso measures
the quality management and leadership of the department. Both systems are
linked to the Performance Measurement Framework (PMF) and,unlike India,
have statutory backing.
Denmark
In Denmark, the contract management approach is seen as a major
contribution to performance management.Four year contracts (similar to
RFDs) incorporating agreed performance targets are negotiated between
specified agencies and parent Ministries and are monitored annually by the
parent Ministry and Ministry of Finance.
France
The Centres de Responsabilite in France is another example.Since 1990,many
State services at both central and devolved levels have been established
as Responsibility Centres in France.A contract with their Ministry gives the
Directors greater management flexibility in operational matters in exchange
for a commitment to achieve agreed objectives. It also stipulates a method for
evaluating results.Contracts,negotiated case by case,are for three years.
61
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
Country Brief description of the system
Indonesia
Indonesia implements a system of Performance Contracts (PCs) for all
Ministries.However,while in India, these PCs in form of RFD are signed
between the Secretary and the Minister,in Indonesia these are signed by
the concerned Minister with the President. In both cases,the purpose is to
translate vision into reality.While one ministry is in charge of a programme,
other Ministries whose support in providing complementary inputs are
identified. These PCs monitor inputs, activities,outputs and outcomes.
The central responsibility for drawing up PC rests with President’s Delivery Unit
(UKP4) in Indonesia. This is a Delivery Unit under the President of Indonesia.
Some provinces in Indonesia have opted for system followed by the President’s
Delivery Unit and they are allowed access to their online system.
Kenya
Performance Contract System (PCS) in Public Service was introduced in Kenya
in 2004 as part of Economic Recovery Strategy for Wealth and Employment
Creation:2003-07.The system got a fillip from the new government under
the widely-held perception of bureaucratic delays,inefficiency, emphasis on
processes rather than on results,lack of transparency and accountability,
inadequate trust and instances of huge surrender of funds.
The Performance Contracting System of Kenya has Public Service Award from
UNDP and Innovation Award from the Kennedy School of Government,
Harvard University.Considered among the most sophisticated government
performance management systems it draws on the experience of leading
practitioners of New Public Management (Australia,UK,New Zealand, etc.).Its
coverage is also impressive. It covers almost all entities getting support from
treasury.
Malaysia
The Programme Agreement system designed by Prime Minister Mahathir
Mohammed was a pioneering effort in the developing world.Every public
entity had to have a programme agreement that specified the purpose of the
existence of that agency and the annual targets in terms of outputs and
outcomes. This government performance system is credited by many experts
for turning Malaysia from a marshy land to almost a developed country.
New
Zealand
New Zealand was one of the first countries to introduce an annual
performance agreement (similar to RFDs) between Ministers and permanent
secretaries (renamed as chief executives) who,like India, are directly
responsible to the minister.
The Public Finance Act of 1989 provided for a performance agreement to be
signed between the chief executive and the concerned Minister every year.
The Performance Agreement describes the key result areas that require the
personal attention of the chief executive. The expected results are expressed
in verifiable terms,and include output-related tasks.The chief executive’s
performance is assessed every year with reference to the performance
agreement.The system provides for bonuses to be earned for good
performance and removal for poor performance.The assessment is done by
a third party – the State Services Commission. Due consideration is given to
the views of the departmental Minister.A written performance appraisal is
prepared.The chief executive concerned is given an opportunity to comment,
and his/her comments form part of the appraisal.
62
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement Country Brief description of the system
The annual purchase agreement for outputs between the minister and
the department or agency complements Performance Agreements.The
distinction between service delivery (outputs) and policy (outcomes) clarifies
accountability. Departments or agencies are accountable for outputs; Ministers
are accountable for outcomes. Purchase agreements specify outputs to be
bought, as well as the terms and conditions surrounding the purchase, such as
the procedure for monitoring, amending and reporting.
United
Kingdom
In UK,a framework agreement specifies each agency’s general mission and the
responsibilities of the Minister and chief executive. A complementary annual
performance agreement between the Minister and chief executive sets out
performance targets for the agency.Setting targets is the responsibility of the
Minister.Agencies are held accountable through quarterly reports to the
Minister.
Under Tony Blair,UK went on to implement Public Service Agreements (PSAs),
which detail the aims and objectives of UK Government Departments for a
three-year period. Such agreements also“describe how targets will be
achieved and how performances against these targets will be measured.”The
agreement may consist of a departmental aim,a set of objectives and targets,
and details of who is responsible for delivery.The main elements of PSA are as
follows:
1. An introduction, setting out the Minister or Ministers accountable for
delivering the commitments, together with the coverage of the PSA, as
some cover other departments and agencies for which the relevant
Secretary of State is accountable;
2. The aims and objectives of the department or cross-cutting area;
3. The resources which have been allocated to it in the CSR;
4. Key performance targets for the delivery of its services,together with,in
some cases,a list of key policy initiatives to be delivered;
A statement about how the department will increase the productivity of its
operations.
United
States of
America
In 1993 the US Congress enacted the Government Performance and Results
Act (GPRA).Under this Act,President of the United States is required to
sign Performance Agreements (similar to RFDs) with Secretaries.These
Performance Agreements include departmental Vision,Mission, Objectives,
and annual targets.The achievements against these are to be placed in
the Congress.An example is enclosed at Flag B.The Secretaries in turn sign
performance Agreements with Assistant Secretaries and the accountability for
results and performance eventually trickles down to the lowest levels.
All countries discussed in this volume have already adopted some variant of
this policy.
6.2 Importance of Management Systems
As depicted in the figure 12, all management experts agree that around
80% of the performance of any organisation depends on the quality of the
systems used. That is why the focus of PMES is on improving management
control systems within the Government.
63
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
Figure12:Importanceof Management Systems
Determinantsof Performance
20% Rest
20% People
80% Leader
80% System
6.3 Shift in Focusfrom 'Reducing Quantity of Government'
to 'Increasing Quality of Government'
Figure below depicts a distinct worldwide trend in managing government
performance. In response to perceived dissatisfaction with performance of
government agencies, governments around the world have taken certain
steps. These steps can be divided into two broad categories: (a) reduction
in quantity of government, and (b) increase in quality of government.
Over time most governments have reduced their focus on reducing the
quantity of government and increased their focus on improving the
quality of government. The former is represented by traditional methods
of government reform such as golden handshakes, cutting the size of
GovernmentDepartments,sale of public assets through privatisation.
Figure13:Qualityvs.Quantityof Government
Government Agencies have not
delivered what was expected from them
Privatisation
Traditional
Civil Service
Reforms
Reduce Quantity
of Government
Increase Quality
of Government
Trickle-down
Approach
Direct
Approach
64
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
The policies undertaken by various governments to increase the quality of
government can be further classified into two broad approaches: (a) Trickle-
down approach,and (b)Directapproach.
Figure14:IncreasingQualityof Governance
IncreasingQualityof Government
PMES falls under the category of trickle-down approach as it holds the top
accountable and the accountability for results eventually trickles down to
the lowest echelons of management. It creates a sustainable environment
for implementing all reforms. The generic name of PMES is Performance
Agreement. These approaches have a sustainable impact on all aspects of
performancein the long run.The direct approach, on the other hand,consists
of many instruments of performance management that have a direct impact
on some aspect of performance.Thus these approaches are complementary
and not substitutes for each other.In fact,PMES also makes use of these direct
approaches by making citizens’ charter and grievance redressal systems a
mandatoryrequirementfor all GovernmentDepartments in their RFDs.
7.What has been the progressin implementation?
Today, PMES covers about 80 Departments of Government of India and
800 responsibility centres (Attached Offices, Subordinate Offices and
Autonomous Bodies) under these Departments (Annex I). In addition 18
states of the Indian Union are at various stages of implementing the RFD
system at the state level. In Punjab, the Government has also experimented
with RFDs at district level. Whereas, in Assam, RFDs have been implemented
at the level of Responsibility Centres.
Trickle-down
Approach
Performance
Agreement
Enabling
Environment
Client Charter
Quality Mark
E-Government
E-Procurement
ISO 9000
Peer Reviews
Knowledge Management
Direct
Approach
65
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
The following figure describes the progress in the implementation of RFD
thus far:
Figure15:Current Coverageof RFDPolicy
As can be seen from Figure 15, the RFD policy has stabilised in terms of
coverage of department since 2011. Following 18 states are at various stages
of implementing the RFD policy:
Figure16:StateLevel Implementationof RFDPolicy
Where ImplementationHasBegun
1. Maharashtra
2. Punjab
3. Karnataka
4. Kerala
5. HimachalPradesh
6. Assam
7. Haryana
8. Chhattisgarh
9. Tripura
10. Rajasthan
11. AndhraPradesh
12. Mizoram
13. Jammu &Kashmir
14. Meghalaya
15. Odisha
16. UP(request)
17. Puducherry(request)
18. Tamil Nadu (request)
2009-2010 59 Departments
RFDsfor 74 Departments
RFDsfor RCsof remaining6 Departments
800 ResponsibilityCentres
18 States
62 Departments2010-2011
Outof total 80 Departments2011-2014
66
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement 8.Implementation of KeyAdministrative Reforms
through RFD
Essentially an RFD represents a mechanism to implement policies,
programme and projects. However, these policies, programs and projects
can be divided into two categories. One relates only to a Department (or,
more specifically, departmental mandate) and the other applicable across
all departments. The latter category is included in Section 2 of RFD as
‘Mandatory Objectives'. Over the past five years several key administrative
reform initiatives have been implemented using the RFD mechanism. These
are not new initiatives, but they were not being implemented effectively by
the Government due to lack of accountability and absence of a follow-up
system. RFD filled these two voids and ramped up the implementation. The
following figure summarises some of the key initiatives implemented via the
RFD mechanism.
Figure17:Scopeof RFD
GrievanceRedressMechanism
ISO9001in Government
Corruption MitigationStrategies
Innovationin Government
ImplementingRTIin Government
Compliancewith CAGAudit
While it is not possible to describe the rich implementation experience with
regard to each of the initiative mentioned above, Readers are encouraged
to visit PMD website (www.performance.gov.in) for detailed guidelines and
progress in implementation.
9.Use of G2 GSoftware to manage RFD
implementation
In collaboration with NIC and PMD, the Cabinet Secretariat has developed
a powerful software to allow all transactions related to implementation
of RFD to be conducted online. This software is called Results-Framework
Management System (RFMS) and it enables the following activities to be
done online:
2010-2014 Citizens'/Clients'Charter
67
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
l preparation of the Results-Framework Document(RFD)
l preparation and annual monitoring of the Clients’/Citizens’ Charters
(CCC)
l M&E of departmental performance on an annual and monthly
basis
It is mandatory for all departments to use RFMS software for preparing and
submitting RFDs. RFMS has already led to use of less paper and it is expected
to eventually lead to a paperless environmentfor implementing RFDs.
Figure18:Screenshotof RFMSwebsite
10. Impact of PMES/RFD
Any system takes a long time to implement and reach its full potential. Thus
impact of systems cannot be evaluated in the short term. If we looked at the
impact of systems in the short term, policy makers would stop taking a long
term perspective.
In addition, it is important to keep in mind that the system has not been
fully implemented. Some of the key features are still in a disabled mode. For
example, not all departments are covered by the RFD system. It is argued by
some departments that what is good for goose is also good for gander.They
argue that unless finance and planning are brought under this common
accountability framework, they will remain the weak links in the chain of
accountability. Similarly, it is argued that Government needs to demonstrate
consequences for performance or lack thereof. As the old saying goes:
'If you are not rewarding success,you are probably rewarding failure.'
In spite of incomplete implementation, it is, however, encouraging to note
preliminary evidence that is beginning to trickle in. As you can see from the
Results-FrameworkManagement System
http://www.rfms.nic.in
68
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
figure above, before the introduction of a success indicator for measuring
performance with respect to Grievance Redress in RFD,the difference between
grievances received and disposed was significant. In 2009, only about 50% of
grievances registered on the computerised grievance redress system were
disposed.In 2013,after three years of RFD implementation the disposal rate for
grievances filed electronically is 100%. The data for this statistic is generated
and owned by another department in Government of India, the Department
of Administrative Reform and Public Grievances (DARPG), so there is no
conflict of interest in presenting it as evidence on behalf of PMD. According to
staff of DARPG, before Grievance Redress became a mandatory performance
requirement in RFDs, it was widely ignored. If RFD has modified behaviour in
thisarea,itisreasonableto believethatithashad impactinotherareasaswell.
Similarly, at the request of the Ministry of Finance, mandatory indicators
dealing with timely disposal of CAG paras were introduced in the 2011-12
RFDs. At the time of introduction the total pendency of paras was 4216. As
can be seen from the figure given below, by 2013-14 pendency of CAG paras
had come down to 533.
Figure20:Impact of RFDon reductionin pendencyof CAGParasin GOI
Figure19:Impact of RFDon GrievanceRedressMechanism
2009
Receipts
250000
200000
150000
100000
50000
0
107961
2010 2011 2012 2013
53075
139240
117612
172520
147027
201197
168308
113896
113151
Disposals
4500
4000
3500
3000
2500
2000
1500
1000
500
0
2010 (June)
4216
533
2014 (March)
RFD
69
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
A quick glance at RFD data yield innumerable examples of dramatic
turnaround as a result of introduction of RFD. In the following figure we
give a few examples to show before and after impact on several important
economic and social indicators.Thefiguresare self explanatory.
Coverageof OBCstudentsfor
Post-matricscholarship
Coverageof SCstudentsfor
Post-matricscholarship
18
16
14
12
10
8
6
4
2
0
Average
2005-08
Average
2009-14
50
45
40
35
30
25
20
15
10
5
0
Average
2005-08
28.13
47.26
Average
2009-14
RuralTeledensity
(Average Annual Growth Rate)
Department of Telecommunications
Reductionin Infant MortalityRate
(IMR)per 1000livebirths
Ministry of Welfare
FreshCapacityAdditionof Power
Ministry of Power
8
7
6
5
4
3
2
1
0
Pre RFD
2005-10
Post RFD
2009-14
RFD
60
50
40
30
20
10
0
Average
2005-08
55.75
3.19
7.15
43.8
Average
2009-14
RFD
RFDRFD
Increase in Enhancementof Milk
Production
Department of Animal Husbandry,
Dairying and Fisheries
140
120
100
80
60
40
20
0
Pre RFD
2005-09
104
125
Post RFD
2009-14
RFD
Average
Annual Milk
Production
(MMT)
25000
Fresh Capacity Addition (MW)
20000
15000
10000
5000
0
1997-98
1998-99
1999-00
2001-02
2002-03
2003-04
2004-05
2005-06
2006-07
2007-08
2008-09
2009-10
2010-11
2011-12
2012-13
RFD
5
17
70
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
Another piece of early evidence comes from a Ph.D. thesis submitted to the
Department of Management Studies, Indian Institute of Technology (IIT),
Delhi. This research study was undertaken with the objective to provide
an integrated perspective on Results-Framework Document (RFD) in the
emerging technological and socio-economic context. The study analysed
the structure and processes of the ministries and their impact on the
effectiveness of the RFD process. The effort was to identify the gaps and
overlaps that exist while implementing the initiative under the Central
Governmentdispensation.
The study based its analyses on the executive perceptions of members of All
India Services /Central Services /Central secretariat services with a seniority of
Under Secretary and above,obtained through a structured questionnaire based
survey. A sample size of 117 officers was taken.The findings were corroborated
and refined basedon semi-structured interviewsofsenior civilservants.
The research revealed that the RFD process has been perceived to contribute
significantlyin following areas of Governance:
i. Objectiveassessment of schemesand programs being implementedby
the Ministriesin Governmentof India
ii. Development of a template to assess the performance of ministries
objectively
iii. Facilitating objectiveperformanceappraisal of civil servants.
iv. InculcatingPerformanceorientationinthe civil servantsby channelising
their efforts towards meeting organisational objectives
v. Facilitating a critical review of the schemes, programs and internal
organisational processes for bringing in required reforms.
vi. Facilitating the policy makers to relook and redefine the ministry’s
vision,mission and objectives.
Another robust source of information comes from practitioners who have
seen both systems –old and new. For example, Mr. J. N. L. Srivastava, Former
Secretary to Government of India, outlined the following advantages of the
RFD system:
i) The timeline as Success Indicator has accelerated the process of decision
making,issue of sanctions and release of funds,etc.
ii) In the past, monitoring of various programmes has been ad hoc and
many times unattended. The RFD system has helped in development
and adoption of better and regular systems of monitoring and faster
introduction of IT based monitoring systems.
With a focus on RFDs for the Responsibility Centres which are directly
involved in implementation of the schemes, the implementation of the
programmes and its monitoring has improved.
iii)
71
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
Similarly, Mr. S. P. Jakhanwal, Former Secretary Coordination in the Cabinet
Secretariat say that: “impact of RFD system may not be limited from the
perspectives of figures of achievements against targets. In the last five
years, RFD system has impacted on the performance of the ministries/
departments of the central government in many ways:
i) New Initiatives (left out earlier) were identified in consultation with the
Ministries
ii) Larger outputs/More efficient delivery system/reducing time in delivery
of services
Schemes were made more self-supporting with higher generation of
revenues
iv) Realistic targets (as against soft or unrealistic targets) were agreed to
during discussions with the ministries.
v) Some of the ways in which ministries assigned to Syndicate 6 got
benefitted in improving their performance by adopting RFD system are
listed in Annex I.
iii)
He goes on to giveseveral examples in each of theabove categories.His entire
commentsare available on www.performance.gov.inor www.gopempal.org
Another very persuasive argument in support of PMES/RFD comes from
one of the world’s top three credit rating agency – Fitch India Ratings and
Research.Known for their objectivity and credibility,Fitch goes on to say:
“PMES – A Step in the Right Direction: India Ratings &Research (Ind-
Ra) believes that the ‘Performance Monitoring and Evaluation System’
(PMES) is an opportune step to improve public governance and deliver
better public goods/services in India.”“Ind-Ra believes that PMES is …
in line with international best practices. The overall objectives of the
PMES are in sync with the new union government’s focus on‘minimum
governmentand maximumgovernance.’Ind-Ra believesthatthe PMES
introduced by the previous government should not only continue, but
also bestrengthenedwithtime.”
The entire Fitch Report flushing out the above argument in a systematic
way is available at: http://www.indiaratings.co.in/upload/research/
specialReports/2014/6/27/indra27GoI.pdf. A copy of the report is also
annexed (Annexure J).
Finally, as we shall see in the next section, the impact of a policy similar to
RFD has had dramatic impact on the performance of public enterprises in
India. We hope in a few years’ time, the impact of RFDs would be judged to
be as dramatic as it has been in the case of MOUs.
72
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement 11. RFDversus Memorandum of Understanding
(MOU)
11.1 About MOU
The Memorandum of Understanding (MOU) is a negotiated document
between the Government, acting as the owner of Centre Public Sector
Enterprise(CPSE) and the Corporate Management of the CPSE. It contains
the intentions, obligations and mutual responsibilities of the Government
and the CPSE and is directed towards strengthening CPSE management by
results and objectivesrather than managementby controls and procedures.
The beginnings of the introduction of the MOU system in India can be
traced to the recommendation of the Arjun Sengupta Committee on Public
Enterprises in 1984. The first set of Memorandum of Undertaking (MOU)
was signed by four Central Public Sector Enterprises for the year 1987-88.
Over a period of time, an increasing number of CPSEs was brought within
the MOU system. Further impetus to extend the MOU system was provided
by the Industrial Policy Resolution of 1991 which observed that CPSEs “will
be provided a much greater degree of management autonomy through the
system of Memorandum of Undertaking.”
Today, as many as 200 CPSEs (including subsidiaries and CPSEs under
construction stages) have been brought into the fold of the MOU system.
During this period, considerable modifications and improvements in the
structure of and procedures for preparing and finalising the MOUs have
been affected. These changes have been brought about on the basis of
experience in the working of the MOU system and supported by studies
carried out from time to time by expert committees on specific aspects of
the MOUsystem.
Broadly speaking, the obligations undertaken by CPSEs under the MOU are
reflected by three types of parameters i.e. (a) financial (b) physical and (c)
dynamic. Considering the very diverse nature of activities in which CPSEs
are engaged. It is obviously not possible to have a uniform set of physical
parameters. These would vary from enterprise to enterprise and are
determined for each enterprise separately during discussions held by the
Task Force (TF) with the Administrative Ministries and the CPSEs . Similarly,
depending on the corporate plans and long-term objectives of the CPSEs,
the dynamic criteria are also identifiedon an enterprise-specific basis.
73
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
11.2 Comparisonand ContrastwithRFD
RFD and MOU systems are exactly the same in their conceptual origins and
design. The differences are mostly cosmetic. A large part of success of RFD
system can be traced to the successful experience of implementing MOUs in
the public enterprise sector.
Similaritiesbetween RFD and MOUsystems
1. Both represent an agreement between a‘principal’and an‘agent’.
a. RFD is between Minister (Principal) and Departmental Secretary
(Agent).
b. MOU is between departmental Secretary (Principal) and Chief
Executive of the public enterprise (Agent)
2. Both have similar institutional arrangements.
a. The quality of both is reviewed by ATF (non-government body of
experts)
b. Both are approved by High Powered Committees chaired by the
Cabinet Secretary.
3. Both use a Composite Score to summarise the overall performance.The
methodology for calculating the Composite Score is same.
a. The RFD Composite Score reflects the performanceof Government
Department
b. The MOU Composite Score reflects the performance of public
enterprise.
Differencesbetween RFD and MOU
1. One significant and substantive difference is with regard to incentives
for performance. MOU score is linked to a financial incentive for public
enterprise employees, whereas RFD score is not yet linked to financial
incentives.
2. Cosmetically RFD and MOU differ in terms of the scale used.RFD uses a
0% -100% scale,whereas,MOUhas a five pointscale from 1-5.
The MOU system has had a major impact on the performance of public
enterprise. While, this is true with regard to a wide range of parameters,
we present the impact of MOU system on the surplus generated for the
exchequer and trend in net profits for Central Public Sector Enterprises
(CPSEs).
74
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
As mentioned earlier, the distinctly positive impact of MOU on public
enterprise performance is one of the reasons for Government’s optimism
about this contractual approach to performancemanagement.
12. In Conclusion: ASWOTAnalysis of PMES/RFD
While no management system is perfect, only an objective management
analysis of a system can lead to improvements. In that spirit, we briefly
summarise a SWOT analysis of the system.
1998-9946,934
56,157
61,037
62,753
81,867
89,036
1,10,599
1,25,384
1,48,783
1,65,994
1,51,728
1,39,918
1,56,124
1,62,402
1,62,762
1999-00
2000-01
2001-02
2002-03
2003-04
2004-05
2005-06
2006-07
2007-08
2008-09
2009-10
2010-11
2011-12
2012-13
Figure21:Contribution of CPSEsto National Exchequer and their
Net Profitearnings
Contribution of CPSEsto National Exchequer
Contribution of CPSEs in cr
1,80,000
1,60,000
1,40,000
1,20,000
1,00,000
80,000
60,000
40,000
20,000
0
Net Profit
1990-91
1991-92
1992-93
1993-94
1994-95
1995-96
1996-97
1997-98
1998-99
1999-00
2000-01
2001-02
2002-03
2003-04
2004-05
2005-06
2006-07
2007-08
2008-09
2009-10
2010-11
2011-12
2012-13
2272
2356
3271
4545
7187
9574.32
9991.82
13719.96
13203
14331
15653
25978
32344
53084
65429
69536
81055
81274
83867
92203
92129
98245
115299
120000
100000
80000
60000
40000
20000
0
Net Profit
75
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
Strengths
l P M E S has stabilised and it is widely understood and accepted.
l The quality of the evaluation system has improved over the
past five years
l It is considered to be state-of-the-art evaluation
methodology by independent and informed observers. This was
re-affirmed by international experts in a Global Roundtable
organised by the Performance Management Division, Cabinet
Secretariat.
l 18 States cutting across political lines have initiated
implementation of PMES and hence there will be no political
opposition to a revamped PMES.
Weaknesses
l In the absence of a performance related incentive scheme
(PRIS), it is difficult to sustain motivation. Implementation of PRIS
has been recommended since the 4th
Central Pay Commission in
1987 and it is time to implementit.
l The current system of Performance Appraisal Reports in
Government can be improved further and linked to RFD.Today officers
get almost perfect scores but the department gets only modest score.
This disconnecthas be to eliminated.
l While the results of RFD are placed in the Parliament, they need
to be publicised more widely to have greater impact on behaviour of
officials.
l The connection between performance and career advancement
should become more apparent.
Opportunities
l There is a clear hunger in the world, as well as India, for good
governance and accountabilitythrough rigorous evaluation.
l This is a very normal way to manage any organisation (public or
private) and citizens expect this to happen in Government.
l The entire machinery for accountability for results through
rigorous evaluation is in place and can become truly effective
with strong political leadership
l This is one of the easiest aspects of governance for
citizens to comprehend.
Threat
l Any delay in further enhancing the effectiveness of PMES and
RFD will lead to disillusionment as it did with previous efforts like
the PerformanceBudget and Outcome Budget.
76
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
For more information on PMES/RFD,please visit:
www.performance.gov.in
For more information on Global Roundtable and background material,
please visit:www.gopempal.org
77
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement
Bibliographical References
John M. Kamensky. 2013. What our government can learn from
India. Government Executive blog. October 3, 2013. Available
from: http://www.govexec.com/excellence/promising-
practices/2013/10/what-our-government-can-learn-
india/71296/
Sunil Kumar Sinha and Devendra Kumar Pant. 2014.“GoI’s Performance
Management & Evaluation System Covers 80 Departments
and Ministries of the Government of India: Special Report”.
Fitch India Ratings and Research - Micro Research. June 27,
2014. Available from: http://www.indiaratings.co.in/upload/
research/specialReports/2014/6/27/indra27GoI.pdf
Trivedi, Prajapati. 1990. “Memorandum of Understanding and Other
Performance Improvement Systems: A Comparison,” The
Indian Journal of Public Administration, Vol. XXXVI, No. 2,
April-June,1990.
— 1995.“Improving Government Performance: What Gets
Measured, Gets Done,” Economic and Political Weekly, Volume
29, no.35, 27 August 1995.pp.M109-M114
— 2003. “Performance Agreements in US Government:
Lessons for Developing Countries,” Economic and Political
Weekly,Vol.38, no.46,15 November 2003.pp.4859-4864
— 2005. “Designing and Implementing Mechanisms to
Enhance Accountability for State-Owned Enterprises”,
published in Proceedings of Expert Group Meeting on Re-
inventing Public Enterprise and its Management, organised by
UNDP, October 27-28, 2005, United Nations Building, New
York
— 2008. “Performance Contracts in Kenya” in Splendour in the
Grass: Innovation in Administration. New Delhi: Penguin
Books. pp.211-235
— 2010 (a). “Programme Agreements in Malaysia:
Instruments for Enhancing Government Performance and
Accountability,” The Administrator: Journal of Lal Bahadur
Shastri National Academy of Administration, Volume 51, no. 1,
pp.110-138
78
ProceedingsofGlobalRoundtableon
GovernmentPerformanceManagement — 2010 (b). “Performance Monitoring and Evaluation
System for Government Departments,” The Journal of
Governance,Volume1,no.1,pp.4-19
— 2011. “India: Indian Experience with the Performance
Monitoring and Evaluation System for Government
Departments” published in ‘Proceedings from the Second
International Conference on National Evaluation Capacities’
organised by UNDP in September 12-14, 2011,
Johannesburg,South Africa.
249Proceedings of Global Roundtable on
Government Performance Management
Annexure D:
Results-Framework Document (RFD) for
D/oAgriculture and Cooperation,2012-13
R F D
(Results-Framework Document)
for
Department Of Agricultural Research and
Education
(2012-2013)
250 Proceedings of Global Roundtable on
Government Performance Management
Results-Framework Document (RFD) for Department Of Agricultural Research and Education-(2012-
2013)
Section 1:
Vision, Mission, Objectives and Functions
Harnessing science to ensure comprehensive and sustained physical, economic and ecological access to food and
livelihood security to all Indians, through generation, assessment, refinement and adoption of appropriate technologies.
Mission
Sustainability and growth of Indian agriculture by interfacing agricultural research, higher educationand front-line
extension initiatives complemented with institutional, infrastructural and policy support that will create efficient and
effective science-harnessing tool.
Objective
1 Improving natural resource management and input use efficiency
2 Strengthening of higher agricultural education
3 Utilizing frontier research in identified areas / programs for better genetic exploitation
4 Strengthening of frontline agricultural extension system and addressing gender issues
5 IP management and commercialization of technologies
6 Assessment and monitoring of fishery resources
7 Development of vaccines and diagnostics
8 Post harvest management, farm mechanization and value addition
Functions
1 To develop Public-Private-Partnerships in developing seeds, planting materials, vaccines, feed formulations, value added products,
agricultural machinery etc.
To serve as a repository in agriculture sector and develop linkages with national and international organizations as per the needs and
current trends.
2
To plan, coordinate and monitor research for enhancing production and productivity of agriculture sector.3
To enhance quality of higher education in agriculture sector.4
Technology generation, commercialization and transfer to end users.5
Human resource development and capacity building.6
To assess implementation of various programmes in relation to target sets and provide mid-course correction, if required.7
8 To provide technological backstopping to various line departments.
Vision
251Proceedings of Global Roundtable on
Government Performance Management
Section2:
IntersePrioritiesamongKeyObjectives,SuccessindicatorsandTargets
Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012-2013)
Target/CriteriaValue
Poor
60%
67
3
16
2
2
0
10
102
102
7
Fair
70%
78
4
19
3
3
1
12
119
119
8
Good
80%
88
5
22
4
4
2
14
136
136
9
VeryGood
90%
100
6
25
5
5
3
15
150
150
10
Excellent
100%
112
7
27
6
6
4
17
170
170
12
Weight
2.55
2.55
1.70
1.70
1.70
0.85
1.70
1.02
1.02
2.21
Unit
Number
Number
Number
Number
Number
Number
Number
Number
Number
Number
Success
Indicator
[1.1.1]DevelopingGIS
baseddistrict/block
levelsoilfertility
maps
[1.1.2]DevelopingINM
packagesfor
differentagro-eco
regionsofthe
country
[1.1.3]Organizingtraining
&demonstrations
[1.2.1]Technologiesfor
enhancingwateruse
efficiencies
[1.2.2]Technologiesfor
waterharvesting
storageand
groundwater
recharge
[1.2.3]Models/DSSfor
multipleusesof
water
[1.2.4]Organizingtraining
&demonstrations
[1.3.1]Awarenessbuilding
amongststake
holdersthrough
trainings/
demonstrations
[1.3.2]Humanresource
developmentand
capacitybuilding
[1.3.3]Testingcrop
varietiesforclimate
resilienceatdifferent
locations
Action
[1.1]Integratednutrient
management(INM)
[1.2]Integratedwater
management(IWM)
[1.3]Climateresilientagriculture
Weight
17.00
Objective
[1]Improvingnaturalresourcemanagement
andinputuseefficiency
252 Proceedings of Global Roundtable on
Government Performance Management
Section2:
IntersePrioritiesamongKeyObjectives,SuccessindicatorsandTargets
Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012-
2013)
Target/CriteriaValue
Poor
60%
4
6
475
15
300
600
16
2000
210
Fair
70%
5
8
500
18
320
700
18
2500
245
Good
80%
6
10
575
20
340
800
20
3000
280
VeryGood
90%
8
12
625
22
360
900
22
4000
315
Excellent
100%
9
13
630
25
380
1000
25
5000
350
Weight
2.55
2.55
4.25
1.70
2.55
1.70
1.70
1.17
1.17
Unit
Number
Number
Number
Number
Rupees
incrores
Number
Number
Number
Number
Success
Indicator
[2.1.1]Numberof
universitiesgranted
accreditation/
extensionof
accreditation
[2.2.1]Numberof
fellowshipsawarded
(subjecttoavailability
ofcompetent
candidates)
[2.3.1]TotalNo.of
fellowshipsgranted
everyyear(subject
toavailabilityof
competent
candidates)
[2.4.1]Experientiallearning
unitsestablished
[2.5.1]Amountreleased
[2.6.1]Numberofteachers
trainedperyear
[2.6.2]NumberofSummer
/WinterSchools
organized
[3.1.1]Numberof
germplasmcollected
/characterizedand
conserved(other
crops)
[3.1.2]Numberof
germplasmcollected
Action
[2.1]Accreditation/Extensionof
accreditationofagricultural
universities
[2.2]GrantofICARInternational
fellowshipstoIndianand
foreignstudents
[2.3]GrantofJRFandSRFto
students
[2.4]Establishmentof
experientiallearningunits
[2.5]Financialsupportand
monitoringofprogress
[2.6]Capacitybuildingand
facultyup-gradation
[3.1]Collection,characterization
andconservationofgenetic
resources
Weight
17.00
13.00
Objective
[2]Strengtheningofhigheragricultural
education
[3]Utilizingfrontierresearchinidentifiedareas/
programsforbettergeneticexploitation
253Proceedings of Global Roundtable on
Government Performance Management
Section2:
IntersePrioritiesamongKeyObjectives,SuccessindicatorsandTargets
Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012-2013)
Target/CriteriaValue
Poor
60%
1000
7000
2400
27
5
7
12
550
0.5
Fair
70%
1500
7500
2800
31.5
8
9
14
600
1
Good
80%
2000
8000
3200
36
10
11
16
800
1.5
VeryGood
90%
2500
8200
3600
40.5
12
13
18
900
2
Excellent
100%
3000
8500
4000
45
15
17
20
1000
2.5
Weight
1.17
1.30
1.17
1.17
1.17
1.17
1.17
1.17
1.17
Unit
Number
Tonnes
Tonnes
Number
(inlakhs)
Number
Number
Number
Number
Number
(in
Success
Indicator
(horticulturalcrops)
[3.2.1]Numberof
germplasm
evaluated
[3.3.1]Quantityofbreeder
seedproduced
(othercrops)
[3.3.2]Quantityofbreeder
seedproduced
(horticulturalcrops)
[3.3.3]Quantityofplanting
materialsproduced
annually
[3.4.1]Numberofvarieties
developed(other
crops)
[3.4.2]Numberofvarieties
developed(pulses/
oilseeds)
[3.4.3]Numberofvarieties
developed
(horticulturalcrops)
[3.5.1]Provisioningof
pigletstofarmers
anddevelopment
agencies
[3.6.1]Provisioningofday
old/6weeks
Action
[3.2]Evaluationofgenetic
resources/improved
varietiesforsuitablecrop
husbandrypractices
[3.3]Productionofbreederseed,
otherseedsandplanting
materials
[3.4]Developmentofimproved
varietiessuitedtodiverse
agroecologies
[3.5]Productionofpiglets(8-12
weeksofage)
[3.6]Productionofdayoldas
wellas6weeksold
WeightObjective
254 Proceedings of Global Roundtable on
Government Performance Management
Section2:
IntersePrioritiesamongKeyObjectives,SuccessindicatorsandTargets
Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012-
2013)
Target/CriteriaValue
Poor
60%
140
14000
15
10
60
50
3
0
0
12
Fair
70%
150
15000
16
15
70
55
4
1
0
14
Good
80%
200
16000
20
20
80
60
5
2
1
16
VeryGood
90%
220
18000
25
25
90
65
6
3
2
18
Excellent
100%
240
20000
30
30
95
70
8
5
3
20
Weight
5.20
5.20
2.60
4.50
4.50
3.60
2.40
3.00
2.00
1.25
Unit
lakhs)
Number
Number
Number
Number
Number
Number
Number
Number
Number
Number
Success
Indicator
oldchickstofarmers
anddevelopment
agencies
[4.1.1]Numberof
technologies
assessed
[4.2.1]Numberoftraining
programmes
organized
[4.3.1]Gender-related
technology
promotionprograms
conducted
[5.1.1]Partners(private
sector)identified
[5.2.1]Applicationsfiled
[6.1.1]Numberof
explorations/
surveyscarriedout
[6.1.2]DevelopmentofGIS
basedaquatic
resourcedatabase
[7.1.1]Diagnostickits
developed
[7.2.1]Productionof
vaccines
[8.1.1]Equipment
developed/refined
Action
chicks
[4.1]Technologyassessment
throughon-farmtrials
[4.2]Capacitybuildingthrough
trainingprogrammes
[4.3]Promotionoftechnologies
coveringgenderconcerns
[5.1]Partnershipdevelopment,
includinglicensingofICAR
technologies
[5.2]PatentsandotherIPRtitles
[6.1]Fishresourcesassessment
andeco-systemmonitoring
[7.1]Productionofdiagnostic
kitsandfieldvalidation
[7.2]Productionofvaccines
againstimportantanimal
diseasesandtheir
validation
[8.1]Develop/refineequipment
forcropproduction&
processing
Weight
13.00
9.00
6.00
5.00
5.00
Objective
[4]Strengtheningoffrontlineagricultural
extensionsystemandaddressinggender
issues
[5]IPmanagementandcommercializationof
technologies
[6]Assessmentandmonitoringoffishery
resources
[7]Developmentofvaccinesanddiagnostics
[8]Postharvestmanagement,farm
mechanizationandvalueaddition
255Proceedings of Global Roundtable on
Government Performance Management
Section2:
IntersePrioritiesamongKeyObjectives,SuccessindicatorsandTargets
Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012-
2013)
*MandatoryObjective(s)
Target/CriteriaValue
Poor
60%
6
5
6
09/03/2012
06/05/2012
80
80
09/03/2013
80
80
60
Fair
70%
8
6
8
08/03/2012
05/05/2012
85
85
08/03/2013
85
85
70
Good
80%
10
8
10
07/03/2012
04/05/2012
90
90
07/03/2013
90
90
80
VeryGood
90%
11
10
12
06/03/2012
03/05/2012
95
95
06/03/2013
95
95
90
Excellent
100%
12
11
14
05/03/2012
01/05/2012
100
100
05/03/2013
100
100
100
Weight
1.25
1.25
1.25
2.0
1.0
2.0
2.0
2.0
2.0
2.0
0.5
Unit
Number
Number
Number
Date
Date
%
%
Date
%
%
%
Success
Indicator
[8.2.1]Commercialtest
reports/samples
tested
[8.3.1]Processprotocols
[8.4.1]Value-added
products
On-timesubmission
On-timesubmission
%ofimplementation
Areaofoperationscovered
Implementationofidentified
innovations
IndependentAuditof
ImplementationofCitizen’s
Charter
IndependentAuditof
implementationofpublic
grievanceredressalsystem
PercentageofATNs
submittedwithinduedate(4
months)fromdateof
presentationofReportto
ParliamentbyCAGduring
theyear.
Action
[8.2]Testingofcommercial
prototypes/technologies
[8.3]Processprotocolsfor
productdevelopment,
storage,safetyand
improvedquality
[8.4]Development/refinement
ofproductsfromcrops,
fibres,naturalgums/
resins,livestock/fishes
TimelysubmissionofDraftfor
Approval
TimelysubmissionofResults
Implementmitigatingstrategies
forreducingpotentialriskof
corruption
ImplementISO9001asperthe
approvedactionplan
Identify,designandimplement
majorinnovations
ImplementationofSevottam
TimelysubmissionofATNson
AuditparasofC&AG
Weight
3.00
6.00
4.00
2.00
Objective
*EfficientFunctioningoftheRFDSystem
*AdministrativeReforms
*ImprovingInternalEfficiency/
responsiveness/servicedeliveryofMinistry
/Department
*EnsuringcompliancetotheFinancial
AccountabilityFramework
256 Proceedings of Global Roundtable on
Government Performance Management
Section2:
IntersePrioritiesamongKeyObjectives,SuccessindicatorsandTargets
Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012-
2013)
*MandatoryObjective(s)
Target/CriteriaValue
Poor
60%
60
60
60
Fair
70%
70
70
70
Good
80%
80
80
80
VeryGood
90%
90
90
90
Excellent
100%
100
100
100
Weight
0.5
0.5
0.5
Unit
%
%
%
Success
Indicator
PercentageofATRS
submittedwithinduedate(
6months)fromdateof
presentationofReportto
ParliamentbyPACduring
theyear.
Percentageofoutstanding
ATNsdisposedoffduring
theyear.
Percentageofoutstanding
ATRSdisposedoffduring
theyear.
Action
TimelysubmissionofATRstothe
PACSectt.onPACReports.
EarlydisposalofpendingATNson
AuditParasofC&AGReports
presentedtoParliamentbefore
31.3.2012.
EarlydisposalofpendingATRs
onPACReportspresentedto
Parliamentbefore31.3.2012
WeightObjective
257Proceedings of Global Roundtable on
Government Performance Management
Section3:
TrendValuesoftheSuccessIndicators
Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012-
2013)
Projected
Valuefor
FY14/15
15
8
25
2
3
1
16
100
100
10
Projected
Valuefor
FY13/14
71
6
25
2
3
1
15
100
50
10
TargetValue
FY12/13
100
6
25
5
5
3
15
150
150
10
ActualValue
FY11/12
15
4
24
9
6
2
49
100
100
7
ActualValue
FY10/11
10
4
15
4
5
2
10
0
0
0
Unit
Number
Number
Number
Number
Number
Number
Number
Number
Number
Number
SuccessIndicator
[1.1.1]DevelopingGISbased
district/blocklevelsoil
fertilitymaps
[1.1.2]DevelopingINM
packagesfordifferent
agro-ecoregionsofthe
country
[1.1.3]Organizingtraining&
demonstrations
[1.2.1]Technologiesfor
enhancingwateruse
efficiencies
[1.2.2]Technologiesforwater
harvestingstorageand
groundwaterrecharge
[1.2.3]Models/DSSfor
multipleusesofwater
[1.2.4]Organizingtraining&
demonstrations
[1.3.1]Awarenessbuilding
amongststakeholders
throughtrainings/
demonstrations
[1.3.2]Humanresource
developmentand
capacitybuilding
[1.3.3]Testingcropvarieties
forclimateresilienceat
differentlocations
Action
[1.1]Integratednutrient
management(INM)
[1.2]Integratedwater
management(IWM)
[1.3]Climateresilient
agriculture
Objective
[1]Improvingnaturalresource
managementandinputuseefficiency
258 Proceedings of Global Roundtable on
Government Performance Management
Section3:
TrendValuesoftheSuccessIndicators
Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012-
2013)
Projected
Valuefor
FY14/15
10
15
650
37
385
1000
35
2300
Projected
Valuefor
FY13/14
10
14
650
35
375
1000
35
2200
TargetValue
FY12/13
8
12
625
22
360
900
22
4000
ActualValue
FY11/12
8
12
625
25
325
1000
35
2000
ActualValue
FY10/11
8
12
625
20
289
1000
40
2000
Unit
Number
Number
Number
Number
Rupeesin
crores
Number
Number
Number
SuccessIndicator
[2.1.1]Numberofuniversities
grantedaccreditation/
extensionof
accreditation
[2.2.1]Numberoffellowships
awarded(subjectto
availabilityofcompetent
candidates)
[2.3.1]TotalNo.offellowships
grantedeveryyear
(subjecttoavailabilityof
competentcandidates)
[2.4.1]Experientiallearning
unitsestablished
[2.5.1]Amountreleased
[2.6.1]Numberofteachers
trainedperyear
[2.6.2]NumberofSummer/
WinterSchools
organized
[3.1.1]Numberofgermplasm
collected/
characterizedand
conserved(other
crops)
Action
[2.1]Accreditation/Extension
ofaccreditationof
agriculturaluniversities
[2.2]GrantofICAR
Internationalfellowships
toIndianandforeign
students
[2.3]GrantofJRFandSRFto
students
[2.4]Establishmentof
experientiallearningunits
[2.5]Financialsupportand
monitoringofprogress
[2.6]Capacitybuildingand
facultyup-gradation
[3.1]Collection,
characterizationand
conservationofgenetic
resources
Objective
[2]Strengtheningofhigheragricultural
education
[3]Utilizingfrontierresearchinidentified
areas/programsforbettergenetic
exploitation
259Proceedings of Global Roundtable on
Government Performance Management
Section3:
TrendValuesoftheSuccessIndicators
Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012-
2013)
Projected
Valuefor
FY14/15
400
2400
8500
4800
52
12
13
20
1000
2.5
Projected
Valuefor
FY13/14
400
2200
8200
4500
50
12
13
20
900
2
TargetValue
FY12/13
315
2500
8200
3600
40.5
12
13
18
900
2
ActualValue
FY11/12
300
2000
8000
3500
40
10
--
15
900
2
ActualValue
FY10/11
45
1800
8000
2250
13
10
--
4
900
0.6
Unit
Number
Number
Tonnes
Tonnes
Number
(inlakhs)
Number
Number
Number
Number
Number
(inlakhs)
SuccessIndicator
[3.1.2]Numberofgermplasm
collected(horticultural
crops)
[3.2.1]Numberofgermplasm
evaluated
[3.3.1]Quantityofbreeder
seedproduced(other
crops)
[3.3.2]Quantityofbreeder
seedproduced
(horticulturalcrops)
[3.3.3]Quantityofplanting
materialsproduced
annually
[3.4.1]Numberofvarieties
developed(other
crops)
[3.4.2]Numberofvarieties
developed(pulses/
oilseeds)
[3.4.3]Numberofvarieties
developed
(horticulturalcrops)
[3.5.1]Provisioningofpiglets
tofarmersand
developmentagencies
[3.6.1]Provisioningofdayold
/6weeksoldchicksto
farmers
Action
[3.2]Evaluationofgenetic
resources/improved
varietiesforsuitablecrop
husbandrypractices
[3.3]Productionofbreeder
seed,otherseedsand
plantingmaterials
[3.4]Developmentofimproved
varietiessuitedtodiverse
agroecologies
[3.5]Productionofpiglets(8-
12weeksofage)
[3.6]Productionofdayoldas
wellas6weeksold
chicks
Objective
260 Proceedings of Global Roundtable on
Government Performance Management
Section3:
TrendValuesoftheSuccessIndicators
Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012-2013)
Projected
Valuefor
FY14/15
280
22000
35
40
150
75
8
4
3
25
Projected
Valuefor
FY13/14
260
20000
30
35
130
70
7
3
3
25
TargetValue
FY12/13
220
18000
25
25
90
65
6
3
2
18
ActualValue
FY11/12
200
--
20
136
111
60
6
4
2
20
ActualValue
FY10/11
--
--
--
20
70
40
4
2
2
20
Unit
Number
Number
Number
Number
Number
Number
Number
Number
Number
Number
SuccessIndicator
anddevelopment
agencies
[4.1.1]Numberof
technologiesassessed
[4.2.1]Numberoftraining
programmesorganized
[4.3.1]Gender-related
technologypromotion
programsconducted
[5.1.1]Partners(private
sector)identified
[5.2.1]Applicationsfiled
[6.1.1]Numberofexplorations
/surveyscarriedout
[6.1.2]DevelopmentofGIS
basedaquatic
resourcedatabase
[7.1.1]Diagnostickits
developed
[7.2.1]Productionofvaccines
[8.1.1]Equipmentdeveloped/
refined
Action
[4.1]Technologyassessment
throughon-farmtrials
[4.2]Capacitybuildingthrough
trainingprogrammes
[4.3]Promotionof
technologiescovering
genderconcerns
[5.1]Partnership
development,including
licensingofICAR
technologies
[5.2]PatentsandotherIPR
titles
[6.1]Fishresources
assessmentandeco-
systemmonitoring
[7.1]Productionofdiagnostic
kitsandfieldvalidation
[7.2]Productionofvaccines
againstimportantanimal
diseasesandtheir
validation
[8.1]Develop/refine
equipmentforcrop
production&processing
Objective
[4]Strengtheningoffrontlineagricultural
extensionsystemandaddressing
genderissues
[5]IPmanagementand
commercializationoftechnologies
[6]Assessmentandmonitoringoffishery
resources
[7]Developmentofvaccinesand
diagnostics
[8]Postharvestmanagement,farm
mechanizationandvalueaddition
261Proceedings of Global Roundtable on
Government Performance Management
Section3:
TrendValuesoftheSuccessIndicators
Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012-
2013)
*MandatoryObjective(s)
Projected
Valuefor
FY14/15
15
13
18
--
--
--
--
--
--
--
Projected
Valuefor
FY13/14
15
13
18
--
--
--
--
--
--
--
TargetValue
FY12/13
11
10
12
06/03/2012
03/05/2012
95
95
06/03/2012
95
95
ActualValue
FY11/12
12
11
14
06/03/2011
--
--
--
--
--
--
ActualValue
FY10/11
12
10
12
05/03/2010
27/04/2011
--
--
--
--
--
Unit
Number
Number
Number
Date
Date
%
%
Date
%
%
SuccessIndicator
[8.2.1]Commercialtest
reports/samples
tested
[8.3.1]Processprotocols
[8.4.1]Value-addedproducts
On-timesubmission
On-timesubmission
%ofimplementation
Areaofoperationscovered
Implementationofidentified
innovations
IndependentAuditof
ImplementationofCitizen’s
Charter
IndependentAuditof
implementationofpublic
grievanceredressal
Action
[8.2]Testingofcommercial
prototypes/technologies
[8.3]Processprotocolsfor
productdevelopment,
storage,safetyand
improvedquality
[8.4]Development/refinement
ofproductsfromcrops,
fibres,naturalgums/
resins,livestock/fishes
TimelysubmissionofDraftfor
Approval
TimelysubmissionofResults
Implementmitigatingstrategies
forreducingpotentialriskof
corruption
ImplementISO9001asper
theapprovedactionplan
Identify,designandimplement
majorinnovations
ImplementationofSevottam
Objective
*EfficientFunctioningoftheRFD
System
*AdministrativeReforms
*ImprovingInternalEfficiency/
responsiveness/servicedeliveryof
Ministry/Department
262 Proceedings of Global Roundtable on
Government Performance Management
Section3:
TrendValuesoftheSuccessIndicators
Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012-2013)
*MandatoryObjective(s)
Projected
Valuefor
FY14/15
--
--
--
--
Projected
Valuefor
FY13/14
--
--
--
--
TargetValue
FY12/13
90
90
90
90
ActualValue
FY11/12
--
--
--
--
ActualValue
FY10/11
--
--
--
--
Unit
%
%
%
%
SuccessIndicator
system
PercentageofATNssubmitted
withinduedate(4months)
fromdateofpresentationof
ReporttoParliamentbyCAG
duringtheyear.
PercentageofATRSsubmitted
withinduedate(6months)
fromdateofpresentationof
ReporttoParliamentbyPAC
duringtheyear.
Percentageofoutstanding
ATNsdisposedoffduringthe
year.
Percentageofoutstanding
ATRSdisposedoffduringthe
year.
Action
TimelysubmissionofATNson
AuditparasofC&AG
TimelysubmissionofATRsto
thePACSectt.onPAC
Reports.
EarlydisposalofpendingATNs
onAuditParasofC&AG
Reportspresentedto
Parliamentbefore31.3.2012.
EarlydisposalofpendingATRs
onPACReportspresentedto
Parliamentbefore31.3.2012
Objective
*EnsuringcompliancetotheFinancial
AccountabilityFramework
263Proceedings of Global Roundtable on
Government Performance Management
Results-Framework Document (RFD) for Department Of Agricultural Research and Education -(2012-
2013)
Section 4:
Description and Definition of Success Indicators
and Proposed Measurement Methodology
Objective 1. Improving natural resource management and input use efficiency
Improving natural resource management and input use efficiency with respect to improving soil health and water
productivity, integrated nutrient and water management are essential. The action points/ success indicators for INM
cover developing GIS based soil fertility maps, macro / micro-level land use plans, developing and disseminating
integrated nutrient management packages, technologies for improving the productivity of problem soils, IFS models etc.
For facilitating IWM, enhancing water storage and ground water recharge, multiple uses of water, precision/micro-
irrigation systems, recycling of wastewater and other on-farm management issues like resource conservation
technologies, deficit irrigation, tools and models to support decision making are planned. For mitigating adverse impact
of climate change on crops, livestock, horticulture and fisheries, emphasis will specifically be on climate resilient
agriculture through identifying the vulnerable zones and mitigating measures through basic and strategic research. In
order to improve the capacity of research and developmental organizations and their staff, provision has been made for
strengthening them with state of the art technologies through training programmes / field demonstrations etc.
Objective 2. Strengthening of higher agricultural education
The success will be measured from the indicator the number of universities having developed appropriate e-learningtools
and resources. Similarly, Accreditation / Extension of accreditation of agricultural universities will require number of
universities granted accreditation / extension of accreditation; Grant of ICAR International fellowships to Indian and
foreign students, and JRF and SRF, as applicable, will cover number of such fellowships awarded. However, such
numbers of grants will also depend upon the availability of competent candidates for the fellowships. Capacity building
and faculty upgradation of teachers will be measured from the number of teachers trained per year.
Objective 3. Utilizing frontier research in identified areas / programs for better genetic exploitation
The emphasis on natural resource management is laid to ensure efficient use of natural resources under the changing
situations. This can be supported by developing high yielding varieties, requiring less input like fertilizers, water and
pesticides. With respect to conservation of genetic resources for sustainable use, it is envisaged to conserve plant
genetic resources to have repository, evaluation and further utilization of resources for improving yield in a sustainable
manner. The genetic diversity of various horticultural crops will be collected from different eco-regions, characterized
and utilized to develop varieties for higher yields, quality and biotic and abiotic stresses. The action points /success
indicators include production of quality seed and planting materials.
Objective 4. Strengthening of frontline agricultural extension system and addressing gender issues
The success indicators with respect to assessment of technology through OFTs is measured by the actual number of
technologies assessed by conducting on farm trials. Capacity building and trainings organized are measured with the
actual numbers of such programme / activities undertaken by the KVKs. Regarding support for promoting gender issues
is measured through the success indicators of actual number of gender related technology promotion programmes
conducted by the DRWA.
Objective 5. IP management and commercialization of technologies
With respect to commercialization of technologies and promoting public-private partnership, it is envisaged to bring
commercial ethos in agricultural research. Indicators for commercialization of
264 Proceedings of Global Roundtable on
Government Performance Management
Results-Framework Document (RFD) for Department Of Agricultural Research and Education -(2012-
2013)
Section 4:
Description and Definition of Success Indicators
and Proposed Measurement Methodology
technologies, promoting public-private partnership, and protection of intellectual property rights will be determined by the
commercialization through partnership development, including licensing of ICAR technologies. The increasing numbers
over the years may indicate a higher emphasis on technology transfer through enterprises; thereby contributing to larger
adoption and improved socioeconomic impact of ICAR technologies.
Objective 6. Assessment and monitoring of fishery resources
To enhance fish production and productivity on a sustainable basis from the available resources, and to address the
issues and strategies to overcome the critical research gaps in realizing the full production potential from fisheries and
aquaculture sector, the research activities have been consolidated and prioritized. The action points and the success
indicators under this objective have been identified depending on the priority and availability of the resources and the
needs and requirements of the stakeholders. It is expected that by undertaking these programmes, there would be an
increase in fish production, conservation of resources, more opportunities for livelihood and employment
generation.
Objective 7. Development of vaccines and diagnostics
The production of diagnostic kits and vaccines would involve delineation of process (processes) and thereby denoting a
specific number for field testing / validation.
Objective 8. Post harvest management, farm mechanization and value addition
The action points / success indicators for development / refinement of equipment would include intended performance of
the equipment and its commercial viability. Test results and on-farm trials will be used to judge the expected output. The
success indicators will cover technologies developed to create innovative products that are commercially acceptable in
competitive markets.
265Proceedings of Global Roundtable on
Government Performance Management
Results-Framework Document (RFD) for Department Of Agricultural Research and Education -
(2012-2013)
Section 5:
Specific Performance Requirements
from other Departments
1. A strong network support for channelizing awareness through training programmes, inputs like
monetary support / loans, availability of germplasm, medicines, etc. and market access through
state development agencies, KVKs and NGOs would play a major role. (State AH departments,
DADF, KVKs, NGOs).
2. Development of animal disease diagnostics and vaccines requires sound commitment for
monitoring support for production of diagnostic vaccines whereas for validation under field
conditions, a strong commitment and participation of state agencies will be required. (State AH
departments, Pvt. Industry for up-scaling).
3. The quantity of breeder seed produced is based on the quantity indented by Department of
Agriculture and Cooperation, which in turn collects indents from various seed agencies including
State Departments of Agriculture.
4. Technology adoption would depend upon the proactive role of development departments
namely DAC, DST, DBT, DADF, SAUs etc.
5. Regarding the achievements related to technology assessment through OFTs and capacity
building through training programme, the support of ICAR institutions and SAUs are required in
order to ensure timely technology and methodology backstopping. In addition, farmers participation,
sponsorship of trainees from the line departments, availability of required demonstration plots for
conducting OFTS trials are some of the much needed support from the stakeholders.
6. The success with respect to promotion of technologies covering gender issues requires the
collaboration of AICRP centres, Agricultural Engineering Division and the line departments are
important in generating suitable gender data base, assessment of the technologies keeping in view
the gender perspectives and their dissemination.
7. Popularization and commercialization of tools and equipment will require continued support of
Department of Agriculture and Cooperation, Ministry of Agriculture for frontline demonstrations on
large scale and capacity building of stakeholders and proactive role taken by various line
departments in promoting improved technologies.
8. The Fisheries Division is working in close coordination and linkages with the Ministry of
Agriculture; Ministry of Commerce; Ministry of Science & Technology; Ministry of Environment &
Forest; Ministry of Earth Sciences; Ministry of Food Processing Industries, funding institutions,
private entrepreneurs, NGOs, stakeholders etc. through interface and participation in various
committees and meetings addressing the researchable issues in fisheries and aquaculture for
formulating the strategies and guidelines for policy interventions to facilitate increasing fish
production and productivity. Support from all these agencies and organizations are essential for
achieving the mission of providing required food, nutritional, socio-economic and livelihood
security.
9. The support of the Ministry of Finance and the Planning Commission would be crucial for realizing
of set objectives, target and goals. Further, successful executing of the programmes would depend
on the proactive role of other line departments of states and stakeholders for technology adoption
and timely implementation of suggested strategies & guidelines.
10.Support from the concerned central / state line departments / SAUs, soil testing laboratories,
KVKs, watershed associations, Pani Panchayat for promoting adoption of developed technologies.
11.Support from associated Institutes/DUs/SAUs/line departments for promoting adoption of
developed technologies.
12. Financial support as per EFC / SFC allocation of institute under Horticulture
266 Proceedings of Global Roundtable on
Government Performance Management
Results-Framework Document (RFD) for Department Of Agricultural Research and Education -
(2012-2013)
Section 5:
Specific Performance Requirements
from other Departments
Division including AICRP / network projects.
13.Support from SAUs, KVKs and line departments for promotion and adoption of technologies
developed by the institutes.
14.Financial and technological support from other government departments like, DAC, NMPB,
NHB, APEDA, MoRD, MoHFA, MoWR etc., State line departments and others including foreign
collaborations.
15.The development and strengthening of the SAUs / AUs will depend upon the support / timely
availability of sufficient fund from the central government.
267Proceedings of Global Roundtable on
Government Performance Management
Section6:
Outcome/ImpactofDepartment/Ministry
Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012-
2013)
Department/Ministry
FY14/15
2
4
2
2.5
4
5
1
2
0.8
5000
FY13/14
2
4
2
2.5
4
5
1
2
0.8
5000
FY12/13
2
4
2
2.5
4
5
1
2
0.8
4500
FY11/12
2
3.8
2
2.5
4
5
1
3
1.6
4000
FY10/11
0
0
0
0
0
0
0
0
0
0
Unit
%
%
%
%
%
%
%
%
%
Number
Success
Indicator
Increaseinagriculture
productivity
Increaseinmilkproductivity
Increaseineggproductivity
Increaseinmeatproductivity
Increaseinfishproductivity
IncreaseinGraduates/PG
studentspassedoutand
capacitybuilding
Decreaseinruralpoverty
Increaseinfarmincome
Increaseinpercapita
availabilityofagricultural
products
Technicalpaperspublishedin
recognizedjournals
Jointlyresponsiblefor
influencingthisoutcome/
impactwiththefollowing
department(s)/ministry(ies)
DADF,DAC,PlanningCommission,
MinistryofEnvironment&Forests,
MinistryofPanchayatiRaj,Ministryof
RuralDevelopmentandState
Governments
DADF,MinistryofPanchayatiRaj,
MinistryofRuralDevelopment,State
GovernmentsandNGOs
SAUs,SVUs,MinistryofPanchayatiRaj,
MinistryofRuralDevelopmentandState
Governments
DAC,DADF,SAUs,SVUs,Ministryof
PanchayatiRaj,MinistryofRural
Development,MinistryofFertilizersand
StateGovernments
DST,DBT,ICMR,MinistryofFood
Processing,MinistryofPanchayatiRaj,
MinistryofRuralDevelopmentandState
Governments
Outcome/Impactof
1Enhancedagriculture
productivity
2Enhancedmilk,egg,meat&
fishproductivity
3Enhancedavailabilityof
qualityhumanresourcesfor
agriculturalresearch&
developmentactivities
4Enhancedrurallivelihood
security
5Improvednutritionalsecurity
6Enhancingfrontierresearch/
programmes
268 Proceedings of Global Roundtable on
Government Performance Management
Section6:
Outcome/ImpactofDepartment/Ministry
Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012-
2013)
Department/Ministry
FY14/15
80
10
FY13/14
80
10
FY12/13
70
10
FY11/12
60
10
FY10/11
0
0
Unit
Number
Number
Success
Indicator
Newvarietiesdeveloped
Researchconvertedinto
commercializedtechnology
Jointlyresponsiblefor
influencingthisoutcome/
impactwiththefollowing
department(s)/ministry(ies)
SAU/DU
Outcome/Impactof
7Commercializationof
technologies
269Proceedings of Global Roundtable on
Government Performance Management
Annexure E:
Results-Framework Document Evaluation
Methodology(REM)Guideline
RFD Evaluation Methodology (REM)
PerformanceManagement Division
C A B I N E T S E C R E T A R I A T
270 Proceedings of Global Roundtable on
Government Performance Management
TABLE OF CONTENTS
I Purpose of this document …………………………………………………... 5
II Approach …………………………………………………………………… 5
III Target Audience…………………………………………………………… 5
IV Rationale and Importance of REM …………………………………………. 6
V Calculating overall Quality Rating of RFD ………………………………… 7 VI
Evaluation of Organization‟s Vision (Section 1A) ………………………… 8 VII
Evaluation of Organization‟s Mission (Section 1B) ……………………….. 10
VIII Evaluation of Organization‟s Objectives (Section 1C) …………………….. 11
IX.Evaluation of Section 2 of RFD ……………………………………………. 13
X.Evaluation of Section 3 of RFD ……………………………………………. 21 XI
Evaluation of Section 4 of RFD ……………………………………………. 22 XII
Evaluation of Quality of Section 5 in RFD ………………………………… 24
XIII Evaluation of Quality of Section 6 of RFD ………………………………… 25
XIV Putting it all together ……………………………………………………… 26
271Proceedings of Global Roundtable on
Government Performance Management
LIST OF TABLES
1. Distribution of Relative Weights and Illustrative Calculation of Overall
Quality Rating ……………………………………………………………..
7
2. Distribution of Relative Weights and Illustrative Calculation of Quality
Rating of Vision Statement ………………………………………………...
3. Distribution of Relative Weights and Illustrative Calculation of Quality
Rating of Mission Statement………………………………………………...
9
11
4. Distribution of Relative Weights and Illustrative Calculation of Quality
Rating for list of Organizational Objectives ………………………………...
13
5. Distribution of Relative Weights and Illustrative Calculation of Quality
Rating of Section 2 of the RFD……………………………………………..
6. Calculation for Quality Scores for SIs……………………………………… 17
14
7. Calculation of Outcome Orientation of Success Indicators ……………….. 17
8. Calculation of Quality Orientation of SIs…………………………………… 18
9. Rating of Quality Targets for each SI……………………………………….. 19
10. Calculation the Quality of Section 3 of the RFD - Percentage of Data
Populated ……………………………………………………………………
11. Distribution of Weight Among Criteria and Illustrative Calculations of
Quality of Section 4 of RFD…………………………………………………
22
24
12. Distribution of Weight Among Criteria and Illustrative Calculations of
Quality of Section 5 of the RFD……………………………………………
25
13. Distribution of Weight Among Criteria and Illustrative Calculations of
Quality of Section 6 of the RFD……………………………………………..
26
272 Proceedings of Global Roundtable on
Government Performance Management
LIST OF FIGURES
1. Heuristic Equation Explaining True Performance of an Organization……... 6
2. Summary of the Six Sections of the RFD…………………………………. 7
3. Appropriate Number of Objectives………………………………………… 12
4. Format of Section 2 of RFD………………………………………………… 14
5. Typical Results Chain……………………………………………………….. 16
6. An Example of Results Chain………………………………………………. 16
7. Target of SI………………………………………………………………….. 19
8. Guidelines for Evaluating the Degree of Consistency of Targets ………….. 19
9. Section 3 of RFD - Trend Values for Success Indicators…………………… 21
10. Calculation of Percentage of Data Populated……………………………….. 22
11. Sample of Acronyms of Section 4 of RFD………………………………….. 23
12. Sample of SI Definition and Measurement Methodology of Section 4 of
RFD …………………………………………………………………………
23
13. Sample Section 5 from RFD………………………………………………… 24
14. Outcome / Impact of activities of department/ ministry…………………….. 25
273Proceedings of Global Roundtable on
Government Performance Management
RFD EVALUATION METHODOLOGY (REM)
I. PURPOSE OF THIS DOCUMENT
This document outlines the methodology for evaluating the quality of a Results-Framework
Document (RFD). This methodology is based on the Guidelines for preparing RFD,
developed by the Performance Management Division (PMD), Cabinet Secretariat, and
approved by the High Power Committee on Government Performance. Hence, this
methodology is a complement to the RFD Guidelines and should be read along with it.
The RFD evaluation methodology outlined in this document is intended to provide a
benchmark against which the design of an RFD can be evaluated. It provides an agreed
definition of “quality” in the context of designing RFD. In the absence of such a shared
understanding, there is a danger that the “quality” of RFD, like beauty, could lie in the eyes
of the beholder.
II. APPROACH
Any „evaluation‟ essentially involves comparing achievement against a target. Therefore, to
evaluate the quality of an RFD we must agree on the target against which we shall judge the
quality of RFD. Since RFD is supposed to be designed as per the RFD Guidelines, it is only
logical and fair to use the RFD Guidelines as the benchmark / target for judging the quality of
an RFD. In other words, our approach is to ascertain how well the RFD Guidelines were
followed to draft the RFD that is being evaluated.
The Results-Framework Document Evaluation Methodology (REM) is a useful analytical
tool designed to assess all RFD sections across all Departments using the same methodology
and minimizing the subjectivity of the assessments.
For each section of RFD we have provided a number of assessment criteria against which a
score is assigned, using the same 5 points rating scale already in use for the RFDs (from 60%
to 100%). These criteria are largely based on the RFD Guidelines document. They comprise
quantitative and qualitative criteria. Quantitative criteria aim to capture risks and limitations
in a numerical way (e.g. "percentage of data populated"); qualitative criteria are applied to
assessment areas for which a numerical analysis is not feasible but can indeed be measured
against the agreed Guidelines for preparing RFD.
III. TARGET AUDIENCE
This methodology is meant primarily for the organizations preparing RFDs. It provides a
convenient checklist for a self-audit. To ensure that all stakeholders are on the same page,
this methodology is also meant for providing a useful platform during the departmental
discussions with members of the Ad-hoc Task Force.
274 Proceedings of Global Roundtable on
Government Performance Management
IV. RATIONALE AND IMPORTANCE OF REM
RFD policy is based on the following fundamental principle of management: What gets
measured gets done. This principle is transcendental in its application and it also applies, in
equal measure, to the „quality‟ of RFD. Unless we have an agreed yardstick for measuring the
„quality‟ of RFD, we will not be able to determine whether successive drafts represent an
improvement or otherwise. Indeed, we will not be able to determine whether all our collective
efforts are improving the „quality‟ of RFDs over time.
In addition, we believe that the quality of deliberations and discussions would be much more
systematic and objective. It will bring rigor and, therefore, greater credibility to our critiques
of RFD.
Above all, we need to remember that RFD is a means towards an end and not an end in and
of itself. The purpose of RFD is to improve performance of an organization by giving the
departmental managers clear, meaningful and unambiguous targets and evaluating their
performance by comparing their achievements against these targets.
If, however, the quality of targets is not very meaningful, then the achieving these targets is
not likely to be very meaningful. This then is the reason for ensuring that targets in RFD are
meaningful. For example, the meaningfulness of targets depends, among other things, on
their alignment with vision, mission and objectives. This is just another way of saying that
quality of RFD matters.
The following heuristic equation captures the essence of the above arguments:
For example:
Figure 1: Heuristic Equation Explaining True Performance of an Organization
In simple words, if the quality of your RFD is 70%, then the maximum score that you can get
is 70%. The quality of RFD provides the upper limit on the maximum score a department can
get.
Performance against
RFD Targets
X Quality of RFD = TRUE PERFORMANCE
OF THE ORGANIZATION
100 %
(RFD Composite Score)
X 70 %
(Quality Rating for RFD)
= 70 %
275Proceedings of Global Roundtable on
Government Performance Management
V. CALCULATING OVERALL QUALITY RATING OF RFD
As we know, an RFD contains the following six sections:
Section 1 Ministry‟s /department‟s Vision, Mission, Objectives and Functions
Section 2 Inter se priorities among key objectives, success indicators and targets
Section 3 Trend values of the success indicators
Section 4 Description and definition of success indicators and proposed measurement
methodology
Section 5 Specific performance requirements from other departments that are critical for
delivering agreed results
Section 6 Outcome / Impact of activities of department/ministry
Figure 2: Summary of the Six Sections of the RFD
Hence, the overall quality of RFD would depend on the quality of each section and the
relative priority of the section. Table 1 summarizes the relative weights for each of the six
sections of the RFD and illustrative calculations used for arrive at the Overall Quality Rating
for the RFD as well.
The distribution of relative weights among various sections was decided after extensive
consultations with all stakeholders, including members of the Ad-Hoc Task Force (ATF).
Table 1
Distribution of Relative Weights and Illustrative Calculation of Overall Quality Rating
Section
of RFD
Section Description Weight Raw Score
for the
Section
Weighted
Raw Score
for the Section
Source of
Data
1 (A) Vision 5 90.0 4.50 Table 2
1 (B) Mission 5 90.0 4.50 Table 3
1 (C) Objectives 5 97.0 4.85 Table 4
2 Inter se priorities among key
objectives, success indicators and
targets
40 87.9 35.00 Table 5
3 Trend values of the success
indicators
15 90.0 13.50 Table 10
4 Description and definition of success
indicators and proposed
5 86.0 4.30 Table 11
276 Proceedings of Global Roundtable on
Government Performance Management
In the following sections we will explain the criteria and their relative weights in evaluating
the quality of each section of RFD.
VI. EVALUATION OF ORGANIZATION’S VISION (SECTION 1A)
According to RFD Guidelines, Vision is an idealized state for the department. It is the big
picture of what the leadership wants the department to look like in the future.
Vision is a symbol, and a cause to which we want to bond the stakeholders, (mostly
employees and sometime other stake-holders). As they say, the people work best, when they
are working for a cause, than for a goal. Vision provides them that cause.
Vision is a long-term statement and is typically generic and grand. Therefore, a vision
statement does not change from year to year unless the department is dramatically
restructured and is expected to undertake very different tasks in the future.
Vision should never carry the 'how' part of vision. For example „To be the most admired
brand in Aviation Industry‟ is a fine vision statement, which can be spoiled by extending it to
„To be the most admired brand in the Aviation Industry by providing world-class in-flight
services.‟ The reason for not including 'how' is that the 'how' part of the vision may keep on
changing with time.
Writing up a Vision statement is not difficult. The problem is to make employees engaged
with it. Many a time, terms like vision, mission and strategy become more a subject of scorn
than being looked up-to. This is primarily because leaders may not be able to make a
connection between the vision/mission and employees every day work. Too often, employees
see a gap between the vision, mission and their goals and priorities. Even if there is a
valid/tactical reason for this mismatch, it is not explained. The leadership of the ministry
(Minister and the Secretary) should therefore consult a wide cross section of employees and
come up with a Vision that can be owned by the employees of the ministry/department.
Vision should have a time horizon of 10-15 years. If it is less than that, it becomes tactical. If
it has a horizon of 20+ years (say), it becomes difficult for the strategy to relate to the vision.
Section Raw Score
of RFD Section Description Weight for the
Section
Weighted
Raw Score
for the Section
Source of
Data
measurement methodology
5 Specific performance requirements
from other departments that are
critical for delivering agreed results
5 85.0 4.25 Table 12
6 Outcome / Impact of activities of
department/ministry
20 88.0 17.6 Table 13
Total Weight = 100
Overall Quality Rating for RFD = 88.5
277Proceedings of Global Roundtable on
Government Performance Management
Features of a good vision statement:
 Easy to read and understand.
 Compact and crisp – leaves some things for people‟s imagination.
 Gives the destination and not the road-map.
 Is meaningful and not too open-ended and far-fetched.
 Excites people and makes them feel energized.
 Provides a motivating force, even in hard times.
 Is perceived as achievable and at the same time is challenging and compelling,
stretching us beyond what is comfortable.
The entire process starting from the Vision down to the objectives is highly iterative. The
question is from where we should start? We strongly recommend that vision and mission
statement should be made first without being colored by constraints, capabilities and
environment. It is akin to the vision of several armed forces: 'Keeping the country safe and
secure from external threats'. This vision is non-negotiable and it drives the organization to
find ways and means to achieve their vision, by overcoming constraints on capabilities and
resources. Vision should be a stake in the ground, a position, a dream, which should be
prudent, but should be non-negotiable barring few rare circumstances.
From the above guidance on Vision we have culled out the following key criteria for
evaluating the quality of a Vision statement included in an RFD. A Vision statement should:
1 deal with “what” the organization wants to achieve and not the “how” it intends to
achieve it
2 be „Forward‟ looking and focus on the destination and not on past achievements
3 be succinct and clear
4 Be inspiring and engaging
The Table 2 below shows the distribution of weight across these criteria and an illustrative
calculation of the quality rating for Vision statement.
Table 2
Distribution of Relative Weights and
Illustrative Calculation of Quality Rating for Vision Statement
Criteria to evaluate
quality of a Vision Weight
Statement
Criteria Values Raw
Score
Weighted
Excellent Very Goo
d Fair Poor
Raw
Goo
d
Score Source of
Data
100% 90% 80% 70% 60%
1 The “What”, not the
“How”
0.25 X 90 22.5 See
above for
guidance2 Forward looking 0.25 X 90 22.5
3 Succinct and clear 0.25 X 100 25
4 Inspiring and 0.25 X 80 20
278 Proceedings of Global Roundtable on
Government Performance Management
When a person evaluating the RFD gives less than 100% for any of the criteria, then the
person must provide an explanation for arriving at this conclusion. Clearly, these four criteria
require judgment. But by narrowing down the criteria we believe that the variation between
experts evaluating RFD will be minimized, if not eliminated. Where we find that using the
same criteria, experts come to very different and divergent ratings, then we may have to fine-
tune the criteria and weights.
It is important to note that a flawed Vision can have an exponentially distorting effect on the
quality of RFD. If Mission and Objectives are aligned to a flawed Vision, then the document
takes us in a completely different direction. Hence, the importance of a well-crafted Vision
cannot be underestimated. Ideally, „Total Raw Score‟ of Vision, Mission and Objectives
could be derived as a multiplicative score rather than as an additive score. However, in this
version of REM, we use additive scores and have not explicitly incorporated this source of
potential distortion.
VII. EVALUATION OF ORGANIZATION’S MISSION (SECTION 1B)
An organization‟s Mission is the nuts and bolts of the vision. Mission is the „who, what, and
why‟ of the department‟s existence.
We strongly recommend that mission should follow the vision. This is because the purpose of
the organization could change to achieve their vision. The vision represents the big picture
and the mission represents the necessary work.
Mission of the department is the purpose for which the department exists. It is in one of the
ways to achieve the vision.
Famous management expert Mintzberg defines a mission as follows:
“A mission describes the organization‟s basic function in society, in terms of the products
and services it produces for its customers.”
Vision and Mission are part of strategic planning exercise. To see the relation between the
two, consider following definitions:
1. Vision: outlines what the organization wants to be, or how it wants the world in
which it operates to be (an "idealised" view of the world). It is a long-term view and
concentrates on the future. It can be emotive and is a source of inspiration. For
example, a charity working with the poor might have a vision statement which reads
"A World without Poverty."
Criteria to evaluate
quality of a Vision Weight
Statement
Criteria Values Raw
Score
Weighted
Excellent Very Goo
d Fair Poor Raw
Goo
d
Score Source of
Data
100% 90% 80% 70% 60%
Engaging
Quality Rating for Vision Statement = 90.0
279Proceedings of Global Roundtable on
Government Performance Management
 Mission: Defines the fundamental purpose of an organization or an enterprise,
succinctly describing why it exists and what it does to achieve its vision. For example,
the charity above might have a mission statement as "providing jobs for the homeless
and unemployed".
To evaluate the quality of a Mission Statement in an RFD we have agreed to use the
following criteria:
1 Is the Mission aligned with Vision (follows the level of Vision and is long-term)?
2 Does the Mission deal with “how” Vision will be achieved but at higher levels of
conceptualization than Objectives?
3 Is Mission Statement succinct and clear?
The Table 3 below shows the distribution of weight across these criteria and an illustrative
calculation of the quality rating for Vision statement.
Table 3
Distribution of Relative Weights and
Illustrative Calculation of Quality Rating for a Mission Statement
VIII. EVALUATION OF ORGANIZATION’S OBJECTIVES (SECTION 1C)
Objectives represent the developmental requirements to be achieved by the department in a
particular sector by a selected set of policies and programmes over a specific period of time
(short-medium-long). For example, objectives of the Ministry of Health and Family Welfare
could include: (a) reducing the rate of infant mortality for children below five years; and (b)
reducing the rate of maternity death by the end of the development plan.
Criteria to evaluate Weight
quality of a Mission
Statement
Criteria Values Raw
Score
Weighted
Excellent Very Good Fair Poor
Good Raw
Score Source
of Data
100% 90% 80% 70% 60%
1 Aligned with Vision
(follows the level of
Vision and is long-term)
0.4 X 90 36 See
Abov
e for
Guida
nce
2 The “How” (at higher
levels than Objectives)
0.3 X 90 27
3 Succinct & clear 0.3 X 90 27
Quality Rating for Mission Statement = 90
280 Proceedings of Global Roundtable on
Government Performance Management
Objectives could be of two types: (a) Outcome Objectives address ends to achieve, and (b)
Process Objectives specify the means to achieve the objectives. As far as possible, the
departmentshould focus on Outcome Objectives.1
Objectives should be directly related to attainment and support of the relevant national
objectives stated in the relevant Five Year Plan, National Flagship Schemes, Outcome Budget
and relevant sector and departmental priorities and strategies, President‟s Address, the
manifesto, and announcement/agendaas spelt out by the Government from time to time.
Objectives should be linked and derived from the Departmental Vision and Mission
statements.
In view of the above, we believe that quality of the objectives should be judged by the
following four criteria:
1. Alignmentwith Mission and Vision
Here we should ask ourselves whether achievement of the objectives specified would lead us
to achieve departmental vision and mission. This is not an exact science and judgment would
be required. For example, if the Vision of a department is “Healthy Nation” then it would
seem “Reducing Child Mortality” would be an objective that could be considered aligned
with departmentalvision.
2. Results-driven (At the level of program ratherthan actions)
If a department‟s vision includes “Safer Roads” then an objective of “increasing awareness
about road safety” would be considered well aligned and focusing at program level as it
focuses on “road Safety Awareness Program.”
However, if the department were to include an objective such as “conducting road safety
awareness programs,” it would still be aligned to departmental Vision of „Safer Roads‟ but it
would be more at the level of action than program.
3. Appropriate number of objectives (no duplication or redundancies in objectives, no
conflictsin articulated objectives)
Management experts generally recommend that the number of objectives for a normal
organization should not generally exceed eight. Of course, large organizations will tend to
have more objectives and smaller ones will have less. We propose that the following
guidelines should be used for determining the appropriatenumber of objectives:
Figure3: Appropriate Number of Objectives
1
Often a distinction is also made between “Goals” and “Objectives.” The former is supposed to be more general
and latter more specific and measurable. The Vision and Mission statement are expected to capture the general
direction and future expected outcomes for the department. Hence, only the inclusion of objectives in Section 1 is
required.
Excellent Very Good Good Fair Poor
8-10 7 or 11 6 or 12 5 or 13 ≤4 or ≥14
281Proceedings of Global Roundtable on
Government Performance Management
These are only guidelines and should, like any other guideline, be used judiciously and not
mechanically.
4. Non duplication, non-redundancy and absence of overt conflicts in stated objectives
It is also important to make objectives crisp and non-duplicative. We should not include
redundant statements and generalities as objectives. Even more importantly, we should not
have explicitly contradictory and overtly conflicting objectives.
The Table 4 below shows the distribution of weight across these four criteria and an
illustrative calculation of the quality rating for the section dealing with „Objectives.‟
Table 4
Distribution of Relative Weights and
Illustrative Calculation of Quality Rating for list of Organizational Objectives
IX. EVALUATION OF SECTION 2 OF RFD
The heart of any RFD is Section 2 and the heart of Section 2 is Figure 4. That is why in the
overall rating of RFD, this section has a weight of 40%. The description of each column is
given in the Guidelines for RFD.
Criteria to evaluate Weight
quality of Objectives
Criteria Values Raw Weighted
Score Raw
ScoreExcellent Very
Good
Good Fair Poor Source
of Data
100% 90% 80% 70% 60%
1 Aligned with Mission 0.3 X 100 30 See above
for
Guidance
2 Results-driven (At
the level of program
rather than actions)
0.3 X 90 27 See above
for
Guidance
3 Appropriate number
of objectives
0.2 X 100 20 See above
for
Guidance
4 Non duplication, non-
redundancy and
absence of overt
conflicts in stated
objectives
0.2 X 100 20 See above
for
Guidance
Quality Rating for Objectives = 97
282 Proceedings of Global Roundtable on
Government Performance Management
Figure 4: Format of Section 2 of RFD
The following is the summary table for the evaluation of Section 2 of the RFD:
Table 5
Distribution of Weight and Sample Calculation of Quality Rating for Section 2
Column 1 Column 2Column 3Column 4 Column 5Column 6Success UnitWeight
Indicator Target / Criteria Value
Excellent Very Good Fair Poor
GoodObjective Weight Actions
100% 90% 80% 70% 60%
Objective 1 Action 1
Action 2
Action 3
Objective 2 Action 1
Action 2
Action 3
Objective 3 Action 1
Action 2
Action 3
Criteria to
evaluate Weight
Quality of
Targets for SIs
Criteria Values
Raw Weighted
Score Raw
ScoreExcellent Very
Good
Good Fair Poor Source
of Data
100% 90% 80% 70% 60%
1 Extent to
which actions
(in Column 3
of RFD)
adequately
capture
objectives
15 X 90 13.5 See
below
for
guidance
2 Extent to
which success
indicators
(Column 4 of
RFD)
adequately
capture Actions
15 X 100 15.0 See
below
for
guidance
3 Quality /
Nature of
Success
Indicators (SIs)
40 X 81 32.4 Table 6
283Proceedings of Global Roundtable on
Government Performance Management
Brief guidance on Table 5:
The following five criteria are proposed for assessing the quality of elements of Section 2 of
the RFD given in Table 5 above:
1. Do actions (in Column 3) adequately capture objectives?
For each objective, the department must specify the required policies, programmes, schemes
and projects. Often, an objective has one or more policies associated with it. Objective
represents the desired “end” and associated policies, programs and projects represent the
desired “means.” The latter are listed as “actions” under each objective.
Assessors and evaluators should use their domain knowledge and knowledge of the
department to ensure all key actions are listed under various objectives. Often, departments
do not mention some key schemes under action just because they feel they may not be able to
achieve the expected target for such important schemes.
Ideally, all actions, taken together, should cover close to 100% of plan funds. But money is
not everything. Evaluators must ensure that those actions that may not require money are also
being adequately covered.
In evaluating this aspect, we should also examine whether actions from previous years have
been dropped for valid reasons.
2. Do success indicators (Column 4) adequately capture Actions?
For each of the “action” specified in Column 3, the department must specify one or more
“success indicators.” They are also known as “Key Performance Indicators (KPIs)” or “Key
Result Indicators (KRIs).” A success indicator provides a means to evaluate progress in
Criteria to
evaluate Weight
Quality of
Targets for SIs
Criteria Values
Raw Weighted
Score Raw
ScoreExcellent Very
Good
Good Fair Poor Source
of Data
100% 90% 80% 70% 60%
4 Appropriatenes
s of distribution
of weight
among
objectives
15 X 90 13.5 See
below
for
guidance
5 Quality of
targets for
respective
Success
Indicators in
RFD
15 X 90 13.5 Table 9
Rating for Quality of Targets = 87.9
284 Proceedings of Global Roundtable on
Government Performance Management
implementing the policy, programme, scheme or project. Sometimes more than one success
indicator may be required to tell the entire story.
Success indicators should consider both „qualitative’ and ‘quantitative’ aspects of
departmental performance.
3. Quality / Nature of Success Indicators (SIs)
In selecting success indicators, any duplication should be avoided. For example, the usual
chain for delivering results and performance is depicted in Figure 5. An example of this
results chain is depicted in Figure 6.
Figure 5: Typical Results Chain Figure 6: An Example of Results Chain
If we use Outcome (increased literacy) as a success indicator, then it would be duplicative to
also use inputs and activities as additional success indicators.
Ideally, one should have success indicators that measure Outcomes and Impacts. However,
sometimes due to lack of data one is able to only measure activities or output.
The common definitions of these terms are as follows:
i. Inputs: The financial, human, and material resources used for the development
intervention.
ii. Activity: Actions taken or work performed through which inputs, such as funds,
technical assistance and other types of resources are mobilized to produce specific
outputs.
iii. Outputs: The products, capital goods and services that result from a development
intervention; may also include changes resulting from the intervention which are
relevant to the achievement of outcomes. Sometimes, „Outputs‟ are divided into two
sub categories – internal and external outputs. „Internal‟ outputs consist of those
outputs over which managers have full administrative control. For example, printing a
brochure is considered an internal output as it involves spending budgeted funds in
hiring a printer and giving orders to print a given number of brochures. All actions
Results-Based Management:
Adult Literacy
Outcomes
• Increased literacyskill; more
employment opportunities
Outputs
• Number of adults completing
literacy courses
Activities • Literacy training courses
Inputs • Facilities, trainers, materials
Goal
(Impacts)
• Higher income levels;
increaseaccess to higher
skill jobs
Results-Based Management
Outcomes
• Intermediate effects of
outputs on clients
Outputs
• Productsand services
produced
Activities
• Tasks personnel
undertake to transform
inputs to outputs
Inputs
• Financial, human, and
material resources
Goal
(Impacts)
• Long-term,widespread
improvement in society
ImplementationResults
285Proceedings of Global Roundtable on
Government Performance Management
required to print a brochure are fully within the manager‟s control and, hence, this
action is considered ‘Internal’ output. However, having these brochures picked up
by the targeted groups and, consequently, making the desired impact on the target
audience would be an example of external output. Thus, actions that exert influence
beyond the boundaries of an organization are termed as ‘external’ outputs.
iv. Outcome: The likely or achieved short-term and medium-term effects/ impact of an
intervention‟s Outputs
The quality score for SIs is calculated as shown in Table 6 below:
Table 6
Calculation for Quality Score for SIs
The Outcome—Orientation of Success Indicators is calculated as follows in Table 7:
Table 7
Calculation of Outcome—Orientation of Success Indicators
Criteria to
evaluate
Quality of Weight
Success
Indicators
(SIs)
Criteria Values Raw Weighted
Score Raw
ScoreExcellent Very Good Fair Poor
Good Data
flows
from
REM
Table
100% 90% 80% 70% 60%
1 Outcome-
Orientation
of Success
Indicators
0.90 X 80 72.0 From
Table 7
2 Quality-
Orientation
of Success
Indicators
0.10 X 90 9.0 From
Table 8
Quality Rating for SIs = 81.0
Criteria to
evaluate
Outcome
Orientation
of Success
Indicators
(SIs)
Weight Criteria Values Raw Weighted
Score Raw
ScoreExcellent Very Good Fair Poor
Good Data flows
from REM
Table
100% 90% 80% 70% 60%
Outcome External Internal
Output Output
Activity Input
1 Success
Indicator 1
0.3 X 80 24 See
2 Success
Indicator 2
0.3 X 100 30
3 Success 0.2 X 70 14
286 Proceedings of Global Roundtable on
Government Performance Management
The Quality—Orientation of Success Indicators is calculated as follows:
Table 8
Calculation of Quality—Orientation of Success Indicators
4. Is the distribution of weigh among objectives appropriate to capture the relative
emphases required for achieving the Mission and Vision of the organization?
Objectives in the RFD (Column 1 of figure 4) should be ranked in a descending order of
priority according to the degree of significance and specific weights should be attached to
these objectives. The Minister in-charge will ultimately have the prerogative to decide the
inter se priorities among departmentalobjectives and all weights.
If there are multiple actions associated with an objective, the weight assigned to a particular
objective should be spread across the relevant success indicators.
Criteria to
evaluate
Outcome
Orientation
of Success
Indicators
(SIs)
Weight Criteria Values Raw Weighted
Score Raw
ScoreExcellent Very Good Fair Poor
Good Data flows
from REM
Table
100% 90% 80% 70% 60%
Outcome External Internal
Output Output
Activity Input
Indicator 3 Above for
Guidance
4 Success
Indicator „N‟
0.2 X 60 12
Quality Rating for Outcome—Orientation of Success Indicators = 80
Criteria to evaluate
Quality-
Orientation of Weight
Success Indicators
(SIs)
Criteria Values Raw Weighted
Score Raw
ScoreExcellent Very
Good
Good Fair Poor Source of
Data
100% 90% 80% 70% 60%
5 SIs 4 SIs 3 SIs 2 SIs 1 SI
1 Number of
indicators explicitly
measuring quality
of Government
Performance
100 X 90 90
Rating for Quality—Orientation of Success Indicators = 90
287Proceedings of Global Roundtable on
Government Performance Management
5. What is the quality of targets for respective Success Indicators in RFD?
Targets are tools for driving performance improvements. Target levels should, therefore,
contain an element of stretch and ambition. However, they must also be achievable. It is
possible that targets for radical improvement may generate a level of discomfort associated
with change, but excessively demanding or unrealistic targets may have a longer-term
demoralizing effect.
The target for each SI is presented as per the five-point scale given below:
Figure 7: Target of SI
It is expected that budgetary targets would be placed at 90% (Very Good) column. For any
performance below 60%, the department would get a score of 0%.
Table 9 summarizes the criteria for judging the quality of targets:
Table 9
Rating for Quality of Targets for each SI
Following Figure provides guidelines for evaluating the degree of consistency of targets and
also establishing the degree of stretch (challenge) built in targets.
Excellent Very Good Good Fair Poor
100 % 90% 80% 70 % 60 %
Criteria to
evaluate Weight
Quality of
Targets for SIs
Criteria Values Raw Weighted
Score Raw
ScoreExcellent Very
Good
Good Fair Poor Source of
Data
100% 90% 80% 70% 60%
1 Consistency
with Planning
Commission /
MOF Targets
70 X 90 63 See
Figure 8
2 Degree of
Stretch
30 X 100 30 See
Figure 8
Rating for Quality of Targets = 93
Criteria to
evaluate
Quality of
Targets for SIs
Criteria Values
Excellent Very
Good
Good Fair Poor
100% 90% 80% 70% 60%
1 Consistency
with Planning
Commission /
MOF Targets
100 % targets
are consistent
90 %
targets are
consistent
80 % targets
are consistent
70 %
targets are
consistent
60 % targets are
consistent
288 Proceedings of Global Roundtable on
Government Performance Management
Figure 8: Guidelines for Evaluating the Degree of Consistency of Targets
To establish whether targets are consistent with Planning Commission / MOF targets, we will
need the departments to show evidence. For major targets it can be ascertained from Annual
Plan, 12th
Plan documents, approved demand for grants, and Outcome Budget.
To determine whether the targets are challenging one has to use one‟s judgment and look at
many sources of information. Clearly, information from Section 3 would be among one of the
most useful sources of information for this purpose.
The summary table for the calculation of the Quality of Section 2 of the RFD is reproduced
below again for reference.
Table 5
Distribution of Weight and Sample Calculation of Quality Rating for Section 2
Criteria to
evaluate
Quality of
Targets for SIs
Criteria Values
Excellent Very
Good
Good Fair Poor
100% 90% 80% 70% 60%
2 Degree of
Stretch
100% targets
are
challenging
90%
targets are
challenging
80% targets
are
challenging
70%
targets are
challenging
60% targets are
challenging
Criteria to
evaluate Weight
Quality of
Targets for SIs
Criteria Values Raw Weighted
Score Raw
ScoreExcellent Very
Good
Good Fair Poor Source of
Data
100% 90% 80% 70% 60%
1 Extent to
which actions
(in Column 3
of RFD)
adequately
capture
objectives
15 X 90 13.5 See
above for
guidance
2 Extent to
which success
indicators
(Column 4 of
RFD)
adequately
capture Actions
15 X 100 15.0 See
above for
guidance
3 Quality /
Nature of
Success
Indicators (SIs)
40 X 81 32.4 Table 6
289Proceedings of Global Roundtable on
Government Performance Management
X. EVALUATION OF SECTION 3 OF RFD
For every success indicator and the corresponding target, Section 3 of RFD provides target
values and actual values for the past two years and also projected values for two years in the
future (reproduced below). The inclusion of target values for the past two years vis-a-vis the
actual values are expected to help in assessing the robustness of the target value for the
current year. However, one cannot begin to evaluate the robustness or otherwise without data
in Section 3. Therefore, Table 10 measures the degree to which the data for Section 3 has
been provided in the RFD.
Figure 9: Section 3 of the RFD - Trend Values for Success Indicators
To evaluate the quality of Section 3 of RFD we have agreed to use the following criterion:
Criteria to
evaluate
Quality of Weight
Targets for SIs
Criteria Values Raw Weighted
Score Raw
ScoreExcellent Very
Good
Good Fair Poor Source of
Data
100% 90% 80% 70% 60%
4 Appropriatenes
s of distribution
of weight
among
objectives
15 X 90 13.5 See
above for
guidance
5 Quality of
targets for
respective
Success
Indicators in
RFD
15 X 90 13.5 Table 9
Rating for Quality of Targets = 87.9
Objective Actions Success
Indicator
Unit Actual
Value
for
FY 11/12
Actual
Value
for
FY 12/13
(anticipated)
Target
Value
for
FY 13/14
Projected Projected
Value Value
for for
FY 14/15 FY 15/16
Objective 1 Action 1
Action 2
Action 3
Objective 2 Action 1
Action 2
Action 3
Objective 3 Action 1
Action 2
Action 3
290 Proceedings of Global Roundtable on
Government Performance Management
Table: 10
Calculation of Quality Rating of Section 3 of the RFD
Percentage of Data Populated
This is a basic requirement for any management effort. Unless the trend values of the
previous and future years are available, it is difficult to assess whether the targets in RFD are
challenging or not.
To assess the value under this criterion, we need to count the total number of cells which
contain data. For each success indicator there should be 5 data values – the data for previous
2 years, the current year target value and the data for future two years.
Number of cells containing data
Figure 10: Formula for Calculating the Percentage of Data Populated
There are only two legitimate reasons for not having data in a particular cell of Section 3.
1. The success indicator is being used for the first time and no records were maintained
in this regard in the past
2. The values are in dates and hence they do not represent a trend and using them is not
meaningful.
XI. EVALUATION OF SECTION 4 OF RFD
RFD is a public document and hence it must be easily understood by a well-informed average
stakeholder. Towards this end, RFD contains a section giving detailed definitions of various
success indicators and the proposed measurement methodology. Abbreviation/acronyms and
other details of the relevant scheme are also expected to be listed in this section.
Percentage of data populated
Total number of SIs X 5
=
Criterion to
evaluate Weight
Quality of
Section 3
Target / Criterion Value
Raw Weighted
Score Raw
Score
Source of
DataExcellent Very
Good
Good Fair Poor
100% 90% 80% 70% 60%
1 Percentage of
data populated
100 X 90 90 See
Above
for
Guidance
Quality Rating for Section 3 = 90
291Proceedings of Global Roundtable on
Government Performance Management
Wherever appropriate and possible, the rationale for using the proposed success indicators
should be provided as per the format in the RFMS. Figures 11 and 12 give a sample of
Section 4 data from Department of AYUSH (Ayurveda, Yoga and Naturopathy, Unani, Siddha
and Homoeopathy).
Figure 11: Sample of select Acronyms from Section 4 of Department AYUSH RFD
Figure 12: Sample of SI Definition and Measurement Methodology from Section 4 of
AYUSH RFD
To evaluate the Section 4 of RFD we have agreed to use the following criteria:
1. Whether all acronyms used in the body of the RFD have been explained in simple
layman‟s terms?
2. Whether necessary explanations and justifications have been given for using a
particular type of success indicator, where required?
3. If so, what is the quality of explanations?
Acronym Description
ASU Ayurveda, Siddha and Unani
ASUDCC Ayurveda, Siddha and Unani Drugs Consultative Committee
ASUDTAB Ayurveda, Siddha and Unani Drugs Technical Advisory Board
AYUSH Ayurveda, Yoga and Naturopathy, Unani, Siddha and Homoeopathy
CHCs Community Health Centres
COE Centre of Excellence
D and C Drugs and Cosmetics
Success Indicator Description Definition Measurement General Comments
[1.1.1] Primary
Health Centres/
Community Health
Centres/District
Hospitalscovered
Completion of
infrastructure,
equipment,
furniture and
provision of
medicines for the
co-located
AYUSH Units of
Primary Health
Centres (PHCs),
Community
Health Centres (
CHCs) &
District Hospitals
( DHs).
Co-located AYUSH
Health Care Units at
Primary Health
Centres (PHCs),
Community Health
Centres (CHCs) &
District Hospitals
(DHs) implies
facilities for
provision of
AYUSH health
services along with
allopathic health
services.
Number of
Units
As per approved
norms, assessments of
the needs will be
measured through the
appraisal of the
Programme
Implementation
Plan(PIP) of the State
Governments and the
outcomes shall be
monitored through
progress reports and
periodical reviews.
292 Proceedings of Global Roundtable on
Government Performance Management
The Table 11 below shows the distribution of weight across these criteria and an illustrative
calculation.
Table 11
Distribution of Weight among Criteria and Illustrative Calculation of Quality of Section
4 of the RFD
XII. EVALUATION OF QUALITY OF SECTION 5 IN RFD
Section 5 of RFD should contain expectations from other departments that impact the
department‟s performance and are critical for achievement of the selected Success Indicator.
These expectations should be mentioned in quantifiable, specific, and measurable terms.
While listing expectations, care should be taken while recording as this would be
communicated to the relevant Ministry/Department and should not be vague or general in
nature. This should be given as per the new format incorporated in the RFMS.
Figure 13: Sample Section 5 from RFD
To evaluate the Section 5 of RFD we have agreed to use the following criteria:
4. Whether claims of dependencies/requirements from other departments are appropriate or
not?
5. Whether requirements from other departments/ claims of dependencies are specific or
not?
Location
Type
State Organization
Type
Organization
Name
Relevant
Success
Indicator
What is your
requirement
from this
organization
Justificati
on for this
requireme
nt
Please
quantify
your
requirement
from this
Organizatio
n
What happens if
your requirement
is not met
Data for this table
wil
l be
provided
by the Depart ment.
Notto be entered for REM.
Criteria to evaluate
quality of Section 4 Weight
Target / Criteria Value
Excellent Very Good Fair Poor
Good
Raw Weighted Source of Data
Score Raw
Score
100% 90% 80% 70% 60%
1 All Acronyms have
been explained
0.1 X 100 10 See Above for
Guidance
2 Necessary
explanations have
been given for SIs?
0.5 X 90 45 See Above for
Guidance
3 Quality of
explanations?
0.4 X 80 32 See Above for
Guidance
Quality Rating for Section 4 = 87
Thank You !

More Related Content

PPTX
Difference between monitoring and evaluation
PPTX
Measures of central tendency and dispersion
PPTX
Experimental Evaluation Methods
PPTX
Environment impact assessment
PPT
Spearman Rank Correlation Presentation
PPTX
Measure of Central Tendency
PPT
Accountability & transparency and good governance 28 08-2011
PPTX
budget cycle ppt.pptx
Difference between monitoring and evaluation
Measures of central tendency and dispersion
Experimental Evaluation Methods
Environment impact assessment
Spearman Rank Correlation Presentation
Measure of Central Tendency
Accountability & transparency and good governance 28 08-2011
budget cycle ppt.pptx

Similar to India performance monitoring and evaluation systems (20)

PDF
Companies-in-Government
PDF
Final draft PFMR Strategy Zambia 2019 -2022.pdf
PPTX
Ethiopian Government Accounting System.pptx
PDF
PFM Pocketbook of Financial management Final version.pdf
PDF
GSTN Report
PPT
Australian Performance Based Budgeting (PBB) – an outline of the steps taken ...
PPTX
ethiopiangovernmentaccountingsystem-230625142338-47240c36.pptx
PDF
Government of Kenya Documents Reviewed During the Study
DOC
Risk Analysis in Audit Plannig.doc
DOCX
Public financial management reforms
PPT
The budget making process and monitoring lecture kmtc
PPT
The budget making process and monitoring lecture kmtc
PPTX
Finance
PPTX
GOODS AND SERVICE TAX NETWORK (GSTN)
PDF
Cambodia; assessing progress in public finance management reform july 2011
PPT
Rakesh verma public audit english
PDF
PPP Reference Guide Version 3 - PPP Framework.pdf
PDF
The Effect Of Profitability, Company Size, Auditor Reputation, And Leverage O...
PPTX
What is pfm df ppt aim v4 final
PDF
Material de Apoio | Pay flexibility country overview
Companies-in-Government
Final draft PFMR Strategy Zambia 2019 -2022.pdf
Ethiopian Government Accounting System.pptx
PFM Pocketbook of Financial management Final version.pdf
GSTN Report
Australian Performance Based Budgeting (PBB) – an outline of the steps taken ...
ethiopiangovernmentaccountingsystem-230625142338-47240c36.pptx
Government of Kenya Documents Reviewed During the Study
Risk Analysis in Audit Plannig.doc
Public financial management reforms
The budget making process and monitoring lecture kmtc
The budget making process and monitoring lecture kmtc
Finance
GOODS AND SERVICE TAX NETWORK (GSTN)
Cambodia; assessing progress in public finance management reform july 2011
Rakesh verma public audit english
PPP Reference Guide Version 3 - PPP Framework.pdf
The Effect Of Profitability, Company Size, Auditor Reputation, And Leverage O...
What is pfm df ppt aim v4 final
Material de Apoio | Pay flexibility country overview
Ad

More from Asian Productivity Organization (20)

PDF
Future Cashless Society
PDF
Driving Change in the Public Sector
PDF
Leading with Vision
PDF
Apo leadership workshop
PDF
Case Studies on Performance Management II
PDF
Case Studies on Performance Management I
PDF
Performance Reporting Framing and Conveying the Performance Story
PDF
Develop People Guidance Questions
PDF
Performance Management and Measurement Best Practices and Recent Initiatives:...
PDF
Apo workshop on performance management
PPTX
Apo leardership workshop ii
PPTX
Ethical Leadership
PPT
Apo Leadership workshop II
PPTX
Apo leadership workshop
PPTX
Leadership & Performance Management System
PPTX
APO Ethical Leadership
PPTX
APO Lecture: Best Practices II
PPTX
APO Lecture: PM Elements
PPTX
APO Lecture: Best Practices I
Future Cashless Society
Driving Change in the Public Sector
Leading with Vision
Apo leadership workshop
Case Studies on Performance Management II
Case Studies on Performance Management I
Performance Reporting Framing and Conveying the Performance Story
Develop People Guidance Questions
Performance Management and Measurement Best Practices and Recent Initiatives:...
Apo workshop on performance management
Apo leardership workshop ii
Ethical Leadership
Apo Leadership workshop II
Apo leadership workshop
Leadership & Performance Management System
APO Ethical Leadership
APO Lecture: Best Practices II
APO Lecture: PM Elements
APO Lecture: Best Practices I
Ad

Recently uploaded (20)

PPTX
Portland FPDR Oregon Legislature 2025.pptx
PDF
Concept_Note_-_GoAP_Primary_Sector_-_The_Great_Rural_Reset_-_Updated_18_June_...
PPTX
SOMANJAN PRAMANIK_3500032 2042.pptx
PPTX
BHARATIYA NAGARIKA SURAKSHA SAHMITA^J2023 (1).pptx
PPTX
Empowering Teens with Essential Life Skills 🚀
PPTX
DFARS Part 252 - Clauses - Defense Regulations
PPTX
Chapter 1: Philippines constitution laws
PDF
PPT Item # 9 - FY 2025-26 Proposed Budget.pdf
PDF
4_Key Concepts Structure and Governance plus UN.pdf okay
PPTX
cpgram enivaran cpgram enivaran cpgram enivaran
PDF
CXPA Finland Webinar: Rated 5 Stars - Delivering Service That Customers Truly...
PPTX
ANALYSIS OF THE PROCLAMATION OF THE PHILIPPHINE INDEPENDENCE.pptx
PPT
The Central Civil Services (Leave Travel Concession) Rules, 1988, govern the ...
PPTX
The DFARS - Part 251 - Use of Government Sources By Contractors
PDF
ESG Alignment in Action - The Abhay Bhutada Foundation
PPTX
Presentatio koos kokos koko ossssn5.pptx
PPTX
Parliamentary procedure in meeting that can be use
PDF
Item # 5 - 5307 Broadway St final review
PPTX
LUNG CANCER PREDICTION MODELING USING ARTIFICIAL NEURAL NETWORK.pptx
PDF
CXPA Finland Webinar - Modern Components of Service Quality - Alec Dalton - ...
Portland FPDR Oregon Legislature 2025.pptx
Concept_Note_-_GoAP_Primary_Sector_-_The_Great_Rural_Reset_-_Updated_18_June_...
SOMANJAN PRAMANIK_3500032 2042.pptx
BHARATIYA NAGARIKA SURAKSHA SAHMITA^J2023 (1).pptx
Empowering Teens with Essential Life Skills 🚀
DFARS Part 252 - Clauses - Defense Regulations
Chapter 1: Philippines constitution laws
PPT Item # 9 - FY 2025-26 Proposed Budget.pdf
4_Key Concepts Structure and Governance plus UN.pdf okay
cpgram enivaran cpgram enivaran cpgram enivaran
CXPA Finland Webinar: Rated 5 Stars - Delivering Service That Customers Truly...
ANALYSIS OF THE PROCLAMATION OF THE PHILIPPHINE INDEPENDENCE.pptx
The Central Civil Services (Leave Travel Concession) Rules, 1988, govern the ...
The DFARS - Part 251 - Use of Government Sources By Contractors
ESG Alignment in Action - The Abhay Bhutada Foundation
Presentatio koos kokos koko ossssn5.pptx
Parliamentary procedure in meeting that can be use
Item # 5 - 5307 Broadway St final review
LUNG CANCER PREDICTION MODELING USING ARTIFICIAL NEURAL NETWORK.pptx
CXPA Finland Webinar - Modern Components of Service Quality - Alec Dalton - ...

India performance monitoring and evaluation systems

  • 1. India Performance Monitoring And Evaluation Systems ( PMES )
  • 2. 23 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement While a truly effective Government Performance Management (GPM) system must include all three sub-systems, more often than not, most countries tend to focuson only one or two of the sub-systems mentioned above.Even *Secretary,Performance Management,Cabinet Secretariat,Government of India INDIA Performance Monitoring and Evaluation System (PMES) PrajapatiTrivedi* 1.Introduction To improve performance of any organisation we need a multidimensional effort. Experts believe that the following three systems are necessary for improving performance of any organisation: (a) Performance Information System, (b) Performance Evaluation System, and (c) Performance Incentive System. A performance information system ensures that appropriate information, in a useful format, is available in a timely manner to stakeholders. A performance evaluation system is meant to convert, distill and arrange this information in a format that allows stakeholders to assess the true effectiveness of the organisation. Finally, no matter how sophisticated the information system and how accurate the evaluation system, performance of any organisation can improve in a sustainable manner only if it has a performance incentive system. A performance incentive system links the performance of the organisation to the welfare of its employees. This allows the employees to achieve organisation objectivesin their own self-interest. These three sub-systems are as relevant for the public sector as they are for the private sector. Within the public sector, these systems are equally important for Government Departments and state-owned enterprises or public enterprises. While the focus of this paper is on the performance management of Government Departments, occasional references and comparisons will be made to the public enterprise sector as well.
  • 3. 24 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement when countries take actions covering all three sub-systems, often these sub-systems are not adequately dealt with or organically connected to each other for yielding desired results. This was also the case in India till September 2009. There were a plethora of actions and policies in all three areas but they remained sporadic and fragmented efforts. In the area of performance information the following initiatives come to mind immediately: the enactment of the Right to Information (RTI), publication of annual reports by departments, suo moto disclosure on departmental websites, creation of an independent Statistical Commission to bolster an already robust statistical tradition, reports of the Planning Commission, certified accounts of Government Departments by Controller General of Accounts, audit reports by Comptroller and Auditor General, free media and its reports on departments, outcome budgets and, finally, reports of the departmental standing committees of the Indian Parliament. One could say that there was overwhelming amount of information available on Government Departments. Similarly, all the above sources of information provide multiple, though often conflicting, narrative on performanceevaluation of GovernmentDepartments. Similarly, a proposal for a performance incentive for Central Government employees has been around ever since the Fourth Pay Commission recommended introducing a performance related incentive scheme (PRIS) for Central Government employees and was accepted by the Government of India in 1987. This recommendation for PRIS was once again reiterated by the Fifth and Sixth Pay Commissions and both times, accepted by the then Governmentsinpower.Asof 2009,however,verylittlewasdonetoimplement a performance-related incentivescheme in the Central Government. At the end of 2008, two major reports provided the impetus for action on this front. The 10th Report of the Second Administrative Reform Commission (2nd ARC) argued for introduction of a Performance Management System in general and Performance Agreements in particular. The Sixth Pay Commission, as mentioned earlier, submitted its report in 2008 urging for introduction of a PerformanceRelated Incentive Scheme (PRIS). After the election in 2009, the new Government decided to take action on both reports. Through the President’s Address to the Parliament in June 2009, the new Government made a commitment to: 'Establish mechanisms for performance monitoring and performance evaluation in government on a regular basis'. Pursuant to the above commitment made in the President’s address to both Houses of the Parliament on June 4, 2009, the Prime Minister approved the outline of the Performance Monitoring and Evaluation System (PMES) for GovernmentDepartments on September 11, 2009.With the introduction of
  • 4. 25 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement PMES, a concerted effort was made to refine and bring together the three sub-systems. Before elaborating the details of PMES in the subsequent sections, it is worth outlining the key features of the System. The essence of PMES is as follows. According to PMES, at the beginning of each financial year,with the approval of the Minister concerned, each Department is required to prepare a Results- Framework Document (RFD). The RFD includes the priorities set out by the Ministry concerned, agenda as spelt out in the manifesto, if any, President’s Address, and announcements/agenda as spelt out by the Government from time to time.The Minister in-charge is expected to decide the inter-se priority among the departmental objectives. Aftersixmonths,theachievementsofeach Ministry/Departmentarereviewed by the High Power Committee on Government Performance, chaired by the Cabinet Secretary, and the goals reset, taking into account the priorities of the Government at that point of time.This enables the Government to factor in unforeseen circumstances such as drought conditions, natural calamities or epidemics. At the end of the year, all Ministries/Departments review and prepare a report listing the achievements of their Ministry/Department against the agreed results in the prescribed formats. This report is finalised by the 1st of May each year and submitted for approval by the High Power Committee, beforeforwarding the results to the Prime Minister. 2.Origin of Performance Monitoring and Evaluation System (PMES) The immediate origins of PMES can be traced to the 10th Report of the Second Administrative Reform Commission finalised in 2008. In Chapter 11 (see Box 1),the Reportgoes on to say: 'Performance agreement is the most common accountability mechanism in most countries that have reformed their public administration systems. This has been done in many forms - from explicit contracts to less formal negotiated agreements to more generally applicable principles. At the core of such agreements are the objectives to be achieved, the resources provided to achieve them, the accountability and control measures, and the autonomy and flexibilitiesthatthecivil servantswillbegiven'. The Prime Minister’s order of September 11, 2009, mandating PMES was based on this basic recommendation. However, the Government of India preferred to use the term Results-Framework Document (RFD) rather than Performance Agreement, which is the commonly used generic term for such policy instruments.Indeed,RFD is the centrepiece of PMES.
  • 5. 26 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Box 1 Excerptsfrom the10th Report of SecondAdministrativeReformsCommission (Chapter 11 onPerformanceManagement) November2008 14. PerformanceAgreements 1.Performance agreement is the most common accountability mechanism in most countries that have reformed their public administration systems. This has been done in many forms - from explicit contracts to less formal negotiated agreements to more generally applicable principles. At the core of such agreements are the objectives to be achieved, the resources provided to achieve them, the accountability and control measures, and the autonomy and flexibilities that the civil servants will be given. 2.In New Zealand, for example, the Public Finance Actof 1989 provided for a performance agreement to be signed between the chief executive and the concerned minister every year. The performance agreement describes the key result areas that require the personal attention of the chief executive. The expected results are expressed in verifiable terms, and include output-related tasks. The chief executive’s performance is assessed every year with reference to the performance agreement. The system provides for bonuses to be earned for good performance and removal for poor performance. The assessment is done by a third party - the State Services Commission. Due consideration is given to the views of the departmental Minister. A written performance appraisal is prepared.The chief executive concerned is given an opportunity to comment,and his/her commentsform part of the appraisal. 3.The Centres de Responsabilitein Franceisanother example.Since 1990, many State services at both central and devolved levels have been established as Responsibility Centres in France. A contract with their Ministry gives the Directors greater management flexibility in operational matters in exchange for a commitment to achieve agreed objectives. It also stipulates a method for evaluatingresults.Contracts,negotiated case by case,arefor threeyears.
  • 6. 27 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement 4.Reforms in these countries are instructive in the way accountabilities were clarified as a necessary first step. The important part of this clarifying process was that it was done by law. As a result of legal clarification of accountabilities, the civil servant in charge of a department became directly accountable to the departmental Minister through the annual performance agreement that was defined in advance and used as a benchmark for measuring end-of-the-period performance. In India, a provision in the proposed Public Services Law could be incorporated specifying that the heads of the line departments or of the executive agencies whenever they are set up, should sign annual performance agreements with the departmental Minister. 5.The performance agreements should be signed between the departmental Minister and the Secretary of the Ministry as also between the departmental Minister and heads of Department, well before the financial year. The annual performance agreement should provide physical and verifiable details of the work to be done by the Secretary/ Head of the Department during the financial year. The performance of the Secretary/Head of the Department should be assessed by a third party – say, the Central Public Services Authority with reference to the annual performance agreement. The details of the annual performance agreements and the results of the assessment by the third party should be provided to the legislature as a part of the Performance Budget/ Outcome Budget. This recommendation of the Second Administrative Reforms Commission (2nd ARC) was, in turn, building on the recommendation of the L. K. Jha Commission on Economic Administration Reforms (1982). The L. K. Jha Commission had recommended the concept of Action Plans for Government Departments. The Government of India accepted this recommendation and implemented it for a few years. Action Plans were found to be ineffective as they suffered from two fatal flaws.The actions listed were not prioritised and there was no agreement on how to measure deviations from targets. As this paper will reveal later, the Performance Monitoring and Evaluation System (PMES) overcame these flaws in the Results-Framework Documents (RFDs) that replaced the instrumentof Action Plans. The real inspiration for the recommendations of the 2nd ARC regarding Performance Agreements comes from the 1984 report of Arjun Sengupta for Public Enterprises which recommended Memorandum of Understanding (MOU) for public enterprises. In concept and design, MOU and RFD are mirror images of each other,as will be discussed shortly.
  • 7. 28 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement 3. Attributes of the Performance Monitoring and Evaluation System (PMES) This is a system to both 'evaluate' and 'monitor' the performance of Government Departments. Evaluation involves comparing the actual achievements of a department against the annual targets at the end of the year. In doing so, an evaluation exercise judges the ability of the department to deliver results on a scale ranging from excellent to poor. Monitoring involves keeping a tab on the progress made by departments towards achieving their annual targets during the year. So while the focus of ‘evaluation’ is on achieving the ultimate ‘ends', the focus of ‘monitoring’ is on ‘means'.They are complementsto each other and not substitutes. To be even more accurate, PMES is not merely a ‘performance evaluation’ exercise, instead, it is a ‘performance management exercise'. The former becomes the latter when accountability is assigned to a person for the results of the entity managed by that person. In the absence of consequences, an evaluation exercise remains an academic exercise. This explains why a large amount of effort on Monitoring & Evaluation (M&E) does not necessarily translate into either results or accountability. Second, PMES takes a comprehensive view of departmental performance by measuring performance of all schemes and projects (iconic and non-iconic) and all relevant aspects of expected departmental deliverables such as: financial, physical, quantitative, qualitative, static efficiency (short-run) and dynamic efficiency (long-run). As a result of this comprehensive evaluation covering all aspects of citizen’s welfare, this system provides a unified and single view of departmental performance. Third, by focusing on areas that are within the control of the department, PMES also ensures ‘fairness’ and, hence, high levels of motivation for departmental managers. These attributes will be discussed detail in the subsequent section of this paper. 4. How does PMESwork? The working of the PMES can be divided into the following three distinct stages of the fiscal year: a. Beginning of the Year (by April 1): Design of Results-Framework Document b. During the Year (after six months - October 1):Monitor progress against agreed targets c. End ofthe year(March31):Evaluateperformanceagainstagreedtargets
  • 8. 29 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement 1 Beginningof Year PrepareRFD April 1 2 DuringtheYear Monitor Progress October1 3 EndofYear Evaluate Performance June1 4.1 Beginning of the Year (byApril 1):Designof Results- Framework Document As mentioned earlier, at the beginning of each financial year, with the approval of the minister concerned, each department prepares a Results- Framework Document (RFD) consisting of the priorities set out by the Minister, agenda as spelt out in the party manifesto if any, President’s Address, announcements/agenda as spelt out by the Government from time to time. The Minister incharge approves the inter-se priority among the departmental objectives. To achieve results commensurate with the priorities listed in the Results- Framework Document, the Minister approves the proposed activities and schemes for the ministry/department. The Minister also approves the corresponding success indicators (Key Result Indicators - KRIs or Key Performance Indicators -KPIs) and time-bound targets to measure progress in achieving these objectives. The Results-Framework Document (RFD) prepared by each department seeks to address three basic questions: a. What are department’s main objectivesfor the year? b. What actions are proposed to achieve these objectives? c. How to determine progress made in implementing these actions? RFD is simply a codification of answers to these questions in a uniform and meaningful format. All RFD documents consist of the six sections depicted in Figure 2: Figure1:How doesRFDwork?(TheProcess)
  • 9. 30 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Section 1 Section 2 Section 3 Section Section 5 Section Ministry'sVision,Mission,Objectivesand Functions Inter se Prioritiesamong key objectives,success indicatorsand targets. Trend values of the success indicators Descriptionand definition of success indicatorsand proposed measurementmethodology Specific performancerequirementsfrom other departments that are critical for delivering agreed results 6 Outcome/ImpactofactivitiesofDepartment/Ministry 4 Figure2:SixSectionsof Results-FrameworkDocument (RFD) A typical example of an actual RFD is enclosed at Annex D. In what follows we will brieflydescribe each of the six sections of RFD. Section 1:Ministry’sVision,Mission,Objectives and Functions This section provides the context and the background for the Results- Framework Document. Creating a vision and mission for a department is a significant enterprise. Ideally, vision and mission should be a by-product of a strategic planning exercise undertaken by the department. Both concepts are interrelated and much has been written about them in management literature. A vision is an idealised state for the Ministry/Department. It is the big picture of what the leadership wants the Ministry/Department to look like in the future. Vision is a long-term statement and typically generic and grand. Therefore a vision statement does not change from year to year unless the Ministry/Department is dramatically restructured and is expected to undertake very different tasks in the future. Vision should never carry the ‘how’ part since the ‘how’ part of the vision may keep on changing with time. The Ministry’s/Department’s mission is the nuts and bolts of the vision. Mission is the ‘who, what and why’ of the Ministry's/Department's
  • 10. 31 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement existence. The vision represents the big picture and the mission represents the necessary work. Objectives represent the developmental requirements to be achieved by the departmentinaparticularsectorbyaselectedsetofpoliciesandprogrammes over a specific period of time (short/medium/long). For example, objectives of the Ministry of Health &Family Welfare could include:(a) reducing the rate of infant mortality for children below five years; and (b) reducing the rate of maternity death by (30%) by the end of the developmentplan. Objectives could be of two types: (a) Outcome Objectives specify ends to achieve, and (b) Process Objectives specify the means to achieve the objectives. As far as possible, the department should focus on Outcome Objectives. Objectives should be directly related to attainment and support of the relevant national objectives stated in the relevant Five Year Plan, National Flagship Schemes, Outcome Budget and relevant sector and departmental priorities and strategies, President’s Address, the manifesto, and announcement/agenda as spelt out by the Governmentfrom time to time. Objectives should be linked and derived from the departmental vision and mission statements and should remain stable over time. Objectives cannot be added or deleted without a rigorous evidence-based justification.In particular, a department should not delete an objective simply because it is hard to achieve. Nor, can it add an objective simply because it is easy to achieve.There must be a logicalconnection between vision,missionand objectives. The functions of the department should also be listed in this section. These functions should be consistent with the Allocation of Business Rules for the Department/Ministry. Unless they change, they cannot be changed in the RFD. This section is supposed to reflect the legal/administrative reality as it exists,and not a wish list. Section 2:Inter sepriorities among key objectives, successindicators and targets This section is the heart of the RFD. Table 1 contains the key elements of Section 2 and in what follows we describe each column of this Table. Column 1:Select KeyDepartmental Objectives From the list of all objectives, departments are expected to select those key objectives that would be the focus for the current RFD. It is important to be selective and focus on the most important and relevant objectivesonly.
  • 11. 32 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Table1:Extract of Section2 from the2013-14RFDof Department of AgricultureandCooperation Section 2: Inter se Priorities among Key Objectives, Success Indicators and Targets Column 1 Column 2 Column 3 Column 4 Objective Weight Action Success Indicator (1) Increasing crop production and productivity thereby ensuring food security and enhanced income level to farmers 15.50 (1.1) Preparation of tentative allocations & approval of State Action Plans for 2013-14 (1.1.1) Approval by 31.05.2013 (1.2) Release of funds to States/Institutions (1.2.1) % of funds (R.E.) released by 31.03.2014 (1.3) Additional foodgrain production (1.3.1) Additional production of 5 million tons over year (5 year moving average) (1.4) Area expansion of pulses by promoting pulse cultivation in rice fallows, as well as intercrops and summer crops (1.4.1) Increase in area by 1.5 lakh ha. over 5 year moving average (1.5) NFSM Impact Evaluation Studies (1.5.1) Completion of studies by CDDs and submission of report (1.6) Monitoring and review of BGREI programme in all 7 States (1.6.1) State visits twice a year (2) Incentivising states to enhance public investment in agriculture & allied sectors to sustain and maintain capital formation and agriculture infrastructure 9.00 (2.1) Incentivise states to make additional allocation in agriculture & allied sectors (2.1.1) Increase in percentage points of States’plan expenditure in agriculture and allied sectors as per Point No. 3 of Annexure-II of RKVY Guidelines
  • 12. 33 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Column 5 Column 6 Column 7 Unit Weight Target/Criteria Value Excellent Very Good Good Fair Poor 100% 90% 80% 70% 60% 3.50 31/05/2013 05/06/2013 10/06/2013 15/06/2013 20/06/2013 3.00 95 85 75 65 60 3.00 5.0 4.8 4.6 4.4 4.2 2.00 1.5 1.3 1.25 1.10 1.00 2.00 30/11/2013 30/12/2013 01/01/2014 28/02/2014 31/03/2014 2.00 14 12 10 08 06 % points 4.00 0.25 0.20 0.15 010 0.05
  • 13. 34 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement The objectives are derived from the Five Year Plan, departmental strategies, party manifesto, and President’s Address to the Parliament. As depicted in Figure 3,this is required to ensure vertical alignment between National Vision as articulated in the National Five Year Plan and departmental objectives. The objective of the departmental strategy is to outline the path for reaching the vision. It usually covers five years and needs to be updated as the circumstances change. Ideally, one should have the departmental strategy in place before preparing an RFD. However, RFD itself can be used to motivate departments to preparea strategy.Thisis whatwas done in our case in India. Figure3:VerticalAlignment of FiveYear Planswith RFDs Vision Column 2:AssignRelativeWeightsto Objectives Objectives in the RFD are required to be ranked in a descending order of priority according to the degree of significance, and specific weights should be attached to these objectives. In the ultimate analysis, the concerned minister has the prerogative to decide the inter se priorities among departmental objectives and all weights must add up to 100. Clearly, the Long-TermStrategy Five-Year DevelopmentPlan Results-Framework Document Objectives Policies Projects/Schemes
  • 14. 35 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement process starts with the departmental secretary suggesting a set of priorities in her best technical judgment. However, the Minister has the final word as she represents the will of the people in our form of government. The logic for attaching specific weights, all adding up to 100%, is straightforward. For instance, if a department has 15 objectives and, at the end of the year, the secretary of the department goes to the Minister and says 'I have achieved 12 out of the 15 objectives'. How is the Minister to judge secretary’s performance? The answer depends on which of the three objectives were achieved.Ifthe important core objectives of the departments were not,then this does not reflectgood performance. In fact, any evaluation system that does not prioritise objectives is a non- starter. We know that all aspects of departmental operations are not equally important. When we have a shared understanding of departmental priorities, it createsa much greater chance of getting the importantthings done. Column 3:Identify Means (Actions) forAchieving DepartmentalObjectives For each objective, the department must specify the required policies, programmes, schemes and projects. These also have to be approved by the concerned minister. Often, an objective has one or more policies associated with it. An objective represents the desired 'end' and associated policies, programs and projects represent the desired 'means'. The latter are listed as 'actions' under each objective. Column 4:DefineSuccessIndicators For each 'action' specified in Column 3, the department must specify one or more 'success indicators'.They are also known as 'Key Performance Indicators (KPIs)'or 'Key Result Indicators (KRIs)'. A success indicator provides a means to evaluate progress in achieving the policy, programme, scheme and project objectives/targets. Sometimes more than one success indicator may be required to tell the entire story. If there are multiple actions associated with an objective, the weight assigned to a particular objective should be spread across the relevant success indicators. The choice of appropriate success indicators is as important as the choice of objectives of the department. It is the success indicators that are most useful at the operational level. They provide a clear signal as to what is expected from the department. Success indicators are important management tools for driving improvements in departmental performance. They should represent the main business of the organisation and should also aid accountability. Success indicators should considerbothqualitativeandquantitativeaspectsofdepartmentalperformance.
  • 15. 36 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement ImplementationResults Goals(Impacts) Outcomes Outputs Activities Inputs In selecting success indicators, any duplication should be avoided. For example, the usual chain for delivering results and performance is depicted in Figure 4.An example of this results chainis depicted in Figure 5. If we use Outcome (increased literacy) as a success indicator,then it would be duplicative to also use inputs and activities as additional success indicators. Ideally, one should have success indicators that measure Outcomes and Impacts. However, sometimes due to lack of data one is able to only measure activitiesor output.The common definitions of these terms are as follows: 1. Inputs: The financial, human, and material resources used for the developmentintervention. Figure4:TypicalResultsChain Results-Based Management Long-term,widespread improvement in society Intermediateeffects of outputs on clients Products and servicesproduced Tasks personnel undertaketo transform inputs to outputs Financial,human and material resources Goal(Impacts) Outcomes Outputs Activities Inputs Higher incomelevels;increase access to higher skill jobs Increased literacy skill;more employmentopportunities Number of adults completing literacy courses Literacy training courses Facilities,trainers,materials Figure5:An Exampleof ResultsChain Results-Based Management:Adult Literacy
  • 16. 37 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement 2. Activity:Actions taken or work performed through which inputs, such as funds, technical assistance and other types of resources are mobilised to produce specific outputs 3. Outputs: The products, capital goods and services that result from a development intervention; may also include changes resulting from the intervention which are relevant to the achievement of outcomes. Sometimes, ‘Outputs’ are divided into two sub-categories – internal and external outputs. ‘Internal’ outputs consist of those outputs over which managers have full administrative control.For example, printing a brochure is considered an internal output as it involves spending budgeted funds in hiring a printer and giving orders to print a given number of brochures. All actions required to print a brochure are fully within the manager’s control and,hence,this action is considered‘Internal’output.However,having these brochures picked up by the targeted groups and,consequently,making the desired impact on the target audience would be an example of external output. Thus, actions that exert influence beyond the boundaries of an organisation are termedas‘external’outputs. 3. Outcome: The likely or achieved short-term and medium-term effects/ impactof an intervention’s Outputs. Departments are required to classify SIs into the following categories: While categories numbered 1-5 are mutually exclusive, a Success Indicator can also measure qualitative aspects of performance. As can be seen from the figure given below, management begins where we do not have full control.Up until that point,we consider it to be the realm of administration. Figure6:AdministrationversusManagement SelectingSuccessIndicators Goal(Impacts) ProgrammeEvaluation Outcomes Results Management External Outputs InternalOutputs Inputs Activities Administration Input Activity Internal External Outcome Measures Output Output Qualitative Aspects (1) (2) (3) (4) (5) (6)
  • 17. 38 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Column 5:AssignRelativeWeightsto SuccessIndicators If we have more than one action associated with an objective, each action should have one or more success indicators to measure progress in implementing these actions. In this case we will need to split the weight for the objective among various success indicators associated with the objective. The rationale for using relative weights has already been given in the context of the relative weights for objectives. The same logic applies in this context as well. Column 6:SetTargetsfor SuccessIndicators The next step in designing an RFD is to choose a target for each success indicator. Targets are tools for driving performance improvements. Target levels should, therefore, contain an element of stretch and ambition. However, they must also be achievable. It is possible that targets for radical improvement may generate a level of discomfort associated with change, but excessively demanding or unrealistic targets may have a longer-term demoralising effect. The target should be presented as per the five-point scale given below: The logic for using a five-point scale can be illustrated with the following example. Let us say a Minister (the principal) gives the Secretary (the agent) a target to build 7000 KMs of road. However, at the end of the year, if the Secretary reports that only 6850 KMs of roads could be built, then how is the Minister to evaluate Secretary’s performance? The reality is that under the present circumstances, a lot would depend on the relationship between the Minister and the Secretary. If the Minister likes the Secretary, he is likely to overlook this shortfall in achievement. If, however, the Minister is unhappy with the Secretary for some reason, then the Minister is likely to make it a big issue. This potential for subjectivity is the bane of most problems in the government. The five-point scale addresses this problem effectively. By having an ex-ante agreement on the scale, the performance evaluation at the end is automatic and fair. Incidentally, it could be a five, seven or even a nine point scale. If an evaluation system does not use a scale concept for ex- ante targets,it is a non-starter. It is expected that, in general, budgetary targets would be placed at 90% (Very Good) column. There are only two exceptions: (a) When the budget requires a very precise quantity to be delivered. For example, if the budget provides money for one bridge to be built, clearly we cannot expect the department to build two bridges or 1.25 of a bridge.(b) When there is a legal mandate for a certain target and any deviation may be considered a legal breach. In these cases, and only in these cases, the targets can be placed Excellent Very Good Good Fair Poor 100 % 90% 80% 70 % 60 %
  • 18. ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement under 100%. For any performance below 60%, the department would get a score of 0 in the relevant success indicator. The RFD targets should be aligned with Plan priorities and be consistent with departmental budget as well as the outcome budget. A well-framed RFD document should be able to account for the majority of the budget. Towards this end, departments must ensure that all major schemes, relevant mission mode projects and Prime Ministers flagship programs are reflected in the RFD. Teamtargets In some cases, the performance of a department is dependent on the performance of one or more departments in the government. For example, to produce power, the Ministry of Power is dependent on the performance of the following: (a) Ministry of Coal, (b) Ministry of Railways, (c) Ministry of Environment and Forest, and (d) Ministry of Heavy Industry (e.g. for power equipment from BHEL). Therefore, in order to achieve the desired result, it is necessary to work as a team and not as individuals. Hence, the need for team targets for all five Departments and Ministries. For example, if the Planning Commission fixes 920 BU as target for power generation, then two consequences will follow. First, RFDs of all five departments will have to include this as a ‘team target'. Second, if this Figure7:HorizontalAlignment amongDepartments Vision Long-Term Strategy Objectives Objectives Objectives Objectives Objectives Five-Year Development Projects/ Projects/ Projects/ Projects/ Projects/ Schemes Schemes Schemes Schemes Schemes 39 RFD of RFD of RFD of RFD of RFD of Department 1 Department 2 Department 3 Department 4 Department ...
  • 19. 40 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement ‘team target’ is not achieved, all five departments will lose some points at the time of evaluation of RFDs. The relative loss of points will depend on the weight for the team target in the respective RFDs. To illustrate, let us imagine the following. The RFD for Ministry of Coal has two types of targets, one deals with coal production and other with ‘team target for Table2:ExampleofTeamTargetin the2013-14RFDof Ministryof Coal. TEAM TARGET Section 2: Inter se Priorities among Key Objectives, Success Indicators and Targets Column 1 Column 2 Column 3 Column 4 Column 5 Objective Weight Action Success Indicator Unit (29) GPS based tracking of transportation of coal 1.00 (29.1) Installation of GPS by 31.03.2014 (29.1.1) All companies have floated tenders for installing GPS No. of subsidiaries (30) Joint responsibility for power generation 5.00 (30.1) Give necessary support and clearance (30.1.1) Additional capacity installed MW (30.1.2) Total power generated BU * Efficient functioning of the RFD System 3.00 Timely submission of draft RFD 2014-15 for approval On-time submission Date Timely submission of results for 2012-13 On-time submission Date * Transparency/ Service delivery Ministry/ Department 3.00 Independent audit of implementation of Citizens’/Clients’ Charter (CCC) % of implementation % Independent audit of implementation of Public Grievance Redressal System % of implementation % * Administrative reforms 6.00 Implement mitigating strategies for reducing potential risk of corruption % of implementation % Implement ISO 9001 as per the approved action plan % of implementation % Implement Innovation Action Plan (IAP) % of milestones achieved % Identification of core and non-core activities of the Ministry/Department as per 2nd ARC recommendations Timely submission Date
  • 20. 41 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement power generation'. They have a weight of 15 % and 2 % respectively. Now if the target of 920 BU for power generation is not achieved, even if the target for coal production is achieved, Ministry of Coal will still lose 2%. An actual example of Team Target in the RFD of Ministry of Coal in the year 2013-14 is reproduced below: Column 6 Column 7 Weight Excellent Target/Criteria Value Very Good Good Fair Poor 100% 90% 80% 70% 60% 1.00 7 6 5 4 3 3.00 18500 18000 17000 16000 1500 0 2.00 1030 1000 950 920 900 2.00 05/03/2014 06/03/2014 07/03/2014 08/03/2014 11/03/2014 1.00 01/05/2013 02/05/2013 03/05/2013 06/05/2013 07/05/2013 2.00 100 95 90 85 80 1.00 100 95 90 85 80 1.00 100 95 90 85 80 2.00 100 95 90 85 80 2.00 100 95 90 85 80 1.00 27/01/2014 28/01/2014 29/01/2014 30/01/2014 31/01/2014
  • 21. 42 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement The logic is that all team members must ensure (like relay race runners) that the entire chain works efficiently. To borrow an analogy from cricket, there is no consolation in a member of the team scoring a double century if the team ends up losing the match. That is, the departments included for team targets will be responsible for achieving the targets jointly. This is one of the ways in which RFDs try to ensure horizontal alignment and break away from the silo mentality. Section 3:Trend values of the successindicators For every success indicator and the corresponding target, RFD must provide actual values for the past two years and also projected values for two years in the future as given inTable 3. Table3:Extract of Section3 from the2013-14RFD of Department of Agricultureand Cooperation Section 3: Trend Values of the Success Indicators Column 1 Column 2 Column 3 Objective Action Success Indicator (1) Increasing crop production and productivity thereby ensuring food security and enhanced income level to farmers (1.1) Preparation of tentative allocations & approval of State Action Plans for 2013-14 (1.1.1) Approval by 31.05.2013 (1.2) Release of funds to States/Institutions (1.2.1) % of funds (R.E.) released by 31.03/2014 (1.3) Additional foodgrain production (1.3.1) Additional production of 5 million tons over year (5 year moving average) (1.4) Area expansion of pulses by promoting pulse cultivation in rice fallows, as well as intercrops and summer crops (1.4.1) Increase in area by 1.5 lakh ha.over 5 year moving average (1.5) NFSM Impact Evaluation Studies (1.5.1) Completion of studies by CDDs and Submission of report (1.6) Monitoring and review of BGREI programme in all 7 States (1.6.1) State visits twice a year (2) Incentivising states to enhance public investment in agriculture & allied sectors to sustain and maintain capital formation and agriculture infrastructure (2.1) Incentivise states to make additional allocation in agriculture & allied sectors (2.1.1) Increase in percentage points of states’ plan expenditure in agriculture and allied sectors as per Point No. 3 of Annexure-II of RKVY Guidelines
  • 22. 43 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Section 4:Description and definition of success indicators and proposedmeasurement methodology RFD contains a section with detailed definitions of various success indicators and the proposed measurement methodology. Wherever possible, the rationale for using the proposed success indicators may be provided. Abbreviations/acronyms of policies, programmes, schemes used may also be elaborated in this section. Column 4 Column 5 Column 6 Column 7 Column 8 Column 9 Unit Actual Value for FY 11/12 Actual Value for FY 12/13 Actual Value for FY 13/14 Actual Value for FY 14/15 Actual Value for FY 15/16 Date 31/05/2011 18/05/2012 31/05/2013 31/05/2014 31/05/2015 % 98 - 90 90 90 Million Tons .. 12.59 5.0 5.2 5.4 Area .. .. 1.5 1.75 2.00 Date .. .. 30/11/2013 .. .. No. of visits 10 12 14 14 15 % Points 0.9 - 0.25 0.25 0.25
  • 23. 44 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Section 5:Specific performance requirements from other departments that are critical for delivering agreed results This section should contain expectations from other departments that impact on the department’s performance. These expectations should be mentioned in quantifiable,specific,and measurable terms. Table4:Extract of Section5 from the2013-14RFDof Ministryof Coal Section 5: Specific Performance Requirements from other Departments Column 1 Column 2 Column 3 Column 4 Column 5 Location Type State Organisation Organisation Relevant Success Type Name Indicator Central Government Ministry Ministry of Environment and Forests (27.1.1) Holding quarterly meetings Ministry of Law and Justice (1.3.2.1) Vetting of notifications under section 4(1),7(1),9(1) and 11(1) of the CBA (A&D) Act, 1957 will be done within the prescribed time on receipt of each
  • 24. 45 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement The purpose of this section is to promote horizontal alignment among departments and overcome the tendency of working in silos. This section allows us to know, ex-ante, the requirements and expectations of departments from each other. This section is a complement to the concept of‘teamtargets’discussed above. Column 6 Column 7 Column 8 Column 9 What is your Requirement from the Organisation Justification for this Requirement Please Quantify your Requirement from this Organisation What Happens if your Requirement is not Met. MoEF is required to review the status of providing EC and FC proposals at their level for expediting the same.Similarly time taken in conveying the approvals after the same are recommended by EAC/FAC needs to be reviewed for avoiding delays in communicating administrative approvals. Further, MoEF is required to give priority for online processing of EC and FC proposals. Flow chart for steps involved in EC process is enclosed as per Annexure-IV. In the absence of EC and FC clearance, projects will not progress further. All the proposals pending with MoEF Commencement of subsequent activities will not start leading to delay in projects. Ministry of Law and Justice is required to vet notifications within prescribed time Vetting by Ministry of Law and justice is necessary before processing the cases further. All the cased related to land acquisition. Notification cannot be issued within prescribed time without vetting.
  • 25. 46 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Section 6:Outcome/Impact of activities of Department/Ministry This section should contain the broad outcomes and the expected impact the Department/Ministry has on national welfare. It should capture the very purpose for which the Department/Ministry exists and the rationale for undertaking the RFD exercise. The Department’s evaluation will be done against the targets mentioned in Section 2 in RFD. The whole point of Section 6 in RFD is to ensure that Departments/Ministries serve the purpose for which they were created in the firstplace. *NA = Not Available Table5:Extract of Section6 from the2013-14RFD of Department of AIDSControl Section 6: Outcome/Impact of Department/Ministry Column 1 Column 2 Column 3 Jointly responsible for Outcome/Impact of influencing this outcome/ Department/Ministry impact with the following Success Indicator department (s)/ ministry (ies) 1.Survival of AIDS patients on ART % of adults and children with HIV known to be on treatment at 24 months after initiation of antiretroviral therapy at select ART centres 2.Reduction in estimated AIDS related deaths Estimated number of annual AIDS related 3.Reduction in estimated new HIV Infections Estimated number of annual new HIV infections 4.Improved Prevention of Parent of Child Transmission Department of Health and Family Welfare 5. Improved prevention of AIDS in High Risk Group (HRG) Coverage of HRG through Targeted Interventions (TIs) - Female Sex Worker Coverage of HRG through TIs - Males having Sex with Males (MSM) (including Transgenders) Coverage of HRG through TIs - Injecting Drug Users (IDUs) 6.Improved health seeking behaviour of HRGs % HRGs who received HIV test
  • 26. 47 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement The required information under Section 6 should be entered in Table 5. Column 1 of Table 5 is supposed to list the expected outcomes and impacts. It is possible that these are also mentioned in the other sections of the RFD. Even then they should be mentioned here for clarity and ease of reference. For example, the purpose of Department of AIDS Control would be to ‘control the spread of AIDS’. Now it is possible that AIDS Control may require collaboration between several departments like Health and Family Welfare, Information and Broadcasting, etc. In Column 2, all Departments/Ministries jointly responsible for achieving national goals are required to be mentioned. In Column 3, the Department/Ministry is expected to mention the success indicator(s) to measure the Department’s outcome or impact. In the case mentioned, the success indicator could be ‘Percentage of Indians infected with AIDS’. Columns 5 to 9 give the expected trend valuesfor various success indicators. Column 4 Column 5 Column 6 Column 7 Column 8 Column 9 Unit FY 11/12 FY 12/13 FY 13/14 FY 14/15 FY 15/16 % NA NA NA NA NA No. 1,47,729 NA NA NA NA No. 1,16,456 NA NA NA NA NA NA NA NA % 81 NA NA NA NA % 64 NA NA NA NA % 80 NA NA NA NA % 40 NA NA NA NA
  • 27. 48 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement 4.RFDDesignProcess Once the RFDhas been preparedand approvedby the concerned minister,it goes through the cycle depicted in Figure 8 and explainedbelow: Figure8:QualityAssurance Processfor RFD Minister approves RFD Departments send RFDto Cabinet Secretariat RFDsreviewed by PMD and ATF Departments incorporate PMD/ ATFsuggestions RFDsapproved by HPC on Government Performance Departments place RFDson Departmental Websites 1 Step 1: Minister Approves the RFD The process starts with the secretarymaking a draft RFD and proposing for approval of the concernedminister.It is primarily the responsibilityof the minister to ensure that RFD includes all the importantprioritiesof the government. 2 Step 2: Draft RFD sent to Cabinet Secretariat The PerformanceManagement Division(PMD),Cabinet Secretariat,examinesthe draft for their quality and consistencywith RFD Guidelines.Thecritiquesof all drafts are preparedbased on the RFD Evaluationmethodology (REM).Thismethodologyallowsus to quantify the quality of an RFD based on agreed criteria(Annex E).
  • 28. 49 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement An RFD approved by HPC, for the year 2012-13, for the department of Agriculture and Cooperation,is enclosed at Annex D. 3 Step 3: Review by PMD and Ad-Hoc Task Force (ATF) ATF consists of distinguished academicians,former Secretaries to Governmentof India,former chiefs of large public enterprises and private sector domain experts (Annex F).This is a non- governmentbody that reviews the critiquesand vets the draft RFDs.These commentsare conveyed to departments during meetings with departmental secretaries. 4 Step 4: Departments incorporateATF comments and revise RFD drafts Based on the minutes of the meetings with ATF members, departments modifydraft RFDs and resubmit them for approval by the High Power Committee (HPC) on Government Performancechaired by the Cabinet Secretary.For details of HPC see Annex G. 5 Step 5: Approval by HPC on Government Performance The HPC consists of Secretary Finance,Secretary Expenditure, and Secretary Planning.They are responsiblefor ensuring that all main targets in RFDs are consistent with targets in the budget as well as the FiveYear Plan.This is also an occasion to resolve any differencesbetween ATF and departments. 6 Step 6: RFDs placed on departmental websites RFDs approved by HPC are placed on the respective departmental websites.In addition,they are also placed on the PMD website (www.performance.gov.in).
  • 29. 50 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement 2. During the Year (aftersixmonths -October1): Monitor progressagainst targets After six months, the RFD as well as the achievements of each Ministry/ Department against the performance goals laid down, may have to be reviewed and the goals reset, taking into account the priorities at that point of time. This enables the Government to factor in unforeseen circumstances such as droughtconditions,natural calamitiesor epidemics. 3. End of the year (March31): Evaluation of performance against agreedtargets At the end of the year, we look at the achievements of the government departments, compare them with the targets, and determine the composite score. Table 6 provides an example from the health sector. For simplicity, we have taken only one objectiveto illustratethe evaluation methodology. The Raw Score for Achievement in Column 6 of Table 6 is obtained by comparing the achievement with the agreed target values. For example, the achievement for first success indicator (percentage increase in primary health care centres) is 15%. This achievement is between 80% (Good) and 70% (Fair) and hence the‘raw score is 75%'. The Weighted Raw Score for Achievement in Column 6 is obtained by multiplying the raw score (column 7) with the relative weights (column 4). Thus for the first success indicator, the weighted raw Score is obtained by multiplying 75% by 0.50. This gives us a weighted score of 37.5%. Finally, the composite score is calculated by adding up all the weighted raw scores (column 8) for achievements. In Table 6, the composite score is calculated to be 84.5%. The composite score shows the degree to which the Government Department in question was able to meet its objective. The fact that it got a score of 84.5 % in our hypothetical example implies that the Department’s performancevis-a-vis this objectivewas rated as 'Very Good'. The methodology outlined above is transcendental in its application. Various Government Departments will have a diverse set of objectives Departmental Rating Valueof Composite Score Excellent= 100% - 96% Very Good = 95% - 86% Good= 85 - 76% Fair = 75% - 66% Poor = 65% and below
  • 31. 52 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement and corresponding success indicators. Yet, at the end of the year every Department will be able to compute its composite score for the past year. This composite score will reflect the degree to which the Department was able to achieve the promised results. The actual achievement and the composite score for the sample RFD for Department of Agriculture and Cooperation given in Annex D is enclosed at Annex H. Today all Departments are also required to include the RFD and corresponding results at the end of the year in the Annual Reports of respective Departments. These annual reports of individual Departments are placed in the Parliament every year. The results for the year 2011-12 are summarised as a pie chart below: Figure9:Resultsforthe year2011-12 Resultsfor 2011-2012 From the above it is clear the system is stabilising as the results were distributednormally. Of all the state governments, Kerala has taken the lead in completing two full cycles of RFD and declaring its results widely through a Government order as show in Figure 10 (page 48) 5.Why isPMESrequired? Systems prior to introduction of PMES suffered from several limitations. The Government examined these limitations and designed PMES to overcome these limitations.Some examples of these limitations follow: Excellent= (100% - 96%) Very Good = (86% to 95%) Good = (76% to 85%) Fair = (66% to 75%) Poor = (65% and Below) Fair 18% Very Good 37% Good 28% Poor 9% Excellent 8%
  • 32. 53 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement 1. Thereisfragmentation of institutional responsibility for performance management Departments are required to report to multiple principals who often have multiple objectives that are not always consistent with each other. A Department could be reporting to the Ministry of Statistics and Programme Implementation on important programmes and projects; Department of Public Enterprises on the performance of PSUs under it; Department of Expenditure on performance in relation to Outcome Budgets; Planning Commission on plan targets; CAG regarding the procedures, processes, and even performance; Cabinet Secretariat on cross cutting issues and issues of national importance; minister in-charge on his priorities; Standing Committee of the Parliament on its annual report and other political issues; etc. 2. Fragmented responsibilityfor implementation Similarly, several important initiatives have fractured responsibilities for implementation and hence accountability for results is diluted. For example, e-governance initiatives are being led by the Department of Electronics and Information Technology, Department of Administrative Reforms and Public Grievances,NIC, as well as individualministries. 3. Selectivecoveragewith time-lag in reporting Some of the systems are selective in their coverage and report on performance with a significant time-lag. The comprehensive Performance Audit reports of the CAG are restricted to a small group of schemes and institutions (only 14 such reports were laid before the Parliament in 2008) and come out with a substantial lag. Often, by the time these reports are produced, both the management and the issues facing the institutions change. The reports of enquiry commissions and special committees set- up to examine performance of Government Departments, schemes and programmes suffer from similar limitations. 4. Mostperformance management systemsare conceptually flawed As mentioned earlier, an effective performance evaluation system is at the heart of an effective performance management system. Typically, performance evaluation systems in India suffer from two major conceptual flaws.Firstthey list a large number of targets that are not prioritised.
  • 33. 54 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Figure10:Publicdeclaration of Results byGovernmentof Kerala GOVERNMENTOF KERALA Abstract Planning & Economic Affairs (CPMU) Department - Performance Monitoring and Evaluation System - Results Frameworks Document Evaluation Report (2012-13) of 35 Administrative Departments - Approved - Orders issued. Planning & Economic Affairs (CPMU) Department GO(MS)No.42/2013/Plg. Dated, Thiruvananthapuram :07.08.2013, Read: GO(MS) No.24/13/Plg dtd 27.03.2013. ORDER Results-Framework Documents is a part of the Performance Monitoring and Evaluation System (PMES) to monitor and evaluate the performance of the Government Departments.RFD includes the agreed objectives, policies, programmes and projects along with the success indicators and targets to measure the performance in implementing them.The document is to be prepared by each department at the beginning of every financial year. Vide paper read above,Govt.have approved the RFD 2012-13 of 35 Administrative Departments. As per the guidelines of Results-Framework Documents, the concerned Administrative Departments have carried out the evaluation of the achievement of targets mentioned in their Results-Framework Documents for the year 2012-13 and submitted the evaluation report online to the Planning and Economic Affairs Department. The department-wise composite scores are as follows. SL. No. Name of Department Composite Score 1 Agriculture 69.83 2 Animal Husbandry 70.98 3 Co-operation 78.14 4 Cultural Affairs 86.5 5 Environment 30.94 6 Excise 85.28 7 Finance 78.09 8 Fisheries 75.44 9 Food, Civil Supplies & Consumer Affairs 53.39 10 Forest 68.83 11 General Administration 68.27 12 General Education 59.91 13 Health & Family Welfare 87.05 14 Higher Education 68.88 15 Housing 42.27 16 Industries & Commerce 71.55 17 Information & Public Relations 65.77 18 Information Technology 71.35 19 Labour & Rehabilitation 51.23 20 LSGD 61.08
  • 34. 55 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Government, after examining in detail the Evaluation Report of Results-Framework Documents 2012-13 of each Administrative Department are pleased to approve the scores as mentioned above. Government have approved in principal to use the concept of Results-Framework Documents to improve the performance of departments and not to grade them. Further it is not indicative of the level of performance. (By Order of the Governor) Rachna Shah, Secretary (Planning) To All Additional Chief Secretaries,Principal Secretaries and Secretaries Dr.Prajapati Trivedi,Secretary,PMD,Cabinet Secretariat,Government of India (with C/L) Performance Management Division, Cabinet Secretariat,Govt.of India. All Heads of Departments All Districts Collectors Private Secretary to Hon’ble Chief Minister Private Secretary to all Ministers Copy to Additional Secretary to Chief Secretary PA to Principal Secretary to Govt.(Planning) CA to Additional Secretary &Director,(CPMU) Stock file/OC. Forwarded / By Order Section Officer 21 NORKA 62.54 22 P&ARD 51.17 23 Planning & Economic Affairs 76.35 24 Ports 40.54 25 Power 63.03 26 PWD 73.76 27 Registration 76.37 28 Revenue 51.6 29 SC/ST Development Department 66.2 30 Social Welfare 30.5 31 Sports & Youth Affairs 52.33 32 Taxes 76.75 33 Tourism 67.09 34 Transport 23.75 35 Water Resources 55.52
  • 35. 56 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Hence, at the end of the year it is difficult to ascertain performance. For example, simply claiming that 14 out of 20 targets were met is not enough. It is possible that the six targets that were not met were in the areas that are the most important areas of the department’s core mandate.This is the logic for using weights in RFDs. Similarly, most performance evaluation systems in the Government use single-point targets rather than a scale. This is the second major conceptual flaw and it makes it difficult to judge deviations from the agreed target. For example, how are we to judge the performance of the department if the target for rural roads for a particular year is 15000 KMs and the achievement is 14500 KMs? In the absence of explicit weights attached to each target and a specific scale of deviations, it is impossible to do a proper evaluation. This is the reason why a five-pointscale and weights are used for RFDs. As can be seen from the example in Table 6, evaluation methodology embedded in RFDs is a significant improvement over the previous approaches. Once we are able to prioritise various success indicators based on government’s prevailing priorities and agree on how to measure deviation from the target, it is easy to calculate the composite score at the end of the year. In the above hypothetical example, this composite score is 84.5%. The ability to compute a composite score for each department at the end of the year is the most important conceptual contribution of RFD methodology and makesit a forerunner amongst its peers. This conceptual approach brings the Monitoring and Evaluation of Government Departments in India into a distinctly modern era of new public management. It creates benchmark competition amongst Government Departments and, we know, competition is the source of all efficiency. The composite score of departments measures the ability of the departments to meet their commitments. While the commitments of the Department of Road Transport are very different from those of the Department of School Education, we can still compare their managerial ability to achieve their agreed targets. In the absence of the evaluation methodology embedded in RFDs, Government Departments were often at the mercy of the most powerful individuals in the government. Different individuals in the government could always find something redeeming for those departments they favoured and something amiss for those departments not in their good books. Thus, depending on the perspective, departments could be simultaneously good, bad or ugly. Even when the evaluators were being objective, others could easily allege subjectivity in evaluation because the pre-RFD evaluation methodology was severely flawed.
  • 36. 57 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement The result was, as in many governments around the world, the views of the most powerful person prevailed in the end. This kind of subjective personalised approach often seems to work in the short run because of the so called ‘audit effect.’ Compared to no system, even a flawed M & E system upon introduction can have some temporary beneficial effect on behaviour because officials are now mindful that they are being audited (watched). However, officials are as clever as anyone else and they realise very soon that the evaluation system is subjective, selective and non-scientific. Once this realisation dawns upon them, they go back to their old habits. Thus, the labour-intensive M &E by hauling up departments for endless presentations is often a counter-productive strategy in the long run.It leads to M&E fatigue and eventually a calculated disregardfor such M&E systems in government. RFDs also differ from previous efforts in another fundamental way.Compared to previous approaches, RFDs represent the most comprehensive and holistic evaluation of government departments. To understand this point let us look at Figure 11. As can be seen from this Figure, there is a fundamental difference between Monitoring and Evaluation. Just because they are referredtogether as M&E,the distinction is often lost. Figure11:Evolution of EvaluationInstrumentsin Government M & E Budget Performance Budget Outcome Budget RFD Financial Inputs Monitoring Evaluation The process of evaluation, on the other hand, helps us arrive at the bottom- line for the evaluated organisations. A sound evaluation exercise should inform us whether the performance of an entity is good, bad or ugly. Monitoring, on the other hand, allows an entity to determine whether we are on track to achieving our bottom-line. For example, as a passenger travelling from point A to point B, we care about the on-time departure and arrival of the plane, experience of cabin service during the flight, the cost of journey, etc. If these parameters are to our Financial Inputs Activities Outputs Outcomes Non-financial Outcomes Financial Inputs Activities Outputs Outcomes Financial Inputs Activities Outputs
  • 37. 58 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement satisfaction, we come to the conclusion that we had a ‘good’ flight. However, to achieve this result, the captain of the flight has to monitor a large number of diverse parameters—headwinds, tailwinds, outside temperature, inside temperature,fuel levels,fuel distribution,weight distribution,etc. Monitoring and Evaluation require different perspectives and skills. ‘Monitoring’is concerned with the means to achieve a desirable end,whereas, ‘Evaluation’is concerned with the end itself. In government,we often confuse between the two and end up doing neither particularly well. By making a clear distinction between the two, RFD approach has contributed to a more effective performance evaluation of Government Departments for the first time since independence. Not to say that ‘evaluation’ of Government Departments did not happen in Government of India before the introduction of RFD. Like most other countries, ‘budget’ was the main instrument of evaluating Government Departments. The bottom line was the size of the budget and a department’s performance was determined by its ability to remain within budget. Dissatisfaction with the narrow focus on‘financial inputs’as a success indicator, however, led to the adoption of ‘Performance Budgets,’ which broadened the scope of evaluation exercise to include ‘activities’ and ‘outputs,’ in addition to financial inputs. With further advances in evaluation, experts began to focus on the outcomes and hence in 2006, Government of India adopted ‘Outcome Budget.’ In 2009, RFD further expanded the scope of departmental evaluation and included non-financial outcomes, in addition to all others in the Outcome Budget. Thus, RFD represents the most comprehensive definition of departmental performance. It includes static and dynamic aspects of departmental performance; long-term and short-term aspects; financial and non-financial aspects of performance; as well as quantitative and qualitative aspects of performance.That is to say, that while RFD still belongs to the genre of approaches that fall under the rubric of Management by Objective (MBO) approaches, it has the most sophisticated evaluation methodology and has the most comprehensivescopecompared to its predecessors. This is not an insignificant point. Many governments have tried using a selective approach by focusing on a few key aspects of departmental performance. This approach leads to the famous ‘water-bed’ effect. Those areas of the department that are under scrutiny may improve but the rest of the department slackens and eventually the whole department suffers. Even if a government wants to focus on a few items of particular interest to them, it is best to give these items (action points) higher weights in the RFD and not take them out of RFD for special monitoring. In the next section we will see that by adopting RFD approach for managing departmental performance,India has moved into a very distinguished league of reformersand,indeed,represents current international best practice.
  • 38. 59 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement 6.What isthe international experience in thisarea? 6.1 Similar policiesused widely in developedand developing countries The inspiration for this policy is derived from the recommendations of the Second Administrative Reform Commission (ARC II). As mentions before, in the words of Second Administrative Reform Commission (ARC II): ‘Performance agreement is the most common accountability mechanism in most countries that have reformed their public administrationsystems'. 'At the core of such agreements are the objectives to beachieved, the resources provided to achieve them, the accountability and control measures, and the autonomy and flexibilities that the civil servants willbegiven'. Similar policies are being used in most OECD countries.The leading examples of this policy come from New Zealand, United Kingdom and USA. In the USA, the US Congress passed a law in 1994 called the Government Performance Results Act. Under this law the US President is obliged to sign a Performance Agreement with his Cabinet members. In the UK, this policy is called Public Service Agreement. In developing countries, the best examples come from Malaysia and Kenya. The table summarises the international experience with regard to approaches similar to India’s policy of Results-Framework Document(RFD): Table7:Summaryof international experiencesin GPM Country Brief description of the system Australia All line departments in Australia operate in the agency mode.The Public Service Act of 1999 includes a range of initiatives that provides for improving public accountability for performance, increasing competitiveness and enhancing leadership in the agencies. These initiatives include: a. public performance agreements (similar to RFDs) for the Agency Heads b. replacement of out-of-date hierarchical controls with more contemporary team-based arrangements c. greater devolved responsibility to the agency levels d. giving agencies flexibility to decide on their own systems for rewarding high performance e. streamlined administrative procedures f. a strategic approach to the systematic management of risk. The Financial and Accountability Act,1997 provides the accountability and accounting framework for the agencies. Under this Act,the Agency Heads are given greater flexibility and autonomy in their financial management.The Act requires Agency Heads to manage resources in an efficient, effective and ethical manner.
  • 39. 60 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Country Brief description of the system Brazil In Brazil,the most advanced form of performance management is found at the State level. In the state of Minas Gerais,heads of Government Departments are required to have a ResultsAgreement (RA).While similar to RFDs,these RAs are for a period of four years,with yearly reviews,while the Indian RFDs are negotiated for one year at a time.The achievement scores against commitments made in ResultAgreements are published documents and available to the public through the official websites in Minas Gerais. Result Agreements (RAs) in Minas Gerais also extend to city administrations and cover the quality of expenditure by the secretariat.In addition, these ResultAgreements contain provisions for performance-related pay. The 2007 innovation in RA was the cascading of the RA into two levels: first level between the State Governor and the heads of State Secretariats and agencies, focused on results of impactfor society;and the second level between the heads of agencies and their respective teams,identifying clearly and objectively the contribution of each staff member to the achievement of results. Bhutan Of all the developing countries, the recent adoption of Performance Agreements (PA) in Bhutan is the most impressive. Performance Agreements in Bhutan are signed between the Prime Minister and the respective ministers.A sample PA from Bhutan is enclosed at Flag C.Bhutan examined the international experience, including Indian experience with RFDs,and decided to improve on all previous approaches. The Harvard-educated Prime Minster is credited with this policy.We have done a comparison of the quality of Performance Agreements in Bhutan with the quality of RFDs in India and found the Bhutanese PAs to be ahead of us in India.That is why I have included them in this list. Canada Canada has a long tradition of Government Performance Management.It introduced a basic performance management system as far back as 1969 and has a very sophisticated system of performance management.In addition to Performance Contracts(PC) with departmental heads, which are similar to RFD,it has a ManagementAccountability Framework (MAF) thatalso measures the quality management and leadership of the department. Both systems are linked to the Performance Measurement Framework (PMF) and,unlike India, have statutory backing. Denmark In Denmark, the contract management approach is seen as a major contribution to performance management.Four year contracts (similar to RFDs) incorporating agreed performance targets are negotiated between specified agencies and parent Ministries and are monitored annually by the parent Ministry and Ministry of Finance. France The Centres de Responsabilite in France is another example.Since 1990,many State services at both central and devolved levels have been established as Responsibility Centres in France.A contract with their Ministry gives the Directors greater management flexibility in operational matters in exchange for a commitment to achieve agreed objectives. It also stipulates a method for evaluating results.Contracts,negotiated case by case,are for three years.
  • 40. 61 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Country Brief description of the system Indonesia Indonesia implements a system of Performance Contracts (PCs) for all Ministries.However,while in India, these PCs in form of RFD are signed between the Secretary and the Minister,in Indonesia these are signed by the concerned Minister with the President. In both cases,the purpose is to translate vision into reality.While one ministry is in charge of a programme, other Ministries whose support in providing complementary inputs are identified. These PCs monitor inputs, activities,outputs and outcomes. The central responsibility for drawing up PC rests with President’s Delivery Unit (UKP4) in Indonesia. This is a Delivery Unit under the President of Indonesia. Some provinces in Indonesia have opted for system followed by the President’s Delivery Unit and they are allowed access to their online system. Kenya Performance Contract System (PCS) in Public Service was introduced in Kenya in 2004 as part of Economic Recovery Strategy for Wealth and Employment Creation:2003-07.The system got a fillip from the new government under the widely-held perception of bureaucratic delays,inefficiency, emphasis on processes rather than on results,lack of transparency and accountability, inadequate trust and instances of huge surrender of funds. The Performance Contracting System of Kenya has Public Service Award from UNDP and Innovation Award from the Kennedy School of Government, Harvard University.Considered among the most sophisticated government performance management systems it draws on the experience of leading practitioners of New Public Management (Australia,UK,New Zealand, etc.).Its coverage is also impressive. It covers almost all entities getting support from treasury. Malaysia The Programme Agreement system designed by Prime Minister Mahathir Mohammed was a pioneering effort in the developing world.Every public entity had to have a programme agreement that specified the purpose of the existence of that agency and the annual targets in terms of outputs and outcomes. This government performance system is credited by many experts for turning Malaysia from a marshy land to almost a developed country. New Zealand New Zealand was one of the first countries to introduce an annual performance agreement (similar to RFDs) between Ministers and permanent secretaries (renamed as chief executives) who,like India, are directly responsible to the minister. The Public Finance Act of 1989 provided for a performance agreement to be signed between the chief executive and the concerned Minister every year. The Performance Agreement describes the key result areas that require the personal attention of the chief executive. The expected results are expressed in verifiable terms,and include output-related tasks.The chief executive’s performance is assessed every year with reference to the performance agreement.The system provides for bonuses to be earned for good performance and removal for poor performance.The assessment is done by a third party – the State Services Commission. Due consideration is given to the views of the departmental Minister.A written performance appraisal is prepared.The chief executive concerned is given an opportunity to comment, and his/her comments form part of the appraisal.
  • 41. 62 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Country Brief description of the system The annual purchase agreement for outputs between the minister and the department or agency complements Performance Agreements.The distinction between service delivery (outputs) and policy (outcomes) clarifies accountability. Departments or agencies are accountable for outputs; Ministers are accountable for outcomes. Purchase agreements specify outputs to be bought, as well as the terms and conditions surrounding the purchase, such as the procedure for monitoring, amending and reporting. United Kingdom In UK,a framework agreement specifies each agency’s general mission and the responsibilities of the Minister and chief executive. A complementary annual performance agreement between the Minister and chief executive sets out performance targets for the agency.Setting targets is the responsibility of the Minister.Agencies are held accountable through quarterly reports to the Minister. Under Tony Blair,UK went on to implement Public Service Agreements (PSAs), which detail the aims and objectives of UK Government Departments for a three-year period. Such agreements also“describe how targets will be achieved and how performances against these targets will be measured.”The agreement may consist of a departmental aim,a set of objectives and targets, and details of who is responsible for delivery.The main elements of PSA are as follows: 1. An introduction, setting out the Minister or Ministers accountable for delivering the commitments, together with the coverage of the PSA, as some cover other departments and agencies for which the relevant Secretary of State is accountable; 2. The aims and objectives of the department or cross-cutting area; 3. The resources which have been allocated to it in the CSR; 4. Key performance targets for the delivery of its services,together with,in some cases,a list of key policy initiatives to be delivered; A statement about how the department will increase the productivity of its operations. United States of America In 1993 the US Congress enacted the Government Performance and Results Act (GPRA).Under this Act,President of the United States is required to sign Performance Agreements (similar to RFDs) with Secretaries.These Performance Agreements include departmental Vision,Mission, Objectives, and annual targets.The achievements against these are to be placed in the Congress.An example is enclosed at Flag B.The Secretaries in turn sign performance Agreements with Assistant Secretaries and the accountability for results and performance eventually trickles down to the lowest levels. All countries discussed in this volume have already adopted some variant of this policy. 6.2 Importance of Management Systems As depicted in the figure 12, all management experts agree that around 80% of the performance of any organisation depends on the quality of the systems used. That is why the focus of PMES is on improving management control systems within the Government.
  • 42. 63 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Figure12:Importanceof Management Systems Determinantsof Performance 20% Rest 20% People 80% Leader 80% System 6.3 Shift in Focusfrom 'Reducing Quantity of Government' to 'Increasing Quality of Government' Figure below depicts a distinct worldwide trend in managing government performance. In response to perceived dissatisfaction with performance of government agencies, governments around the world have taken certain steps. These steps can be divided into two broad categories: (a) reduction in quantity of government, and (b) increase in quality of government. Over time most governments have reduced their focus on reducing the quantity of government and increased their focus on improving the quality of government. The former is represented by traditional methods of government reform such as golden handshakes, cutting the size of GovernmentDepartments,sale of public assets through privatisation. Figure13:Qualityvs.Quantityof Government Government Agencies have not delivered what was expected from them Privatisation Traditional Civil Service Reforms Reduce Quantity of Government Increase Quality of Government Trickle-down Approach Direct Approach
  • 43. 64 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement The policies undertaken by various governments to increase the quality of government can be further classified into two broad approaches: (a) Trickle- down approach,and (b)Directapproach. Figure14:IncreasingQualityof Governance IncreasingQualityof Government PMES falls under the category of trickle-down approach as it holds the top accountable and the accountability for results eventually trickles down to the lowest echelons of management. It creates a sustainable environment for implementing all reforms. The generic name of PMES is Performance Agreement. These approaches have a sustainable impact on all aspects of performancein the long run.The direct approach, on the other hand,consists of many instruments of performance management that have a direct impact on some aspect of performance.Thus these approaches are complementary and not substitutes for each other.In fact,PMES also makes use of these direct approaches by making citizens’ charter and grievance redressal systems a mandatoryrequirementfor all GovernmentDepartments in their RFDs. 7.What has been the progressin implementation? Today, PMES covers about 80 Departments of Government of India and 800 responsibility centres (Attached Offices, Subordinate Offices and Autonomous Bodies) under these Departments (Annex I). In addition 18 states of the Indian Union are at various stages of implementing the RFD system at the state level. In Punjab, the Government has also experimented with RFDs at district level. Whereas, in Assam, RFDs have been implemented at the level of Responsibility Centres. Trickle-down Approach Performance Agreement Enabling Environment Client Charter Quality Mark E-Government E-Procurement ISO 9000 Peer Reviews Knowledge Management Direct Approach
  • 44. 65 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement The following figure describes the progress in the implementation of RFD thus far: Figure15:Current Coverageof RFDPolicy As can be seen from Figure 15, the RFD policy has stabilised in terms of coverage of department since 2011. Following 18 states are at various stages of implementing the RFD policy: Figure16:StateLevel Implementationof RFDPolicy Where ImplementationHasBegun 1. Maharashtra 2. Punjab 3. Karnataka 4. Kerala 5. HimachalPradesh 6. Assam 7. Haryana 8. Chhattisgarh 9. Tripura 10. Rajasthan 11. AndhraPradesh 12. Mizoram 13. Jammu &Kashmir 14. Meghalaya 15. Odisha 16. UP(request) 17. Puducherry(request) 18. Tamil Nadu (request) 2009-2010 59 Departments RFDsfor 74 Departments RFDsfor RCsof remaining6 Departments 800 ResponsibilityCentres 18 States 62 Departments2010-2011 Outof total 80 Departments2011-2014
  • 45. 66 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement 8.Implementation of KeyAdministrative Reforms through RFD Essentially an RFD represents a mechanism to implement policies, programme and projects. However, these policies, programs and projects can be divided into two categories. One relates only to a Department (or, more specifically, departmental mandate) and the other applicable across all departments. The latter category is included in Section 2 of RFD as ‘Mandatory Objectives'. Over the past five years several key administrative reform initiatives have been implemented using the RFD mechanism. These are not new initiatives, but they were not being implemented effectively by the Government due to lack of accountability and absence of a follow-up system. RFD filled these two voids and ramped up the implementation. The following figure summarises some of the key initiatives implemented via the RFD mechanism. Figure17:Scopeof RFD GrievanceRedressMechanism ISO9001in Government Corruption MitigationStrategies Innovationin Government ImplementingRTIin Government Compliancewith CAGAudit While it is not possible to describe the rich implementation experience with regard to each of the initiative mentioned above, Readers are encouraged to visit PMD website (www.performance.gov.in) for detailed guidelines and progress in implementation. 9.Use of G2 GSoftware to manage RFD implementation In collaboration with NIC and PMD, the Cabinet Secretariat has developed a powerful software to allow all transactions related to implementation of RFD to be conducted online. This software is called Results-Framework Management System (RFMS) and it enables the following activities to be done online: 2010-2014 Citizens'/Clients'Charter
  • 46. 67 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement l preparation of the Results-Framework Document(RFD) l preparation and annual monitoring of the Clients’/Citizens’ Charters (CCC) l M&E of departmental performance on an annual and monthly basis It is mandatory for all departments to use RFMS software for preparing and submitting RFDs. RFMS has already led to use of less paper and it is expected to eventually lead to a paperless environmentfor implementing RFDs. Figure18:Screenshotof RFMSwebsite 10. Impact of PMES/RFD Any system takes a long time to implement and reach its full potential. Thus impact of systems cannot be evaluated in the short term. If we looked at the impact of systems in the short term, policy makers would stop taking a long term perspective. In addition, it is important to keep in mind that the system has not been fully implemented. Some of the key features are still in a disabled mode. For example, not all departments are covered by the RFD system. It is argued by some departments that what is good for goose is also good for gander.They argue that unless finance and planning are brought under this common accountability framework, they will remain the weak links in the chain of accountability. Similarly, it is argued that Government needs to demonstrate consequences for performance or lack thereof. As the old saying goes: 'If you are not rewarding success,you are probably rewarding failure.' In spite of incomplete implementation, it is, however, encouraging to note preliminary evidence that is beginning to trickle in. As you can see from the Results-FrameworkManagement System http://www.rfms.nic.in
  • 47. 68 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement figure above, before the introduction of a success indicator for measuring performance with respect to Grievance Redress in RFD,the difference between grievances received and disposed was significant. In 2009, only about 50% of grievances registered on the computerised grievance redress system were disposed.In 2013,after three years of RFD implementation the disposal rate for grievances filed electronically is 100%. The data for this statistic is generated and owned by another department in Government of India, the Department of Administrative Reform and Public Grievances (DARPG), so there is no conflict of interest in presenting it as evidence on behalf of PMD. According to staff of DARPG, before Grievance Redress became a mandatory performance requirement in RFDs, it was widely ignored. If RFD has modified behaviour in thisarea,itisreasonableto believethatithashad impactinotherareasaswell. Similarly, at the request of the Ministry of Finance, mandatory indicators dealing with timely disposal of CAG paras were introduced in the 2011-12 RFDs. At the time of introduction the total pendency of paras was 4216. As can be seen from the figure given below, by 2013-14 pendency of CAG paras had come down to 533. Figure20:Impact of RFDon reductionin pendencyof CAGParasin GOI Figure19:Impact of RFDon GrievanceRedressMechanism 2009 Receipts 250000 200000 150000 100000 50000 0 107961 2010 2011 2012 2013 53075 139240 117612 172520 147027 201197 168308 113896 113151 Disposals 4500 4000 3500 3000 2500 2000 1500 1000 500 0 2010 (June) 4216 533 2014 (March) RFD
  • 48. 69 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement A quick glance at RFD data yield innumerable examples of dramatic turnaround as a result of introduction of RFD. In the following figure we give a few examples to show before and after impact on several important economic and social indicators.Thefiguresare self explanatory. Coverageof OBCstudentsfor Post-matricscholarship Coverageof SCstudentsfor Post-matricscholarship 18 16 14 12 10 8 6 4 2 0 Average 2005-08 Average 2009-14 50 45 40 35 30 25 20 15 10 5 0 Average 2005-08 28.13 47.26 Average 2009-14 RuralTeledensity (Average Annual Growth Rate) Department of Telecommunications Reductionin Infant MortalityRate (IMR)per 1000livebirths Ministry of Welfare FreshCapacityAdditionof Power Ministry of Power 8 7 6 5 4 3 2 1 0 Pre RFD 2005-10 Post RFD 2009-14 RFD 60 50 40 30 20 10 0 Average 2005-08 55.75 3.19 7.15 43.8 Average 2009-14 RFD RFDRFD Increase in Enhancementof Milk Production Department of Animal Husbandry, Dairying and Fisheries 140 120 100 80 60 40 20 0 Pre RFD 2005-09 104 125 Post RFD 2009-14 RFD Average Annual Milk Production (MMT) 25000 Fresh Capacity Addition (MW) 20000 15000 10000 5000 0 1997-98 1998-99 1999-00 2001-02 2002-03 2003-04 2004-05 2005-06 2006-07 2007-08 2008-09 2009-10 2010-11 2011-12 2012-13 RFD 5 17
  • 49. 70 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Another piece of early evidence comes from a Ph.D. thesis submitted to the Department of Management Studies, Indian Institute of Technology (IIT), Delhi. This research study was undertaken with the objective to provide an integrated perspective on Results-Framework Document (RFD) in the emerging technological and socio-economic context. The study analysed the structure and processes of the ministries and their impact on the effectiveness of the RFD process. The effort was to identify the gaps and overlaps that exist while implementing the initiative under the Central Governmentdispensation. The study based its analyses on the executive perceptions of members of All India Services /Central Services /Central secretariat services with a seniority of Under Secretary and above,obtained through a structured questionnaire based survey. A sample size of 117 officers was taken.The findings were corroborated and refined basedon semi-structured interviewsofsenior civilservants. The research revealed that the RFD process has been perceived to contribute significantlyin following areas of Governance: i. Objectiveassessment of schemesand programs being implementedby the Ministriesin Governmentof India ii. Development of a template to assess the performance of ministries objectively iii. Facilitating objectiveperformanceappraisal of civil servants. iv. InculcatingPerformanceorientationinthe civil servantsby channelising their efforts towards meeting organisational objectives v. Facilitating a critical review of the schemes, programs and internal organisational processes for bringing in required reforms. vi. Facilitating the policy makers to relook and redefine the ministry’s vision,mission and objectives. Another robust source of information comes from practitioners who have seen both systems –old and new. For example, Mr. J. N. L. Srivastava, Former Secretary to Government of India, outlined the following advantages of the RFD system: i) The timeline as Success Indicator has accelerated the process of decision making,issue of sanctions and release of funds,etc. ii) In the past, monitoring of various programmes has been ad hoc and many times unattended. The RFD system has helped in development and adoption of better and regular systems of monitoring and faster introduction of IT based monitoring systems. With a focus on RFDs for the Responsibility Centres which are directly involved in implementation of the schemes, the implementation of the programmes and its monitoring has improved. iii)
  • 50. 71 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Similarly, Mr. S. P. Jakhanwal, Former Secretary Coordination in the Cabinet Secretariat say that: “impact of RFD system may not be limited from the perspectives of figures of achievements against targets. In the last five years, RFD system has impacted on the performance of the ministries/ departments of the central government in many ways: i) New Initiatives (left out earlier) were identified in consultation with the Ministries ii) Larger outputs/More efficient delivery system/reducing time in delivery of services Schemes were made more self-supporting with higher generation of revenues iv) Realistic targets (as against soft or unrealistic targets) were agreed to during discussions with the ministries. v) Some of the ways in which ministries assigned to Syndicate 6 got benefitted in improving their performance by adopting RFD system are listed in Annex I. iii) He goes on to giveseveral examples in each of theabove categories.His entire commentsare available on www.performance.gov.inor www.gopempal.org Another very persuasive argument in support of PMES/RFD comes from one of the world’s top three credit rating agency – Fitch India Ratings and Research.Known for their objectivity and credibility,Fitch goes on to say: “PMES – A Step in the Right Direction: India Ratings &Research (Ind- Ra) believes that the ‘Performance Monitoring and Evaluation System’ (PMES) is an opportune step to improve public governance and deliver better public goods/services in India.”“Ind-Ra believes that PMES is … in line with international best practices. The overall objectives of the PMES are in sync with the new union government’s focus on‘minimum governmentand maximumgovernance.’Ind-Ra believesthatthe PMES introduced by the previous government should not only continue, but also bestrengthenedwithtime.” The entire Fitch Report flushing out the above argument in a systematic way is available at: http://www.indiaratings.co.in/upload/research/ specialReports/2014/6/27/indra27GoI.pdf. A copy of the report is also annexed (Annexure J). Finally, as we shall see in the next section, the impact of a policy similar to RFD has had dramatic impact on the performance of public enterprises in India. We hope in a few years’ time, the impact of RFDs would be judged to be as dramatic as it has been in the case of MOUs.
  • 51. 72 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement 11. RFDversus Memorandum of Understanding (MOU) 11.1 About MOU The Memorandum of Understanding (MOU) is a negotiated document between the Government, acting as the owner of Centre Public Sector Enterprise(CPSE) and the Corporate Management of the CPSE. It contains the intentions, obligations and mutual responsibilities of the Government and the CPSE and is directed towards strengthening CPSE management by results and objectivesrather than managementby controls and procedures. The beginnings of the introduction of the MOU system in India can be traced to the recommendation of the Arjun Sengupta Committee on Public Enterprises in 1984. The first set of Memorandum of Undertaking (MOU) was signed by four Central Public Sector Enterprises for the year 1987-88. Over a period of time, an increasing number of CPSEs was brought within the MOU system. Further impetus to extend the MOU system was provided by the Industrial Policy Resolution of 1991 which observed that CPSEs “will be provided a much greater degree of management autonomy through the system of Memorandum of Undertaking.” Today, as many as 200 CPSEs (including subsidiaries and CPSEs under construction stages) have been brought into the fold of the MOU system. During this period, considerable modifications and improvements in the structure of and procedures for preparing and finalising the MOUs have been affected. These changes have been brought about on the basis of experience in the working of the MOU system and supported by studies carried out from time to time by expert committees on specific aspects of the MOUsystem. Broadly speaking, the obligations undertaken by CPSEs under the MOU are reflected by three types of parameters i.e. (a) financial (b) physical and (c) dynamic. Considering the very diverse nature of activities in which CPSEs are engaged. It is obviously not possible to have a uniform set of physical parameters. These would vary from enterprise to enterprise and are determined for each enterprise separately during discussions held by the Task Force (TF) with the Administrative Ministries and the CPSEs . Similarly, depending on the corporate plans and long-term objectives of the CPSEs, the dynamic criteria are also identifiedon an enterprise-specific basis.
  • 52. 73 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement 11.2 Comparisonand ContrastwithRFD RFD and MOU systems are exactly the same in their conceptual origins and design. The differences are mostly cosmetic. A large part of success of RFD system can be traced to the successful experience of implementing MOUs in the public enterprise sector. Similaritiesbetween RFD and MOUsystems 1. Both represent an agreement between a‘principal’and an‘agent’. a. RFD is between Minister (Principal) and Departmental Secretary (Agent). b. MOU is between departmental Secretary (Principal) and Chief Executive of the public enterprise (Agent) 2. Both have similar institutional arrangements. a. The quality of both is reviewed by ATF (non-government body of experts) b. Both are approved by High Powered Committees chaired by the Cabinet Secretary. 3. Both use a Composite Score to summarise the overall performance.The methodology for calculating the Composite Score is same. a. The RFD Composite Score reflects the performanceof Government Department b. The MOU Composite Score reflects the performance of public enterprise. Differencesbetween RFD and MOU 1. One significant and substantive difference is with regard to incentives for performance. MOU score is linked to a financial incentive for public enterprise employees, whereas RFD score is not yet linked to financial incentives. 2. Cosmetically RFD and MOU differ in terms of the scale used.RFD uses a 0% -100% scale,whereas,MOUhas a five pointscale from 1-5. The MOU system has had a major impact on the performance of public enterprise. While, this is true with regard to a wide range of parameters, we present the impact of MOU system on the surplus generated for the exchequer and trend in net profits for Central Public Sector Enterprises (CPSEs).
  • 53. 74 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement As mentioned earlier, the distinctly positive impact of MOU on public enterprise performance is one of the reasons for Government’s optimism about this contractual approach to performancemanagement. 12. In Conclusion: ASWOTAnalysis of PMES/RFD While no management system is perfect, only an objective management analysis of a system can lead to improvements. In that spirit, we briefly summarise a SWOT analysis of the system. 1998-9946,934 56,157 61,037 62,753 81,867 89,036 1,10,599 1,25,384 1,48,783 1,65,994 1,51,728 1,39,918 1,56,124 1,62,402 1,62,762 1999-00 2000-01 2001-02 2002-03 2003-04 2004-05 2005-06 2006-07 2007-08 2008-09 2009-10 2010-11 2011-12 2012-13 Figure21:Contribution of CPSEsto National Exchequer and their Net Profitearnings Contribution of CPSEsto National Exchequer Contribution of CPSEs in cr 1,80,000 1,60,000 1,40,000 1,20,000 1,00,000 80,000 60,000 40,000 20,000 0 Net Profit 1990-91 1991-92 1992-93 1993-94 1994-95 1995-96 1996-97 1997-98 1998-99 1999-00 2000-01 2001-02 2002-03 2003-04 2004-05 2005-06 2006-07 2007-08 2008-09 2009-10 2010-11 2011-12 2012-13 2272 2356 3271 4545 7187 9574.32 9991.82 13719.96 13203 14331 15653 25978 32344 53084 65429 69536 81055 81274 83867 92203 92129 98245 115299 120000 100000 80000 60000 40000 20000 0 Net Profit
  • 54. 75 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Strengths l P M E S has stabilised and it is widely understood and accepted. l The quality of the evaluation system has improved over the past five years l It is considered to be state-of-the-art evaluation methodology by independent and informed observers. This was re-affirmed by international experts in a Global Roundtable organised by the Performance Management Division, Cabinet Secretariat. l 18 States cutting across political lines have initiated implementation of PMES and hence there will be no political opposition to a revamped PMES. Weaknesses l In the absence of a performance related incentive scheme (PRIS), it is difficult to sustain motivation. Implementation of PRIS has been recommended since the 4th Central Pay Commission in 1987 and it is time to implementit. l The current system of Performance Appraisal Reports in Government can be improved further and linked to RFD.Today officers get almost perfect scores but the department gets only modest score. This disconnecthas be to eliminated. l While the results of RFD are placed in the Parliament, they need to be publicised more widely to have greater impact on behaviour of officials. l The connection between performance and career advancement should become more apparent. Opportunities l There is a clear hunger in the world, as well as India, for good governance and accountabilitythrough rigorous evaluation. l This is a very normal way to manage any organisation (public or private) and citizens expect this to happen in Government. l The entire machinery for accountability for results through rigorous evaluation is in place and can become truly effective with strong political leadership l This is one of the easiest aspects of governance for citizens to comprehend. Threat l Any delay in further enhancing the effectiveness of PMES and RFD will lead to disillusionment as it did with previous efforts like the PerformanceBudget and Outcome Budget.
  • 55. 76 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement For more information on PMES/RFD,please visit: www.performance.gov.in For more information on Global Roundtable and background material, please visit:www.gopempal.org
  • 56. 77 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement Bibliographical References John M. Kamensky. 2013. What our government can learn from India. Government Executive blog. October 3, 2013. Available from: http://www.govexec.com/excellence/promising- practices/2013/10/what-our-government-can-learn- india/71296/ Sunil Kumar Sinha and Devendra Kumar Pant. 2014.“GoI’s Performance Management & Evaluation System Covers 80 Departments and Ministries of the Government of India: Special Report”. Fitch India Ratings and Research - Micro Research. June 27, 2014. Available from: http://www.indiaratings.co.in/upload/ research/specialReports/2014/6/27/indra27GoI.pdf Trivedi, Prajapati. 1990. “Memorandum of Understanding and Other Performance Improvement Systems: A Comparison,” The Indian Journal of Public Administration, Vol. XXXVI, No. 2, April-June,1990. — 1995.“Improving Government Performance: What Gets Measured, Gets Done,” Economic and Political Weekly, Volume 29, no.35, 27 August 1995.pp.M109-M114 — 2003. “Performance Agreements in US Government: Lessons for Developing Countries,” Economic and Political Weekly,Vol.38, no.46,15 November 2003.pp.4859-4864 — 2005. “Designing and Implementing Mechanisms to Enhance Accountability for State-Owned Enterprises”, published in Proceedings of Expert Group Meeting on Re- inventing Public Enterprise and its Management, organised by UNDP, October 27-28, 2005, United Nations Building, New York — 2008. “Performance Contracts in Kenya” in Splendour in the Grass: Innovation in Administration. New Delhi: Penguin Books. pp.211-235 — 2010 (a). “Programme Agreements in Malaysia: Instruments for Enhancing Government Performance and Accountability,” The Administrator: Journal of Lal Bahadur Shastri National Academy of Administration, Volume 51, no. 1, pp.110-138
  • 57. 78 ProceedingsofGlobalRoundtableon GovernmentPerformanceManagement — 2010 (b). “Performance Monitoring and Evaluation System for Government Departments,” The Journal of Governance,Volume1,no.1,pp.4-19 — 2011. “India: Indian Experience with the Performance Monitoring and Evaluation System for Government Departments” published in ‘Proceedings from the Second International Conference on National Evaluation Capacities’ organised by UNDP in September 12-14, 2011, Johannesburg,South Africa.
  • 58. 249Proceedings of Global Roundtable on Government Performance Management Annexure D: Results-Framework Document (RFD) for D/oAgriculture and Cooperation,2012-13 R F D (Results-Framework Document) for Department Of Agricultural Research and Education (2012-2013)
  • 59. 250 Proceedings of Global Roundtable on Government Performance Management Results-Framework Document (RFD) for Department Of Agricultural Research and Education-(2012- 2013) Section 1: Vision, Mission, Objectives and Functions Harnessing science to ensure comprehensive and sustained physical, economic and ecological access to food and livelihood security to all Indians, through generation, assessment, refinement and adoption of appropriate technologies. Mission Sustainability and growth of Indian agriculture by interfacing agricultural research, higher educationand front-line extension initiatives complemented with institutional, infrastructural and policy support that will create efficient and effective science-harnessing tool. Objective 1 Improving natural resource management and input use efficiency 2 Strengthening of higher agricultural education 3 Utilizing frontier research in identified areas / programs for better genetic exploitation 4 Strengthening of frontline agricultural extension system and addressing gender issues 5 IP management and commercialization of technologies 6 Assessment and monitoring of fishery resources 7 Development of vaccines and diagnostics 8 Post harvest management, farm mechanization and value addition Functions 1 To develop Public-Private-Partnerships in developing seeds, planting materials, vaccines, feed formulations, value added products, agricultural machinery etc. To serve as a repository in agriculture sector and develop linkages with national and international organizations as per the needs and current trends. 2 To plan, coordinate and monitor research for enhancing production and productivity of agriculture sector.3 To enhance quality of higher education in agriculture sector.4 Technology generation, commercialization and transfer to end users.5 Human resource development and capacity building.6 To assess implementation of various programmes in relation to target sets and provide mid-course correction, if required.7 8 To provide technological backstopping to various line departments. Vision
  • 60. 251Proceedings of Global Roundtable on Government Performance Management Section2: IntersePrioritiesamongKeyObjectives,SuccessindicatorsandTargets Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012-2013) Target/CriteriaValue Poor 60% 67 3 16 2 2 0 10 102 102 7 Fair 70% 78 4 19 3 3 1 12 119 119 8 Good 80% 88 5 22 4 4 2 14 136 136 9 VeryGood 90% 100 6 25 5 5 3 15 150 150 10 Excellent 100% 112 7 27 6 6 4 17 170 170 12 Weight 2.55 2.55 1.70 1.70 1.70 0.85 1.70 1.02 1.02 2.21 Unit Number Number Number Number Number Number Number Number Number Number Success Indicator [1.1.1]DevelopingGIS baseddistrict/block levelsoilfertility maps [1.1.2]DevelopingINM packagesfor differentagro-eco regionsofthe country [1.1.3]Organizingtraining &demonstrations [1.2.1]Technologiesfor enhancingwateruse efficiencies [1.2.2]Technologiesfor waterharvesting storageand groundwater recharge [1.2.3]Models/DSSfor multipleusesof water [1.2.4]Organizingtraining &demonstrations [1.3.1]Awarenessbuilding amongststake holdersthrough trainings/ demonstrations [1.3.2]Humanresource developmentand capacitybuilding [1.3.3]Testingcrop varietiesforclimate resilienceatdifferent locations Action [1.1]Integratednutrient management(INM) [1.2]Integratedwater management(IWM) [1.3]Climateresilientagriculture Weight 17.00 Objective [1]Improvingnaturalresourcemanagement andinputuseefficiency
  • 61. 252 Proceedings of Global Roundtable on Government Performance Management Section2: IntersePrioritiesamongKeyObjectives,SuccessindicatorsandTargets Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012- 2013) Target/CriteriaValue Poor 60% 4 6 475 15 300 600 16 2000 210 Fair 70% 5 8 500 18 320 700 18 2500 245 Good 80% 6 10 575 20 340 800 20 3000 280 VeryGood 90% 8 12 625 22 360 900 22 4000 315 Excellent 100% 9 13 630 25 380 1000 25 5000 350 Weight 2.55 2.55 4.25 1.70 2.55 1.70 1.70 1.17 1.17 Unit Number Number Number Number Rupees incrores Number Number Number Number Success Indicator [2.1.1]Numberof universitiesgranted accreditation/ extensionof accreditation [2.2.1]Numberof fellowshipsawarded (subjecttoavailability ofcompetent candidates) [2.3.1]TotalNo.of fellowshipsgranted everyyear(subject toavailabilityof competent candidates) [2.4.1]Experientiallearning unitsestablished [2.5.1]Amountreleased [2.6.1]Numberofteachers trainedperyear [2.6.2]NumberofSummer /WinterSchools organized [3.1.1]Numberof germplasmcollected /characterizedand conserved(other crops) [3.1.2]Numberof germplasmcollected Action [2.1]Accreditation/Extensionof accreditationofagricultural universities [2.2]GrantofICARInternational fellowshipstoIndianand foreignstudents [2.3]GrantofJRFandSRFto students [2.4]Establishmentof experientiallearningunits [2.5]Financialsupportand monitoringofprogress [2.6]Capacitybuildingand facultyup-gradation [3.1]Collection,characterization andconservationofgenetic resources Weight 17.00 13.00 Objective [2]Strengtheningofhigheragricultural education [3]Utilizingfrontierresearchinidentifiedareas/ programsforbettergeneticexploitation
  • 62. 253Proceedings of Global Roundtable on Government Performance Management Section2: IntersePrioritiesamongKeyObjectives,SuccessindicatorsandTargets Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012-2013) Target/CriteriaValue Poor 60% 1000 7000 2400 27 5 7 12 550 0.5 Fair 70% 1500 7500 2800 31.5 8 9 14 600 1 Good 80% 2000 8000 3200 36 10 11 16 800 1.5 VeryGood 90% 2500 8200 3600 40.5 12 13 18 900 2 Excellent 100% 3000 8500 4000 45 15 17 20 1000 2.5 Weight 1.17 1.30 1.17 1.17 1.17 1.17 1.17 1.17 1.17 Unit Number Tonnes Tonnes Number (inlakhs) Number Number Number Number Number (in Success Indicator (horticulturalcrops) [3.2.1]Numberof germplasm evaluated [3.3.1]Quantityofbreeder seedproduced (othercrops) [3.3.2]Quantityofbreeder seedproduced (horticulturalcrops) [3.3.3]Quantityofplanting materialsproduced annually [3.4.1]Numberofvarieties developed(other crops) [3.4.2]Numberofvarieties developed(pulses/ oilseeds) [3.4.3]Numberofvarieties developed (horticulturalcrops) [3.5.1]Provisioningof pigletstofarmers anddevelopment agencies [3.6.1]Provisioningofday old/6weeks Action [3.2]Evaluationofgenetic resources/improved varietiesforsuitablecrop husbandrypractices [3.3]Productionofbreederseed, otherseedsandplanting materials [3.4]Developmentofimproved varietiessuitedtodiverse agroecologies [3.5]Productionofpiglets(8-12 weeksofage) [3.6]Productionofdayoldas wellas6weeksold WeightObjective
  • 63. 254 Proceedings of Global Roundtable on Government Performance Management Section2: IntersePrioritiesamongKeyObjectives,SuccessindicatorsandTargets Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012- 2013) Target/CriteriaValue Poor 60% 140 14000 15 10 60 50 3 0 0 12 Fair 70% 150 15000 16 15 70 55 4 1 0 14 Good 80% 200 16000 20 20 80 60 5 2 1 16 VeryGood 90% 220 18000 25 25 90 65 6 3 2 18 Excellent 100% 240 20000 30 30 95 70 8 5 3 20 Weight 5.20 5.20 2.60 4.50 4.50 3.60 2.40 3.00 2.00 1.25 Unit lakhs) Number Number Number Number Number Number Number Number Number Number Success Indicator oldchickstofarmers anddevelopment agencies [4.1.1]Numberof technologies assessed [4.2.1]Numberoftraining programmes organized [4.3.1]Gender-related technology promotionprograms conducted [5.1.1]Partners(private sector)identified [5.2.1]Applicationsfiled [6.1.1]Numberof explorations/ surveyscarriedout [6.1.2]DevelopmentofGIS basedaquatic resourcedatabase [7.1.1]Diagnostickits developed [7.2.1]Productionof vaccines [8.1.1]Equipment developed/refined Action chicks [4.1]Technologyassessment throughon-farmtrials [4.2]Capacitybuildingthrough trainingprogrammes [4.3]Promotionoftechnologies coveringgenderconcerns [5.1]Partnershipdevelopment, includinglicensingofICAR technologies [5.2]PatentsandotherIPRtitles [6.1]Fishresourcesassessment andeco-systemmonitoring [7.1]Productionofdiagnostic kitsandfieldvalidation [7.2]Productionofvaccines againstimportantanimal diseasesandtheir validation [8.1]Develop/refineequipment forcropproduction& processing Weight 13.00 9.00 6.00 5.00 5.00 Objective [4]Strengtheningoffrontlineagricultural extensionsystemandaddressinggender issues [5]IPmanagementandcommercializationof technologies [6]Assessmentandmonitoringoffishery resources [7]Developmentofvaccinesanddiagnostics [8]Postharvestmanagement,farm mechanizationandvalueaddition
  • 64. 255Proceedings of Global Roundtable on Government Performance Management Section2: IntersePrioritiesamongKeyObjectives,SuccessindicatorsandTargets Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012- 2013) *MandatoryObjective(s) Target/CriteriaValue Poor 60% 6 5 6 09/03/2012 06/05/2012 80 80 09/03/2013 80 80 60 Fair 70% 8 6 8 08/03/2012 05/05/2012 85 85 08/03/2013 85 85 70 Good 80% 10 8 10 07/03/2012 04/05/2012 90 90 07/03/2013 90 90 80 VeryGood 90% 11 10 12 06/03/2012 03/05/2012 95 95 06/03/2013 95 95 90 Excellent 100% 12 11 14 05/03/2012 01/05/2012 100 100 05/03/2013 100 100 100 Weight 1.25 1.25 1.25 2.0 1.0 2.0 2.0 2.0 2.0 2.0 0.5 Unit Number Number Number Date Date % % Date % % % Success Indicator [8.2.1]Commercialtest reports/samples tested [8.3.1]Processprotocols [8.4.1]Value-added products On-timesubmission On-timesubmission %ofimplementation Areaofoperationscovered Implementationofidentified innovations IndependentAuditof ImplementationofCitizen’s Charter IndependentAuditof implementationofpublic grievanceredressalsystem PercentageofATNs submittedwithinduedate(4 months)fromdateof presentationofReportto ParliamentbyCAGduring theyear. Action [8.2]Testingofcommercial prototypes/technologies [8.3]Processprotocolsfor productdevelopment, storage,safetyand improvedquality [8.4]Development/refinement ofproductsfromcrops, fibres,naturalgums/ resins,livestock/fishes TimelysubmissionofDraftfor Approval TimelysubmissionofResults Implementmitigatingstrategies forreducingpotentialriskof corruption ImplementISO9001asperthe approvedactionplan Identify,designandimplement majorinnovations ImplementationofSevottam TimelysubmissionofATNson AuditparasofC&AG Weight 3.00 6.00 4.00 2.00 Objective *EfficientFunctioningoftheRFDSystem *AdministrativeReforms *ImprovingInternalEfficiency/ responsiveness/servicedeliveryofMinistry /Department *EnsuringcompliancetotheFinancial AccountabilityFramework
  • 65. 256 Proceedings of Global Roundtable on Government Performance Management Section2: IntersePrioritiesamongKeyObjectives,SuccessindicatorsandTargets Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012- 2013) *MandatoryObjective(s) Target/CriteriaValue Poor 60% 60 60 60 Fair 70% 70 70 70 Good 80% 80 80 80 VeryGood 90% 90 90 90 Excellent 100% 100 100 100 Weight 0.5 0.5 0.5 Unit % % % Success Indicator PercentageofATRS submittedwithinduedate( 6months)fromdateof presentationofReportto ParliamentbyPACduring theyear. Percentageofoutstanding ATNsdisposedoffduring theyear. Percentageofoutstanding ATRSdisposedoffduring theyear. Action TimelysubmissionofATRstothe PACSectt.onPACReports. EarlydisposalofpendingATNson AuditParasofC&AGReports presentedtoParliamentbefore 31.3.2012. EarlydisposalofpendingATRs onPACReportspresentedto Parliamentbefore31.3.2012 WeightObjective
  • 66. 257Proceedings of Global Roundtable on Government Performance Management Section3: TrendValuesoftheSuccessIndicators Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012- 2013) Projected Valuefor FY14/15 15 8 25 2 3 1 16 100 100 10 Projected Valuefor FY13/14 71 6 25 2 3 1 15 100 50 10 TargetValue FY12/13 100 6 25 5 5 3 15 150 150 10 ActualValue FY11/12 15 4 24 9 6 2 49 100 100 7 ActualValue FY10/11 10 4 15 4 5 2 10 0 0 0 Unit Number Number Number Number Number Number Number Number Number Number SuccessIndicator [1.1.1]DevelopingGISbased district/blocklevelsoil fertilitymaps [1.1.2]DevelopingINM packagesfordifferent agro-ecoregionsofthe country [1.1.3]Organizingtraining& demonstrations [1.2.1]Technologiesfor enhancingwateruse efficiencies [1.2.2]Technologiesforwater harvestingstorageand groundwaterrecharge [1.2.3]Models/DSSfor multipleusesofwater [1.2.4]Organizingtraining& demonstrations [1.3.1]Awarenessbuilding amongststakeholders throughtrainings/ demonstrations [1.3.2]Humanresource developmentand capacitybuilding [1.3.3]Testingcropvarieties forclimateresilienceat differentlocations Action [1.1]Integratednutrient management(INM) [1.2]Integratedwater management(IWM) [1.3]Climateresilient agriculture Objective [1]Improvingnaturalresource managementandinputuseefficiency
  • 67. 258 Proceedings of Global Roundtable on Government Performance Management Section3: TrendValuesoftheSuccessIndicators Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012- 2013) Projected Valuefor FY14/15 10 15 650 37 385 1000 35 2300 Projected Valuefor FY13/14 10 14 650 35 375 1000 35 2200 TargetValue FY12/13 8 12 625 22 360 900 22 4000 ActualValue FY11/12 8 12 625 25 325 1000 35 2000 ActualValue FY10/11 8 12 625 20 289 1000 40 2000 Unit Number Number Number Number Rupeesin crores Number Number Number SuccessIndicator [2.1.1]Numberofuniversities grantedaccreditation/ extensionof accreditation [2.2.1]Numberoffellowships awarded(subjectto availabilityofcompetent candidates) [2.3.1]TotalNo.offellowships grantedeveryyear (subjecttoavailabilityof competentcandidates) [2.4.1]Experientiallearning unitsestablished [2.5.1]Amountreleased [2.6.1]Numberofteachers trainedperyear [2.6.2]NumberofSummer/ WinterSchools organized [3.1.1]Numberofgermplasm collected/ characterizedand conserved(other crops) Action [2.1]Accreditation/Extension ofaccreditationof agriculturaluniversities [2.2]GrantofICAR Internationalfellowships toIndianandforeign students [2.3]GrantofJRFandSRFto students [2.4]Establishmentof experientiallearningunits [2.5]Financialsupportand monitoringofprogress [2.6]Capacitybuildingand facultyup-gradation [3.1]Collection, characterizationand conservationofgenetic resources Objective [2]Strengtheningofhigheragricultural education [3]Utilizingfrontierresearchinidentified areas/programsforbettergenetic exploitation
  • 68. 259Proceedings of Global Roundtable on Government Performance Management Section3: TrendValuesoftheSuccessIndicators Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012- 2013) Projected Valuefor FY14/15 400 2400 8500 4800 52 12 13 20 1000 2.5 Projected Valuefor FY13/14 400 2200 8200 4500 50 12 13 20 900 2 TargetValue FY12/13 315 2500 8200 3600 40.5 12 13 18 900 2 ActualValue FY11/12 300 2000 8000 3500 40 10 -- 15 900 2 ActualValue FY10/11 45 1800 8000 2250 13 10 -- 4 900 0.6 Unit Number Number Tonnes Tonnes Number (inlakhs) Number Number Number Number Number (inlakhs) SuccessIndicator [3.1.2]Numberofgermplasm collected(horticultural crops) [3.2.1]Numberofgermplasm evaluated [3.3.1]Quantityofbreeder seedproduced(other crops) [3.3.2]Quantityofbreeder seedproduced (horticulturalcrops) [3.3.3]Quantityofplanting materialsproduced annually [3.4.1]Numberofvarieties developed(other crops) [3.4.2]Numberofvarieties developed(pulses/ oilseeds) [3.4.3]Numberofvarieties developed (horticulturalcrops) [3.5.1]Provisioningofpiglets tofarmersand developmentagencies [3.6.1]Provisioningofdayold /6weeksoldchicksto farmers Action [3.2]Evaluationofgenetic resources/improved varietiesforsuitablecrop husbandrypractices [3.3]Productionofbreeder seed,otherseedsand plantingmaterials [3.4]Developmentofimproved varietiessuitedtodiverse agroecologies [3.5]Productionofpiglets(8- 12weeksofage) [3.6]Productionofdayoldas wellas6weeksold chicks Objective
  • 69. 260 Proceedings of Global Roundtable on Government Performance Management Section3: TrendValuesoftheSuccessIndicators Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012-2013) Projected Valuefor FY14/15 280 22000 35 40 150 75 8 4 3 25 Projected Valuefor FY13/14 260 20000 30 35 130 70 7 3 3 25 TargetValue FY12/13 220 18000 25 25 90 65 6 3 2 18 ActualValue FY11/12 200 -- 20 136 111 60 6 4 2 20 ActualValue FY10/11 -- -- -- 20 70 40 4 2 2 20 Unit Number Number Number Number Number Number Number Number Number Number SuccessIndicator anddevelopment agencies [4.1.1]Numberof technologiesassessed [4.2.1]Numberoftraining programmesorganized [4.3.1]Gender-related technologypromotion programsconducted [5.1.1]Partners(private sector)identified [5.2.1]Applicationsfiled [6.1.1]Numberofexplorations /surveyscarriedout [6.1.2]DevelopmentofGIS basedaquatic resourcedatabase [7.1.1]Diagnostickits developed [7.2.1]Productionofvaccines [8.1.1]Equipmentdeveloped/ refined Action [4.1]Technologyassessment throughon-farmtrials [4.2]Capacitybuildingthrough trainingprogrammes [4.3]Promotionof technologiescovering genderconcerns [5.1]Partnership development,including licensingofICAR technologies [5.2]PatentsandotherIPR titles [6.1]Fishresources assessmentandeco- systemmonitoring [7.1]Productionofdiagnostic kitsandfieldvalidation [7.2]Productionofvaccines againstimportantanimal diseasesandtheir validation [8.1]Develop/refine equipmentforcrop production&processing Objective [4]Strengtheningoffrontlineagricultural extensionsystemandaddressing genderissues [5]IPmanagementand commercializationoftechnologies [6]Assessmentandmonitoringoffishery resources [7]Developmentofvaccinesand diagnostics [8]Postharvestmanagement,farm mechanizationandvalueaddition
  • 70. 261Proceedings of Global Roundtable on Government Performance Management Section3: TrendValuesoftheSuccessIndicators Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012- 2013) *MandatoryObjective(s) Projected Valuefor FY14/15 15 13 18 -- -- -- -- -- -- -- Projected Valuefor FY13/14 15 13 18 -- -- -- -- -- -- -- TargetValue FY12/13 11 10 12 06/03/2012 03/05/2012 95 95 06/03/2012 95 95 ActualValue FY11/12 12 11 14 06/03/2011 -- -- -- -- -- -- ActualValue FY10/11 12 10 12 05/03/2010 27/04/2011 -- -- -- -- -- Unit Number Number Number Date Date % % Date % % SuccessIndicator [8.2.1]Commercialtest reports/samples tested [8.3.1]Processprotocols [8.4.1]Value-addedproducts On-timesubmission On-timesubmission %ofimplementation Areaofoperationscovered Implementationofidentified innovations IndependentAuditof ImplementationofCitizen’s Charter IndependentAuditof implementationofpublic grievanceredressal Action [8.2]Testingofcommercial prototypes/technologies [8.3]Processprotocolsfor productdevelopment, storage,safetyand improvedquality [8.4]Development/refinement ofproductsfromcrops, fibres,naturalgums/ resins,livestock/fishes TimelysubmissionofDraftfor Approval TimelysubmissionofResults Implementmitigatingstrategies forreducingpotentialriskof corruption ImplementISO9001asper theapprovedactionplan Identify,designandimplement majorinnovations ImplementationofSevottam Objective *EfficientFunctioningoftheRFD System *AdministrativeReforms *ImprovingInternalEfficiency/ responsiveness/servicedeliveryof Ministry/Department
  • 71. 262 Proceedings of Global Roundtable on Government Performance Management Section3: TrendValuesoftheSuccessIndicators Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012-2013) *MandatoryObjective(s) Projected Valuefor FY14/15 -- -- -- -- Projected Valuefor FY13/14 -- -- -- -- TargetValue FY12/13 90 90 90 90 ActualValue FY11/12 -- -- -- -- ActualValue FY10/11 -- -- -- -- Unit % % % % SuccessIndicator system PercentageofATNssubmitted withinduedate(4months) fromdateofpresentationof ReporttoParliamentbyCAG duringtheyear. PercentageofATRSsubmitted withinduedate(6months) fromdateofpresentationof ReporttoParliamentbyPAC duringtheyear. Percentageofoutstanding ATNsdisposedoffduringthe year. Percentageofoutstanding ATRSdisposedoffduringthe year. Action TimelysubmissionofATNson AuditparasofC&AG TimelysubmissionofATRsto thePACSectt.onPAC Reports. EarlydisposalofpendingATNs onAuditParasofC&AG Reportspresentedto Parliamentbefore31.3.2012. EarlydisposalofpendingATRs onPACReportspresentedto Parliamentbefore31.3.2012 Objective *EnsuringcompliancetotheFinancial AccountabilityFramework
  • 72. 263Proceedings of Global Roundtable on Government Performance Management Results-Framework Document (RFD) for Department Of Agricultural Research and Education -(2012- 2013) Section 4: Description and Definition of Success Indicators and Proposed Measurement Methodology Objective 1. Improving natural resource management and input use efficiency Improving natural resource management and input use efficiency with respect to improving soil health and water productivity, integrated nutrient and water management are essential. The action points/ success indicators for INM cover developing GIS based soil fertility maps, macro / micro-level land use plans, developing and disseminating integrated nutrient management packages, technologies for improving the productivity of problem soils, IFS models etc. For facilitating IWM, enhancing water storage and ground water recharge, multiple uses of water, precision/micro- irrigation systems, recycling of wastewater and other on-farm management issues like resource conservation technologies, deficit irrigation, tools and models to support decision making are planned. For mitigating adverse impact of climate change on crops, livestock, horticulture and fisheries, emphasis will specifically be on climate resilient agriculture through identifying the vulnerable zones and mitigating measures through basic and strategic research. In order to improve the capacity of research and developmental organizations and their staff, provision has been made for strengthening them with state of the art technologies through training programmes / field demonstrations etc. Objective 2. Strengthening of higher agricultural education The success will be measured from the indicator the number of universities having developed appropriate e-learningtools and resources. Similarly, Accreditation / Extension of accreditation of agricultural universities will require number of universities granted accreditation / extension of accreditation; Grant of ICAR International fellowships to Indian and foreign students, and JRF and SRF, as applicable, will cover number of such fellowships awarded. However, such numbers of grants will also depend upon the availability of competent candidates for the fellowships. Capacity building and faculty upgradation of teachers will be measured from the number of teachers trained per year. Objective 3. Utilizing frontier research in identified areas / programs for better genetic exploitation The emphasis on natural resource management is laid to ensure efficient use of natural resources under the changing situations. This can be supported by developing high yielding varieties, requiring less input like fertilizers, water and pesticides. With respect to conservation of genetic resources for sustainable use, it is envisaged to conserve plant genetic resources to have repository, evaluation and further utilization of resources for improving yield in a sustainable manner. The genetic diversity of various horticultural crops will be collected from different eco-regions, characterized and utilized to develop varieties for higher yields, quality and biotic and abiotic stresses. The action points /success indicators include production of quality seed and planting materials. Objective 4. Strengthening of frontline agricultural extension system and addressing gender issues The success indicators with respect to assessment of technology through OFTs is measured by the actual number of technologies assessed by conducting on farm trials. Capacity building and trainings organized are measured with the actual numbers of such programme / activities undertaken by the KVKs. Regarding support for promoting gender issues is measured through the success indicators of actual number of gender related technology promotion programmes conducted by the DRWA. Objective 5. IP management and commercialization of technologies With respect to commercialization of technologies and promoting public-private partnership, it is envisaged to bring commercial ethos in agricultural research. Indicators for commercialization of
  • 73. 264 Proceedings of Global Roundtable on Government Performance Management Results-Framework Document (RFD) for Department Of Agricultural Research and Education -(2012- 2013) Section 4: Description and Definition of Success Indicators and Proposed Measurement Methodology technologies, promoting public-private partnership, and protection of intellectual property rights will be determined by the commercialization through partnership development, including licensing of ICAR technologies. The increasing numbers over the years may indicate a higher emphasis on technology transfer through enterprises; thereby contributing to larger adoption and improved socioeconomic impact of ICAR technologies. Objective 6. Assessment and monitoring of fishery resources To enhance fish production and productivity on a sustainable basis from the available resources, and to address the issues and strategies to overcome the critical research gaps in realizing the full production potential from fisheries and aquaculture sector, the research activities have been consolidated and prioritized. The action points and the success indicators under this objective have been identified depending on the priority and availability of the resources and the needs and requirements of the stakeholders. It is expected that by undertaking these programmes, there would be an increase in fish production, conservation of resources, more opportunities for livelihood and employment generation. Objective 7. Development of vaccines and diagnostics The production of diagnostic kits and vaccines would involve delineation of process (processes) and thereby denoting a specific number for field testing / validation. Objective 8. Post harvest management, farm mechanization and value addition The action points / success indicators for development / refinement of equipment would include intended performance of the equipment and its commercial viability. Test results and on-farm trials will be used to judge the expected output. The success indicators will cover technologies developed to create innovative products that are commercially acceptable in competitive markets.
  • 74. 265Proceedings of Global Roundtable on Government Performance Management Results-Framework Document (RFD) for Department Of Agricultural Research and Education - (2012-2013) Section 5: Specific Performance Requirements from other Departments 1. A strong network support for channelizing awareness through training programmes, inputs like monetary support / loans, availability of germplasm, medicines, etc. and market access through state development agencies, KVKs and NGOs would play a major role. (State AH departments, DADF, KVKs, NGOs). 2. Development of animal disease diagnostics and vaccines requires sound commitment for monitoring support for production of diagnostic vaccines whereas for validation under field conditions, a strong commitment and participation of state agencies will be required. (State AH departments, Pvt. Industry for up-scaling). 3. The quantity of breeder seed produced is based on the quantity indented by Department of Agriculture and Cooperation, which in turn collects indents from various seed agencies including State Departments of Agriculture. 4. Technology adoption would depend upon the proactive role of development departments namely DAC, DST, DBT, DADF, SAUs etc. 5. Regarding the achievements related to technology assessment through OFTs and capacity building through training programme, the support of ICAR institutions and SAUs are required in order to ensure timely technology and methodology backstopping. In addition, farmers participation, sponsorship of trainees from the line departments, availability of required demonstration plots for conducting OFTS trials are some of the much needed support from the stakeholders. 6. The success with respect to promotion of technologies covering gender issues requires the collaboration of AICRP centres, Agricultural Engineering Division and the line departments are important in generating suitable gender data base, assessment of the technologies keeping in view the gender perspectives and their dissemination. 7. Popularization and commercialization of tools and equipment will require continued support of Department of Agriculture and Cooperation, Ministry of Agriculture for frontline demonstrations on large scale and capacity building of stakeholders and proactive role taken by various line departments in promoting improved technologies. 8. The Fisheries Division is working in close coordination and linkages with the Ministry of Agriculture; Ministry of Commerce; Ministry of Science & Technology; Ministry of Environment & Forest; Ministry of Earth Sciences; Ministry of Food Processing Industries, funding institutions, private entrepreneurs, NGOs, stakeholders etc. through interface and participation in various committees and meetings addressing the researchable issues in fisheries and aquaculture for formulating the strategies and guidelines for policy interventions to facilitate increasing fish production and productivity. Support from all these agencies and organizations are essential for achieving the mission of providing required food, nutritional, socio-economic and livelihood security. 9. The support of the Ministry of Finance and the Planning Commission would be crucial for realizing of set objectives, target and goals. Further, successful executing of the programmes would depend on the proactive role of other line departments of states and stakeholders for technology adoption and timely implementation of suggested strategies & guidelines. 10.Support from the concerned central / state line departments / SAUs, soil testing laboratories, KVKs, watershed associations, Pani Panchayat for promoting adoption of developed technologies. 11.Support from associated Institutes/DUs/SAUs/line departments for promoting adoption of developed technologies. 12. Financial support as per EFC / SFC allocation of institute under Horticulture
  • 75. 266 Proceedings of Global Roundtable on Government Performance Management Results-Framework Document (RFD) for Department Of Agricultural Research and Education - (2012-2013) Section 5: Specific Performance Requirements from other Departments Division including AICRP / network projects. 13.Support from SAUs, KVKs and line departments for promotion and adoption of technologies developed by the institutes. 14.Financial and technological support from other government departments like, DAC, NMPB, NHB, APEDA, MoRD, MoHFA, MoWR etc., State line departments and others including foreign collaborations. 15.The development and strengthening of the SAUs / AUs will depend upon the support / timely availability of sufficient fund from the central government.
  • 76. 267Proceedings of Global Roundtable on Government Performance Management Section6: Outcome/ImpactofDepartment/Ministry Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012- 2013) Department/Ministry FY14/15 2 4 2 2.5 4 5 1 2 0.8 5000 FY13/14 2 4 2 2.5 4 5 1 2 0.8 5000 FY12/13 2 4 2 2.5 4 5 1 2 0.8 4500 FY11/12 2 3.8 2 2.5 4 5 1 3 1.6 4000 FY10/11 0 0 0 0 0 0 0 0 0 0 Unit % % % % % % % % % Number Success Indicator Increaseinagriculture productivity Increaseinmilkproductivity Increaseineggproductivity Increaseinmeatproductivity Increaseinfishproductivity IncreaseinGraduates/PG studentspassedoutand capacitybuilding Decreaseinruralpoverty Increaseinfarmincome Increaseinpercapita availabilityofagricultural products Technicalpaperspublishedin recognizedjournals Jointlyresponsiblefor influencingthisoutcome/ impactwiththefollowing department(s)/ministry(ies) DADF,DAC,PlanningCommission, MinistryofEnvironment&Forests, MinistryofPanchayatiRaj,Ministryof RuralDevelopmentandState Governments DADF,MinistryofPanchayatiRaj, MinistryofRuralDevelopment,State GovernmentsandNGOs SAUs,SVUs,MinistryofPanchayatiRaj, MinistryofRuralDevelopmentandState Governments DAC,DADF,SAUs,SVUs,Ministryof PanchayatiRaj,MinistryofRural Development,MinistryofFertilizersand StateGovernments DST,DBT,ICMR,MinistryofFood Processing,MinistryofPanchayatiRaj, MinistryofRuralDevelopmentandState Governments Outcome/Impactof 1Enhancedagriculture productivity 2Enhancedmilk,egg,meat& fishproductivity 3Enhancedavailabilityof qualityhumanresourcesfor agriculturalresearch& developmentactivities 4Enhancedrurallivelihood security 5Improvednutritionalsecurity 6Enhancingfrontierresearch/ programmes
  • 77. 268 Proceedings of Global Roundtable on Government Performance Management Section6: Outcome/ImpactofDepartment/Ministry Results-FrameworkDocument(RFD)forDepartmentOfAgriculturalResearchandEducation-(2012- 2013) Department/Ministry FY14/15 80 10 FY13/14 80 10 FY12/13 70 10 FY11/12 60 10 FY10/11 0 0 Unit Number Number Success Indicator Newvarietiesdeveloped Researchconvertedinto commercializedtechnology Jointlyresponsiblefor influencingthisoutcome/ impactwiththefollowing department(s)/ministry(ies) SAU/DU Outcome/Impactof 7Commercializationof technologies
  • 78. 269Proceedings of Global Roundtable on Government Performance Management Annexure E: Results-Framework Document Evaluation Methodology(REM)Guideline RFD Evaluation Methodology (REM) PerformanceManagement Division C A B I N E T S E C R E T A R I A T
  • 79. 270 Proceedings of Global Roundtable on Government Performance Management TABLE OF CONTENTS I Purpose of this document …………………………………………………... 5 II Approach …………………………………………………………………… 5 III Target Audience…………………………………………………………… 5 IV Rationale and Importance of REM …………………………………………. 6 V Calculating overall Quality Rating of RFD ………………………………… 7 VI Evaluation of Organization‟s Vision (Section 1A) ………………………… 8 VII Evaluation of Organization‟s Mission (Section 1B) ……………………….. 10 VIII Evaluation of Organization‟s Objectives (Section 1C) …………………….. 11 IX.Evaluation of Section 2 of RFD ……………………………………………. 13 X.Evaluation of Section 3 of RFD ……………………………………………. 21 XI Evaluation of Section 4 of RFD ……………………………………………. 22 XII Evaluation of Quality of Section 5 in RFD ………………………………… 24 XIII Evaluation of Quality of Section 6 of RFD ………………………………… 25 XIV Putting it all together ……………………………………………………… 26
  • 80. 271Proceedings of Global Roundtable on Government Performance Management LIST OF TABLES 1. Distribution of Relative Weights and Illustrative Calculation of Overall Quality Rating …………………………………………………………….. 7 2. Distribution of Relative Weights and Illustrative Calculation of Quality Rating of Vision Statement ………………………………………………... 3. Distribution of Relative Weights and Illustrative Calculation of Quality Rating of Mission Statement………………………………………………... 9 11 4. Distribution of Relative Weights and Illustrative Calculation of Quality Rating for list of Organizational Objectives ………………………………... 13 5. Distribution of Relative Weights and Illustrative Calculation of Quality Rating of Section 2 of the RFD…………………………………………….. 6. Calculation for Quality Scores for SIs……………………………………… 17 14 7. Calculation of Outcome Orientation of Success Indicators ……………….. 17 8. Calculation of Quality Orientation of SIs…………………………………… 18 9. Rating of Quality Targets for each SI……………………………………….. 19 10. Calculation the Quality of Section 3 of the RFD - Percentage of Data Populated …………………………………………………………………… 11. Distribution of Weight Among Criteria and Illustrative Calculations of Quality of Section 4 of RFD………………………………………………… 22 24 12. Distribution of Weight Among Criteria and Illustrative Calculations of Quality of Section 5 of the RFD…………………………………………… 25 13. Distribution of Weight Among Criteria and Illustrative Calculations of Quality of Section 6 of the RFD…………………………………………….. 26
  • 81. 272 Proceedings of Global Roundtable on Government Performance Management LIST OF FIGURES 1. Heuristic Equation Explaining True Performance of an Organization……... 6 2. Summary of the Six Sections of the RFD…………………………………. 7 3. Appropriate Number of Objectives………………………………………… 12 4. Format of Section 2 of RFD………………………………………………… 14 5. Typical Results Chain……………………………………………………….. 16 6. An Example of Results Chain………………………………………………. 16 7. Target of SI………………………………………………………………….. 19 8. Guidelines for Evaluating the Degree of Consistency of Targets ………….. 19 9. Section 3 of RFD - Trend Values for Success Indicators…………………… 21 10. Calculation of Percentage of Data Populated……………………………….. 22 11. Sample of Acronyms of Section 4 of RFD………………………………….. 23 12. Sample of SI Definition and Measurement Methodology of Section 4 of RFD ………………………………………………………………………… 23 13. Sample Section 5 from RFD………………………………………………… 24 14. Outcome / Impact of activities of department/ ministry…………………….. 25
  • 82. 273Proceedings of Global Roundtable on Government Performance Management RFD EVALUATION METHODOLOGY (REM) I. PURPOSE OF THIS DOCUMENT This document outlines the methodology for evaluating the quality of a Results-Framework Document (RFD). This methodology is based on the Guidelines for preparing RFD, developed by the Performance Management Division (PMD), Cabinet Secretariat, and approved by the High Power Committee on Government Performance. Hence, this methodology is a complement to the RFD Guidelines and should be read along with it. The RFD evaluation methodology outlined in this document is intended to provide a benchmark against which the design of an RFD can be evaluated. It provides an agreed definition of “quality” in the context of designing RFD. In the absence of such a shared understanding, there is a danger that the “quality” of RFD, like beauty, could lie in the eyes of the beholder. II. APPROACH Any „evaluation‟ essentially involves comparing achievement against a target. Therefore, to evaluate the quality of an RFD we must agree on the target against which we shall judge the quality of RFD. Since RFD is supposed to be designed as per the RFD Guidelines, it is only logical and fair to use the RFD Guidelines as the benchmark / target for judging the quality of an RFD. In other words, our approach is to ascertain how well the RFD Guidelines were followed to draft the RFD that is being evaluated. The Results-Framework Document Evaluation Methodology (REM) is a useful analytical tool designed to assess all RFD sections across all Departments using the same methodology and minimizing the subjectivity of the assessments. For each section of RFD we have provided a number of assessment criteria against which a score is assigned, using the same 5 points rating scale already in use for the RFDs (from 60% to 100%). These criteria are largely based on the RFD Guidelines document. They comprise quantitative and qualitative criteria. Quantitative criteria aim to capture risks and limitations in a numerical way (e.g. "percentage of data populated"); qualitative criteria are applied to assessment areas for which a numerical analysis is not feasible but can indeed be measured against the agreed Guidelines for preparing RFD. III. TARGET AUDIENCE This methodology is meant primarily for the organizations preparing RFDs. It provides a convenient checklist for a self-audit. To ensure that all stakeholders are on the same page, this methodology is also meant for providing a useful platform during the departmental discussions with members of the Ad-hoc Task Force.
  • 83. 274 Proceedings of Global Roundtable on Government Performance Management IV. RATIONALE AND IMPORTANCE OF REM RFD policy is based on the following fundamental principle of management: What gets measured gets done. This principle is transcendental in its application and it also applies, in equal measure, to the „quality‟ of RFD. Unless we have an agreed yardstick for measuring the „quality‟ of RFD, we will not be able to determine whether successive drafts represent an improvement or otherwise. Indeed, we will not be able to determine whether all our collective efforts are improving the „quality‟ of RFDs over time. In addition, we believe that the quality of deliberations and discussions would be much more systematic and objective. It will bring rigor and, therefore, greater credibility to our critiques of RFD. Above all, we need to remember that RFD is a means towards an end and not an end in and of itself. The purpose of RFD is to improve performance of an organization by giving the departmental managers clear, meaningful and unambiguous targets and evaluating their performance by comparing their achievements against these targets. If, however, the quality of targets is not very meaningful, then the achieving these targets is not likely to be very meaningful. This then is the reason for ensuring that targets in RFD are meaningful. For example, the meaningfulness of targets depends, among other things, on their alignment with vision, mission and objectives. This is just another way of saying that quality of RFD matters. The following heuristic equation captures the essence of the above arguments: For example: Figure 1: Heuristic Equation Explaining True Performance of an Organization In simple words, if the quality of your RFD is 70%, then the maximum score that you can get is 70%. The quality of RFD provides the upper limit on the maximum score a department can get. Performance against RFD Targets X Quality of RFD = TRUE PERFORMANCE OF THE ORGANIZATION 100 % (RFD Composite Score) X 70 % (Quality Rating for RFD) = 70 %
  • 84. 275Proceedings of Global Roundtable on Government Performance Management V. CALCULATING OVERALL QUALITY RATING OF RFD As we know, an RFD contains the following six sections: Section 1 Ministry‟s /department‟s Vision, Mission, Objectives and Functions Section 2 Inter se priorities among key objectives, success indicators and targets Section 3 Trend values of the success indicators Section 4 Description and definition of success indicators and proposed measurement methodology Section 5 Specific performance requirements from other departments that are critical for delivering agreed results Section 6 Outcome / Impact of activities of department/ministry Figure 2: Summary of the Six Sections of the RFD Hence, the overall quality of RFD would depend on the quality of each section and the relative priority of the section. Table 1 summarizes the relative weights for each of the six sections of the RFD and illustrative calculations used for arrive at the Overall Quality Rating for the RFD as well. The distribution of relative weights among various sections was decided after extensive consultations with all stakeholders, including members of the Ad-Hoc Task Force (ATF). Table 1 Distribution of Relative Weights and Illustrative Calculation of Overall Quality Rating Section of RFD Section Description Weight Raw Score for the Section Weighted Raw Score for the Section Source of Data 1 (A) Vision 5 90.0 4.50 Table 2 1 (B) Mission 5 90.0 4.50 Table 3 1 (C) Objectives 5 97.0 4.85 Table 4 2 Inter se priorities among key objectives, success indicators and targets 40 87.9 35.00 Table 5 3 Trend values of the success indicators 15 90.0 13.50 Table 10 4 Description and definition of success indicators and proposed 5 86.0 4.30 Table 11
  • 85. 276 Proceedings of Global Roundtable on Government Performance Management In the following sections we will explain the criteria and their relative weights in evaluating the quality of each section of RFD. VI. EVALUATION OF ORGANIZATION’S VISION (SECTION 1A) According to RFD Guidelines, Vision is an idealized state for the department. It is the big picture of what the leadership wants the department to look like in the future. Vision is a symbol, and a cause to which we want to bond the stakeholders, (mostly employees and sometime other stake-holders). As they say, the people work best, when they are working for a cause, than for a goal. Vision provides them that cause. Vision is a long-term statement and is typically generic and grand. Therefore, a vision statement does not change from year to year unless the department is dramatically restructured and is expected to undertake very different tasks in the future. Vision should never carry the 'how' part of vision. For example „To be the most admired brand in Aviation Industry‟ is a fine vision statement, which can be spoiled by extending it to „To be the most admired brand in the Aviation Industry by providing world-class in-flight services.‟ The reason for not including 'how' is that the 'how' part of the vision may keep on changing with time. Writing up a Vision statement is not difficult. The problem is to make employees engaged with it. Many a time, terms like vision, mission and strategy become more a subject of scorn than being looked up-to. This is primarily because leaders may not be able to make a connection between the vision/mission and employees every day work. Too often, employees see a gap between the vision, mission and their goals and priorities. Even if there is a valid/tactical reason for this mismatch, it is not explained. The leadership of the ministry (Minister and the Secretary) should therefore consult a wide cross section of employees and come up with a Vision that can be owned by the employees of the ministry/department. Vision should have a time horizon of 10-15 years. If it is less than that, it becomes tactical. If it has a horizon of 20+ years (say), it becomes difficult for the strategy to relate to the vision. Section Raw Score of RFD Section Description Weight for the Section Weighted Raw Score for the Section Source of Data measurement methodology 5 Specific performance requirements from other departments that are critical for delivering agreed results 5 85.0 4.25 Table 12 6 Outcome / Impact of activities of department/ministry 20 88.0 17.6 Table 13 Total Weight = 100 Overall Quality Rating for RFD = 88.5
  • 86. 277Proceedings of Global Roundtable on Government Performance Management Features of a good vision statement:  Easy to read and understand.  Compact and crisp – leaves some things for people‟s imagination.  Gives the destination and not the road-map.  Is meaningful and not too open-ended and far-fetched.  Excites people and makes them feel energized.  Provides a motivating force, even in hard times.  Is perceived as achievable and at the same time is challenging and compelling, stretching us beyond what is comfortable. The entire process starting from the Vision down to the objectives is highly iterative. The question is from where we should start? We strongly recommend that vision and mission statement should be made first without being colored by constraints, capabilities and environment. It is akin to the vision of several armed forces: 'Keeping the country safe and secure from external threats'. This vision is non-negotiable and it drives the organization to find ways and means to achieve their vision, by overcoming constraints on capabilities and resources. Vision should be a stake in the ground, a position, a dream, which should be prudent, but should be non-negotiable barring few rare circumstances. From the above guidance on Vision we have culled out the following key criteria for evaluating the quality of a Vision statement included in an RFD. A Vision statement should: 1 deal with “what” the organization wants to achieve and not the “how” it intends to achieve it 2 be „Forward‟ looking and focus on the destination and not on past achievements 3 be succinct and clear 4 Be inspiring and engaging The Table 2 below shows the distribution of weight across these criteria and an illustrative calculation of the quality rating for Vision statement. Table 2 Distribution of Relative Weights and Illustrative Calculation of Quality Rating for Vision Statement Criteria to evaluate quality of a Vision Weight Statement Criteria Values Raw Score Weighted Excellent Very Goo d Fair Poor Raw Goo d Score Source of Data 100% 90% 80% 70% 60% 1 The “What”, not the “How” 0.25 X 90 22.5 See above for guidance2 Forward looking 0.25 X 90 22.5 3 Succinct and clear 0.25 X 100 25 4 Inspiring and 0.25 X 80 20
  • 87. 278 Proceedings of Global Roundtable on Government Performance Management When a person evaluating the RFD gives less than 100% for any of the criteria, then the person must provide an explanation for arriving at this conclusion. Clearly, these four criteria require judgment. But by narrowing down the criteria we believe that the variation between experts evaluating RFD will be minimized, if not eliminated. Where we find that using the same criteria, experts come to very different and divergent ratings, then we may have to fine- tune the criteria and weights. It is important to note that a flawed Vision can have an exponentially distorting effect on the quality of RFD. If Mission and Objectives are aligned to a flawed Vision, then the document takes us in a completely different direction. Hence, the importance of a well-crafted Vision cannot be underestimated. Ideally, „Total Raw Score‟ of Vision, Mission and Objectives could be derived as a multiplicative score rather than as an additive score. However, in this version of REM, we use additive scores and have not explicitly incorporated this source of potential distortion. VII. EVALUATION OF ORGANIZATION’S MISSION (SECTION 1B) An organization‟s Mission is the nuts and bolts of the vision. Mission is the „who, what, and why‟ of the department‟s existence. We strongly recommend that mission should follow the vision. This is because the purpose of the organization could change to achieve their vision. The vision represents the big picture and the mission represents the necessary work. Mission of the department is the purpose for which the department exists. It is in one of the ways to achieve the vision. Famous management expert Mintzberg defines a mission as follows: “A mission describes the organization‟s basic function in society, in terms of the products and services it produces for its customers.” Vision and Mission are part of strategic planning exercise. To see the relation between the two, consider following definitions: 1. Vision: outlines what the organization wants to be, or how it wants the world in which it operates to be (an "idealised" view of the world). It is a long-term view and concentrates on the future. It can be emotive and is a source of inspiration. For example, a charity working with the poor might have a vision statement which reads "A World without Poverty." Criteria to evaluate quality of a Vision Weight Statement Criteria Values Raw Score Weighted Excellent Very Goo d Fair Poor Raw Goo d Score Source of Data 100% 90% 80% 70% 60% Engaging Quality Rating for Vision Statement = 90.0
  • 88. 279Proceedings of Global Roundtable on Government Performance Management  Mission: Defines the fundamental purpose of an organization or an enterprise, succinctly describing why it exists and what it does to achieve its vision. For example, the charity above might have a mission statement as "providing jobs for the homeless and unemployed". To evaluate the quality of a Mission Statement in an RFD we have agreed to use the following criteria: 1 Is the Mission aligned with Vision (follows the level of Vision and is long-term)? 2 Does the Mission deal with “how” Vision will be achieved but at higher levels of conceptualization than Objectives? 3 Is Mission Statement succinct and clear? The Table 3 below shows the distribution of weight across these criteria and an illustrative calculation of the quality rating for Vision statement. Table 3 Distribution of Relative Weights and Illustrative Calculation of Quality Rating for a Mission Statement VIII. EVALUATION OF ORGANIZATION’S OBJECTIVES (SECTION 1C) Objectives represent the developmental requirements to be achieved by the department in a particular sector by a selected set of policies and programmes over a specific period of time (short-medium-long). For example, objectives of the Ministry of Health and Family Welfare could include: (a) reducing the rate of infant mortality for children below five years; and (b) reducing the rate of maternity death by the end of the development plan. Criteria to evaluate Weight quality of a Mission Statement Criteria Values Raw Score Weighted Excellent Very Good Fair Poor Good Raw Score Source of Data 100% 90% 80% 70% 60% 1 Aligned with Vision (follows the level of Vision and is long-term) 0.4 X 90 36 See Abov e for Guida nce 2 The “How” (at higher levels than Objectives) 0.3 X 90 27 3 Succinct & clear 0.3 X 90 27 Quality Rating for Mission Statement = 90
  • 89. 280 Proceedings of Global Roundtable on Government Performance Management Objectives could be of two types: (a) Outcome Objectives address ends to achieve, and (b) Process Objectives specify the means to achieve the objectives. As far as possible, the departmentshould focus on Outcome Objectives.1 Objectives should be directly related to attainment and support of the relevant national objectives stated in the relevant Five Year Plan, National Flagship Schemes, Outcome Budget and relevant sector and departmental priorities and strategies, President‟s Address, the manifesto, and announcement/agendaas spelt out by the Government from time to time. Objectives should be linked and derived from the Departmental Vision and Mission statements. In view of the above, we believe that quality of the objectives should be judged by the following four criteria: 1. Alignmentwith Mission and Vision Here we should ask ourselves whether achievement of the objectives specified would lead us to achieve departmental vision and mission. This is not an exact science and judgment would be required. For example, if the Vision of a department is “Healthy Nation” then it would seem “Reducing Child Mortality” would be an objective that could be considered aligned with departmentalvision. 2. Results-driven (At the level of program ratherthan actions) If a department‟s vision includes “Safer Roads” then an objective of “increasing awareness about road safety” would be considered well aligned and focusing at program level as it focuses on “road Safety Awareness Program.” However, if the department were to include an objective such as “conducting road safety awareness programs,” it would still be aligned to departmental Vision of „Safer Roads‟ but it would be more at the level of action than program. 3. Appropriate number of objectives (no duplication or redundancies in objectives, no conflictsin articulated objectives) Management experts generally recommend that the number of objectives for a normal organization should not generally exceed eight. Of course, large organizations will tend to have more objectives and smaller ones will have less. We propose that the following guidelines should be used for determining the appropriatenumber of objectives: Figure3: Appropriate Number of Objectives 1 Often a distinction is also made between “Goals” and “Objectives.” The former is supposed to be more general and latter more specific and measurable. The Vision and Mission statement are expected to capture the general direction and future expected outcomes for the department. Hence, only the inclusion of objectives in Section 1 is required. Excellent Very Good Good Fair Poor 8-10 7 or 11 6 or 12 5 or 13 ≤4 or ≥14
  • 90. 281Proceedings of Global Roundtable on Government Performance Management These are only guidelines and should, like any other guideline, be used judiciously and not mechanically. 4. Non duplication, non-redundancy and absence of overt conflicts in stated objectives It is also important to make objectives crisp and non-duplicative. We should not include redundant statements and generalities as objectives. Even more importantly, we should not have explicitly contradictory and overtly conflicting objectives. The Table 4 below shows the distribution of weight across these four criteria and an illustrative calculation of the quality rating for the section dealing with „Objectives.‟ Table 4 Distribution of Relative Weights and Illustrative Calculation of Quality Rating for list of Organizational Objectives IX. EVALUATION OF SECTION 2 OF RFD The heart of any RFD is Section 2 and the heart of Section 2 is Figure 4. That is why in the overall rating of RFD, this section has a weight of 40%. The description of each column is given in the Guidelines for RFD. Criteria to evaluate Weight quality of Objectives Criteria Values Raw Weighted Score Raw ScoreExcellent Very Good Good Fair Poor Source of Data 100% 90% 80% 70% 60% 1 Aligned with Mission 0.3 X 100 30 See above for Guidance 2 Results-driven (At the level of program rather than actions) 0.3 X 90 27 See above for Guidance 3 Appropriate number of objectives 0.2 X 100 20 See above for Guidance 4 Non duplication, non- redundancy and absence of overt conflicts in stated objectives 0.2 X 100 20 See above for Guidance Quality Rating for Objectives = 97
  • 91. 282 Proceedings of Global Roundtable on Government Performance Management Figure 4: Format of Section 2 of RFD The following is the summary table for the evaluation of Section 2 of the RFD: Table 5 Distribution of Weight and Sample Calculation of Quality Rating for Section 2 Column 1 Column 2Column 3Column 4 Column 5Column 6Success UnitWeight Indicator Target / Criteria Value Excellent Very Good Fair Poor GoodObjective Weight Actions 100% 90% 80% 70% 60% Objective 1 Action 1 Action 2 Action 3 Objective 2 Action 1 Action 2 Action 3 Objective 3 Action 1 Action 2 Action 3 Criteria to evaluate Weight Quality of Targets for SIs Criteria Values Raw Weighted Score Raw ScoreExcellent Very Good Good Fair Poor Source of Data 100% 90% 80% 70% 60% 1 Extent to which actions (in Column 3 of RFD) adequately capture objectives 15 X 90 13.5 See below for guidance 2 Extent to which success indicators (Column 4 of RFD) adequately capture Actions 15 X 100 15.0 See below for guidance 3 Quality / Nature of Success Indicators (SIs) 40 X 81 32.4 Table 6
  • 92. 283Proceedings of Global Roundtable on Government Performance Management Brief guidance on Table 5: The following five criteria are proposed for assessing the quality of elements of Section 2 of the RFD given in Table 5 above: 1. Do actions (in Column 3) adequately capture objectives? For each objective, the department must specify the required policies, programmes, schemes and projects. Often, an objective has one or more policies associated with it. Objective represents the desired “end” and associated policies, programs and projects represent the desired “means.” The latter are listed as “actions” under each objective. Assessors and evaluators should use their domain knowledge and knowledge of the department to ensure all key actions are listed under various objectives. Often, departments do not mention some key schemes under action just because they feel they may not be able to achieve the expected target for such important schemes. Ideally, all actions, taken together, should cover close to 100% of plan funds. But money is not everything. Evaluators must ensure that those actions that may not require money are also being adequately covered. In evaluating this aspect, we should also examine whether actions from previous years have been dropped for valid reasons. 2. Do success indicators (Column 4) adequately capture Actions? For each of the “action” specified in Column 3, the department must specify one or more “success indicators.” They are also known as “Key Performance Indicators (KPIs)” or “Key Result Indicators (KRIs).” A success indicator provides a means to evaluate progress in Criteria to evaluate Weight Quality of Targets for SIs Criteria Values Raw Weighted Score Raw ScoreExcellent Very Good Good Fair Poor Source of Data 100% 90% 80% 70% 60% 4 Appropriatenes s of distribution of weight among objectives 15 X 90 13.5 See below for guidance 5 Quality of targets for respective Success Indicators in RFD 15 X 90 13.5 Table 9 Rating for Quality of Targets = 87.9
  • 93. 284 Proceedings of Global Roundtable on Government Performance Management implementing the policy, programme, scheme or project. Sometimes more than one success indicator may be required to tell the entire story. Success indicators should consider both „qualitative’ and ‘quantitative’ aspects of departmental performance. 3. Quality / Nature of Success Indicators (SIs) In selecting success indicators, any duplication should be avoided. For example, the usual chain for delivering results and performance is depicted in Figure 5. An example of this results chain is depicted in Figure 6. Figure 5: Typical Results Chain Figure 6: An Example of Results Chain If we use Outcome (increased literacy) as a success indicator, then it would be duplicative to also use inputs and activities as additional success indicators. Ideally, one should have success indicators that measure Outcomes and Impacts. However, sometimes due to lack of data one is able to only measure activities or output. The common definitions of these terms are as follows: i. Inputs: The financial, human, and material resources used for the development intervention. ii. Activity: Actions taken or work performed through which inputs, such as funds, technical assistance and other types of resources are mobilized to produce specific outputs. iii. Outputs: The products, capital goods and services that result from a development intervention; may also include changes resulting from the intervention which are relevant to the achievement of outcomes. Sometimes, „Outputs‟ are divided into two sub categories – internal and external outputs. „Internal‟ outputs consist of those outputs over which managers have full administrative control. For example, printing a brochure is considered an internal output as it involves spending budgeted funds in hiring a printer and giving orders to print a given number of brochures. All actions Results-Based Management: Adult Literacy Outcomes • Increased literacyskill; more employment opportunities Outputs • Number of adults completing literacy courses Activities • Literacy training courses Inputs • Facilities, trainers, materials Goal (Impacts) • Higher income levels; increaseaccess to higher skill jobs Results-Based Management Outcomes • Intermediate effects of outputs on clients Outputs • Productsand services produced Activities • Tasks personnel undertake to transform inputs to outputs Inputs • Financial, human, and material resources Goal (Impacts) • Long-term,widespread improvement in society ImplementationResults
  • 94. 285Proceedings of Global Roundtable on Government Performance Management required to print a brochure are fully within the manager‟s control and, hence, this action is considered ‘Internal’ output. However, having these brochures picked up by the targeted groups and, consequently, making the desired impact on the target audience would be an example of external output. Thus, actions that exert influence beyond the boundaries of an organization are termed as ‘external’ outputs. iv. Outcome: The likely or achieved short-term and medium-term effects/ impact of an intervention‟s Outputs The quality score for SIs is calculated as shown in Table 6 below: Table 6 Calculation for Quality Score for SIs The Outcome—Orientation of Success Indicators is calculated as follows in Table 7: Table 7 Calculation of Outcome—Orientation of Success Indicators Criteria to evaluate Quality of Weight Success Indicators (SIs) Criteria Values Raw Weighted Score Raw ScoreExcellent Very Good Fair Poor Good Data flows from REM Table 100% 90% 80% 70% 60% 1 Outcome- Orientation of Success Indicators 0.90 X 80 72.0 From Table 7 2 Quality- Orientation of Success Indicators 0.10 X 90 9.0 From Table 8 Quality Rating for SIs = 81.0 Criteria to evaluate Outcome Orientation of Success Indicators (SIs) Weight Criteria Values Raw Weighted Score Raw ScoreExcellent Very Good Fair Poor Good Data flows from REM Table 100% 90% 80% 70% 60% Outcome External Internal Output Output Activity Input 1 Success Indicator 1 0.3 X 80 24 See 2 Success Indicator 2 0.3 X 100 30 3 Success 0.2 X 70 14
  • 95. 286 Proceedings of Global Roundtable on Government Performance Management The Quality—Orientation of Success Indicators is calculated as follows: Table 8 Calculation of Quality—Orientation of Success Indicators 4. Is the distribution of weigh among objectives appropriate to capture the relative emphases required for achieving the Mission and Vision of the organization? Objectives in the RFD (Column 1 of figure 4) should be ranked in a descending order of priority according to the degree of significance and specific weights should be attached to these objectives. The Minister in-charge will ultimately have the prerogative to decide the inter se priorities among departmentalobjectives and all weights. If there are multiple actions associated with an objective, the weight assigned to a particular objective should be spread across the relevant success indicators. Criteria to evaluate Outcome Orientation of Success Indicators (SIs) Weight Criteria Values Raw Weighted Score Raw ScoreExcellent Very Good Fair Poor Good Data flows from REM Table 100% 90% 80% 70% 60% Outcome External Internal Output Output Activity Input Indicator 3 Above for Guidance 4 Success Indicator „N‟ 0.2 X 60 12 Quality Rating for Outcome—Orientation of Success Indicators = 80 Criteria to evaluate Quality- Orientation of Weight Success Indicators (SIs) Criteria Values Raw Weighted Score Raw ScoreExcellent Very Good Good Fair Poor Source of Data 100% 90% 80% 70% 60% 5 SIs 4 SIs 3 SIs 2 SIs 1 SI 1 Number of indicators explicitly measuring quality of Government Performance 100 X 90 90 Rating for Quality—Orientation of Success Indicators = 90
  • 96. 287Proceedings of Global Roundtable on Government Performance Management 5. What is the quality of targets for respective Success Indicators in RFD? Targets are tools for driving performance improvements. Target levels should, therefore, contain an element of stretch and ambition. However, they must also be achievable. It is possible that targets for radical improvement may generate a level of discomfort associated with change, but excessively demanding or unrealistic targets may have a longer-term demoralizing effect. The target for each SI is presented as per the five-point scale given below: Figure 7: Target of SI It is expected that budgetary targets would be placed at 90% (Very Good) column. For any performance below 60%, the department would get a score of 0%. Table 9 summarizes the criteria for judging the quality of targets: Table 9 Rating for Quality of Targets for each SI Following Figure provides guidelines for evaluating the degree of consistency of targets and also establishing the degree of stretch (challenge) built in targets. Excellent Very Good Good Fair Poor 100 % 90% 80% 70 % 60 % Criteria to evaluate Weight Quality of Targets for SIs Criteria Values Raw Weighted Score Raw ScoreExcellent Very Good Good Fair Poor Source of Data 100% 90% 80% 70% 60% 1 Consistency with Planning Commission / MOF Targets 70 X 90 63 See Figure 8 2 Degree of Stretch 30 X 100 30 See Figure 8 Rating for Quality of Targets = 93 Criteria to evaluate Quality of Targets for SIs Criteria Values Excellent Very Good Good Fair Poor 100% 90% 80% 70% 60% 1 Consistency with Planning Commission / MOF Targets 100 % targets are consistent 90 % targets are consistent 80 % targets are consistent 70 % targets are consistent 60 % targets are consistent
  • 97. 288 Proceedings of Global Roundtable on Government Performance Management Figure 8: Guidelines for Evaluating the Degree of Consistency of Targets To establish whether targets are consistent with Planning Commission / MOF targets, we will need the departments to show evidence. For major targets it can be ascertained from Annual Plan, 12th Plan documents, approved demand for grants, and Outcome Budget. To determine whether the targets are challenging one has to use one‟s judgment and look at many sources of information. Clearly, information from Section 3 would be among one of the most useful sources of information for this purpose. The summary table for the calculation of the Quality of Section 2 of the RFD is reproduced below again for reference. Table 5 Distribution of Weight and Sample Calculation of Quality Rating for Section 2 Criteria to evaluate Quality of Targets for SIs Criteria Values Excellent Very Good Good Fair Poor 100% 90% 80% 70% 60% 2 Degree of Stretch 100% targets are challenging 90% targets are challenging 80% targets are challenging 70% targets are challenging 60% targets are challenging Criteria to evaluate Weight Quality of Targets for SIs Criteria Values Raw Weighted Score Raw ScoreExcellent Very Good Good Fair Poor Source of Data 100% 90% 80% 70% 60% 1 Extent to which actions (in Column 3 of RFD) adequately capture objectives 15 X 90 13.5 See above for guidance 2 Extent to which success indicators (Column 4 of RFD) adequately capture Actions 15 X 100 15.0 See above for guidance 3 Quality / Nature of Success Indicators (SIs) 40 X 81 32.4 Table 6
  • 98. 289Proceedings of Global Roundtable on Government Performance Management X. EVALUATION OF SECTION 3 OF RFD For every success indicator and the corresponding target, Section 3 of RFD provides target values and actual values for the past two years and also projected values for two years in the future (reproduced below). The inclusion of target values for the past two years vis-a-vis the actual values are expected to help in assessing the robustness of the target value for the current year. However, one cannot begin to evaluate the robustness or otherwise without data in Section 3. Therefore, Table 10 measures the degree to which the data for Section 3 has been provided in the RFD. Figure 9: Section 3 of the RFD - Trend Values for Success Indicators To evaluate the quality of Section 3 of RFD we have agreed to use the following criterion: Criteria to evaluate Quality of Weight Targets for SIs Criteria Values Raw Weighted Score Raw ScoreExcellent Very Good Good Fair Poor Source of Data 100% 90% 80% 70% 60% 4 Appropriatenes s of distribution of weight among objectives 15 X 90 13.5 See above for guidance 5 Quality of targets for respective Success Indicators in RFD 15 X 90 13.5 Table 9 Rating for Quality of Targets = 87.9 Objective Actions Success Indicator Unit Actual Value for FY 11/12 Actual Value for FY 12/13 (anticipated) Target Value for FY 13/14 Projected Projected Value Value for for FY 14/15 FY 15/16 Objective 1 Action 1 Action 2 Action 3 Objective 2 Action 1 Action 2 Action 3 Objective 3 Action 1 Action 2 Action 3
  • 99. 290 Proceedings of Global Roundtable on Government Performance Management Table: 10 Calculation of Quality Rating of Section 3 of the RFD Percentage of Data Populated This is a basic requirement for any management effort. Unless the trend values of the previous and future years are available, it is difficult to assess whether the targets in RFD are challenging or not. To assess the value under this criterion, we need to count the total number of cells which contain data. For each success indicator there should be 5 data values – the data for previous 2 years, the current year target value and the data for future two years. Number of cells containing data Figure 10: Formula for Calculating the Percentage of Data Populated There are only two legitimate reasons for not having data in a particular cell of Section 3. 1. The success indicator is being used for the first time and no records were maintained in this regard in the past 2. The values are in dates and hence they do not represent a trend and using them is not meaningful. XI. EVALUATION OF SECTION 4 OF RFD RFD is a public document and hence it must be easily understood by a well-informed average stakeholder. Towards this end, RFD contains a section giving detailed definitions of various success indicators and the proposed measurement methodology. Abbreviation/acronyms and other details of the relevant scheme are also expected to be listed in this section. Percentage of data populated Total number of SIs X 5 = Criterion to evaluate Weight Quality of Section 3 Target / Criterion Value Raw Weighted Score Raw Score Source of DataExcellent Very Good Good Fair Poor 100% 90% 80% 70% 60% 1 Percentage of data populated 100 X 90 90 See Above for Guidance Quality Rating for Section 3 = 90
  • 100. 291Proceedings of Global Roundtable on Government Performance Management Wherever appropriate and possible, the rationale for using the proposed success indicators should be provided as per the format in the RFMS. Figures 11 and 12 give a sample of Section 4 data from Department of AYUSH (Ayurveda, Yoga and Naturopathy, Unani, Siddha and Homoeopathy). Figure 11: Sample of select Acronyms from Section 4 of Department AYUSH RFD Figure 12: Sample of SI Definition and Measurement Methodology from Section 4 of AYUSH RFD To evaluate the Section 4 of RFD we have agreed to use the following criteria: 1. Whether all acronyms used in the body of the RFD have been explained in simple layman‟s terms? 2. Whether necessary explanations and justifications have been given for using a particular type of success indicator, where required? 3. If so, what is the quality of explanations? Acronym Description ASU Ayurveda, Siddha and Unani ASUDCC Ayurveda, Siddha and Unani Drugs Consultative Committee ASUDTAB Ayurveda, Siddha and Unani Drugs Technical Advisory Board AYUSH Ayurveda, Yoga and Naturopathy, Unani, Siddha and Homoeopathy CHCs Community Health Centres COE Centre of Excellence D and C Drugs and Cosmetics Success Indicator Description Definition Measurement General Comments [1.1.1] Primary Health Centres/ Community Health Centres/District Hospitalscovered Completion of infrastructure, equipment, furniture and provision of medicines for the co-located AYUSH Units of Primary Health Centres (PHCs), Community Health Centres ( CHCs) & District Hospitals ( DHs). Co-located AYUSH Health Care Units at Primary Health Centres (PHCs), Community Health Centres (CHCs) & District Hospitals (DHs) implies facilities for provision of AYUSH health services along with allopathic health services. Number of Units As per approved norms, assessments of the needs will be measured through the appraisal of the Programme Implementation Plan(PIP) of the State Governments and the outcomes shall be monitored through progress reports and periodical reviews.
  • 101. 292 Proceedings of Global Roundtable on Government Performance Management The Table 11 below shows the distribution of weight across these criteria and an illustrative calculation. Table 11 Distribution of Weight among Criteria and Illustrative Calculation of Quality of Section 4 of the RFD XII. EVALUATION OF QUALITY OF SECTION 5 IN RFD Section 5 of RFD should contain expectations from other departments that impact the department‟s performance and are critical for achievement of the selected Success Indicator. These expectations should be mentioned in quantifiable, specific, and measurable terms. While listing expectations, care should be taken while recording as this would be communicated to the relevant Ministry/Department and should not be vague or general in nature. This should be given as per the new format incorporated in the RFMS. Figure 13: Sample Section 5 from RFD To evaluate the Section 5 of RFD we have agreed to use the following criteria: 4. Whether claims of dependencies/requirements from other departments are appropriate or not? 5. Whether requirements from other departments/ claims of dependencies are specific or not? Location Type State Organization Type Organization Name Relevant Success Indicator What is your requirement from this organization Justificati on for this requireme nt Please quantify your requirement from this Organizatio n What happens if your requirement is not met Data for this table wil l be provided by the Depart ment. Notto be entered for REM. Criteria to evaluate quality of Section 4 Weight Target / Criteria Value Excellent Very Good Fair Poor Good Raw Weighted Source of Data Score Raw Score 100% 90% 80% 70% 60% 1 All Acronyms have been explained 0.1 X 100 10 See Above for Guidance 2 Necessary explanations have been given for SIs? 0.5 X 90 45 See Above for Guidance 3 Quality of explanations? 0.4 X 80 32 See Above for Guidance Quality Rating for Section 4 = 87