1 Introduction
1.1 Background
1.2 Purpose of This Paper
1.3 Methodology and Limitation
2 Results of Analysis
2.1 Performance Reporting DPRs) Progress Overall
2.2 Findings by Reporting Principles
2.2.1 Does the report provide a coherent and balanced picture of performance
information?
2.2.2 Focus on outcomes, not outputs
2.2.3 Associate performance with earlier commitments, and explain any changes
2.2.4 Set performance in context
2.2.5 Link resources to outcomes
2.2.6 Does this report provide factual, independently verifiable performance information?
3 Government-wide Initiatives and Management Issues
4 Conclusions
Appendix A: Figures
1 Introduction
1.1 Background
Each year,
the government prepares Estimates in support of its request to Parliament for authority to spend public monies. This request is
formalized through the tabling of appropriation
bills in Parliament.
The
Estimates of the Government of Canada are structured in several parts.
Beginning with an overview of total government spending in Part I, the
documents become increasingly more specific. Part II outlines spending
according to departments, agencies and programs and contains the proposed
wording of the conditions governing spending which Parliament will be asked to
approve.
Reports on Plans and Priorities (Part III) are individual
expenditure plans for each department and agency (excluding Crown corporations)
that elaborate on, and supplement, the information contained in Part II. They
provide increased levels of detail on a business line basis and contain
information on objectives, initiatives and planned results, including links to
related resource requirements over a three-year time horizon.
The
Departmental Performance Report (Part III) provides a focus on
results-based accountability by reporting on accomplishments achieved against
the performance expectations and results commitments as set out in the Report
on Plans and Priorities.
Departmental performance reports
play a key role in the cycle of planning, monitoring, evaluating, and reporting of results
through ministers to Parliament and citizens. It focuses on outcomes - benefits
to Canadians and Canadian society - and describes the contribution the
organization has made toward those outcomes.
The
Estimates, along with the Minister of Finance's Budget, reflect the
government's annual
budget planning and resource allocation priorities. In combination with the
subsequent reporting of financial results in the Public Accounts and of
accomplishments achieved in Departmental Performance Reports, this material
helps Parliament hold the government to account for the allocation and
management of funds.
1.2 Purpose of this Paper
This is the second year a review of all DPRs is undertaken. This report provides an overview of the
status of performance reporting as reflected in the DPRs for the period ending
March 31, 2002, and on the changes from last year, based on an assessment of
all DPRs. The
analysis focuses on assessing how well the eighty-six 2002 DPRs followed the
reporting Principles in the TBS DPR Preparation Guide.. The paper is organized with an overview of
the analysis for all principles for all departments, followed by a more
detailed analysis for each principle in the guide. The discussion of each principle contains an analysis of how the
departments followed the principle or guideline, including a discussion of the
changes over the previous year.
This paper was
prepared with two aims in mind:
- to provide an assessment of
the overall progress that is being made in performance reporting, particularly
with respect to identifying good practices that can be shared and areas where
improvements are needed; and
- to assist TBS in providing feedback to the departments
on their adherence to the guidelines;
1.3 Methodology and
Limitations
The
TBS Guidelines for Preparing Departmental Performance Reports 2001-2002 listed six principles that the departments were expected
to follow. The features discussed under
each principle (i.e. those features that would help the department adhere to
the principle) were rated for each DPR, along with the overall question
relating to the principle. The review
did not assess the factual content of the report, but at its reporting
aspects. The assessment represents the
views and judgement of the assessors. The process for the 2002 DPRs differed in a few respects
from last year's exercise with the 2001 reports:
- an updated assessment form
was agreed between the TBS project authority and the consultants. This form included several new questions,
and some clarifications of wording as well.
In particular, the following questions were added:
Does the
report show that performance information has influenced, or been incorporated
into the decision-making process?
Does the
report link to government priorities and provide a good understanding of the
significance of the department's outcomes in that context?
Does the
department complete the required financial tables provided in the annex to the
guide? Is this done in a manner that is
understandable to the reader?
-
It was agreed to write the
assessment for each DPR presenting overall comments, strengths and weaknesses,
based on each principle and sub-question.
This was done for all departments.
-
It was decided that
Performance Reports for the period ending March 31, 2002would be scored
according to the following scale:
5 = Outstanding/Excellent
4 = Very Good
3 = Satisfactory
2 = Needs Improvement
1 = Poor/Weak
0 = No Evidence
N/A = Not Applicable
?? = Could not determine
This
five-point scale replaced the three-point scale used last year, and permitted
us to recognize improvement in departments whose DPRs were poor or weak last
year according to the various criteria, and were better though not yet fully
satisfactory this year. It also allowed
us to recognize departments whose DPRs were satisfactory last year according to
the various criteria, and even better though not really outstanding or
excellent this year.
The ratings
for the 2002 reports were compared to the 2001 reports, in order to determine
if, and where progress had been made.
However, to do this, the 2001 reports were converted from their 3-point
to the five-point scale used in 2002.
While the conversion was tempered using best judgement based on
familiarity with the reports, it produces a best approximation, rather than a
precise score. That factor, plus the
slight change in the rating tool questions for 2002 mean that the comparisons
should be considered indicative only.
While the overall trends are likely reliable, comparison of individual
department changes from year to year should be interpreted with caution.
2 Results of Analysis:
2.1 Performance
Reporting (DPRs) Progress Overall
In general,
the DPRs are better than last year, though there is still progress to be
made. As was the case last year, every
department has something to improve in its performance reporting; some
departments have more to improve than others.
However, it is evident that the departments are taking the requirement
to report on performance seriously.
Most departments reviewed so far seem to have made some effort to follow
the 2001-032 DPR guidelines. In some
cases where the report is still not up to standard, it appears that the
department is revamping or setting up a new performance measurement and
reporting system, and all elements to produce an adequate performance report
may not yet be in place. In other
departments, though, there is still a considerable distance between what is in
the report and what the guidelines would suggest.
A comparison of the overall rating of the DPRs from 200to
200 showed that of all the departments, about one-quarter of the departments
showed enough improvement overall in their reports to move up at least one
point on the rating scale. (It must be
remembered that because the 2002 reports were rated on a 5-point scale, while
the 2001 DPRs were rated on a 3-point scale and converted to the 5 point scale
for comparison, and because the rating tool was changed slightly between the
two years, these comparisons should be taken as indicative only.) A few departments' scores declined over last
year. Some of these reports were rated
lower because although a few have an excellent performance measurement
framework and context, they had very little actual performance information,
other than anecdotal information.
Others do not relate the performance information to either earlier
planned results or to strategic outcomes.
Still others concentrated largely on operational and output information
without indicating the contribution to broader outcomes. While most departments scored the same as
last year, many had made improvements in some areas, although not enough to
affect the overall rating.
Areas of
weakness are largely the same as those found last year: the absence of measurable planned results
against which to report performance, the failure to associate resource
expenditures to outcomes, and the general lack of information about data
sources, validity and reliability.
These weaknesses are common to almost all departments.
The
understanding of the performance paradigm seems to be improving. Its technical terms are more likely to be
used - and used correctly - than was the case last year, although the terms outcome,
planned result, and key results commitments are still used almost
interchangeably, and what they refer to often varies from department to
department. Another improvement over
last year is that most of the required elements of performance reporting are
present, at least to some degree. As
discussed below, there are far fewer occurrences of "no evidence" in this year's scores as compared with last
year's. Notable exceptions are the
general absence of performance indicators, the lack of information about
changes in planned results, little information about the use of performance
information to learn or in decision-making, and the scarcity of information on
data reliability or validity.
We noted
last year that the success of performance reporting in a department appears to be
correlated with evidence that the information is used for decision-making or
program improvements. This conclusion
emerges even more clearly from this year's review. Departments whose DPRs were rated satisfactory to very good
generally scored high on questions 2.3 (use of performance information to
learn) and 3.3 (using performance information for decision-making). In most of these cases, the reporting has a
strategic quality to it, (i.e. it is part of a larger departmental strategy to
achieve particular outcomes) and the focus is on outcomes, with a clear
indication of the logic between what a department does and produces, and how
this contributes to the outcomes. In
other cases, where the DPR generally was rated lower, and the information was
not used, information is presented as a report on past events and what the
department had produced in the past year, with little attention paid to
outcomes and the department's role in contributing to them. It remains the case that departments that
appear to have performance measurement and reporting integrated to some degree
into their management systems (i.e. through use for decisions or program
improvement) did better than those for whom performance reporting appears to be
a more isolated task, disconnected from the management cycle.
This year,
there is a greater recognition in the reports that government organizations
contribute to public benefits in collaboration with other organizations, public
and private, rather than produce them directly by themselves. With this acknowledgement, departments are
able to include longer-term outcomes within their scope of reporting, and to
report results that are lower than expected by recognizing the shared
responsibility for them. The result is
more balance in the reports, including weak as well as strong performance.
Although
last year we found no departmental attributes associated with particular
problems in reporting performance, we did find this year that the smaller
departments and agencies had made less progress than the larger departments,
and generally scored lower for reporting overall. The average overall score of all large departments was 2.5, while
the average overall score for the smaller departments and agencies was
2.1. Last year's finding (See TBS website)
that specific conceptsand sections of the departmental performance report
(e.g. client satisfaction, risk, societal indicator) vary widely in importance
and meaning depending on the nature of the organization, and sometimes by
business line within the same organization, also remains valid.
Finally, one
area that has not improved is that most DPRs still appear to be presented as
annual reports, with a focus on activities carried out that year, rather
than on planned results accomplished as of March 31, 2001, regardless of
when they were initiated.
Of the
reports reviewed, those of the Canada Economic Development for Quebec Regions,
the Canadian Human Rights Tribunal, the Canadian International Development
Agency, the National Library, the National Parole Board, Natural Resources
Canada, the Office of the Auditor General, the Office of the Information
Commissioner, and Parks Canada were rated among the highest, being rated "4",
or "Very Good" overall. Though there is
room for improvement in each of these reports, and none scored consistently
high across all principles, they do provide good examples of clear, coherent
and relatively complete performance reporting.
No report
was rated "Excellent" overall, although our assessment found parts of some
reports were outstanding. For example:
- CIDA's and Natural Resources
Canada's focus on outcomes that benefit Canadians and Canadian society,
- ACOA's, CEDQ's, CCRA's and
National Library's relation of performance to planned results,
- CIDA's, Correctional Service
Canada's, National Archives, National Energy Board's, National Film Board's and
Parks Canada's discussion of context,
- Transport Canada's and
Veterans Affairs Canada's relation of resources to outcomes, and
- Office of the Auditor
General, Indian and Northern Affairs Canada's, Natural Resources Canada's and
Parks Canada's discussion of methodology and data reliability.
No department was rated outstanding for the first principle
-- Providing a
coherent, balanced picture of performance that is brief and to the point.
2.2 Findings by Reporting Principles:
Pursuant to
the TBS Guidelines, the six reporting principles were the basis of the main
criteria for the assessment exercise:
1)
Does the
report provide a coherent and balanced picture of performance information?
2)
Does the
report focus on outcomes that benefit Canadians and Canadian society?
3)
How
effectively does the report associate performance with earlier commitments,
discussing any performance gaps?
4)
On the basis
of this report, how well do you understand the department's role and operating
environment?
5)
Does the
report demonstrate the value of departmental performance by linking outcomes
achieved with resources expended?
6)
Does this
report provide information that allows the reader to make informed decisions
about the reliability of the performance information?
Each of these principles was further broken down into three
or four factors which contribute to the principle. The reports were assessed on each of the principles, and on the
component factors. (It should be
noted, however, that the principles are not equally applicable across all
departments. For example, a focus on
outcomes is more relevant to a policy department than to one that is strictly
providing a service.) This section reviews the ratings of the DPRs
for both of these elements, both to determine the state of reporting with
respect to the principles and component factors, and to determine what
progress, if any, has been made since last year, and if so, where.
In comparing the ratings for the principles and factors,
there were two substantial changes in the ratings between the 2001 and
2002:
-
the decline in the number of factors for which there
was no evidence,
-
the general upward shift of the ratings.
Figure 1 (see Appendix A)
shows the distribution of all scores for 2002, compared to 2001. Last year, in reviewing the performance
reports, there was frequently no evidence of the factor being sought. In fact, fully half of the ratings last year
indicated that the factor being rated was not present in the report. For the 2002 reports, the ratings of "no
evidence" declined to about 15%.
Figure 1 also clearly illustrates how the ratings have
improved between 2001 and 2002 with a much higher proportion of the scores in
the higher categories. While the
percentage of scores in the "poor" category actually increased, analysis showed
that those criteria rated "poor" for a department this year often had improved,
from having "no evidence" of the criteria last year. Similarly, those rated "poor" last year frequently rated higher
this year. (Although
such an improvement would not show up in this chart, because the scores of "1"
(poor/weak) and "2" (needs improvement) on the 5-point scale from this year's
assessment were collapsed to be comparable with the "poor" rating of last year,
on the 3-point scale.) The proportion of scores in the other categories increased - from just over 1
in 10 rated satisfactory in 2001 to almost twice that in 2002, and from 3%
rated good-excellent in 2002 to over 20% in 2002.
Figure 2 shows the average score
for each principle in the 2001 and 2002 DPRs.
The average score over all the principles has moved from 1.2 (poor) in
the 2001 DPRs to 2.2 (needs improvement) in the 2002 reports. We also examined the proportion of ratings
for each principle that have improved, stayed the same and declined. Principles 3 (reporting against outstanding
commitments) and 4 (explain strategic context) showed the most improvement,
followed closely by principle 5 (relating resources to outcomes). It must be noted, however, that there is
little relation between the degree of improvement and the rating of the
principle. For example, while Principle
3 improved the most, it did so from a very poor overall rating last year, and
still is not well done by departments.
In general, it improved from an overall position of mostly "no evidence"
with some "poor" last year to an average of "poor/needs improvement" this year
(although the range is significant with a few "no evidence" ranging to some
"excellent"). In comparison, Principle
4, which showed only slightly less improvement, improved from an average
position of "needs improvement" (although with significant numbers of ratings
of both "poor" and "satisfactory") to "satisfactory" overall (with almost no
ratings of "poor" and a number of "very good" and "excellent" ratings).
Findings for
each of these main criteria are reported below.
2.2.1 Does the report provide a
coherent and balanced picture of performance information?
Guideline
Principle 1 - Provide a coherent and balanced picture of performance that is
brief and to the point:
This
criterion was scored according to the reviewer's judgment of the DPR as a
whole. There were three sub-criteria:
- Does the
report present coherent performance information in a complete performance
story?
- To what
extent does this report describe performance problems and shortcomings as well
as accomplishments?
- Does this
report provide results by strategic outcomes identified by the department?
Very few
DPRs tell a complete and coherent performance story in which the reader can
determine what the department had planned to achieve, what it did achieve, and what
it is doing to improve performance in those areas where performance did not
meet expectations. Though most
departments still have a long way to go, the 2001-2002 DPRs are distinctly
better than the previous year's in telling the performance story. The problems remain the same, however. There is a tendency to use the vocabulary of
results-based management and performance reporting without applying the
performance paradigm and its focus on outcomes, reported against planned
results. Few departments report
obstacles, risks and shortcomings, or discuss corrective action to be
taken. Although in many cases, the
organization of the report is by business line, and what is reported are
activities rather than results, this sub-criteria was better adhered to than
the other two, with the description of shortcomings being least adhered to.
The ratings for this principle were
distributed across the full possible range of the DPRs assessed. No department presented an "excellent" and
coherent performance story, although Canada Economic Development for Quebec
Regions, the Canadian Human Rights Tribunal, CIDA, National Parole Board,
Natural Resources Canada, the Office of the Auditor General, Office of the
Information Commissioner and Parks Canada were rated "very good" on this
principle.
2.2.2 Focus on outcomes, not outputs
Guideline
Principle 2 -Focus on outcomes, not outputs:
This
criterion was scored on the extent to which the achievements reported were
outcomes, rather than outputs or activities.
There were three sub-criteria:
- Does the
report show the logical connections between resources, activities, outputs and
the outcomes toward which these contribute?
- Does the
report explain how achievements contribute to longer-term strategic outcomes?
- Does the
report demonstrate that the organization makes use of performance measurement
information to learn and improve?
There is a
much stronger focus on outcomes this year, although a lot of reports are still
largely focussed on activities, outputs, and immediate or intermediate
outcomes. By contrast with last year,
nearly all the DPR reports attempted to define strategic outcomes though not
all used that term and not all succeeded.
However, most of them did not break these down into more direct
outcomes, which could be shown to follow from the activities performed and
planned results achieved. Most reports
that did define strategic outcomes went on to talk about activities and
outputs--without attempting to measure or show that the desired outcomes were
coming to pass, and without explaining why their activities and outputs would
lead to the desired outcomes. While the
sub-criteria of linking achievements to longer term strategic outcomes was done
better than the other two sub-criteria, there was a scarcity of discussion
or logic models to explain how activities and outputs might plausibly lead to
the desired outcomes.
Only a very
few departments had gotten so far as to define and justify performance
indicators for their strategic outcomes, and few of these (CED(Q), INAC,
Canadian Human Rights Tribunal, CIDA, Natural Resources Canada and Parks
Canada) actually reported any outcome performance data, let alone demonstrated
that the collection and use of such data was an integral part of departmental
management, strategic planning and resource allocation. The "performance indicators" were
often simply re-statements of the strategic outcome, rather than an attempt to
specify what success would look like and how it would be recognized and
measured. Only NRCan and the Office of
the Information Commissioner were rated "excellent" on this sub-criteria.
Of all the
DPRs, only CIDA and NRCan were rated "excellent", and the ratings were fairly
heavily distributed across the bottom half of the range, with an overall
average rating of 2.3, or "needs improvement".
Canadian Economic Development for Quebec Regions, the National Parole
Board, Parks Canada, the Office of the Auditor General, and the Canadian Human
Rights Tribunal were all rated "very good" on this principle, often because
they had a logic chart of discussion to show how their activities and outputs
were linked to achievements and to the Strategic Outcomes. As well, they tended to discuss the areas
where they did not achieve their expected level of performance and what they
planned to do about it.
2.2.3 Associate performance with earlier
commitments, and explain any changes.
Guideline
Principle 3 - Associate performance with earlier commitments, and explain any
changes:
This
criterion was scored on the report's effectiveness in associating performance
with earlier planned results, and discussing any performance gaps. There were three sub-criteria:
- Does the
report compare performance accomplishments to commitments made in previous RPPs?
- Does the
report identify changes (if any) to
commitments made in previous RPPs, providing credible explanation for such
changes?
- Does the
report show that performance information has influenced, or been incorporated
into the decision-making process?
This
principle was the most problematic for the departments, with very few
presenting their accomplishments against their planned results as specified in
the RPP in a clear and systematic way.
In the federal context, effective performance reporting requires that
the performance story be readily comparable with commitments framed in RPPs
going back over at least a 3-year period and, in many cases, for much
longer. Performance was seldom reported
against planned results, and there were infrequent references to the
commitments made in the RPP. In some
cases these were noted, but performance was not assessed against them. As a result, the ratings for this principle
were generally low (no evidence or poor) with only a very few departments
handling this principle well. The
average score overall of 1.7 ("poor/needs improvement") is the lowest of all
the principles. Of all the DPRs, only
ACOA, CCRA, Canadian Economic Development for Quebec Regions and the National
Library were rated "excellent", with these same departments reporting their accomplishments
relative to what they had indicated in the RPP they expected to
accomplish. The Canadian Human Rights
Tribunal, Parks Canada and IANC also did a very good job of reporting
achievements against planned results.
But even in these cases, there was little to no consideration of
outstanding commitments from previous years.
Only planned results from the 2001-2002 RPP were considered.
There was
also very little indication of use of the performance information for
decision-making. Only the Canadian
Centre for Occupational Health and Safety, CED(Q), NRCan and the Office of the
Information Commissioner provided good information about how performance
information had been used to make decisions.
2.2.4 Set performance in context
Guideline
Principle 4 -Set performance in context:
This
criterion was scored according to the over-all judgment of the reviewer of the
clarity and completeness of the report's description of the departmental role
and operating environment. There were
four sub-criteria:
- Does the
context identify key internal and external risks and challenges involved in
delivering, or not delivering, strategic outcomes to Canadians?
- Does the
report refer to societal indicators to help the reader assess the department's
performance?
- Does the
report identify key strategic partnerships/horizontal initiatives as important
links in achieving strategic outcomes?
- Does the
report link to government priorities and provide a good understanding of the
significance of the department's outcomes in that context?
The context
sections of the 2001 DPRs were sometimes good and, in a few cases, excellent;
but most organizations had not done a very good job on this aspect of
performance reporting, preferring to get right down to what they themselves
were doing. This prevented the reader
from understanding the organization, and the organizations from presenting
themselves realistically as players in a complex field. In the 2002 reports however, the departments
did a much better job of setting the context for performance. The departments generally made a good
presentation of the risks and challenges they face, and many linked what they
were doing to priorities listed in the Speech from the Throne and identified
strategic partnerships. However, few
identified horizontal initiatives as an important factor in achieving their
outcomes, and very few used societal indicators to place their performance in a
societal context.
The overall
average score of 2.9 ("satisfactory") is the highest of all the principles. This is one area where the smaller
departments did as well as the larger departments. CIDA, Correctional Service Canada, National Archives, National
Energy Board, National Film Board, Parks Canada and Western Economic
Development all did an excellent job in explaining their strategic context and
seventeen other departments of the 87 were rated "very good" on this
principle.
The factors
considered in describing the department's performance context also improved,
and there were numerous very good examples of specific factors. A number of departments
- CIDA, Correctional
Service Canada, Fisheries and Oceans Canada, Law Commission of Canada, National
Archives, National Parole Board, Office of the Chief Electoral Officer, Parks
Canada, Statistics Canada and the Canadian Human Rights Tribunal - did an "excellent" job of discussing the
risks and challenges they faced. ACOA
and Transport Canada used social indicators in an outstanding way to provide a
broader context for their own performance.
A number of departments provided excellent information and discussions
on partnerships or horizontal initiatives, and a number related their
performance to government priorities in a way that was particularly useful:
three departments - ACOA, HRDC and Environment Canada - were notable in doing
both of these excellently.
2.2.5 Link resources to outcomes
Guideline
Principle 5 - Link resources to outcomes:
This
criterion was scored according to the over-all judgment of the reviewer of the
extent to which resources are linked to achieving individual strategic
outcomes, or (preferably) to the lower-level outcomes that contribute to a
strategic outcome. There were four
sub-criteria:
- Does the
report provide information on the amount and type of resources used, i.e. appropriations,
capital, revenues, human resources and partnerships, linked to the outcomes
achieved?
- Does the
report explain the reasons for significant changes to plans and resource levels
(where applicable) over the planning period?
- Does the
report make use of a crosswalk if no direct alignment between resources
expended and strategic outcomes can be made?
- Does the
department complete the required financial tables provided in the annex to the
guide? Is this done in a manner that is
understandable to the reader?
More
departments reported their expenditures by strategic outcome this year than
last year, and many more who did not did include a crosswalk showing the
relationship between strategic outcomes and the expenditures by business line. Most departments reported only on
expenditures, with some departments also providing information on FTEs by
strategic outcome or by business line.
Only DFAIT included FTEs, capital, and others of the types of resources
in the discussion of resources by strategic outcome. Most departments simply discussed expenditures. In discussing changes between planned and
actual figures, some departments provided footnotes in the financial tables to give
a reason for a significant difference, but most did not even do that. Health Canada, Transport Canada, IANC and
the Immigration and Refugee Board provided exemplary explanations of changes
over the planning period. The Guide was quite clear that a discussion, not just
a table, was required to adequately explain the financial picture. The idea was to explain the relationship
between resources expended (of all kinds, not just the total budget allocation
and FTEs), and the outcomes achieved.
Only a few organizations came close to this, the OAG, Fisheries and
Oceans Canada and AAFC all provided excellent tables in an understandable way,
but only Transport Canada and Veterans Affairs Canada did an excellent job in
discussing the linkage of outcomes to resources.
However, the
2002 DPRs still show significant improvement in this area over last year. While the ratings on this principle in 2001
were mostly "poor" or "needs improvement", the ratings for 2002 are better
distributed across the range of scores, with the above departments having consistently
good ratings for this principle. The
average score of 2.3 is among the higher scores for the principles. This was an area where the smaller
departments performed as well as the larger ones.
2.2.6 Does this report provide factual,
independently verifiable performance information?
Guideline
Principle 6 - Explain why the public can have confidence in the methodology and
the data used to substantiate performance:
This
criterion was scored according to the over-all judgment of the reviewer of the
extent to which the report helps its reader interpret performance information
and assess its reliability (e.g. sources of information, statistical
reliability of data, etc.) There were
four sub-criteria:
- Does the
report substantiate performance by including historical or other comparison
information? (earlier periods, similar
organisations, etc.)
- Does the
report refer to findings of evaluations and audits to substantiate its
performance information?
- Does the
report provide information on the validity and credibility of the data used?
-
Does the report
provide web links allowing a reader to "drill down" for more detailed
information?
The actual
measurement of performance is still in its infancy. In the 2001 DPRs, it was apparent that few organizations had
defined meaningful performance indicators and justified their validity, let
alone put systems in place to collect and analyze actual performance data. This situation is somewhat improved in 2002,
although there is still a great deal of progress to be made. While more departments are using web references
effectively to point the reader to additional information on a topic, there is
still little use of evaluation or audit findings or of historical or other
comparative information. Very few
departments provide the reader with information about the source and
reliability of the performance information.
While
ratings of "no evidence" and "poor" predominated for the 2001 DPRs, a few
departments addressed the principle in a satisfactory manner. In comparison, the 2002 reports show many
more departments addressing the principle (i.e. fewer "no evidence" ratings),
although a number are rated poor. The
overall average score of 1.9 ("needs improvement") is still fairly low,
relative to most other principles.
The OAG was
rated very good or excellent National Energy Board, the National Research
Council, NRCan and the OAG used historical data in an excellent fashion, while
CED(Q), Centre for Occupational Health and Safety, CIDA, Fisheries and Oceans
Canada, IANC, NRCan and the OAG all used evaluations and audits to substantiate
performance in an outstanding way. The
provision of information on the validity and credibility of the data was the
worst of the sub-criteria, with almost one-half of the departments providing no
evidence of this. However, CED(Q),
DIAND, Statistics Canada and Transport Canada did excellent work on this
question. And while many departments
offered at least one web address for more information, Citizenship and
Immigration Canada, CIDA, IANC, ND, Fisheries and Oceans Canada, Health Canada,
HRDC, Industry Canada, NRCan, Office of the Chief Electoral Officer, Transport
Canada, Treasury Board of Canada Secretariat, Tax Court of Canada and the
Transportation Safety Board used these web links in a directed way to help the
reader find specific information to support the performance discussed. The use of web addresses was by far the most
adhered to sub-criteria for this principle.
3 Government-wide Initiatives
and Management Issues
The
assessment also included a series of specific questions concerning the report's
treatment of various government-wide
initiatives and management issues:-namely
- Sustainable
Development;
- Sustainable
Development in Government Operations;
- Social Union
Framework Agreement (SUFA);
- Service
Improvement Initiative;
- Government-On-Line
;
- Modern
Comptrollership and management practices;
- Human resources management issues;
- Management of grants and
contributions and the Policy on Transfer Payment;
- Alternative Service Delivery and
Foundations.
Section 3 of the
Guide for 2002 Performance Reports,
indicated clearly that these government-wide themes are part of a balanced
and coherent performance story. Each
organization was expected to comment on each of these themes, if
applicable. When appropriate, each
organization was expected to tell what it was doing, and how far it had gotten,
in incorporating these government-wide commitments within its mandate and
programs.
These
were assessed by reviewing each DPR against a checklist to determine the extent
to which a department addressed each theme, as appropriate. There was a great deal of variation by
department in the amount of information presented for relevant themes: some
provided excellent information and others provided only brief mentions. This aspect was difficult to assess since it
was not always possible to determine whether a particular theme was relevant to
a department, and therefore whether it should be reported on. However, it appeared that departments
generally made an effort to report on those which were significant to the
mandate of the department, or where the department had a specific initiative
underway (such as modern comptrollership).
Initiatives that were management-oriented, such as modern comptrollership
and human resource management tended to be better reported than those not
internal to the department's operations, such as SUFA and sustainable
development.
4 Conclusions
Considerable
progress has been made in the Departmental Performance Reports of 2002,
compared to 2001. There is more
information in the performance reports this year, and it is generally of higher
quality. While telling a coherent and
integrated performance story focussed on outcomes and relative to past planned
results is still a challenge for many departments, progress is being made, and
most departments have improved in this respect over last year.
Perhaps the
biggest impediment to good performance reporting is the lack of clear,
concrete, outcome-oriented and systematic planned results against which to report
performance. Planned results help drive
the collection of performance information, and keep it focussed towards the
strategic outcomes. Few departments
make the connections with the information included in their RPPs and there is
little evidence of planned results in most of the DPRs. As a result, there is no standard against
which the reader can determine whether the results reported are appropriate or
sufficient.
The DPRs are to report performance "as of
March 31, 200x". Yet, many departments appear to view the DPR as an annual
report, in which the department accounts for what it did or accomplished in the
past fiscal year. Thus, the focus is on
what was done or initiated in that year, while the outcomes that should be
reported on may have started a number of years earlier.
Many departments are now developing or
revamping their performance frameworks.
As these are developed and implemented, the performance reporting should
continue to improve. The progress over
the past year suggests that performance reporting is gaining momentum.
The
treatment of government-wide reporting is rather mixed: some departments are meticulous in reporting
on government-wide initiatives and commitments, while others are less so. But the reader has no way of knowing whether
a given initiative is adequately or appropriately addressed, by a particular
department.
Appendix A: Figures
Figure 1
Figure
This chart shows the distribution of all scores for all
principles and their component factors. The total across all five categories for each year is 100%.
The 5-point rating scale (not including "Not
applicable" or "Not there"/"No Evidence") for the 2002 reports was collapsed to
conform to the 3-point scale ("Poor", "Satisfactory", "Excellent") used to
assess the 2001 reports, for purposes of comparison.
Figure 2
The scores shown in the chart are
the overall scores for each principle, averaged across all departments.
|