Treasury Board of Canada Secretariat - Government of Canada
Skip to Side MenuSkip to Content Area
Français Contact Us Help Search Canada Site
What's New About Us Policies Site Map Home

Alternate Format(s)
Printable Version

Assessment of 2001 Departmental Performance Reports - Summary Report


Assessment of 2001 Departmental Performance Reports - Summary Report

Treasury Board of Canada Secretariat
Results Based Management Division
Comptrollership Branch
2002-04-22

Objective of Report:

The review and analysis of the 84 Departmental Performance Reports (DPRs) for FY 2000-2001, had two several goals:

  • to assess the effectiveness of the renewed guidance;
  • to provide a summary report on the status of DPRs with respect to each of the Principles; and
  • identify good practices to support the overall improvement of Departmental Performance Reports.

The reports were read and rated on the six principles outlined in the TBS 2001 Performance Reports Guide. The assessment was done according to a list of criteria, conforming to the six principles and their sub-components, using a scoring system from 1 (meaning no evidence, or not found) to 4 (meaning excellent). The assessment was carried out by an independent third party contracted by TBS.

The DPR review - focus on learning:

The review and analysis of the Departmental Performance Reports is being carried out primarily for learning purposes and the gathering of good practices, rather than for auditing or recognition. Hence, an overall score or ranking of DPRs has not been undertaken. The overall objective of this review is to support the collective development of DPRs. This report focuses on the individual principles and their component parts and provides practical feedback in these areas. It must be acknowledged that the guidance for the preparation of DPRs was significantly revised last year and released late in the DPR preparation cycle. Although many of the concepts and requirements around the development of DPRs remained the same, this renewal of guidance, combined with its late introduction, made it difficult for some organisations to fully integrate many of the principles in to this years DPRs.

Findings:

Overall, it was found that there were instances of good practice in most of the DPR principles. However, departments and agencies were significantly challenged to fully integrate all six of the DPR principles into their report. No single report can be held up as an over-all model or good practice - every department has something to improve in its performance reporting.

DPR Strengths:

  • Most departments have adopted the concept of "strategic outcome" as an important part of their DPR, although the application of this concept was uneven.
  • Focusing on outcomes, not outputs was highest rated for departments and agencies 24 of the 84 entities scored satisfactory or better in their focus on outcomes. However this finding is countered by the finding that 31 organisations did not focus on outcomes at all.
  • DPRs were generally shorter and more reader friendly than they have been in previous years.

DPR Challenges:

Areas departments and agencies found the most challenging included:

  • costing of outcomes (linking resources to results);
  • strengthening the context;
  • making a logical connection between activities and strategic outcomes; and
  • clear and concrete strategic outcomes and planned results based on commitments included in previous RPPs.

Distribution of Overall Scores by Principles

Distribution of Overall Scores by Principle; No, of Depts. with Score: Tell Perf. Story - No Evidence:35, Poor:35, Satisfactory:12, Excellent:2; Outcome Focus - No Evidence:31, Poor:29, Satisfactory:21, Excellent:3; Report on Commitments - No Evidence:34, Poor:38, Satisfactory:10, Excellent:2; Context - No Evidence:25, Poor:37, Satisfactory:18, Excellent:4; Resources to Results - No Evidence:50, Poor:22, Satisfactory:12, Excellent:0; Validity of Performance Indicators - No Evidence:55, Poor:20, Satisfactory:9, Excellent:0;
Click here to view a full sized image (45Kb)

General Findings:

The following provides a brief summary of thematic findings which apply to all departments and agencies. Findings for each of the six principles of public performance reporting found in the DPR guidance have also been included in an annex.

(1) Shared challenge:

Analysis of the findings from the review found no specific departmental attributes associated with particular problems or strengths in reporting performance. For example, economic departments do not measure outcomes better than social ones; big departments don’t report performance better than small ones; single-mandate departments don't provide more focussed performance information than departments with multiple-business lines.

What emerged clearly are critical dimensions and issues of performance that differ from one organization to another, depending on the nature of the organization. The emphasis in departmental performance reports should vary accordingly. For example, client satisfaction reporting varies among departments both because of differences in clients, and differences in the concept of satisfaction. As well, notions and implications of risk and risk management articulated in the context section have a different significance for different departments.

A good performance report will reflect the uniqueness of an organization, and will provide different emphases on different aspects of performance, appropriate to that organization. It must discuss the indicators and measures by which the organization judges itself, and will be judged by others - notably by Parliamentarians, and by the Canadian public.

 

(2) Make the logical linkages:

Provide logical and plausible linkages between what an organisation is trying to achieve and the outcomes to which it is contributing. Several departments were able to deal effectively with this sort of attribution through discussion or by providing a diagram to help illustrate a clear linkage between their activities and outcomes. A logic chain on its own is not sufficient, however. It must be accompanied by performance indicators to help demonstrate impact. Considerable discussion may be needed to explain the logic on which a certain program, or group of programs is linked to strategic outcomes and why it is believed that associated measures are useful indicators of performance.

 

(3) Building performance information:

Based on the content of the 2001 DPRs, many organizations need to begin to define meaningful performance indicators as well as put systems in place to collect and analyse actual performance data Indicators need not be quantitative, but must be such that the needed information can be gathered and compared from year to year. The indicators chosen must be stable over a period of time, so that corresponding information can be collected and tabulated as a meaningful time series.

Efforts should also be made to develop appropriate indicators for the broader societal outcome. Where performance indicators for strategic outcomes cannot be defined or measured, departments should consider reporting significant achievements as immediate outcomes-provided their contribution toward commitments is clearly demonstrated.

(4) Linking resources to results:

A few organizations provided numbers for the financial resources expended on outcomes. Some did discuss partnerships in connection with specific outcomes. However, scarcely any included FTEs, or capital or any other of the resources mentioned. The general reason given by departments for not providing information on resources expended to achieve outcomes is that their financial systems do not capture or provide information in that way. However, since resources are allocated to programs linked to a particular strategic outcome, this should at least provide the basis for a notional linkage of resources to these outcomes.

In discussing changes between planned and actual figures, some departments provided footnotes in the financial tables to give a reason for a significant difference. Most did not even do that. Yet the Guide was quite clear: What was wanted was a discussion, not just a table. The idea was to explain the relationship between resources expended (of all kinds, not just appropriations and FTEs), and the results and outcomes achieved.

An objection sometimes raised here is that financial reporting against outcomes is not always appropriate, where some resources contribute to a number of outcomes. In such a case, it is likely that the outcomes are defined at too low a level, and that these are immediate or intermediate rather than strategic outcomes.

(5) Reporting on Outcomes:

The concepts of outcome and strategic outcome need to be carefully distinguished by departments. Where the strategic outcomes are very general, such as economic growth (through research) or a harmoniously functioning society (through tribunals work should be undertaken to focus on narrower conceptions of outcomes and explain how they, in turn, contribute to broader strategic outcomes.

For example, a tribunal that regulates competition or trade should work towards strategic outcomes that contribute to harmonious or competitive trade relationships, rather than the overall well being of society. Research in the agricultural sector might contribute to economic growth in the agriculture and agri-food sector, or even to a particular part of the sector, rather than economic growth in Canada as a whole. Similarly, the strategic outcomes of government-oriented departments, such as the PCO, should be linked to the broader public benefits that result from its work, such as the decreased regulatory and administrative burden on the public.

While in all cases, the linkage between the more focused strategic outcomes (growth in the agricultural sector) and the broader ones (economic growth) should be made, performance measurement and performance reporting should take place at the narrower level.

The review of the DPR guidance also suggests that certain wording may have caused confusion about how to define a strategic outcome. The Guide asks departments to state strategic outcomes that are public goods and that an organization can directly provide. However, the long-term and horizontal focus would render it almost impossible for organizations to provide a strategic outcome on its own. Thus, the guide is being updated to reflect the shared or contributory nature of strategic outcomes. Departments are encouraged to utilize the Performance Reporting lexicon that was developed last year and included with the DPR guidance to help refine their strategic outcomes.

(6) Strengthen the Context:

The DPR assessment found that, on the whole, the strategic context section for most DPRs did not provide the reader with a sound understanding of the environment that the organization had been working within during the planning period. Since the performance of an organization cannot be explained without reference to this environment, a good performance report must include an appropriate context section. A good context section should include a brief overview of the organization (i.e., mission or vision), an environmental scan highlighting relevant statistics/societal-level indicators, references to the strategic outcomes and how they are linked to government priorities, important horizontal linkages to key stakeholders and key risks involved in delivering - or not delivering - outcomes to Canadians.

Departments are encouraged to utilize appropriate qualitative and quantitative sources to construct an informative strategic context. Using statistical sources at the societal-level helps to provide a context for situating government program performance. For example, programs affecting health can be placed in the context of societal indicators measuring the overall health status of Canadians.

One way in which departments can set this context is by linking their performance to the government-wide report on performance, Canada’s Performance 2001 (http://www.tbs-sct.gc.ca/report_e.html). The report presents information on a set of 19 societal indicators that have been grouped according to four main themes:

  • economic opportunities and innovation in Canada;
  • the health of Canadians;
  • the Canadian environment; and
  • the strength of Canadian communities.

The results achieved by the department toward each of its strategic outcomes should, as much as possible, be situated and aligned in relation to the societal outcomes and indicators used in this government-wide report.

Annex 1:

Findings by Principle:

Principle 1 - Tell a Coherent Performance Story:

The cornerstone of developing an effective performance report is to develop a coherent and balanced performance story. A good performance story address performance shortcomings as well as successes. Further, it provides the ability for readers to find more detailed information if required, through electronic links, reports or annexes.

Lessons Learned:

The 2000-2001 DPRs generally did not tell a coherent and logically linked performance story. The review suggested that in many cases, reports used the terminology of results-based management and performance reporting without applying the performance paradigm. In particular, only a few departments wrote frankly about shortcomings, indicating corrective action to be taken.

Good Practice:

Canada Economic Development for Quebec Regions

Why?:

Report focuses on outcomes as well as acknowledges shortcomings and provides a significant attempt to measure performance.

Principle 2 (Focus on Strategic Outcomes):

The distinction between outcomes and outputs (see lexicon) is at the core of the results-based management1

Lessons Learned:

Few of the reports actually defined and focused on true strategic outcomes. Although most DPRs define one or more strategic outcome(s) (sometimes called business lines, objectives, etc) many of these could be classified as outputs as they were produced directly by the department and were focused on activities under the direct control of the organization. To a large extent, this inconsistent definition and use of strategic outcomes could be attributed to the fact that the concept of strategic outcomes was introduced only last year and will take time to be fully understood by departments.

The discussions of how organizational achievements contribute to longer term outcomes are also inconsistent. Departments who were better able to show demonstrate their contribution, used a discussion or a graphic logic chart or results chain to show the linkage between activities, outputs and immediate, intermediate and longer term or strategic outcomes.

Discussions of how the organization uses performance information to learn and adapt their efforts also needed improvement. Some that are absent reports left the item out altogether.

Good Practice:

Office of the Auditor General of Canada

Why?:

Use of logic model or results chain to help link the achievements to the long-term outcomes

Good Practice:

RCMP

Why ?:

Good explanation of changes that were made as a result of meaningful performance information.

Principle  3 (Report against Outstanding Commitments):

In the federal context, effective performance reporting requires that the performance story be readily comparable with commitments framed in RPPs going back over at least a 3-year period and, in many cases, for much longer. Performance needs to be measured against such long-term commitments, with an indication of how much progress has been made as of a particular point in time. But this also requires that the commitments themselves be tangible, clear and concrete.

Lessons Learned

Overall, departments did a much better job of using the outcomes and commitments identified in the previous RPP as a basis for reporting than they did of associating their performance with these commitments. Typically, the chart of commitments and outcomes from the RPP was used to organize the report, but the report often fell short of comparing performance to these commitments. A few departments did however associate their performance with commitments from previous years' RPPs, although it would be reasonable to expect a report on outcomes from several years ago.

Good Practice:

Treasury Board Secretariat

Why?:

Used the organizational strategic outomes as basis of discussion and reported progress towards outstanding commitments from previous years.

Principle 4 (Explain Strategic Context):

Organizations exist and function in an environment of clients and stakeholders, along with other organizations who are pursuing overlapping and/or competing with goals of their own. Since the performance of an organization cannot be judged without reference to this environmental scan, a good performance report must include an appropriate context section which could include discussion of how risk was dealt with over the reporting period or any significant environmental changes. A department often must address societal issues that are complex and often beyond its control. Societal indicators offer some idea of the magnitude of the issues being tackled by the department.

Lessons Learned:

The reports generally leave a lot to be desired in terms of setting the context for performance. Information on risks challenges and on strategic partnerships is a key part of the information needed to manage and assess the department and its programs. In addition, societal indicators were absent in most reports. This is often readily available through such sources as Statistics Canada, and helps provide credible context and to Departmental Performance Reports. For many organizations, the lack of social indicators is correlated with a general lack of focus on final outcomes (see lexicon) that would be measured by such indicators.

Good Practice:

Correctional Services of Canada

Why?:

Made a good use of societal indicators in discussing the demographics of Canada's prison population and the resulting challenges.

Principle 5 (Link Outcomes to Resources Expended):

One major function of a performance report is to inform the allocation of public resources amongst the competing aims of government. To do this, it is not sufficient to report on what various activities cost in financial, human and other resources. The report should at least estimate - if no hard data can be provided - how resources were allocated amongst the department’s strategic outcomes. The overall objective is to explain the relationship between resources expended (of all kinds, not just the total budget allocation) and the results and outcomes achieved.

Lessons Learned:

It was recognized before this assessment exercise started that, as of yet, very few organizations have in place financial information systems that can provide the necessary data to identify the cost of achieving outcomes. That being said, few organizations attempted even to estimate the breakdown of resource allocation amongst their strategic and/or major intermediate outcomes. In discussing changes between planned and actual (variance) figures, some departments provided footnotes in the financial tables to give a reason for a significant difference, but most did not.

Good Practice:

Public Service Commission

Why?:

Presentation of financial information by "objective" clearly allocating resources directly to the strategic outcomes.

Good Practice:

Justice Canada

Why ?:

Presented clear and understandable reasons for changes between planned and actual resources.

Principle 6 (Demonstrate the Validity of Performance Information):

A good performance report is based on factual performance information, which it presents in such a way that a reader can easily verify, and collect further information if desired.

Lessons Learned:

Almost two thirds of the departments provided no evidence pursuant to the overall principle of providing factual, independently verifiable data. When data was included, it was often provided without interpretation or discussion of the organization’s role in attaining these results. Provision of historical or comparable data to substantiate their performance story was also a challenge for departments, with just over half the departments providing no evidence.

The concept of attribution was not an issue for some of the smaller departments with a single mandate. In these cases, the attribution or contribution was obvious. However, for the remaining departments, many reports left it to the reader to determine how organizations make a contribution to strategic outcomes. Several departments were able to deal effectively with attribution by using discussion or diagrammatic logic charts to help illustrate clear linkages between their activities and longer-term outcomes.

Good Practice:

Canada Customs and Revenue Agency

Why?:

Indicated the reliability of data used for performance reporting.

Good Practice:

Office of the Auditor General

Why ?:

Discusses attribution through the use of logic charts to illustrate plausible contribution.

Horizontal Themes and Management Issues:

In their DPR, each organization was expected to comment on the following horizontal themes:

  • sustainable development for the economy at large;
  • the Social Union Framework Agreement (SUFA);
  • tracking of client/stakeholder satisfaction;
  • the government-on-line (GOL) initiative;
  • implementation of modern comptrollership and management practices;
  • human resources management issues in the delivery of outcomes;
  • management of grants and contributions for intended strategic outcomes,

The 2001 guidance asked that, to the greatest extent possible, departments incorporate these government-wide horizontal commitments within their overall performance story where applicable. Reporting in horizontal areas was uneven, perhaps because the Guidelines themselves were not as clear as they might have been.

Annex 2:

Details on Good Practices

1. Coherent, balanced picture of performance information?

Only Canada Economic Development for Quebec Regions (CEDQR) received a good rating for this principle, because the focus on outcomes was very strong in this report, and real attempts to measure performance were visible (pages 11-18).

2. Focuses on outcomes that benefit Canadians and Canadian society

For principle 2, the strongest showings were made by the Office of the Chief Electoral Officer (CEO) and by the Office of the Auditor General (OAG).

For CEO, excellent strategic outcomes are given (page 3), and these are well followed up in Section III, (starting on page 5). Although no performance data is given, some good performance indicators are mentioned (page 6), but more effort in this area is needed. Finally, the report effectively demonstrated (see bottom of page 3) that this organization makes use of performance measurement information to learn and improve.

The OAG also did very well in identifying strategic outcomes and presenting its achievements as progress toward outcomes (page 12). However, it did less well in explaining how achievements contribute to longer-term outcomes by focusing too much on its impact on government operations, and not enough on the critical strategic outcomes of honest and accountable government and public confidence. On the other hand, it did extremely well in demonstrating its good use of performance measurement information to learn and improve.

3. Associates performance with earlier commitments and explains changes to planned results?

Only the Canadian Centre for Occupational Health and Safety (CCOHS) and Treasury Board Secretariat (TBS) were rated well for this principle. The CCOHS DPR not only followed up on outcomes and commitments from last year’s and previous RPPs, but actually made use of performance indicators that had been identified in the RPP (page 10).

4. Explains department's role and operating environment (context)?

Of the principles identified as important for good performance reporting, this one was probably easiest to satisfy as it required no special data or grasp of the performance paradigm, but simply an ability and willingness to write frankly about the strategic realities of the department. By comparison with the other principles, it probably fared best in the 2001 reports. The following departments did well:

  • Canadian Human Rights Tribunal (CHRT);
  • Correctional Services Canada (CSC);
  • Office of the Chief Electoral Officer (CEO);
  • Public Service Commission (PSC).

In regard to departmental context, the two strongest DPRs were those Correctional Services (CSC) and the Public Service Commission (PSC). The CSC report gave a very thorough discussion of the demographics of Canada’s correctional populations, and the risks and challenges in coping with it. It also provided an excellent discussion (see especially pp. 47-53) of its roles with respect to various strategic partners. A good deal of societal data is presented in chart or table form2. Fewer indicators, collected in one place and given appropriate interpretation, would have been even better.

The PSC report also gave a very good discussion of risks and challenges (see page 13), and made some use of societal indicators.  An excellent discussion of key partners and clients is given on page 12

CEO gave a very good discussion of risks and challenges on pp 3-4, and of partners, also on page 4. The main societal indicator used is the voter participation rate on page 7. This could perhaps be broken down in several ways, and explored further. In particular, it would be good to know more about CEO’s role in ensuring a good turn-out and a fair election.

CHRT gave a good discussion of risks and challenges on pp. 5-6 and 18-19. Unfortunately, no indicators were used to measure the long-term, societal effects of the evolving body of case law in the field of pay equity and in other areas.  Strategic partnerships were not discussed; but then, for a tribunal, autonomy and independence are of the essence, so that it may be inappropriate to speak of "partnerships". A discussion of role vis-a-vis other jurisdictions and institutions would have been appropriate and welcome.

5. Links outcomes achieved with resources expended?

The following 12 departments did moderately well on this principle:

  • Canadian Space Agency (CdnSA);
  • Citizenship and Immigration Canada (CIC);
  • Human Resources Development Canada (HRDC);
  • Millennium Bureau of Canada (MBC);
  • National Archives of Canada (NA);
  • Northern Pipeline Agency Canada (NPA);
  • Office of the Correctional Investigator (OCI);
  • Offices of the Information and Privacy Commissioners (OIPC);
  • Public Service Commission> (PSC);
  • Status of Women Canada (SWC);
  • Transport Canada (TC);
  • Veterans Affairs Canada (VAC);
  • Department of Justice (Jus).

With regard to principle 5, none of these reports can really be taken as exemplary. They have been acknowledged because they went one step beyond the bare minimum of showing financial expenditures by strategic outcome.

The Millenniumins Bureau did well in this area for its very detailed and clear explanation of how its budget was spent, and why. In CIC’s report, only planned and actual spending and authorities are provided, but see page 6 for a good explanation of expenditure variances. In the National Archives of Canada (NA) DPR, there are good discussions of changes on pp. 36 and 38-39. In the Department of Justice DPR, there is a discussion of changes on page 41.

6. Provides factual, independently verifiable performance information?

As with question 5, none of the reports can really be taken as exemplary, because none provides solid information showing the trends in relevant performance indicators. To do well on principle 6 a department would have to:

  • define outcomes and planned results appropriate to its mission;
  • define and justify appropriate performance indicators showing changes and trends relevant to the outcomes and planned results;
  • establish good systems to collect and analyze information regarding the chosen performance indicators; and
  • abulate the information collected in a meaningful fashion and provide a balanced interpretation of the changes and trends observed.

However, the following six departments have made progress in this area:

  • Canadian Center for Occupational Health and Safety (CCOHS);
  • National Parole Board (NPB);
  • National Research Council (NRC;
  • Office of the Auditor General of Canada (OAG);
  • Tax Court of Canada (TCC);
  • Canada Customs and Revenue Agency (CCRA);
  • Transport Canada (TC).

The CCOHS report provides some historical comparison (page7) and makes reference to independent study on its website (http://www.ccohs.ca).

The OAG’s report is weak on historical information, but strong on attribution, credibility of data and balance in its reporting of successes and failures. For example, see the table on page 13, and note below it. On attribution see Exhibit 6 on pp. 15-16. Both successes and failures are reported. On credibility of data see pp. 18 and 47.

TC’s report provides some good historical data (pages 8 and 16, and gives its sources for that data, but is weak on the issue of attribution.

CCRA’s DPR includes an independent assessment of performance information conducted by the Auditor General of Canada (p.65 - 68 of section 1).

1 The 2001-2002 DPR Guide asks departments to state strategic outcomes, which are public goods that an organization can directly provide or contribute to.

2 See, for example, pp.19, 22, 23, 25, 28, 31, 33