Go to Home Page

Other Alberta Performance
Measurement Documents


CAMBRIDGE PAPER
Redefining Government Performance
Presented July 16, 1998

K. Ogata and R. Goodkey
Performance Measurement
Alberta Finance (formerly Alberta Treasury), Canada

Abstract

In 1993, the Province of Alberta initiated development of a comprehensive business planning and performance measurement system, and is now considered a leader in Canada. This paper will examine theoretical models and practical experience for designing performance measurement systems, and compare the similarities and differences between Alberta’s approach and three other prominent North American public sector models (Oregon, Minnesota and Florida). Drawing upon the experiences of Alberta and these other jurisdictions, this paper will discuss implications for the development and advancement of public sector performance measurement.

Introduction:

Performance measurement and quality management principles (e.g. reengineering) are in vogue with North American governments following the release of Reinventing Government (Osborne and Gaebler, 1992). Many jurisdictions now have, or are striving to implement, performance measurement systems. Among U.S. state governments, some of the older, more prominent systems are: Oregon Benchmarks, Minnesota Milestones, Florida Benchmarks, Texas Tomorrow, and Utah Tomorrow. Performance measurement is also becoming popular in Canada. Alberta’s system is the oldest, followed by New Brunswick, Nova Scotia, and the federal government. This paper will describe Alberta’s experience adopting performance measurement, compared to that of Oregon, Minnesota and Florida.

Most jurisdictions learn from the experience of their predecessors, and adopt elements of others’ systems; however, similar to benchmarking initiatives, situation specific environmental, systemic and cultural factors prevent mere duplication. Examining the fundamental tenets and underlying assumptions of existing systems is required prior to adapting elements to meet the new environmental context. By comparing jurisdictions’ implementation experiences, this paper will seek to identify critical success factors, variances between theory and practice, and implications for practitioners.

Methodology:

A literature review was conducted for this paper on the design of performance measurement systems and public sector reform. Based upon this review, a framework of key determinants of success for performance measurement was developed, incorporating both elements of theory and best practice. Practitioners in Oregon, Minnesota and Florida were interviewed, and copies of their publications and reports reviewed. Alberta’s approach was compared to Oregon, Minnesota and Florida’s using this framework to identify similarities and differences, and their implications.

Theory and Best Practices:

Earlier performance measurement systems were an extension of an organization’s accounting systems, usually intended to function as cost control mechanisms. These accounting based systems prompted governments to focus on program efficiency, budget utilization, and level of activity, rather than the achievement of policy objectives. This often resulted in punishing poor performance, rather than learning from experience. It also focused managers’ attention on process efficiency, rather than effectiveness in achieving policy outcomes.

Economic indicators have also traditionally been used to assess the economic "state of the state". Strong economic growth, low inflation and unemployment were regarded as indicative of a healthy economic climate and believed to result in prosperity for citizens. However, citizens have become increasingly concerned about their relative quality of life, expressed in terms such as quality of education and health care, availability of recreational/cultural opportunities, clean environment, and safety from crime. Accounting and economic based measurement systems were not designed to address these issues; thus, governments have introduced new systems for measuring progress, including policy outcome based performance measurement.

To provide a framework for comparing the four jurisdictions’ performance measurement systems, we have compiled a list of key system design factors (See Table 1). This list was synthesized from principles advocated by academics and practitioners (Osborne and Gaebler, 1992; Hatry, 1994; Kravchuck and Schack, 1996), and findings from studies of best practices (OECD, 1997; National Performance Review, 1997; Epstein and Olsen, 1997). Compliance with these factors may advance or impede system development, and enhance or limit system capabilities.

Table 1 - System Design Elements

Dimension

Factor

Design Elements

Environment Political climate (public)
  • Best to have a climate for urgent action (crisis).
  • Public/stakeholder demands for increased accountability.
Leadership
  • Top level support including a "political" champion for the process.
Framework (system architecture) Vision
  • System designed to provide information:
    1. to improve program performance.
    2. to improve planning and decision making (including resource allocation).
    3. to improve accountability.
Strategic planning
  • Define mission, goals and strategies.
  • Measurement is part of larger planning process (to assess progress towards goals).
  • Define logic chain of how strategies will influence outcomes and thereby achieve goals.
Responsibility and accountability
  • Define parties responsible for specific outcomes.
  • Provide linkage between delivery agents and achievement of results.
  • Involve program experts in design to facilitate buy-in.
Culture Client-centred service delivery
  • Consult with customers/stakeholders.
  • Desired outcomes are consistent with customer needs.
  • Report information in user-friendly terms.
High performing organization
  • Focus is on results and enhancing performance, not punishment.
  • Information used to facilitate planning and resource allocation. Information supports decision making process.
  • Need to have data analyzed/interpreted to identify required action.

The Alberta Experience:

In 1992, Alberta’s future prospects seemed dim. The 1992-93 deficit was $3.4 billion (budget of $16.8 billion), and economic growth was sluggish due to weak oil and gas prices. The province’s bleak fiscal and economic situation, and a growing public distrust of government provided a favourable "crisis" climate for change. The new government, led by Premier Ralph Klein and Provincial Treasurer Jim Dinning, seized upon this crisis situation and instituted sweeping reforms including: three-year business plans, 20% reduction in government spending, privatization and downsizing of government services, making community-based delivery agents publicly accountable, and outcome-based performance measurement. Thus, performance measurement was just one of several reforms initiated to make government more open and accountable to the public. By 1996-97, the Alberta government enjoyed a budget surplus of $2.5 billion (budget of $14.5 billion) and Alberta’s economy grew by 6.6% in 1996.

The 1993 Budget launched the process of three-year business planning for the government and each ministry. In 1994, development began on a government-wide performance measurement system to monitor progress toward the goals stated in government and ministry business plans. In June 1995, the first annual Measuring Up report (Government of Alberta, 1995) was released containing 22 core government measures under 18 goals. Business planning and performance measurement were co-ordinated by the Treasury department as part of the regular budget process, which provided the authority to "reinforce" ministry cooperation.

Although the Premier and Provincial Treasurer advocated these reforms, there was less need to garner political support as Alberta’s political system follows the parliamentary system, and the ruling party held a majority position. Parliamentary tradition also facilitated the definition and reinforcement of new and existing accountability relationships. Thus, accountability relationships were extended from Cabinet through ministries to community-based delivery agents such as school boards and regional health authorities.

A parallel reform was developing a "results-oriented" culture within government. Spending money was no longer sufficient evidence of results. Ministries were encouraged to demonstrate how their programs and strategies were effective in producing desired or intended results. As stated by Osborne and Gaebler in Reinventing Government, "...a perfectly executed process is a waste of time and money if it fails to achieve the outcomes desired." Thus, government’s vocabulary shifted from the number of hospital beds and funding per student (inputs) to the quality of health care, student achievement, and a lower crime rate (outcomes). Balanced budget and government accountability legislation were enacted in 1995; however, legislation served to reinforce political commitment rather than effect change. Sufficient cultural change has occurred that the performance measurement process has survived the loss of its original political champion.

The public accountability focus has shaped the content, presentation and weighting of measures. Measuring Up was intended to be a user-friendly document, distilling sometimes complex policy issues into everyday language. Thus, graphical presentation of data has been used wherever possible. In addition to several economic measures, Measuring Up included several measures of government performance, reflecting an emphasis on public accountability.

Overview of Oregon, Minnesota and Florida:

In 1989, Governor Neil Goldschmidt unveiled Oregon Shines (Oregon Department of Economic Development, 1989), an economic development strategy designed to guide a shift from the state’s traditional resource-based economy to a new information based economy. Oregon Shines addressed not only economic diversification, but also the enhancement of citizens’ quality of life through a "circle of prosperity". The circle envisaged a diversified economy that created opportunities for Oregon citizens while protecting the environment and supporting quality public services, thereby attracting new businesses.

The Oregon Benchmarks (Oregon Progress Board, 1994) was developed to monitor the state’s progress towards the goals identified in Oregon Shines. The 1990 Oregon Benchmarks contained 160 measures, which grew to 259 measures. Both the Oregon Benchmarks and Oregon Shines have been administered by the Oregon Progress Board, an independent agency created by the Legislature in 1989. The Progress Board and its mandate have survived two changes in leadership. Current Governor John Kitzhaber prompted the development of Oregon Shines II (Oregon Progress Board, 1997), which significantly reduced the number of measures to 92 to focus attention on priority areas.

The weighting of measures in Oregon Benchmarks strongly reflects its roots in the Oregon Shines economic development strategy. Measures of economic growth, balanced by protecting the environment, top the new Oregon Benchmarks list, followed closely by knowledge/education measures, reflecting the importance of a highly skilled workforce.

Minnesota Milestones (Minnesota Planning, 1992) was initiated in 1991 at the direction of Governor Arne Carlson. The Governor was inspired by the Oregon Benchmarks, and charged Minnesota Planning, the state’s strategic planning agency, with the task of developing a similar report. Minnesota started with an economic based agenda similar to Oregon, though with less of a "crisis" climate for change. Minnesota Milestones was intended to provide information to improve planning and decision making, as well as enhance public accountability.

The Minnesota Milestones was first published in 1992, and contained 79 indicators under 20 goals to measure progress. Minnesota Planning recently revised its list of measures down to 73 measures under 20 goals (Minnesota Planning, 1998). The weighting of Minnesota’s measures is similar to the other jurisdictions, although striking in its lack of global economic measures. Instead, Minnesota Milestones uses personal income to measure the effects of economic growth upon citizens.

The Florida Benchmarks (Florida Commission on Government Accountability to the People, 1996) was initiated in 1993 by Governor Lawton Chiles. The Florida Benchmarks was developed as a report card on the "state of the state" to respond to low public trust in government. Like Minnesota, there was not a strong "crisis" situation driving the process. However, strong top level political leadership was present, as the Governor created the Florida Commission on Government Accountability to the People through an executive order. Unfortunately, Florida has provided an illustration of the potential effect of losing a political champion. With the Governor scheduled to complete his final term this year, a Legislature committee has voted to eliminate funding for the Commission.

The 1996 Florida Benchmarks contained 270 measures, including 32 critical benchmarks. A second set of 60 critical benchmarks has since been released (Florida Commission on Government Accountability to the People, 1997). Although Florida Benchmarks was created in response to low public trust in government, acceptance has been limited. Within government though, performance measurement has garnered sufficient support that efforts are being made to implement a performance based budgeting system. This would represent a significant achievement in getting performance information used in the planning and decision making process.

Comparison of Approaches:

Examining the key similarities between governments, all of the jurisdictions have adopted an outcome-based system to provide information on progress towards stated long-term targets. All of the systems were driven by top level political leadership, and involved public consultation during the development of their measures. Each system was designed to address both public accountability and collect information for planning and decision making, though not necessarily with the same degree of emphasis. While the systems have enhanced accountability, limited success has been experienced in using results information for future planning and decision making.

Perhaps most interesting however, is the similarity and dissimilarity in the relative weighting of measures by policy area. Although each system encounters unique environmental circumstances, there is a general "concurrence" on the top policy areas: the economy, environment, education, income, and health. While relative weighting may not be indicative of the importance to each jurisdiction, the similarity is striking. Nevertheless, each system still displays its citizens’ unique priorities in the weighting of other policy areas. For example, Florida’s list reflects its difficulties with crime and public safety, while community values are prominent in Minnesota. Alberta’s list highlights government performance measures, while Oregon’s list reveals its roots in Oregon Shines.

Perhaps the key difference between Alberta’s approach and the others is the number of measures. Alberta reports on only 25 macro level measures (Government of Alberta, 1998), while Florida reports on 270 measures. Oregon has significantly reduced the number of measures it reports on, while Minnesota has made a slight reduction. Part of the difference is due to the hierarchical structure of Alberta’s system. The 25 core measures are the focus of the system, but 126 ministry measures and other supporting information are used to develop an overall picture of performance.

Other key differences include the purpose for which information is collected, and the existence of a global strategic planning framework. Three of the four jurisdictions collect information to support public accountability, and all collect information for use in planning and decision making, but the relative importance of each function varies. Also, although each jurisdiction has elements of strategic planning, the breadth, depth and importance of strategic planning will determine future success in implementing performance based decision making.

One of the difficulties in using performance information for planning, is the lack of formal accountability frameworks in most jurisdictions. Lead or responsible agencies have generally not been identified, except those that have volunteered. Until more agencies accept responsibility for specific macro measures, it is unlikely that they will use the information in their planning process or documentation. In this respect, Alberta has benefited from a tradition of government accountability rooted in the parliamentary system. Table 2 provides a summary of the key attributes of the four measurement systems.

Table 2 - Comparison of Systems

Dimension

Factor

Alberta

Oregon

Minnesota

Florida

Technical

Name of system

Measuring Up

Oregon Benchmarks

Minnesota Milestones

Florida Benchmarks

Number of goals

17

22

20

n/a

Number of measures

25

92

73 (proposed)

270

(60 critical)

Key target dates

2000/other

2000/2010

2020

2000/2010

Legislation

Yes

Yes

Yes

Yes

Environment

Separate board coordinating

No

Oregon Progress Board

No

Commission on Government Accountability to the People

Political climate (public)

Fiscal crisis, change in leadership and lack of public trust

Changing economy, change in leadership

Budget deficit and change in leadership

Change in leadership and lack of public trust

Leadership

Provincial Treasurer

Governor

Governor

Governor

Framework

Vision - Primary information purpose

Government accountability

Improve planning and decision making

Improve planning and decision making

Government accountability

Overall strategic plan

Government Business Plan

Oregon Shines II

No

No

Accountability

Part of planning process

Part of planning process

Informal process

Part of planning process

Development process

By program officials vetted by Treasury Board

By citizens and process participants

By citizens and process participants

By citizens, process participants and measurement experts

Culture

Stakeholder consultation

Mail survey, ministry consultation with stakeholders

Town hall meetings with business and community leaders

15 town hall meetings, over 1600 participants

Mail survey of 2000 citizens

Results publicly released

Yes

Yes

Yes

Yes

Information used in planning

Limited

No

No

Limited

Conclusions:

Based upon Alberta’s and other jurisdictions’ experiences, several espoused performance measurement system design principles seemed to be substantiated. Political leadership is essential. Each jurisdiction’s process was championed by top level political leadership. Citizen feedback and participation is vital. Again, each jurisdiction conducted extensive public consultation prior to implementation, which helped build citizen awareness. Although most systems exhibited a fair degree of convergence for key policy areas, local and global environmental factors resulted in a slightly different weighting of measures.

An excessive number of measures, a lack of legislation, or coordination by a separate board do not appear to represent fatal system flaws. Even the lack of a global strategic plan does not appear to impede development; however, lack of a strategic plan or an integrating/coordinating mechanism will limit system effectiveness. While public accountability may not be affected, use of the information for planning and decision making will be restricted.

If performance measurement systems are to provide meaningful information to improve program performance, then a framework for the overall process must be in place. By defining the overall strategic planning context and linking together the results desired, actions taken to achieve outcomes, and the results actually achieved, a plan for enhancing program performance can be developed. This will provide the basis for creating a government that works more efficiently and effectively.

    References

  • Epstein, J. and Olsen, R.T. (1996). "Lessons Learned by State and Local Governments", The Public Manager Fall 1996, p. 41-44
  • Epstein, J. and Olsen, R.T. (1997). "Performance Management: Perspectives for Today’s Public-Sector Manager", PA Times January 1997, Special Supplement.
  • Hatry, H.P. (1994). "Findings and Recommendations: Oregon Benchmarks and Associated Performance Measurement Process". Unpublished paper.
  • Florida Commission on Government Accountability to the People (1996). The Florida Benchmarks Report.
  • Florida Commission on Government Accountability to the People (1997). Critical Benchmarks Goals.
  • Government of Alberta (1995). Measuring Up.
  • Government of Alberta (1998). Measuring Up.
  • Kravchuk, R.S. and Schack, R.W. (1996). "Designing Effective Performance Measurement Systems under the Government Performance and Results Act of 1993". Public Administration Review Vol. 56, No. 4, p. 348-358.
  • Minnesota Planning (1992). Minnesota Milestones.
  • Minnesota Planning (1998). Minnesota Milestones 1998: Proposed Revisions.
  • National Performance Review (1997). Serving the American Public: Best Practices in Performance Measurement.
  • OECD (1997). In Search of Results: Performance Measurement Practices.
  • Oregon Department of Economic Development (1989). Oregon Shines: An Economic Strategy for the Pacific Century.
  • Oregon Progress Board (1994). Oregon Benchmarks: Report to the 1995 Legislature.
  • Oregon Progress Board (1997). Oregon Shines II: Updating Oregon’s Strategic Plan.
  • Osborne, D. and Gaebler, T. (1992). "Reinventing Government". Plume Books, New York, NY, p. 138-165, 349-359.

Copyright © 2000 Government of Alberta. WebMaster@treas.gov.ab.ca.