Treasury Board of Canada Secretariat - Government of Canada
Skip to Side MenuSkip to Content Area
Français Contact Us Help Search Canada Site
What's New About Us Policies Site Map Home

Alternate Format(s)
Printable Version

The Managing for Results Self-Assessment Tool

Previous Table of Contents  


Consultations

This project had its genesis in the work of the Montebello Group of senior public servants,[3] who provided ongoing steering and oversight. The Montebello Working Group created by them provided advice to help refine the Tool.

In February 2002, consultations were also held with a small group of officers who were MFR leaders in Agriculture and Agri-Food Canada, the Canada Customs and Revenue Agency, Canadian Heritage, Industry Canada, Indian and Northern Affairs, and Transport Canada. The focus was on the lessons they had learned while pioneering MFR practices. Following are highlights of their comments:

  • MFR is not easy to implement.
  • You know you are progressing when people:
    • spend a lot of time trying to "get it right";
    • are put on teams specifically to work on MFR and they take ownership of the process; and
    • appropriately challenge what goes into a corporate business plan.
  • It's important to report on results.
  • Commitment to MFR by senior managers will invigorate the team under them.
  • This Tool can be used as:
    • a marketing tool;
    • an assessment tool;
    • a tool to develop an action plan; or
    • a tool to measure the progress of a program.
  • A problem-solving approach can help create a practical results focus for line groups.
  • The use of a logic model or results chain hashelped groups to focus on appropriate outcomes.
  • A key is consistent, harmonized involvement and support across corporate and line functions, as well as between regions and headquarters.
  • Community ownership of the risk-results approach is important; all levels must see themselves in the approach.
  • A significant resource effort is required at all levels among all groups.
  • The clear and consistent integration of management concepts into a few key management processes is important.
  • Patience and persistence pay off.

The project team gratefully acknowledges the donation of time and useful feedback received from those consulted as part of this process4:

  • David Enns
  • Vincent Ngan
  • Bryan Mclean
  • John Platts
  • Karen Swol
  • Aileen Pangilinan
  • Joyce Hue
  • Tim LaForce
  • Paulette Panzeri
  • Gail Young
  • Robert McDonald
 

[1] The Office of the Auditor General has developed a separate self-assessment tool for rating Departmental Performance Reports on the basis of how well departments report accomplishments -- that is, how they measure outcomes against previously stated performance expectations. See the April 2002 OAG Report, Chapter 6, "A Model for Rating Departmental Performance Reports." In addition, in 2001 the Treasury Board Secretariat issued principle-based guidance on performance reporting for Reports on Plans and Priorities and Departmental Performance Reports.

[2] This section draws on Beverly A. Parsons, "Finding Transformative Themes Across Multiple System Change Evaluations," paper presented to the November 1998 Annual Meeting of the American Evaluation Association. Ms. Parsons is Executive Director of InSites, a Colorado-based organization that conducts research and evaluation, and provides technical assistance in support of change in the field of education.

[3] Including Maria Barrados, Jennifer Benimadhu, Ivan Blake, Bob Cook, Keith Coulter, Bruce Deacon, Carolyn Farquhar, Jean-Pierre Gauthier, Paul Gauvin, Blair Haddock, Sherry Harrison, Cathy Livingstone, John Mayne, Lee McCormack, John McLure, Janet Milne, Bruce Sloan and Judy Watling, as well as Chris Mihm and Sarah Veale from the General Accounting Office in the United States.

[4] Work on managing for results in the United States was kindly provided to us by the Government Performance Project at Syracuse University.

 

 
Previous Table of Contents