![]() |
![]() |
1.0 Introduction
1.1 Purpose of the RMAF
1.2 When Is An RMAF Required?
1.3 Linkage to the Management, Resources and Results Structure
2.0 Planning Basics
2.1 Who Should Be Involved
2.2 Determining Scope and Complexity
2.3 When to Begin
2.4 Guiding Principles
2.5 When Flexibility Is Required
3.0 Overview
4.1 Program Profile
4.2 Expected Results
4.3 Monitoring and Evaluation Plan
5.0 Common Pitfalls to Developing an RMAF
7.0 Integrating RMAFs and RBAFs
8.0 Need More Help?
Appendix A: The Review and Approval Process
Appendix B: Expenditure Review Committee Questions
Appendix C: Sample Evaluation Framework Table
Appendix D: Sample Integrated RMAF/RBAF Table of Contents
This Guide replaces the “Guide for the Development of Results-based Management and Accountability Frameworks” (August 2001). It is the result of lessons that the Centre of Excellence for Evaluation (CEE) has learned in working with departments to develop, review and approve Results-based Management and Accountability Frameworks (RMAFs).
In preparing this Guide, the CEE consulted a range of stakeholders on their experiences and needs. We discovered that, to be useful in today’s environment, we needed to change the Guide and our approach to review and approval. Our aim is to improve the quality of RMAFs and, in doing so, support their implementation.
What’s different [1] ? First, we have streamlined our advice and direction. As a result, this Guide is much shorter than our 2001 Guide. By being more concise and focused, we hope that departments and agencies will respond with shorter, more strategic frameworks.
In the past, RMAFs have displayed a number of common weaknesses. For example, there was often incomplete information on performance measurement strategies, results were not focused on benefits to Canadians, governance structures were inadequate, etc. We have tried to address these weaknesses by being more specific on the nature and level of information required, providing greater assistance through tools and training, and changing the approval process to ensure that the right players are participating at the right time.
Finally, we have updated the Guide to reflect and support changes in the environment. For example, integrating ongoing expenditure review activities and linking results to a department’s Management, Resources and Results Structure (MRRS) [2] in relation to its Program Activity Architecture (PAA) [3] are new requirements.
We hope that the changes and direction provided in this new and improved Guide will help departments achieve the results they are looking for and demonstrate the type of accountability and good management Canadians expect.
Who should use this Guide ?
This Guide was prepared with Program Managers and Evaluation Unit staff in mind. Its purpose is to outline a clear set of expectations on what should be included in an RMAF, the level of detail required, and the review and approval process. Throughout this document the term “department” is used to mean both department and agency.
The RMAF was first introduced in 2000 shortly after the federal government
introduced “Results for Canadians” – an expectation that managers focus on
measuring progress towards achieving results of their programs, policies and
initiatives.
The Policy on Transfer Payments (June 2000) formalized the requirement of the RMAF and the Risk-Based Audit Framework (RBAF) as part of the TB submission involving transfer payments [4] . RMAFs and RBAFs ensure that managers have the means and measures for program monitoring, performance improvement, risk management and reporting.
The Government of Canada’s Evaluation Policy (April 2001) also encourages the development of an RMAF. The RMAF integrates the evaluation function within the context of results-based management and supports managers and decision-makers in objectively assessing program and policy results.
With the Government’s renewed focus on good management including good
planning, performance assessment, ongoing expenditure review, and Parliamentary
pressure to increase transparency on the use of public funds, the RMAF and the
RBAF remains critical planning and management tools. They not only provide
frameworks to help monitor performance, manage risk and demonstrate results but
they are inextricably linked to the department’s MRRS. Results of monitoring
and evaluation activities will feed into the MRRS reporting process. This makes
the development and implementation of an RMAF an essential task for all Program
Managers regardless of the Policy on Transfer Payments requirements.
An RMAF provides Program Managers with a concise statement or road map to plan, monitor, evaluate and report on the results throughout the lifecycle of a program, policy or initiative. When implemented, it helps a Program Manager:
An RMAF is required for program approval of terms and conditions for grants to a class of recipients or for contributions and therefore must be presented to TBS for review and approval as part of a related Treasury Board Submission.
Because of its potential value as a management tool, the CEE also recommends developing an RMAF to ensure effective management decision-making and demonstrate clear accountability in all program areas. For example, senior managers may consider “rolling up or in” a number of related programs (including non-grants and contribution programs) into a single RMAF so as to better reflect the organization or intervention of a department to realize intended results. [5]
The creation of the MRRS, which replaces the Planning, Reporting and
Accountability Structure policy beginning in 2005/06, requires that departments
develop a Program Activity Architecture (PAA). The PAA reflects how a department
allocates and manages the resources under its control to achieve intended
results and reflects how programs are linked to the department’s strategic
outcomes [6]. The
MRRS also requires departments provide information on results expectations and
performance measures for elements and levels of the PAA.
RMAF development and implementation will help support this requirement.
In particular, the process of developing an RMAF assures:
To this end, RMAFs help provide essential information needed for MRRS.
Before preparing an RMAF, there are a few things to know such as who to involve and how to approach the process.
There are two key parties involved in preparing and implementing an RMAF: Program Managers and Evaluation Managers.
Program Managers hold the primary responsibility for preparing and implementing the RMAF. They are responsible for:
Departmental Evaluation Managers also play a key role in preparing and implementing the RMAF. They are responsible for:
Key stakeholders (e.g., third party delivery agencies such as non-governmental organizations) should also be consulted in preparing elements of the RMAF. Their early buy-in to intended results and ongoing monitoring and reporting activities greatly support the implementation process.
As of January 2005, the Senior Financial Officer (SFO) or his/her delegate is responsible for exercising due diligence for the quality and completeness of Transfer Payment Program TB submissions and related documents including the RMAF. The “sign-off” must denote, on behalf of department management, the acceptability of the proposed TB Submission with respect to the following:
SFO approval is required before a submission and related documents can be presented to TBS for review and approval.
TBS expects Program Management and Heads of Evaluation to support SFOs in the review and approval process by ensuring the quality and completeness of relevant sections of the RMAF. Appendix A presents a list of review and approval criteria for the SFO, ADM and Head of Evaluation.
Once “signed off” by the SFO, TBS will review an RMAF to ensure the following:
TBS Program Sector Analysts work hand in hand with the CEE and other analysts from the Results-Based Management Directorate (RBMD) to ensure that TBS’ review is complete. It is important to note that TBS analysts will not review RMAFs or associated documents without evidence of due diligence.
The scope and level of detail of an RMAF should be aligned with the risk and complexity of the program, policy or initiative. For example, a “low-risk” program should have simple logic model and straightforward monitoring and evaluation activities. A more complex, “high-risk” program should provide additional information to clearly explain relationships, accountabilities, risks and performance measurement challenges.
At the outset, the RMAF should reference the overall level of risk associated with the program, policy or initiative and provide a clear explanation on how it was determined (e.g., risk factors considered and rating approach applied). This will help departments and TBS to place the level of complexity of the RMAF in perspective.
Program Managers are encouraged to consult with their Audit and Evaluation Units for assistance in completing a strategic risk assessment.
Departments should prepare RMAFs at the outset of a policy, program or initiative - ideally - at the time when decisions are being made about design and delivery approaches. Once approved, Program Managers should implement the RMAF immediately - beginning with implementing the proposed performance measurement strategy and detailing the evaluation plan(s). (See also Section 6.0 – Implementing the RMAF.)
Successful preparation and implementation of an RMAF follows when Program and Evaluation Managers adhere to the following guiding principles:
Departments may find the standard RMAF approach (presented in this Guide) does not adequately address or support their needs. If so, Program Managers and Evaluation Managers should consult the “Tools and Guidance” section of the CEE website (http://www.tbs-sct.gc.ca/eval/tools-outils_e.asp) for additional assistance. Here you will find some helpful resources. In particular,
The RMAF has been streamlined into three core components:
The remainder of this Guide outlines the required elements for each component and provides suggestions to enhance quality and ensure completeness.
To assist departments with “low risk” programs, we have provided suggestions for page length for each major section. Please note that these are suggested guidelines only. For medium and high-risk programs, the department should still focus on keeping the RMAF short and focused. Page length, however, is dependent on the complexity of the program, policy or initiative.
In this section, we provide a summary of the purpose of each component and guidance in the information required. It is important to note that the information presented in the tables form the basis for TBS’ expectations of an RMAF.
The Program Profile provides a concise description of why a program, policy or initiative exists, what issues or problems it addresses, who are the key stakeholders and beneficiaries, what it is intended to achieve and the resource requirements.
A complete and concise Program Profile section will help clearly communicate what a program, policy or initiative aims to achieve and how. This and the next component – Expected Results - provide the basis upon which monitoring and evaluation activities are developed.
For “low risk” programs, the suggested length for this section is between two and four pages.
Table 4.0 Program Profile Information Requirements
1.0 Program Profile provides a concise description of the program, policy or initiative. |
|
Section |
Key Elements |
1.1 Context
|
|
1.2 Objectives
|
|
1.3 Key Stakeholders and Beneficiaries
|
|
1.4 Resources
|
|
Expected Results present the results that a program, policy or initiative intends to achieve and associated accountabilities. It is the focal point of the RMAF.
A key element of this component is the logic model. The logic model is a graphic representation of the causal or logical relationships (i.e., linkages) between activities and outputs and the outcomes (i.e., results) of a given policy, program or initiative, that they are intended to produce. The model should be supplemented with explanatory text to help describe the linkages (i.e., how one set of results or project outcomes lead to the next). A good logic model validates the theory behind the program and is the first step in developing realistic and relevant performance measurement and evaluation strategies.
At this stage, TBS recommends taking stock of the potential internal and external risks that may be associated with the program, policy or initiative and, therefore, completing a risk assessment to confirm the appropriateness of proposed results and associated performance targets [9] .
For “low risk” programs, the suggested length for this section is between two and three pages.
2.0 Expected Results provides a logical description and illustration of how activities and outputs lead to results and associated accountabilities. |
|
Section |
Key Elements |
2.1 Expected Results
|
|
2.2 Logic Model
|
|
2.3 Accountabilities |
|
Developed in collaboration with a department’s Evaluation Unit, the Monitoring and Evaluation Plan represents a Program Manager’s strategy to monitor performance and demonstrate results.
The monitoring or performance measurement plan enables managers to establish the necessary systems and processes to collect and analyze data and information so that program performance can be optimized. Evaluation studies generate accurate, objective and evidenced-based information to help managers make sound management decisions, demonstrate success, show ongoing relevance and develop more cost-effective alternatives to service delivery.
It is only through the combination of these two activities that Program Managers and senior executives can demonstrate a program, policy or initiative’s benefit to Canadians. Hence, consideration should be given to the questions that comprise ongoing expenditure review activities.
For “low risk” programs, the suggested length for this section is between two to four pages.
3.0 The Monitoring and Evaluation Plan provide directions for ongoing performance measurement and evaluation activities. |
|
Section |
Key Elements |
3.1 Performance
|
|
3.2 Evaluation Plan
|
|
The CEE reviews departmental evaluation plans to ensure an overall strategic approach to evaluating government priorities. The following criteria are used to support this process:
The RMAF’s true value is realized only when implemented. Suggestions on how to make implementation easy and cost-effective include:
The Policy on Transfer Payments requires the development of both an RMAF and an RBAF to ensure that managers have the means and measures for program monitoring, performance improvement, and reporting. The RMAF and RBAF are complementary. The processes used to develop them have natural points of integration that relate to the typical analytical and planning approaches used by managers to monitor program operations and performance. For example, program managers should simultaneously contemplate performance and risk issues when defining expected results, performance targets, roles and responsibilities.
Departments may consider integrating RMAF and RBAF documents to ensure effective coordination of these related activities or gain efficiencies in preparation and the internal review process. (See Appendix D) for an example of a Table of Contents for an Integrated RMAF/RBAF.
If you still need more help, contact your Evaluation Unit or visit the CEE website at http://www.tbs-sct.gc.ca/eval/eval_e.asp..
Senior Financial Officer (SFO) or his/her delegate
The acceptability of the proposed TB Submission with respect to the following:
Note: SFO approval is required before a submission and related documents can be presented to TBS.
Program Management (ADM-level) (ADM-level)
Departmental Evaluation Unit (Head of Evaluation)
The Expenditure Review Committee, which was established in Winter 2004, assesses existing programs and government spending using two sets of criteria. The first set of criteria are policy tests for program. They involve questions regarding:
The second set of criteria are implementation tests. Only if there is a proposal to change expenditures are the following questions examined. Incorporation of these criteria into evaluation studies should be incorporated as required and appropriate.
The Evaluation Framework presented in the RMAF is subject to review and updating as the program evolves taking into account such factors as changing circumstances, program changes and lessons learned. It is the basis of developing a more detailed evaluation plan for the formative (if applicable) and summative evaluations
Evaluation Activity |
Issues |
Data Sources |
Data Analysis Methods |
Frequency of Analysis |
Responsibility |
Formative Evaluation [14] |
Continuous Improvement : Are there ways to improve program delivery from either an effectiveness or efficiency perspective)? |
|
|
|
|
|
Performance Measurement Systems: Is appropriate performance information being collected, captured, safeguarded and used? Is data quality assured? |
|
|
|
|
|
Program Design and Implementation: Is the program being delivered/implemented as it was designed? Etc. etc. |
|
|
|
|
|
Other Issues :
|
|
|
|
|
Summative Evaluation |
Success : Is the program, policy or initiative effective in meeting its objectives, within budget and without unwanted outcomes? |
|
|
|
|
|
Relevance : Does the program, policy or initiative continue to be consistent with departmental and government-wide priorities and does it realistically address an actual need? |
|
|
|
|
|
Cost-Effectiveness : Are the most appropriate and efficient means being used to achieve objectives, relative to alternative design and delivery approaches? |
|
|
|
|
|
Other Issues :
|
|
|
|
|
|
ERC Questions/Issues : |
|
|
|
|
1.0 Introduction
1.1 Background
1.2 Level of Integration
1.3 Overall Risk Assessment
2.0 Program Profile
2.1 Context
2.2 Objectives
2.3 Stakeholders and Beneficiaries
2.4 Resources
3.0 Expected Results
3.1 Expected Results
3.2 Key Risk Areas
3.3 Logic Model
3.4 Accountabilities
4.0 Risk Assessment and Management Summary
4.1 Key Risks
4.2 Existing Mitigating Measures
4.3 Incremental Strategies
5.0 Monitoring, Evaluation and Auditing
5.1 Monitoring Plan5.2 Evaluation Plan5.1.1 Performance
5.1.2 Risk
5.3 Internal and Recipient Auditing
5.4 Reporting Commitments
Appendices (as required)
[1] For a complete summary of changes to the CEE’s
guidance on RMAF development, see “Summary of Changes to Guidance on
Developing Results-based Management and Accountability Frameworks”. (February
2005)
[2] A Management, Resources and Results Structure
(MRRS) is a comprehensive framework that consists of an organization’s
inventory of activities, resources, results, performance measurement and
governance information. Activities and results are depicted in their logical
relationship to each other and to the Strategic Outcome(s) to which they
contribute. The MRRS is developed from an organization’s Program Activity
Architecture.
[3] A Program Activity Architecture is an inventory
of all the activities undertaken by a department or agency. The activities are
depicted in their logical relationship to each other and to the Strategic
Outcome(s) to which they contribute.
[4] See Section 8.1.1 (xv) of the Policy on Transfer
Payments. (June 2000)
[5] Consolidating a number of programs under a single
RMAF is called an Umbrella RMAF. For additional information on how and when to
use Umbrella RMAFs, see “Guidance for Strategic Approach to RMAFs”.
(February 2005)
[6] Strategic outcomes are the long-term and enduring
benefits to Canadians that stem from a department or agency’s mandate, vision
or efforts. It represents the difference that a department or agency wants to
make for Canadians.
[7] For example evaluation strategies should adhere
to the Evaluation Standards for the Government of Canada as set out in the
Treasury Board Evaluation Policy. (April 2001)
[8] This is an issue examined by the Expenditure
Review Committee. For a complete list of ERC questions see Appendix B.
[9] For additional assistance on completing
risk assessments consult the Risk-based Audit Framework found on the Centre
of Excellence for Internal Audit website (http://www.tbs-sct.gc.ca/ia-vi/home-accueil_e.asp).
[10] At this stage, a strategic risk assessment of
the program, policy or initiative should be conducted to ensure that results
statements and timeframes are aligned and balanced with the department’s (and
its partners) capacity (authorities, skills and resources) to deliver.
[11] Examples of operating constraints may include
administrative rules and procedures to follow or rules to follow such as values
and ethics policies, privacy policies, etc.
[12] TBS recognizes that not all evaluation issues
will be known at the time of preparing an RMAF. Therefore, evaluation issues
identified in the RMAF represent a starting point on which to build a more
detailed evaluation plan for the program, policy or initiative. Activities to
support the development of a detailed evaluation plan for a program, policy or
initiative should be incorporated into the operational plans of a program and
begin in the early stages of delivery.
[13] The CEE recommends an RMAF be concise and
focused and, where possible, integrate RMAFs and RBAFs in order to efficiently
consolidate findings and ensure enhanced coordination of explicitly linked
tasks. For more information see Section 7.0 Integrating RMAFs and RBAFs.
[14] As noted in Section 4.3 Monitoring and
Evaluation, the CEE recommends the judicious use of formative evaluations. Where
appropriate, the use of formative assessments or other assessment or evaluation
approaches should be considered. Issues identified in this table under formative
evaluation are suggestions only.