About this Guide
1.0 Introduction
1.1 Purpose of the RMAF
1.2 When Is An RMAF Required?
1.3 Linkage to the Management, Resources and
Results Structure
2.0 Planning Basics
2.1 Who Should Be Involved
2.2 Determining Scope and Complexity
2.3 When to Begin
2.4 Guiding Principles
2.5 When Flexibility Is Required
3.0 Overview
4.0 Description by Component
4.1 Program Profile
4.2 Expected Results
4.3 Monitoring and Evaluation Plan
5.0 Common Pitfalls to Developing an RMAF
6.0 Implementing the RMAF
7.0 Integrating RMAFs and RBAFs
8.0 Need More Help?
Appendix A: The Review and Approval
Process
Appendix B: Expenditure Review
Committee Questions
Appendix C: Sample Evaluation Framework
Table
Appendix D: Sample Integrated RMAF/RBAF
Table of Contents
This Guide replaces the “Guide for the Development of Results-based
Management and Accountability Frameworks” (August 2001). It is the result of
lessons that the Centre of Excellence for Evaluation (CEE) has learned in
working with departments to develop, review and approve Results-based Management
and Accountability Frameworks (RMAFs).
In preparing this Guide, the CEE consulted a range of stakeholders on their
experiences and needs. We discovered that, to be useful in today’s
environment, we needed to change the Guide and our approach to review and
approval. Our aim is to improve the quality of RMAFs and, in doing so, support
their implementation.
What’s different [1] ?
First, we have streamlined our advice and direction. As a result, this Guide is
much shorter than our 2001 Guide. By being more concise and focused, we hope
that departments and agencies will respond with shorter, more strategic
frameworks.
In the past, RMAFs have displayed a number of common weaknesses. For example,
there was often incomplete information on performance measurement strategies,
results were not focused on benefits to Canadians, governance structures were
inadequate, etc. We have tried to address these weaknesses by being more
specific on the nature and level of information required, providing greater
assistance through tools and training, and changing the approval process to
ensure that the right players are participating at the right time.
Finally, we have updated the Guide to reflect and support changes in the
environment. For example, integrating ongoing expenditure review activities and
linking results to a department’s Management, Resources and Results Structure
(MRRS) [2] in relation to its Program
Activity Architecture (PAA) [3] are
new requirements.
We hope that the changes and direction provided in this new and improved
Guide will help departments achieve the results they are looking for and
demonstrate the type of accountability and good management Canadians expect.
Who should use this Guide ?
This Guide was prepared with Program Managers and Evaluation Unit staff in
mind. Its purpose is to outline a clear set of expectations on what should be
included in an RMAF, the level of detail required, and the review and approval
process. Throughout this document the term “department” is used to mean both
department and agency.
The RMAF was first introduced in 2000 shortly after the federal government
introduced “Results for Canadians” – an expectation that managers focus on
measuring progress towards achieving results of their programs, policies and
initiatives.
The Policy on Transfer Payments (June 2000) formalized the requirement of the
RMAF and the Risk-Based Audit Framework (RBAF) as part of the TB submission
involving transfer payments [4] .
RMAFs and RBAFs ensure that managers have the means and measures for
program monitoring, performance improvement, risk management and reporting.
The Government of Canada’s Evaluation Policy (April 2001) also encourages
the development of an RMAF. The RMAF integrates the evaluation function within
the context of results-based management and supports managers and
decision-makers in objectively assessing program and policy results.
With the Government’s renewed focus on good management including good
planning, performance assessment, ongoing expenditure review, and Parliamentary
pressure to increase transparency on the use of public funds, the RMAF and the
RBAF remains critical planning and management tools. They not only provide
frameworks to help monitor performance, manage risk and demonstrate results but
they are inextricably linked to the department’s MRRS. Results of monitoring
and evaluation activities will feed into the MRRS reporting process. This makes
the development and implementation of an RMAF an essential task for all Program
Managers regardless of the Policy on Transfer Payments requirements.
An RMAF provides Program Managers with a concise statement or road map to
plan, monitor, evaluate and report on the results throughout the lifecycle of a
program, policy or initiative. When implemented, it helps a Program Manager:
- Ensure clear and logical design that ties resources and activities
to expected results;
- Describe clear roles and responsibilities for the main partners
involved in delivering the program, policy or initiative;
- Make sound judgements on how to improve performance on an ongoing
basis;
- Demonstrate accountability and benefits to Canadians ; and,
- Ensure reliable and timely information is available to senior
executives in the department, central agencies and other key stakeholders.
An RMAF is required for program approval of terms and conditions for grants
to a class of recipients or for contributions and therefore must be presented to
TBS for review and approval as part of a related Treasury Board Submission.
Because of its potential value as a management tool, the CEE also recommends
developing an RMAF to ensure effective management decision-making and
demonstrate clear accountability in all program areas. For example, senior
managers may consider “rolling up or in” a number of related programs
(including non-grants and contribution programs) into a single RMAF so as to
better reflect the organization or intervention of a department to realize
intended results. [5]
The creation of the MRRS, which replaces the Planning, Reporting and
Accountability Structure policy beginning in 2005/06, requires that departments
develop a Program Activity Architecture (PAA). The PAA reflects how a department
allocates and manages the resources under its control to achieve intended
results and reflects how programs are linked to the department’s strategic
outcomes [6]. The
MRRS also requires departments provide information on results expectations and
performance measures for elements and levels of the PAA.
RMAF development and implementation will help support this requirement.
In particular, the process of developing an RMAF assures:
- Sound program design takes place by developing a logic model,
- Intended results are clear by developing outcomes statements; and,
- A performance measurement strategy exists by identifying key performance
issues and meaningful indicators.
To this end, RMAFs help provide essential information needed for MRRS.
Before preparing an RMAF, there are a few things to know such as who to
involve and how to approach the process.
There are two key parties involved in preparing and implementing an RMAF:
Program Managers and Evaluation Managers.
Program Managers hold the primary responsibility for preparing and
implementing the RMAF. They are responsible for:
- Ensuring that the content is accurate and reflects the design, operation
and management of the policy, program or initiative; and,
- Implementing the RMAF. In particular, Program Managers should ensure
program staff and partners collect performance information, oversee the
quality and security of the data and information collected, monitor and
improve their performance on an ongoing basis and demonstrate results
through timely evaluation activities and reporting.
Departmental Evaluation Managers also play a key role in preparing and
implementing the RMAF. They are responsible for:
- Providing guidance and technical expertise throughout the development and
implementation of performance measurement and evaluation strategies; and,
- Managing or conducting evaluation activities according to the Government
of Canada’s Evaluation Policy (April 2001).
Key stakeholders (e.g., third party delivery agencies such as
non-governmental organizations) should also be consulted in preparing elements
of the RMAF. Their early buy-in to intended results and ongoing monitoring and
reporting activities greatly support the implementation process.
2.1.1 Departmental Approval
As of January 2005, the Senior Financial Officer (SFO) or his/her delegate is
responsible for exercising due diligence for the quality and completeness of
Transfer Payment Program TB submissions and related documents including the
RMAF. The “sign-off” must denote, on behalf of department management, the
acceptability of the proposed TB Submission with respect to the following:
- Authorities;
- Sources of funds;
- Detailed cost information;
- Audit and evaluation commitments;
- Terms and conditions;
- Results-Based Management and Accountability Framework;
- Risk-Based Audit Framework; and,
- Other accountability documents as specified by TBS.
SFO approval is required before a submission and related documents can be
presented to TBS for review and approval.
TBS expects Program Management and Heads of Evaluation to support SFOs in the
review and approval process by ensuring the quality and completeness of relevant
sections of the RMAF. Appendix A presents a list of review and
approval criteria for the SFO, ADM and Head of Evaluation.
Once “signed off” by the SFO, TBS will review an RMAF to ensure the
following:
- Scope and level of details proposed for the RMAF is appropriate given the
level of risk associated with the program, policy or initiative;
- Results of past evaluation studies have been incorporated into the
program’s design, performance measurement strategy and evaluation plans.
- Proposed evaluation issues are appropriate and address TB and TBS
information requirements;
- Estimated costs for evaluation activities have been provided and are
realistic; and
- Accountabilities for management, delivery and reporting are clear.
TBS Program Sector Analysts work hand in hand with the CEE and other analysts
from the Results-Based Management Directorate (RBMD) to ensure that TBS’
review is complete. It is important to note that TBS analysts will not review
RMAFs or associated documents without evidence of due diligence.
The scope and level of detail of an RMAF should be aligned with the risk and
complexity of the program, policy or initiative. For example, a “low-risk”
program should have simple logic model and straightforward monitoring and
evaluation activities. A more complex, “high-risk” program should provide
additional information to clearly explain relationships, accountabilities, risks
and performance measurement challenges.
At the outset, the RMAF should reference the overall level of risk associated
with the program, policy or initiative and provide a clear explanation on how it
was determined (e.g., risk factors considered and rating approach applied). This
will help departments and TBS to place the level of complexity of the RMAF in
perspective.
Program Managers are encouraged to consult with their Audit and Evaluation
Units for assistance in completing a strategic risk assessment.
Departments should prepare RMAFs at the outset of a policy, program or
initiative - ideally - at the time when decisions are being made about design
and delivery approaches. Once approved, Program Managers should implement the
RMAF immediately - beginning with implementing the proposed performance
measurement strategy and detailing the evaluation plan(s). (See also Section 6.0
– Implementing the RMAF.)
Successful preparation and implementation of an RMAF follows when Program and
Evaluation Managers adhere to the following guiding principles:
- Utility - to ensure managers can use the RMAF to explain their
policies, programs or initiatives to Canadians and institute sound
performance measurement and evaluation activities.
- Shared Ownership - to meet the needs of all stakeholders and ensure
information needs and accountability requirements of (all) managers are met;
- Transparency - to ensure all stakeholders understand what results
are expected;
- Action-oriented - to ensure information needed by managers and
other stakeholders is available when it is required for key decisions;
- Focused and Concise - to ensure its immediate implementation by
managers and delivery partners;
- Credibility - to ensure professional standards [7]
are adhered to and commitments for monitoring, evaluation and reporting
are realistic.
Departments may find the standard RMAF approach (presented in this Guide)
does not adequately address or support their needs. If so, Program Managers and
Evaluation Managers should consult the “Tools and Guidance” section of the
CEE website (http://www.tbs-sct.gc.ca/eval/tools-outils_e.asp) for additional
assistance. Here you will find some helpful resources. In particular,
- If you are interested in modifying an RMAF to fit a unique situation, you
should consult “Guidance for Strategic Approach to RMAFs”
(February 2005);
- If you are managing a horizontal initiative, you should consult “Companion
Guide – The Development of Results-based Management and Accountability
Frameworks for Horizontal Initiatives” (June 2002);
- If you need assistance in understanding or developing a common results
terminology, you should consult “Results-based Management Lexicon”
(December 2004).
The RMAF has been streamlined into three core components:
- Program Profile - is a concise description of the policy, program
or initiative including the context and need, stakeholders and
beneficiaries, and resource allocations;
- Expected Results - is a description and illustration (i.e., logic
model) of how the activities of a policy, program or initiative are expected
to lead to the required economic, social and or environmental change,
accountabilities, and the critical assumptions on which the program, policy
or initiative is based;
- Monitoring and Evaluation - is a detailed roadmap for ongoing
performance measurement and evaluation activities that will support
effective program management and accountability.
The remainder of this Guide outlines the required elements for each component
and provides suggestions to enhance quality and ensure completeness.
To assist departments with “low risk” programs, we have provided
suggestions for page length for each major section. Please note that these are
suggested guidelines only. For medium and high-risk programs, the department
should still focus on keeping the RMAF short and focused. Page length, however,
is dependent on the complexity of the program, policy or initiative.
In this section, we provide a summary of the purpose of each component and
guidance in the information required. It is important to note that the
information presented in the tables form the basis for TBS’ expectations of an
RMAF.
The Program Profile provides a concise description of why a program,
policy or initiative exists, what issues or problems it addresses, who are the
key stakeholders and beneficiaries, what it is intended to achieve and the
resource requirements.
A complete and concise Program Profile section will help clearly communicate
what a program, policy or initiative aims to achieve and how. This and the next
component – Expected Results - provide the basis upon which monitoring and
evaluation activities are developed.
For “low risk” programs, the suggested length for this section is between two and four pages.
Table 4.0 Program Profile Information Requirements
1.0 Program Profile provides a concise description of
the program, policy or initiative.
|
Section
|
Key Elements
|
1.1 Context
|
|
1.2 Objectives
|
- Clearly state objectives of the program, policy or initiative.
- Describe how the objectives link to the department’s strategic
outcomes as identified in its Program Activity Architecture.
|
1.3 Key Stakeholders and Beneficiaries
|
- List all key stakeholders including delivery partners and project
beneficiaries.
- When information is available, identify targets in terms of reach to
project beneficiaries. When no targets are available, explain why and
how and when targets, if any, will be developed.
|
1.4 Resources
|
- Summarize (in a table) annual resources allocated to the department
and each delivery partner including salaries, O&M, transfers to
partners and capital costs.
- Specify estimated costs for ongoing performance measurement and
evaluation activities.
|
Expected Results present the results that a program, policy or initiative
intends to achieve and associated accountabilities. It is the focal point of the
RMAF.
A key element of this component is the logic model. The logic model is a
graphic representation of the causal or logical relationships (i.e., linkages)
between activities and outputs and the outcomes (i.e., results) of a given
policy, program or initiative, that they are intended to produce. The model
should be supplemented with explanatory text to help describe the linkages
(i.e., how one set of results or project outcomes lead to the next). A good
logic model validates the theory behind the program and is the first step in
developing realistic and relevant performance measurement and evaluation
strategies.
At this stage, TBS recommends taking stock of the potential internal and
external risks that may be associated with the program, policy or initiative
and, therefore, completing a risk assessment to confirm the
appropriateness of proposed results and associated performance targets [9]
.
For “low risk” programs, the suggested length for this section is
between two and three pages.
Table 4.1 Expected Results Information Requirements
2.0 Expected Results provides a logical
description and illustration of how activities and outputs lead to
results and associated accountabilities.
|
Section
|
Key Elements
|
2.1 Expected Results
|
- Identify results expected at various stages of program, policy or
initiative delivery and specify anticipated timeframes for the
achievement of results.
- Identify internal and external factors that may influence the
ability of a program, policy or initiative to achieve results [10].
Reference to your RBAF is acceptable but must be noted.
|
2.2 Logic Model
|
- Provide a logic model including, if necessary, explanatory text for
the program, policy or initiative ensuring that there is a logical
flow of activities to outputs to outcomes of the program.
- In the logic model, link final outcomes to the department’s
strategic outcomes as specified in its Program Activity Architecture.
|
2.3 Accountabilities
|
- Identify the roles and responsibilities (i.e., duties, obligations
and authorities) of the department and its delivery partners.
- Specify performance targets, reporting responsibilities and any
operating constraints [11] of the department or
its partners that may impact the department’s ability to deliver the
program or report on performance.
- For collaborative arrangements (i.e., programs or initiatives
managed or delivered jointly by partners), outline how this
relationship will be managed including how decision-making will take
place.
|
Developed in collaboration with a department’s Evaluation Unit, the
Monitoring and Evaluation Plan represents a Program Manager’s strategy to
monitor performance and demonstrate results.
The monitoring or performance measurement plan enables managers to establish
the necessary systems and processes to collect and analyze data and information
so that program performance can be optimized. Evaluation studies generate
accurate, objective and evidenced-based information to help managers make sound
management decisions, demonstrate success, show ongoing relevance and develop
more cost-effective alternatives to service delivery.
It is only through the combination of these two activities that Program
Managers and senior executives can demonstrate a program, policy or
initiative’s benefit to Canadians. Hence, consideration should be given to the
questions that comprise ongoing expenditure review activities.
For “low risk” programs, the suggested length for this section is
between two to four pages.
Table 4.2 Monitoring and Evaluation Plan Information Requirements
3.0 The Monitoring and Evaluation Plan provide
directions for ongoing performance measurement and evaluation
activities.
|
Section
|
Key Elements
|
3.1 Performance
Measurement Plan
|
- Outline the overall performance measurement strategy including
four to five key performance issues and provide a rationale as to why
this strategy is proposed. The performance measurement strategy should
outline what current systems (i.e. information systems as well as
operational systems) are in place to support monitoring and how, when
and by whom performance will be reviewed and adjustments made.
- For each key performance issue, identify the associated indicators/
measures and performance targets.
- Outline provisions to ensure data integrity.
- Provide estimated costs for performance measurement activities by
year.
- List all performance reporting commitments on the part of the
department and all delivery partners. The purpose of a report should
be clearly stated with an emphasis on how the report will be used to
improve performance.
|
3.2 Evaluation Plan
|
- Outline the overall evaluation strategy and provide a rationale as
to why this strategy is proposed.
- Formative evaluations should be used judiciously – primarily in
instances where questions arise as to the delivery of the program. They
may address specific delivery issues or focus on the quality of
performance information and reporting systems. Where “full”
formative evaluations are undertaken, outputs, early results, validation
of program logic, and the likelihood of long-term results achievement
must be assessed.
- For summative evaluations, identify all known evaluation issues [12]
this includes success, relevance, cost-effectiveness, and any issues
identified in past evaluation studies.
- Identify how and when the Expenditure Review Committee’s questions
will be incorporated into evaluation activities. (For a list of the
key questions see Appendix B.)
- Present an overall approach to evaluation (i.e., evaluation
framework) including: data sources, proposed methodologies, and
responsibilities for data collection. (See Appendix C for a sample
framework table.)
- Provide estimated costs for evaluation activities.
- List all reporting requirements associated with the evaluation
strategy including dates for development of the evaluation framework
and completion of evaluation studies.
|
The CEE reviews departmental evaluation plans to ensure an overall strategic
approach to evaluating government priorities. The following criteria are used to
support this process:
- Writing with only TBS in mind. While it is important that a
program, policy or initiative meets the overall requirements of TBS, an RMAF
must, first and foremost, be a useful document for the Program Manager. It
is his/her responsibility to ensure effective program management and the
RMAF is the key tool to help him/her do this.
- Relying too heavily on external consultants. Limited time and
expertise may result in a Program Manager contracting the development of an
RMAF to an external consultant. While external consultants can provide
valuable expertise, they are not responsible for the implementation of the
RMAF nor are they responsible for achieving results. Program Managers must
ensure that the consultant’s report accurately reflects their program and
that he/she can execute what is being proposed.
- Lengthy, complex documents. RMAFs can quickly become lengthy
documents making them difficult to implement. Since details about the
program, policy or initiative should exist in other documents, care should
be taken to provide only the essential information required to explain the
program, policy or initiative and the overall monitoring and evaluation
plan. The scope and level of detail of an RMAF (and RBAF) should be aligned
with the scope and complexity of the program, policy or initiative. For
example, straightforward, low-risk programs can be less than 10 pages [13].
- Failing to coordinate or consult with a department’s Evaluation Unit.
In the past, the lack of coordination and consultation with a department’s
Evaluation Unit has led to inadequate or difficult to implement performance
measurement and evaluation strategies. This wastes both resources
and greatly impedes a manager’s ability to provide credible, reliable, and
timely information of how a program is progressing. By engaging departmental
Evaluation Units early in the process, Program Managers can help ensure that monitoring
and evaluation activities “make sense” and can be implemented as
described.
- Submitting an incomplete performance measurement strategy. Many
Program Managers argue that it is difficult to know all their information
requirements at the time of preparing an RMAF. While this may be true –
especially for new programs –basic financial and administrative
information requirements are known and can represent the starting point for
monitoring activities. In addition, the longer it takes to develop a
complete performance measurement strategy, the longer it will take to
provide credible and reliable information to program management, key
stakeholders, senior executives and central agencies. The availability of
performance information will become increasingly important in ongoing
expenditure review activities and the use of the MRRS.
- Absence of performance targets and lack of associated baseline data.
Too often performance measurement strategies have not specified the
performance targets nor associated data requirements (i.e., baseline data
requirements). Without this information, it will be difficult to assess the
relative contribution of the program, policy or initiative and ensure that
appropriate information is collected and captured at the outset of the
program, policy or initiative. The identification of key performance issues
and associated performance targets is a new requirement designed to address
this issue.
- Thinking the job is done after you have received approval. Many
Program Managers see the development of an RMAF as an end in itself. Once
approved, they “shelve” the RMAF only to pick it up again when preparing
for evaluation activities. The RMAF is a Program Manager’s road map to not
only evaluation work but, more importantly, to performance measurement and
good management. Hence, the development of an RMAF should be viewed as the
beginning of good management practice and, therefore, a means to an end.
Steps should be taken to ensure that resources for monitoring and evaluation
are available at the outset of the program, policy or initiative to ensure
that commitments can be fulfilled during program, policy or initiative
implementation.
The RMAF’s true value is realized only when implemented. Suggestions on how
to make implementation easy and cost-effective include:
- Distribute the final RMAF (or a summary document) to all key stakeholders
to confirm roles and responsibilities for delivery, performance measurement,
evaluation and reporting.
- Ensure performance measurement, evaluation and reporting activities are
included in terms and conditions and contribution agreements.
- Begin working with your Evaluation Unit and related offices in the
organization responsible for collecting results information, as soon as
possible, to create the databases and reporting templates, and develop
detailed evaluation plans required to support decision-making.
- Consult the growing number of special studies, references materials and
other resources on effective results-based management, performance
measurement, risk management and evaluation activities available through the
TBS, Office of the Auditor General Canada and departments actively involved
in the management of Grants and Contributions programs to assist with
developing and implementing results-oriented programs, performance
measurement and evaluation activities.
- At a minimum, meet once a year with program personnel including delivery
partners, your evaluation manager, and other key stakeholders to review and
update the RMAF. Ask if the information being generated about the program is
helping to demonstrate or improve performance on a timely basis and whether
intended results and targets are still relevant.
The Policy on Transfer Payments requires the development of both an RMAF and
an RBAF to ensure that managers have the means and measures for program
monitoring, performance improvement, and reporting. The RMAF and RBAF are
complementary. The processes used to develop them have natural points of
integration that relate to the typical analytical and planning approaches used
by managers to monitor program operations and performance. For example, program
managers should simultaneously contemplate performance and risk issues when
defining expected results, performance targets, roles and responsibilities.
Departments may consider integrating RMAF and RBAF documents to ensure
effective coordination of these related activities or gain efficiencies in
preparation and the internal review process. (See Appendix D) for an example
of a Table of Contents for an Integrated RMAF/RBAF.
If you still need more help, contact your Evaluation Unit or visit the CEE
website at http://www.tbs-sct.gc.ca/eval/eval_e.asp..
Senior Financial Officer (SFO) or his/her delegate
The acceptability of the proposed TB Submission with respect to the
following:
- Authorities;
- Sources of funds;
- Detailed cost information;
- Audit and evaluation commitments;
- Terms and conditions;
- Results-Based Management and Accountability Framework;
- Risk-Based Audit Framework; and,
- Other accountability documents as specified by TBS.
Note: SFO approval is required before a submission and related documents can
be presented to TBS.
Program Management (ADM-level) (ADM-level)
- There is a clear rationale presented for the scope and level of detail
proposed as determined by the level of risk associated with a program,
policy or initiative.
- Content is accurate and reflects the design, operation and management of
the policy, program or initiative.
- There is commitment to monitor and evaluate the performance and risks of
the program/policy/initiative and funds will be reserved and transferred as
required.
- Accountabilities for delivery, collection and reporting of performance
information and evaluation activities are clear.
Departmental Evaluation Unit (Head of Evaluation)
- A performance measurement strategy exists and allows the department to
monitor and report on performance towards the achievement of results with a
high degree of reliability.
- A preliminary evaluation strategy and framework exists and meets TBS
Evaluation Policy requirements.
- Evaluation(s) activities are timed appropriately to ensure evidence-based
results information is available to senior management on a timely basis.
- Expenditure Review Committee questions have been incorporated into the
preliminary evaluation framework where appropriate and logical.
- Estimated costs for monitoring/performance measurement and evaluation
activities have been provided and are realistic.
Appendix B:
Expenditure Review Committee Questions
The Expenditure Review Committee, which was established in Winter 2004,
assesses existing programs and government spending using two sets of criteria.
The first set of criteria are policy tests for program. They involve questions
regarding:
- Public Interest - Does the program area or activity continue to
serve the public interest?
- Role of Government - Is there a legitimate and necessary role for
government in this program area or activity?
- Federalism - Is the current role of the federal government
appropriate, or is the program a candidate for realignment with the
provinces?
- Partnership - What activities or programs should or could be
transferred in whole or in part to the private/voluntary sector?
- Value-For-Money - Are Canadians getting value for their tax
dollars?
- Efficiency - If the program or activity continues, how could its
efficiency be improved?
- Affordability - Is the resultant package of programs and activities
affordable? If not, what programs or activities would be abandoned?
The second set of criteria are implementation tests. Only if there is a
proposal to change expenditures are the following questions examined.
Incorporation of these criteria into evaluation studies should be incorporated
as required and appropriate.
- Achievability - Are proposed expenditure reductions and timelines
achievable and sustainable? How will their impacts be managed over time?
- Future Cost - Do the proposed changes avoid or create future cost
or program pressures?
- Capacity - What is the effect of any proposed changes on policy and
analytical capacity? On operational and delivery capacity?
- Human Resource Management - What is the effect of any proposed
changes on human resource management, staffing levels, and compensation
costs?
- Program Integrity - Do any proposed changes address existing
operational and program integrity pressures? Do proposed changes ensure
ongoing integrity of departmental corporate governance and comptrollership
capacity, and information management systems?
- Horizontal Implications - Has the impact of any proposed changes on
other departments been clearly specified? What is the effect of proposed
changes on other levels of government, the private sector, and the voluntary
sector? “What is the effect of proposed changes on the departmental
corporate risk profile, and what strategies does the department recommend to
mitigate unacceptable risk? Does the proposal incorporate contingencies to
address major risks associated with implementation of any changes proposed?
The Evaluation Framework presented in the RMAF is subject to review and
updating as the program evolves taking into account such factors as changing
circumstances, program changes and lessons learned. It is the basis of
developing a more detailed evaluation plan for the formative (if
applicable) and summative evaluations
Evaluation Activity
|
Issues
|
Data Sources
|
Data Analysis Methods
|
Frequency of Analysis
|
Responsibility
|
Formative Evaluation [14]
|
Continuous Improvement : Are there ways to improve
program delivery from either an effectiveness or efficiency perspective)?
|
|
|
|
|
|
Performance Measurement Systems: Is appropriate
performance information being collected, captured, safeguarded and used?
Is data quality assured?
|
|
|
|
|
|
Program Design and Implementation:
Is the program being delivered/implemented as it was
designed? Etc. etc.
|
|
|
|
|
|
Other Issues :
|
|
|
|
|
Summative Evaluation
|
Success : Is the program, policy or initiative effective
in meeting its objectives, within budget and without unwanted outcomes?
|
|
|
|
|
|
Relevance : Does the program, policy or initiative
continue to be consistent with departmental and government-wide priorities
and does it realistically address an actual need?
|
|
|
|
|
|
Cost-Effectiveness : Are the most appropriate and
efficient means being used to achieve objectives, relative to alternative
design and delivery approaches?
|
|
|
|
|
|
Other Issues :
|
|
|
|
|
|
ERC Questions/Issues :
(See Appendix B.)
|
|
|
|
|
1.0 Introduction
1.1 Background
1.2 Level of Integration
1.3 Overall Risk Assessment
2.0 Program Profile
2.1 Context
2.2 Objectives
2.3 Stakeholders and Beneficiaries
2.4 Resources
3.0 Expected Results
3.1 Expected Results
3.2 Key Risk Areas
3.3 Logic Model
3.4 Accountabilities
4.0 Risk Assessment and Management Summary
4.1 Key Risks
4.2 Existing Mitigating Measures
4.3 Incremental Strategies
5.0 Monitoring, Evaluation and Auditing
5.1 Monitoring Plan
5.1.1 Performance
5.1.2 Risk
5.2 Evaluation Plan
5.3 Internal and Recipient Auditing
5.4 Reporting Commitments
Appendices (as required)
[1] For a complete summary of changes to the CEE’s
guidance on RMAF development, see “Summary of Changes to Guidance on
Developing Results-based Management and Accountability Frameworks”. (February
2005)
[2] A Management, Resources and Results Structure
(MRRS) is a comprehensive framework that consists of an organization’s
inventory of activities, resources, results, performance measurement and
governance information. Activities and results are depicted in their logical
relationship to each other and to the Strategic Outcome(s) to which they
contribute. The MRRS is developed from an organization’s Program Activity
Architecture.
[3] A Program Activity Architecture is an inventory
of all the activities undertaken by a department or agency. The activities are
depicted in their logical relationship to each other and to the Strategic
Outcome(s) to which they contribute.
[4] See Section 8.1.1 (xv) of the Policy on Transfer
Payments. (June 2000)
[5] Consolidating a number of programs under a single
RMAF is called an Umbrella RMAF. For additional information on how and when to
use Umbrella RMAFs, see “Guidance for Strategic Approach to RMAFs”.
(February 2005)
[6] Strategic outcomes are the long-term and enduring
benefits to Canadians that stem from a department or agency’s mandate, vision
or efforts. It represents the difference that a department or agency wants to
make for Canadians.
[7] For example evaluation strategies should adhere
to the Evaluation Standards for the Government of Canada as set out in the
Treasury Board Evaluation Policy. (April 2001)
[8] This is an issue examined by the Expenditure
Review Committee. For a complete list of ERC questions see Appendix B.
[9] For additional assistance on completing
risk assessments consult the Risk-based Audit Framework found on the Centre
of Excellence for Internal Audit website (http://www.tbs-sct.gc.ca/ia-vi/home-accueil_e.asp).
[10] At this stage, a strategic risk assessment of
the program, policy or initiative should be conducted to ensure that results
statements and timeframes are aligned and balanced with the department’s (and
its partners) capacity (authorities, skills and resources) to deliver.
[11] Examples of operating constraints may include
administrative rules and procedures to follow or rules to follow such as values
and ethics policies, privacy policies, etc.
[12] TBS recognizes that not all evaluation issues
will be known at the time of preparing an RMAF. Therefore, evaluation issues
identified in the RMAF represent a starting point on which to build a more
detailed evaluation plan for the program, policy or initiative. Activities to
support the development of a detailed evaluation plan for a program, policy or
initiative should be incorporated into the operational plans of a program and
begin in the early stages of delivery.
[13] The CEE recommends an RMAF be concise and
focused and, where possible, integrate RMAFs and RBAFs in order to efficiently
consolidate findings and ensure enhanced coordination of explicitly linked
tasks. For more information see Section 7.0 Integrating RMAFs and RBAFs.
[14] As noted in Section 4.3 Monitoring and
Evaluation, the CEE recommends the judicious use of formative evaluations. Where
appropriate, the use of formative assessments or other assessment or evaluation
approaches should be considered. Issues identified in this table under formative
evaluation are suggestions only.
|