Canadian Flag Transport Canada / Transports Canada Government of Canada
Common menu bar (access key: M)
Skip to specific page links (access key: 1)
  
Corporate Services
* CS Overview
* Organization
* Services
* Internal Audit Reports
* Departmental Evaluation Services
* Related Sites
* Contact Us
   
Skip all menus (access key: 2)

Audit & Advisory Services - Banner

Audit on Quality of Information Used for Resource Related Decision Making - Phase 1 Report

November 2005

File number: 1577-04-016

Print Version |


table of contents

executive summary

1. introduction

2. findings, recommendations and management action plan

Appendix A – Current State of Implementation: Summary by Service Line


executive summary

There are a number of external pressures and initiatives such as government wide expenditure reviews that are challenging Transport Canada managers to continually justify their resource requirements and to fund new or changing priorities within existing resource allocations. These pressures and initiatives are increasing the need for reliable and accurate information that clearly links an organization’s resources with the results it achieves.

This audit assesses the quality of information available to make resource related decisions. The audit is divided into two phases. The first phase profiles the current state of progress each Safety and Security service line and the Business Line as a whole has made in implementing a performance management approach that provides a solid basis for resource related decisions. The findings and recommendations for Phase 1 are documented in this report. Phase 2 will examine two service lines, Civil Aviation and Marine Safety that employ different approaches to planning and monitoring FTE resource utilization and specifically assess the quality of their information. Phase 2 results will be documented in a subsequent report.

The state of resource related information that provides management a solid basis for making resource related decisions such as resource allocation decisions varies significantly between Safety and Security service lines. A few service lines have in place processes and systems that allow them to plan and track resource utilization for their activities. For the majority of Safety and Security service lines significant progress is required to establish the required elements (e.g., commonly defined activities/tasks; quantified workload drivers, service level standards etc.) for improved resource planning and tracking.

Although significant work has been carried out by the individual service lines to define individual performance frameworks only a few currently have the ability to categorize their FTE resource utilization by the activities defined in their frameworks. Also, even though there is a high degree of similarity between the various Safety and Security service line performance frameworks, there are still sufficient differences that make it impractical to assess and compare performance of programs/services in support of overall Safety and Security strategic objectives.

As it relates to Safety and Security, the currently defined Program Activity Architecture (PAA) whose purpose is to link resources (budget and expenditure data) to the PAA defined activities is of limited use for improving resource related decision-making since the definitions do not describe common horizontal functions that are carried out by the Safety and Security service lines. For example, the resources expended on a common function such as compliance monitoring and enforcement currently cannot be tracked using the PAA. Moreover, the PAA and the individual Safety and Security service lines’ performance framework activity areas are not consistent. This situation creates additional workload for service lines to maintain their own performance framework and associated data as well as reporting into the PAA, the department’s official reporting system.

To address the phase 1 findings it is recommended that the ADM Safety and Security:

  • determine the need for comparative analysis of service line’s performance within and across service lines.

  • design and implement a resource information system that will link resources (e.g., FTE) to activities to outputs and to results

    • define the resource related information requirements (e.g., FTEs) and provide specific direction to S&S service lines as to how FTE resource information should be planned, collected and monitored.

    • determine the requirements for a standardized approach to planning and tracking resources and activities including assessing the feasibility of a common automated planning and activity reporting system and establish a feasible implementation plan.

Also, the ADM Safety and Security and ADM Corporate Services should collectivel y determine appropriate changes to the PAA activity structure to enable Safety and Security to improve the ability to monitor service lines’ performance and improve resource related decisions especially for the key functions and processes that cut horizontally across organizational boundaries. The appropriate changes should be implemented in a timely manne

Audit and Review Committee Decision

The Audit and Review Committee approved the audit report and management action plan as presented on November 16, 2005.


1. introduction

1.1 purpose of the audit

The audit is to assess the quality of information available to make resource related decisions. The initial scope is to examine the Safety and Security Business Line with a specific focus on time and activity tracking systems that are used to plan and track Full Time Equivalent (FTE) resource utilization. The audit is divided into two phases. The first phase profiles the current state of progress each Safety and Security service line and the Business Line as a whole has made in implementing a performance management approach that provides a solid basis for resource related decisions. The findings and recommendations for Phase 1 are documented in this report. Phase 2 will examine two service lines that employ different approaches to planning and monitoring FTE resource utilization and specifically assess the quality of information. Phase 2 results will be documented in a subsequent report.

1.2 background

There are a number of external pressures and initiatives that are challenging Transport Canada managers to continually justify their resource requirements and to fund new or changing priorities within existing resource allocations. These pressures and initiatives are increasing the need for reliable and accurate information that clearly links an organization’s resources with the results it achieves.

Since the mid 1990s, the government has moved towards a results-based approach to management. A series of reforms, including Estimates Reform, Comptrollership Modernization, and Results for Canadians have been undertaken, challenging Deputy Heads to build a “new management culture” that focuses on results that matter to Canadians, and to demonstrate that results have been achieved. More recently, the Management Accountability Framework has been introduced to highlight a Deputy’s responsibility for providing Ministers with evidence about results that supports continued funding of programs.

The Prime Minister established an Expenditure Review Committee (ERC) of Cabinet in December 2003 as part of a series of initiatives designed to strengthen the Government’s financial management and accountability. The ERC’s mandate was to carryout rigorous reviews of federal spending, testing for relevance, efficiency and excellence, and to submit recommendations to the Prime Minister. The ERC is reviewing existing programs and government spending by assessing six areas: Public Interest Role of Government; Federalism; Partnership; Value-for-money; Efficiency; Affordability. The last three areas are directly related to the effective use of resources.

Moreover, Treasury Board continues to express the view that Departments and Agencies should look to internal reallocation of resources, both as a good management practice and to assist departments to manage within their limited funds.

Treasury Board replaced the Program Resourcing Activity Structure (PRAS) with the Management Resources and Results Structure (MRRS), as of April 1, 2005. The underlying objective of MRRS is to link objectives, results and resources. Departments were requested to develop a Program Activity Architecture (PAA) that links resources to activities and results. As of April of this year, Transport Canada’s Chart of Accounts was updated with new PAA coding.1

Over the past few years Transport Canada’s Senior Management Executive Committee (TMX) expressed the need for validation of service lines’ existing resource base to support their decision-making during resource allocation exercises.2 A series of service line resource reviews were launched to review existing resource allocations within and among service lines. A service line review was expected to provide assurance that:

  • TC is spending an appropriate level of resources to undertake activities that are relevant and contribute to the department’s objectives and priorities.

  • The level of resources is based on a benchmark applied in a consistent manner nationally

  • Programs are delivered in an efficient and effective manner.

Given that these reviews were carried out directly by a specific service line, they are primarily self-assessment exercises and, as such, do not necessarily provide the degree of objective analysis as would be expected by an independent review carried-out by a third party. Consequently, these exercises often result in a request for additional resources as opposed to recommending changes to priorities and a re-allocation of existing resources.

Most recently, TMX announced that comprehensive reviews would be carried out in Safety and Security, Corporate Services and Communications, likely by an external third party to provide an independent assessment of the various programs and services. In a communiqué from the Deputy Minister to his Executive Management Team in April 2005, the Deputy described the characteristics of the reviews:

  • They are not a downsizing exercise. It is part of our on-going commitment to ensure best use of resources and to better understand our flexibility to meet changing priorities and pressures;

  • They will start this fiscal year;

  • They cover functions in both Regions and Headquarters;

  • They will have similar methodologies, and

  • They will look at current spending but will also attempt to look forward at priorities and needs over the next five to ten years.

All of the government wide and internal exercises and initiatives described above focus on ensuring limited resources are allocated to the highest priorities and that resources are efficiently utilized. Simply put, the goal for every manager is to do the rights things the best way. Making informed decisions on where to allocate resources based on an assessment of effectiveness and efficiency requires quality information.

Three fundamental pieces of information are needed to ensure that resources are allocated and operations managed with due regard for value for money: information on effectiveness, efficiency, and costs. A framework clearly describing the links between resources, activities, outputs and results, along with meaningful indicators and measures to assess the level of achievement is fundamental to assessing program performance and overall cost effectiveness.

Resources Arrow point to the right ActivitiesArrow point to the right OutputsArrow point to the right Results (Immediate to Long-Term)

There are a number of additional factors that an organization would also incorporate into its decision making such as clearly defining its relationships with regulatees, direct clients, service deliver partners, other stakeholders etc. along with employing a sound method of identifying, ranking and managing risks to successfully delivering their program. This audit does not examine these additional factors.

1.3 objectives and scope

The Audit and Review Committee approved this project as part of the 2004-05 annual audit and review plan. The purpose is to assess the quality of information available to make resource related decisions. The overall objectives are to:

  • identify the information sources used for determining resource requirements and making allocation decisions;

  • assess the quality of the information being used; and

  • determine the capacity to respond to on-going expenditure/resource reviews.

The project is being carried out in two phases. The first phase profiles how the Safety and Security Business Line and individual Service Lines link resources, activities and outcomes/results and identifies the mechanisms in place to ensure the quality of the information. Given that FTEs represent the largest portion of the Safety and Security Service Lines’ budgets, the primary focus is to assess the availability of information to identify FTE requirements and make resource allocation decisions. The Strategies and Integration Directorate was not included in the scope since it is primarily a support organization for the other Safety and Security Headquarter Directorates. The first phase findings and recommendations are described in this report.

The second phase of the audit will examine selected time and activity reporting systems and assess the quality of the data (e.g., accuracy, relevance, reliability, integrity and timeliness etc.) and how it is used in decision-making. Two systems will be examined, Civil Aviation’s Activity Reporting and Standards System (ARASS) and Marine Safety’s National Time and Activity System (NTARS).

1.4 criteria

In general, the expectation is that Safety and Security at both the Business Line and individual service line level would have available, quality information that links resources to activities and outputs and to the extent possible to results to allow informed resource related decisions. Given that building the capacity to measure performance and manage by results is an evolutionary process the assessment of the current state of each service line’s progress was gauged using a five stage model: 1 – Awareness; 2- Exploration; 3– Transition; 4 – Full Implementation; 5 – Continuous Learning. (See Appendix A).

The following Phase 1 criteria focuses primarily on the availability of:

  1. Basic budget information that describes the resources allocated to the organization to fulfill its mandate . (Resources)

  2. Information that describes the activities that the organization carries out to fulfil its mandate and the extent to which activity data is linked and integrated with basic budget data. (Activities)

  3. Information that describes the services delivered by the organization. (Outputs)

  4. Clear descriptions of the results an organization is trying to achieve along with indicators to measure the achievement of results and overall service-line performance. (Results/Outcomes)

1.5 methodology

  • Developed a Model to rate service line capacity according to five criteria categories based on measurement tools and criteria from the Treasury Board’s Work Book for Reporting on Results in Departmental Performance Reports; TBS & Office of the Auditor General Managing for Results Self-Assessment Tool; Canadian Comprehensive Auditing Foundation ( CCFA) Principle Reporting Documents and “The Three Rs of Performance” by Performance Management Network’s Steve Montague.

  • Carried out detailed document reviews to assess the actual status of each organization’s progress in developing the capacity to plan and track resource usage and manage by results. The following lists the primary generic documents reviewed for most organizations.

    • Service Line Reports
    • Service Line Resource Review Reports (if completed)
    • Departmental Performance Report
    • Report on Plans and Priorities
    • High Level Assessment of Performance Measures and Integrated Risk Management in Transport Canada (TC)3

  • Consulted with Planning and Performance Management Team Members (PPMT)

  • Interviewed Financial Management

  • Participated on Safety and Security Service Line Resource Reviews Working and Steering Committees.

1 Presentation Deck RDIMS # 1036724: Program Activity Architecture Status Finance and Administration Extended Management Meeting Jan. 18, 2005
2 A Service Line is comprised of HQ functional management and Regions deliver programs.
3 The Avcon Group – Report presented to the Audit and Review Committee April 2005.


2. findings, recommendations and management action plan

2.1 performance-based information for resource related decision-making

Almost all service lines in Safety and Security, working individually and in some cases with the direct assistance of external consultants or TC’s Evaluation Services, have developed performance frameworks that link macro activities/functions with the results the organization is trying to achieve. With few exceptions, the macro activity/function categories are not further broken down into specific activities and tasks nor are they linked directly to resource data that would allow management to plan and track resource utilization to monitor efficiency and assess overall cost effectiveness.

2.1.1 Required elements for resource planning and tracking have yet to be developed. Lack of commonly defined activities/tasks, quantified workload demands; service level standards; baseline data etc

To enable on-going resource planning and tracking a number of basic elements are required. Activity/Tasks that are defined, workload demands quantified or estimated, and service level standards are some of the key elements that need to be in place to allow a program to accurately plan and track its resource utilization.

Other than Civil Aviation, Marine Safety and parts of Aircraft Services, the other Safety and Security service lines have not defined common activity/task definitions that would allow resource utilization to be planned or tracked. Without commonly defined work activities/tasks it is not possible to accurately determine resource requirements and compare relative resource needs to determine equitable resource allocations. This creates the potential of unequal workloads and resource levels that could create strains on every aspect of a program, from policy development to program delivery.

Quantifying workload demand is another important information element that has yet to be well defined. Again, there are a few exceptions, but many service lines have only begun to clearly define their various clients/stakeholders in order to quantify the expected demand for their regulatory services. Defining the level of demand for service requires a number of basic factors to be quantified. Such as:

  • Total population of the target population;

  • The frequency of regulatory activity (e.g., safety audits, inspections, licensing and certification requirements etc.) -- defined by a common approach such as risk factors, regulatory requirements etc.;

  • Service level standards (e.g., response times, approval or turnaround times).

Once service demands are well defined, it is important to have the ability to estimate activity/task workload effort standards, i.e., the estimated average time required to complete specific activities/tasks. This information combined with the workload demand information allows resource requirements to be accurately estimated.

Other than Civil Aviation and to a certain extent Marine Safety and Aircraft Services, which have the benefit of time and activity reporting systems, the other S&S service lines maintain limited baseline and historical data. This type of information is needed to allow on-going trend analysis of resource utilization at the activity level to better support resource allocation decision-making.

2.1.2 Two existing but different approaches to planning and activity reporting systems

There are various approaches that can be taken to define common activities/tasks and implement a process to plan and track resource utilization. Civil Aviation and Marine Safety have established national systems. Aircraft Services has time and activity reporting systems in place for two of its functions, Technical Services and Engineering, which are integral to their cost recovery activities.

Civil Aviation has an activity planning and resource utilization tracking system, which is based on defined work task standards that detail the amount of effort, expended on average to complete a task. The approach requires time availability formulas to be defined for various work categories since actual time expenditures are not recorded. This approach is sound, as long as the risk of inaccurate work task standards is sufficiently controlled. Targeted real-time reporting and time and motion assessments are the primary means Civil Aviation employs to validate their workload standards on an on-going basis.

Marine Safety has taken another approach by creating a national time and activity reporting system that records 100 percent of an employee’s time expended on commonly defined activities/tasks. This approach captures the total time expended to complete a task and does not require available working time to be calculated, since the actual time expended on leave, training, administration etc., is recorded. The system has been in place for just over two years. The service line has plans to analyze the data collected to calculate average times for the various activities/tasks and once this next step is completed, it will have the ability to assess resource requirements and the equitable allocation of resources.

The various benefits and risks of the different approaches to time and activity planning and reporting taken by Civil Aviation and Marine Safety will be examined in Phase 2 of this audit. Aircraft Services will not be included in the scope of Phase 2, since there will be an audit of Cost Recovery which will likely include Aircraft Services in its scope and the fact that the ASD approach does not apply to the whole service line.

2.1.3 There is a lack of consistent and complete resource planning and utilization data necessary to strengthen resource related decisions.

With slightly over 60% of Safety and Security’s budget directly related to personnel costs (FTEs), it is critical to understand how effort is expended. As stated in the Rail Safety Resource Review draft report, “until an ongoing time recording system is in place it will not be possible to accurately assess where resources are, and have been, deployed.”

The resource review exercises and other similar studies, e.g. Security and Emergency Preparedness and Marine Security (SEP&MARSEC) Effectiveness Review, currently being carried out in Safety and Security clearly demonstrate there is a lack of complete and consistent resource utilization data that allows comparative analysis both within a Service Line and between service lines.

Resource reviews have been carried out in Civil Aviation, Marine Safety, Rail Safety, Security and Emergency Preparedness and Marine Security (i.e., Effectiveness Review) and are currently underway in Road Safety and Transportation of Dangerous Goods. Other than the Civil Aviation resource review, each of these reviews demonstrates that the current state of resource utilization data is inadequate to allow for meaningful comparisons to determine optimal levels of efficiency. For example for most service lines an equitable allocation of resources between regions cannot be accurately determined, given the current state of information available.

As described in the Rail Safety Resource Review “performance data are not available, output information is inconsistent and requires further development and resource information is simply financial reports, which is inadequate for costing analysis and linkages to outputs and results.”4 Similar situations exist to varying degrees in the other Safety and Security service lines with Civil Aviation and specific areas in Aircraft Services as exceptions.

2.1.4 In most cases Performance Framework activity categories have not been linked to resource information.

Most Service Lines that have created a performance framework recognize the importance of trying to link resources to the activity areas defined in their logic model. Ultimately, these frameworks should demonstrate the links between resources, activities, and outputs to a service line’s results to provide a basis for assessing overall cost effectiveness.

To date, with two exceptions, Safety and Security Service Lines lack the information to accurately plan and report resources according to their logic model activity areas.

Civil Aviation’s ARASS resource planning system can map their FTE resources (as defined in their Activity Reporting and Standards System [ARASS] task codes) to their logic model enabling them to manually report resource expenditures against their five activity areas: Qualifying Aeronautical Products; Individuals & Organizations; Oversight of the Aviation System; Education/Promotion/ Evaluation; Rulemaking and Agreements; Leadership and Management. This is a significant step towards being able to plan and report resources against activity areas. Marine Safety also has mapped their NTARS codes to its performance framework activity categories.

The other service lines have yet to establish the required elements for resource planning and tracking. Without the capability to link resources to activities and activities to outputs and results, it is not possible to assess overall cost effectiveness.

2.2 performance framework

2.2.1 The lack of a fully implemented Safety and Security logic model and performance framework impedes the ability to compare service lines.

A coherent and complete Safety and Security Business Line level logic model has not yet been fully implemented. In the past few years most Safety and Security service lines have developed their own logic models and performance frameworks that define their key activities, outputs, and desired results along with the associated measures and indicators. Table 1 on the next page displays the similarities and differences between the logic model activity areas as defined by the various service lines.

Although there is a high degree of similarity between the frameworks, there are still sufficient differences that make it impractical to assess and compare performance of programs/services in support of strategic objectives. Also, there is a risk of misinterpreting the data and/or creating an incomplete and inaccurate performance profile of the overall Safety and Security business line.

In addition, there is a lack of consistent, complete data to make informed resource decisions. Each Safety & Security service line is progressing at various rates towards developing useful data to accurately measure and monitor their organization’s performance and manage by results. Most service lines have defined their own performance frameworks with activities, outputs, results and associated indicators. For example, both Marine Safety and Civil Aviation certify operators. Civil Aviation has created a unique activity in their performance framework, where as, in contrast Marine Safety defines the activity under their Compliance and Enforcement category.

Another example where common definitions would be expected relates to defining and measuring compliance. Given all the safety regulatory service lines share the goal of achieving compliance to their regulations, one would expect each performance framework to define compliance as one of the desired results and that a common approach to measuring the rate of compliance could be defined. For example, as part of TDG’s resource review currently underway compliance data based on random sampling will be collected to determine a “compliance rate” baseline, in which, to track performance over time. Currently, no other service lines are measuring rates of compliance, the same way as TDG plans to.

In the absence of a fully implemented common S&S performance framework and commonly defined performance indicators for the Safety and Security Business Line as a whole, significant effort and interpretation would be required to assess the use of resources between programs and service lines, the relative levels of efficiency and overall performance.

Table 1: Comparison of S&S Activity Performance Framework Categories  

Service Lines

Policies & Rulemaking

Monitoring

Outreach

Management & Admin

Other

Aircraft Services **

 

 

 

 

 

Client & Internal services

 

Other Aircraft Services Functions

Civil Aviation

Rulemaking & Agreements

Oversight of the System

Education, Promotion and Evaluation

Leadership & Management

Qualifying Products, Individuals & Organizations

Rail Safety

Policy, Regulation, Standards & Research

Monitoring, Investigation & Enforcement

Outreach – Education, Awareness & Funding Programs

Program Planning & Management

 

Road Safety

Policy and Regulatory Instrument Development

Safety Enforcement

Leadership & Promotion

 

Research

SEP

 

Oversight and verification

 

 

Analysis & evaluation

 

Strategic stakeholder relationships

Leadership & Effective management

Emergency and critical incident management

Marine Safety

Regulatory Framework

Compliance & Enforcement

Education & Awareness

Program Management

 

Marine Security *

 

 

 

 

 

TDG *

 

 

 

 

 

Notes:
* At the time of the study Logic Models still draft or under development
** Aircraft Services is a common service organization unlike the other service lines, which are regulatory oversight programs. Therefore it is expected that they have defined unique activity categories.

2.2.2 There has been limited direction to guide the development of consistent performance frameworks that would incorporate core common elements.

To date there has been limited direction provided to the S&S service lines to guide the development of core common elements that would allow comparative analysis of performance information. Although variations exist between the service lines’ performance frameworks, they are still fundamentally similar. A common framework with core activities, outputs, results and associated indicators could be defined.

The Safety and Security Planning and Performance Management Team (PPMT) is a collegial body, whose mandate is to help ensure that planning and results-based management initiatives are undertaken in an integrated and consistent manner throughout the Safety and Security business line.5 The PPMT network provides an ideal forum for sharing approaches and practices, however, it has not been given the responsibility to establish a comprehensive framework and reporting process that would provide senior management a complete Business Line-level performance framework which would allow the tracking of comparable data and provide an objective basis for resource decision making. The ADM and his Senior Management Committee are ultimately responsible for ensuring a common framework and approach for monitoring Service Lines’ performance is in place.

2.3 departmental resource information

2.3.1 The PAA whose purpose is to link resources (budget and expenditure data) to the PAA defined activities is of limited use for improving resource related decision-making since the sub- activity definitions do not describe horizontal functions/processes.

The PAA groups all Safety and Security Service Lines under an activity described as Policies, Rulemaking, Monitoring and Outreach in support of a safe and secure transportation system. Virtually every organization within the Safety & Security Business Line fits under this activity category. The sub and sub-sub activities describe the existing departmental organization structure. Further breakdowns, for example, define Aviation Safety as a sub-activity and Commercial and Business aviation as a sub-sub-activity.

The department’s chart of accounts already utilizes codes, such as region, organization and responsibility [centers] to describe the department’s organization structure. This allows reporting of financial resource information by organizational hierarchy. The program activity codes, which for the most part duplicate the organization codes that already exist in the TC Chart of Accounts, provide no additional value to help managers make more informed resource related decisions especially for the key functions and processes that cut horizontally across organizational boundaries within and across service lines.

2.3.2 The PAA does not match the activity categories defined by the Service Lines in their Performance Frameworks (logic models).

The PAA activity definitions are not consistent with the activities defined in the performance frameworks of the various Safety and Security service lines.

This situation requires Service Lines to maintain two different reporting frameworks and increases the risk of inconsistent performance reporting. Given limited resources Service Lines should not be required to develop and maintain performance data for both the PAA and their own Performance frameworks.

Although the PAA is the official departmental reporting resource reporting framework the individual service line performance frameworks’ activity definitions such as monitoring, outreach etc. are for the most part better descriptions of the horizontal activities that each service line carries out to meet their strategic objectives and those of the Safety and Security Business Line, as a whole. Although, each service line has slightly different activity/function categories, as displayed in Table 1, they are fundamentally very similar and closely resemble the PAA’s, over arching activity, “Policies, Rulemaking, Monitoring and Outreach” in support of a safe and secure transportation system.

If the PAA activity description was divided into its component parts to define the sub-sub- activities, it would be consistent with most Safety and Security Service Lines’ performance framework activity categories. For example, there could be five core sub-sub-activities for all Safety Security Service Lines:

  • Policies;
  • Rulemaking;
  • Monitoring;
  • Outreach; and
  • Program Management (program support activities).

This would allow for both an organizational roll-up at the sub-activity level (e.g., Aviation Safety) and a horizontal grouping by activities that cut across organization boundaries (i.e., policies, rulemaking, monitoring, outreach and program management). Moreover, this would also be consistent with Safety and Security’s mission statement to advance safety, security, efficiency and environmental protection to achieve a sustainable transportation system through:

  • Policy Development
  • Rule-Making
  • Monitoring and Enforcement
  • Outreach

Additional adjustments would likely be required to accommodate Aircraft Services and Strategies and Integration Service Lines into a revised PAA given their unique roles and responsibilities as compared to the other regulatory oversight functions performed by the majority of the other Safety and Security service lines.6

Recommendations and Management Action Plan


Recommendation(s)

Management Action Plan with Expected Completion Date

The ADM Safety and Security should:

  • Determine the need for comparative analysis of service line’s performance within and across service lines.
  • Design and implement a resource information system that will link resources (e.g., FTE) to activities to outputs and to results.
    • define the resource related information requirements (e.g., FTEs) and provide specific direction to S&S service lines as to how FTE resource information should be planned, collected and monitored.
    • determine the requirements for a standardized approach to planning and tracking resources and activities including assessing the feasibility of a common automated planning and activity reporting system and establish a feasible implementation plan.
  • Agree with the need for some comparisons in key areas of Safety & Security (for example, regulation-making; outreach). The Comprehensive Review will serve as the catalyst for determining the basis for such comparisons and for benchmarking key areas. (OPI – Lead for the Safety component of the Comprehensive Review)
  • The Safety & Security Planning & Performance Management Team (PPMT) will continue its work on finalizing a Safety & Security performance logic model as a useful step in moving towards a common approach to reporting on results within the group.
  • The outcome of the Comprehensive Review and Phase II of the Audit will also influence any future design and implementation of a resource information system. Next steps will be determined upon completion of these initiatives.

The ADM Safety and Security and ADM Corporate Services should collectively determine appropriate changes to the PAA activity structure to enable Safety and Security to improve its ability to monitor service line performance and improve resource related decisions especially for the key functions that cut horizontally across organizational boundaries. The appropriate changes should be implemented in a timely manner.

  • The ADMSS agrees to consult with the ADMCS (upon completion of the Safety component of the Comprehensive Review, planned for June 2006) with respect to this recommendation to determine the feasibility of making any changes to the PAA.

4 Rail Safety Resource Review Final Report March 2005 pg. 48
5 Planning And Performance Management Team (PPMT) Terms Of Reference
6 Financial Codes Annex 10 TP 117 Segment 4 Program Activity Codes Printed May 16, 2005


Appendix A — Current state of Implementation: Summary by service line

In general, the expectation is that Safety and Security at both the Business Line and individual service line level would have available, quality information that linked resources to activities and outputs and to the extent possible to results. Given that building the capacity to measure performance and manage by results is an evolutionary process, the assessment of the current state of each service line’s progress was gauged using a five stage model: 1 – Awareness; 2- Exploration; 3– Transition; 4 – Full Implementation; 5 – Continuous Learning.

These stages and their definitions are consistent with a sequence of implementation common to most organizational transitions. These are conceptual stages that describe the predominant behaviors of the organization at a particular point in time.

Definitions for each of the stages of implementation are as follows:

  1. Awareness : The organization is aware of, but not yet organized to explore better ways to define, collect and analyze activity data as part of the overall objective to monitor the organization’s performance and manage by results. In this stage people in the organization recognize that what they have been doing is inadequate and that there must be a better way of proceeding.

  2. Exploration : The organization begins to commit to defining the type of information needed to allow meaningful analysis of resource utilization and management by results. Different approaches are researched and primary steps are taken. During this stage, people begin to pick up on new ideas from a variety of sources. The exploration may take the form of learning groups, benchmarking studies and pilot projects.

  3. Transition : The organization has committed itself to managing for results and attempting to make the transition from previous approaches. In this stage, people begin to make a commitment to the new practices required such as linking resources to common definitions of activities, outputs and short to long-term results. Approaches are being developed to collect, analyze and report on performance.

  4. Full Implementation : The organization fully implements managing for results in all areas. In this stage, groups across the organization begin to see and look forward to the real benefits of the new management approach. Resources are allocated and plans are designed to support new practices, not to maintain old and outdated ones.

  5. Continuous Learning : The organization periodically adjusts and updates existing tools, methods and processes that support the use the information in the organization, including training tools, new approaches to planning, experimentation with advanced measurement tools, and development of reporting mechanisms that further align internal and external reporting.

The following table summarizes the stage each Service Line has achieved to date. A critical point to bear in mind is that no organization fits neatly into any one stage. Rather, the assessment may show that an organization is at different stages with respect to various elements that were examined. It is also expected that activity and output information from the earlier stages will continue to be produced in the more advanced stages. The key difference is that the increasing use of outcome information at the more advanced stages will supplement activity and output information used in decision making. The combination of activity, output and outcome data will provide both efficiency and effectiveness indicators to gauge the overall cost effectiveness of a program.

Current State of Implementation :
S&S Service Lines’ Resource and Performance Data

Safety & Security Directorate

Resource & Activity Data

Results Measurement Data

  • Resources linked to Clearly Defined Activities & Workload Drivers

  • Common approach to Plan and Track Activities & Effort Expended

  • Potential to Cost Activities

  • Logic Model Linking Resources, Activities, Outputs to Results

  • Collecting Indicator & Measures Data

 

Aircraft Services

Full Implementation (Technical Services & Engineering) & Awareness (Other Areas)

Transition

Civil Aviation

Full Implementation

Transition

Marine Safety

Transition

Transition

*Marine Security

Exploration

Transition

Rail Safety

Exploration

Transition

Road Safety

Exploration

Transition

SEP

Exploration

Transition

*S&I

Not evaluated

Not evaluated

*TDG

Exploration

Awareness

*Notes:

  1. Marine Security being a new organization is currently still in transition and is designing its overall management framework.
  2. The assessment excluded Strategies and Integration since the function primarily fulfills a coordination and support role for the Safety and Security business line.
  3. Based on Audit and Advisory Services participation on both the Steering Committee and Working Groups for the TDG Service Line Resource Review exercise, it is evident that the Service Line is at the preliminary stages of developing the essential elements necessary to monitor performance.

aircraft services (asd)

Resource & Activity Data

Overall the Service Line has implemented processes and systems to plan and track resource utilization for activities that are cost recoverable. In two key areas, Engineering and Technical Services there are time and activity reporting processes and systems that generate resource utilization data that are manually fed into the Department’s financial system for billing purposes. These time and activity reporting systems have been developed where costs are recoverable from external clients. Sixty-five to sixty-eight percent of the ASD budget is cost recovered. The need for accurate data accounting for the resources expended providing services to clients has been the motivation for developing these systems. Currently, there are no specific plans to introduce time and activity processes and systems for the other ASD functions.

Clearly Defined Common Work Activities and Workload Drivers:
The Service Line’s clients are clearly defined and only the volume of workload year to year is variable. Memoranda of Understanding are negotiated on an annual basis with each client. The client provides a work plan describing their needs and level of demand for ASD’s services.

Common Approach To Plan And Track Activities and Effort:
Technical Services captures labour time expenditures for all operational staff via the “Daily Labour Distribution Forms”. The forms also capture “indirect “Non Aircraft” related work order activities/tasks such as manual amendments, health and safety activities, attendance at meetings etc. Administrative support staff activities are not recorded. Engineering, for cost recovery purposes, has developed their own MS Access database to capture time expenditures expended on specific clients.

Potential to Cost Activities:
For the functions and activities that are eligible for cost recovery costs are being calculated based on MOU agreements and charged to clients. ASD’s Finance unit receives monthly reports that are categorized by client. The reports detail the hours expended for each activity converted into the labour costs to be invoiced. The exception is CCG bases where the ASD staff is dedicated to CCG activities. The time and activity information in this case isn’t used for billing purposes. It is used as a resource requirements planning tool.

Performance Data

The Service Line’s performance framework is still underdevelopment and more work is required to complete it. The resource and activity and output linkages are clearly defined for the Engineering and Technical Services functions. Defining immediate, intermediate and long-term result objectives along with indicators and measures needs to be finalized. The data collection and reporting processes will also need to be put in place.

Logic Model Linking Resources, Activities, Outputs to Results:
A logic model was developed a few years ago but data is not captured on a regular basis. Performance indicators related to specific cost-recovery activities are captured and reported regularly.

Collecting and Monitoring Indicator and Measurement Data:
For the most part data sources have been identified but not all indicators are being captured. For example, client satisfaction measures.

Civil Aviation

Resource & Activity Data

The Service Line has in place an automated activity planning and reporting system that allows work planning and tracking of FTE resources. The Activity Reporting and Standards System (ARASS) is Civil Aviation Service Line’s primary work planning and resource management information system as well TC financial systems (Oracle 11i and SMS) and Civil Aviation operational information systems (NACIS, FTAE, etc). ARASS is based on a task time methodology that was originally utilized in 1983 and 1984 as part of a comprehensive A-Base Review. The system tracks over 85% of total program FTEs. The remaining resources not directly accounted for in the system are primarily overhead functions such as Regulatory Services, Learning Services, Resource Management, etc. The Service Line has reached a continuous improvement stage in which system enhancements are being added.

Clearly Defined Common Work Activities and Workload Drivers:
ARASS task descriptions define the objective of the task and include a detailed description of the activity along with the units of measure describing how the task is tracked. ARASS task definitions are standardized for the whole service line, regions and headquarters. Workload demand drivers are clearly defined and incorporated in ARASS as “units required”. ARASS task frequencies are based on safety risk assessments and documented in the Service Line’s “Frequency of Inspection Policy”.

Common Approach to Plan and Track Activities and Effort:
ARASS is a mature system that has recently been modified to be web-based. It provides a standardized approach to plan work activities and track effort. The system’s capabilities provide the Service Line with a structured common approach to planning FTE resource allocations and tracking FTE resource utilization. Functional priorities can be set and implementation monitored via ARASS. ARASS is based on effort standards that define the average time that it takes to complete a specific task. Task standard times are verified on an “as required” basis, through actual task time trials and/or time and motion studies. Another notable characteristic of ARASS is that the officer/support time available to complete operational tasks is based on standardized calculations to account for leave, training and other non-operational tasks.

Potential to Cost Activities:
The Service Line has another activity reporting system specifically designed for and used by its Aircraft Certification Branch to track cost recoverable activities. The Standardized Cost Recovery and Activity Monitoring System (SCRAM) is a system that records Aircraft Certification staff time expended and units of work completed, on a daily basis. Enhancements are also being designed for ARASS to allow users to track travel and overtime costs related to an ARASS task. In addition, a project-tracking feature is being incorporated to allow specific projects or initiatives to be monitored providing data that can better track effort expenditures.

Performance Data

Overall the Service Line is in the process of fully implementing its performance framework. It has developed a comprehensive performance framework that describes activities, outputs and results. With the mapping of ARASS categories to the performance framework activities, FTE resources can be linked. The functional managers are able to monitor the allocation and utilization of FTE resources to ensure national priorities and direction is being followed. The Service Line is currently developing other performance indicators defined in the logic model.

The Service Line is also actively implementing an “integrated management system” which ties the performance framework and all the key elements of the civil aviation business model into a broader management framework that is designed to instill a continuous improvement ethic into the organization.

Logic Model Linking Resources, Activities, Outputs to Results:
A logic model has been developed that defines 5 activity areas linked to immediate, intermediate and long-term/ultimate results. The activity areas are also mapped to the ARASS categories, allowing FTE resource expenditures to be aligned with the logic model.

Collecting and Monitoring Indicator and Measurement Data:
Safety indicators and targets have been established and results monitoring for many safety targets is in place. In addition, ARASS output indicators have been defined that group a number of tasks together to allow reporting and monitoring of specific activities, such as audits. Other indicators for the performance model are being put in place.

Marine Safety

Resource & Activity Data

The Service Line has taken significant steps to improve the type of activity data available to managers to allow meaningful resource utilization analysis that will contribute to better resource decision-making. Further progress is needed to move from the transition stage to fully implement a time and activity reporting system that provides the basis for linking resource expenditures to program results allowing accurate value for money assessments to be carried out.

Clearly Defined Common Work Activities and Workload Drivers:
Other than a few new functions added to Marine Safety from the Department of Fisheries with the past couple of years, all Marine Safety activities are defined at a high-level. Detailed descriptions of the activities including work flow process mapping and more importantly, standard average activity effort estimates, have to date, not been defined.7

Although most workload drivers are intuitively known many have not be quantified --for example, vessel populations, one of the key workload drivers for many activities, is not accurately known.8

Common approach to Plan and Track Activities and Effort:
The Service Line is in transition striving to fully implement an effective time and activity-reporting system that will provide management with useful resource utilization data to improve its resource related decision-making.

Within the past two years, a National Time and Activity Reporting System (NTARS) was put in place. The system is designed to capture time utilization (effort) down to the individual employee level.

Although, NTARS is not specifically designed as a resource-planning tool, analysis of the data could form the basis for determining resource requirements and optimizing resource allocations.

Potential to Cost Activities:
Although NTARS data has been mapped to the performance framework activity definitions there is currently no systematic direct link between activities, outputs and resources that allows full costing of activities. Work is currently underway in Finance and Administration (F&A) to update the costing models related to the marine safety fees. F&A was provided the latest completed year of NTARS data at the end of May 2005. The costing exercise will be reviewing all areas where Marine Safety has existing fees. Additional line objects have been defined to allow for a more accurate tracking of revenue generation.

Performance Data

The Service Line is developing the essential elements of their performance framework and determining how best to link resource data collected in NTARS to other types of performance data necessary to provide a comprehensive profile of efficiency and effectiveness. A significant amount of additional work is required to put all the necessary elements in place. For example, service level and service delivery standards need to be defined to be able to gauge the cost of the various regulatory activities. In turn, activity costs need to be understood to calculate the net benefits of the activity generated outcomes/results. Without this data, determining resource optimization is not possible.

Logic Model Linking Resources, Activities, Outputs to Results:
The Service Line has developed a performance framework that defines a logic model that links four high-level activity areas to immediate, intermediate and long-term outcomes/results. Although the NTARS activities have been mapped to the Performance Framework activity areas, resource utilization is not being reported at these activity areas. By reporting results on projects/initiatives, the Service Line will be able to better report on the cost of program results.

Collecting and Monitoring Indicator and Measurement Data:
Some performance target information related to accident and incident rates is being monitored but the majority of performance indicators and measures are still under development.

Service level standards describing the targets related to quality and timeliness of services delivered as well as service delivery standards (the average time to complete an activity) have yet to be developed. Without these types of information Marine Safety will not be in a position to maximize program cost effectiveness.

Rail Safety

Resource & Activity Data

The Service Line recognizes the need for accurate, meaningful activity data that will link resources to activities and ultimately to program outcomes/results. It is currently exploring various approaches to improving the data available to support priority setting and resource related decisions. The Service Line via its recent resource review exercise has identified specific information gaps and is proposing to address them by implementing the Rail Safety Integrated Gateway (RSIG), a national integrated data system to collect, analyze and report on results in support of Rail Safety’s Integrated Risk & Performance Management Framework.9

Clearly Defined Common Work Activities and Workload Drivers:
As part of the Service Line resource review exercise a survey was carried out to gather workload and effort expenditure estimates for nine common program functions. As described in their report “there were a number of variations in interpretations applied to completing the survey”.

Common Approach to Plan and Track Activities and Effort:
Rail safety does not have a time recording system. Given that over 70% of the expenditures are personnel costs, Rail Safety recognizes it is critical to understand how time is used. Until an ongoing time recording system is in place it will not be possible to accurately assess where resources are, and have been, deployed.10

The Service Line is currently starting to determine the level of activity and time data required for decision-making at the various management levels. The challenge here is to determine the appropriate level of detail required for a resource/time management system in order to make decisions, but not to over-burden the users with a labor-intensive data collection process.

Potential to Cost Activities:
Again the Service Line recognizes the advantages to being able to accurately cost activities and especially activities subject to cost recovery. As described in the recent resource review, “further study is required to assess the appropriate rates that should be used to achieve cost recovery.”

Performance Data

The Service Line’s plan to integrate a risk assessment process directly into their performance framework demonstrates the Service Line’s commitment to improving the type of information available for sound resource decision-making. The Service Line is clearly in a transitional stage as it develops the essential elements of the proposed framework.

Logic Model Linking Resources, Activities, Outputs to Results:
The Service Line has drafted a logic model that links activities, outputs and results. However, it does not link resources to activities. As described above, the Service is starting to explore how best to collect meaningful data without creating a bureaucratic system. This work is part of the broader initiative under Rail Safety’s Integrated Risk and Performance Management Framework.

Collecting and Monitoring Indicator and Measurement Data:
The Service Line is at the preliminary stage of identifying specific performance data requirements. There is a variety of safety related data available and significant work has been carried out to create the “Railway Safety Integrated Gateway” (RSIG) that will provide a data warehouse solution to integrate various sources of performance information. The Deputy Minister approved a Program Approval Document (PAD) in the fall of 2004 to proceed with phase 1 development of RSIG.

Road Safety

Resource & Activity Data

The Service Line is currently exploring how best to align resource data with their performance framework. It has made some attempts via their budget allocation exercises to align responsibility centre budgets with the activity areas defined in their performance framework. Still, there is no formal process in place to track resource expenditures against the organization’s activity areas.

Clearly Defined Common Work Activities and Workload Drivers:
High-level activities and outputs have been defined as part of the performance framework. These activities have not been further broken down nor have specific workload drivers been defined. Stakeholders are defined at the high-level and there are no specific breakdowns displaying the actual numbers of clients or transaction volumes by branch or activity area; potential population demands are not described in the document.

Common Approach To Plan and Track Activities and Effort:
There is no time and activity reporting and no plans to develop process.

Potential to Cost Activities:
NA

Performance Framework – Key Elements
The Service Line has completed a logic model in draft form and it is finalizing its performance framework as part a Service Line resource review exercise. Similar to the other Safety and Security Service Lines the logic model does not link resources to activities. There is recognition that it’s important to understand the resource costs associated with the results achieved. As part of the Resource Review exercise the performance model is currently being refined to ensure it adequately represents the multi-jurisdictional nature of the Road Safety program in terms of the associated accountabilities inside and outside the organization with its co-delivery partners and the linkages between the two performance measurement systems.

Logic Model Linking Resources, Activities, Outputs to Results:
The logic model describes activities areas, key outputs and immediate, intermediate and ultimate outcomes, however, it does not identify or link resource inputs. As stated earlier, the Service Line does recognize the need to link resources to results but currently has no specific plans to put in place a formal process.

Collecting and Monitoring Indicator and Measurement Data:
The performance framework is still in draft form and although indicators and data sources have been established, baseline data has yet to be collected.

Security and Emergency Preparedness

Resource & Activity Data

In a recent “Effectiveness Review”, it was concluded that “SEP and MARSEC need to make an investment in better defining and documenting their plans and resource requirements in order to better articulate and support resource requests, identify opportunities for reallocation or reduction, and aid in planning and managing the work.”11 The Service Line recognizes the requirement to define, collect and monitor resource utilization information and is currently exploring various approaches to do so.

Clearly Defined Common Work Activities and Workload Drivers:
The Service Line does not have in place a comprehensive listing of work activities and clearly identified workload drivers but there are parts of the organization that have defined common work activities. For example, the Security and Emergency Preparedness Information Reporting System (SEPIRS) is an activity reporting system for inspectors. It is both a data collection and management information system designed to capture, monitor and track security-related activities of interest to TC. Some parts of the organization have mapped key processes (i.e. the Security Clearance Process) and analyzed resource utilization.

Common Approach To Plan and Track Activities and Effort:
SEPIRS currently cannot record or track time expended on the various inspection activities but there is the potential to integrate time reporting. The Service Line is exploring the use of MS Project and other approaches to plan, track and monitor key projects and initiatives that cut across the organization. The goal is to track and measure the timeliness of completing project milestones and resource utilization.

Potential to Cost Activities:
Until resource utilization can be tracked and measured the Service Line is not in position to accurately cost activities.

Performance Framework – Key Elements

SEP has drafted a comprehensive performance framework that links its high-level activities (referred to as functions) to outputs and results. The framework elements are described in detailed including the proposed measurement strategy. The Service Line is in a transition stage as it refines and validates the framework prior to full implementation. As described above, a key element that will need to be integrated into the framework is accurate resource utilization data that will assist the Service Line to assess the cost effectiveness of its program activities. Indicators of measuring the success of outcomes/results achieved will be incomplete without understanding the costs of the activities that led to the results. Without this information value for money assessments will be difficult to determine.

Logic Model Linking Resources, Activities, Outputs to Results:
The 2002 Service Line Plan did attempt to link the budget to the broad activity areas of the performance framework. This attempt underlines the Service Line’s awareness of the importance to be able to link resources to activities and ultimately to results.

Collecting and Monitoring Indicator and Measurement Data:
Performance indicators and data sources have been defined but data has yet to be collected. The Service Line is currently reviewing and validating the performance framework and significant work will be required to fully implement it.

7 Marine Safety Service Line Resource Review Report: April 2004 Service delivery standards – these standards identify the time that should be taken to complete a specific activity (e.g., periodic inspections on passenger vessels > 150 GRT should take 7.5 hours). These will need to be determined once each inspection activity has been accurately defined as to what they entail. Some estimates were developed as part of the service line review; however, these will need to be validated.
8 Marine Safety Service Line Resource Review: April 2004 Report recommended a need to “Revise the data systems to allow for an accurate assessment of populations (i.e., vessel populations) in each of the activity areas. This is critical to the performance management, risk management and operational assessment aspects of Marine Safety.”
9 Rail Safety Resource Review Report March 2005: Rail Safety does not have adequate information or capacity to analyze the data should it be available. Without the data, Rail Safety cannot conduct proper risk assessments or perform the risk analysis necessary to properly support the decision-making process. Performance data are not available, output information is inconsistent and requires further development and resource information is simply financial reports, which is inadequate for costing analysis and linkages to outputs and results.
10 Rail Safety Resource Review Report March 2005
11 Effectiveness Review Report March 2005


Last updated: Top of Page Important Notices