Atlantic Canada Opportunities Agency

line
Assistant Auditor General: Wm. F. Radburn
Responsible Auditor: John O'Brien

Introduction

The Case for Economic Development in Atlantic Canada

18.5 By most objective measures, Atlantic Canada is at an economic disadvantage when compared to Canada as a whole. Participation in the labour force is lower than the national average (Exhibit 18.1 ). Fewer in the labour force have employment compared to the national average ( Exhibit 18.2 ). Those who are employed earn less than the national average ( Exhibit 18.3 ).

18.6 Atlantic Canada is highly dependent on primary industries such as agriculture, fishing, forestry and mining, and on resource-based, export-oriented manufacturing. These industries are highly cyclical and subject to influences outside of Canada. Studies have indicated that Atlantic Canada's economy is characterized by low productivity, high costs of transportation to markets, inadequate investment and lack of technological innovation.

ACOA - A Decentralized Approach

18.7 The Atlantic Canada Opportunities Agency (the Agency) is the most recent in a series of initiatives designed to reduce the economic disparity between Atlantic Canada and the rest of Canada. The evolution of regional economic development programs in Canada is described in Chapter 17, "Overview of Regional Economic Development Programs".

18.8 The Agency, with its head office in Moncton, New Brunswick, is one of the few federal departments whose headquarters are outside the National Capital Region. The Agency itself is highly decentralized, with offices headed by vice-presidents located in each Atlantic provincial capital and in Sydney, Nova Scotia. A great deal of decision-making authority is delegated to each vice-president and, in turn, to the officers in each region.

18.9 The legislation establishing the Agency states that its purpose is to "increase economic opportunity in Atlantic Canada and, more particularly, to enhance the growth of earned incomes and employment opportunities in that region."

18.10 The Agency has three primary expenditure programs that focus on economic development.

18.11 The Agency's expenditures are about three percent of federal government spending in the Atlantic region.

18.12 Other Agency activities include advocacy of Atlantic Canada's interests and co-ordination among federal government departments of activities intended to stimulate economic growth.

Audit Objectives and Scope

Objectives

18.13 Our objectives in conducting this audit were to assess:

Scope

18.14 Our scope involved auditing certain key results that the Agency has reported to Parliament in its Part III of the 1994-95 Estimates and its 1988-1993 Five-Year Report to Parliament. We also audited a sample of accepted and rejected projects to determine whether the Agency had applied the terms and conditions approved by Treasury Board, to determine whether it had applied key economic development concepts when making decisions, and to assess whether it had exercised appropriate monitoring and control of projects. During our review of these files, we assessed the Agency's compliance with its own standards for service delivery. We also performed case studies on sectors of the Atlantic economy that the Agency supports to assess the extent to which it co-ordinated its efforts internally and externally.

18.15 The quantitative information in this chapter has been drawn from various government sources indicated in the text. Unless otherwise indicated, however, this quantitative information has been checked for reasonableness but has not been audited.

Observations and Recommendations

Measuring and Reporting Results

Background
18.16 In its Part III of the 1994-95 Estimates and its 1988-1993 Five-Year Report to Parliament, the Agency reported that its programs and activities had achieved significant results. In reviewing the results reported by the Agency, we sought to answer the following questions:

18.17 The Agency's enabling legislation requires it to evaluate its activities and report on the "impact [its] activities have had on regional disparity". The Agency completed program evaluations of its two major expenditure programs, the Action Program (completed in 1992) and the COOPERATION Program (completed in 1993). These evaluations are the primary sources of publicly reported information on results.

18.18 Exhibit 18.4 displays the key results that the Agency has reported to Parliament.

18.19 In addition, the Agency has reported levels of client satisfaction, measures of efficiency and operational activity, and increases in federal procurement in the Atlantic region since its inception.

Commendable initiatives to measure and report results
18.20 Evaluating the results of economic development programs is difficult and expensive, especially for a new organization. Nonetheless, the questioning of program relevance and cost effectiveness is vital for good management and accountability.

18.21 Although the evaluation of economic development programs is not new, the Agency is the only federal organization required by law to evaluate its programs' impact on regional disparity. It is important to emphasize that in addressing this difficult task, the Agency has gone beyond most current practice for measuring and reporting results of economic development programs. The remainder of this section will comment on the strengths and weaknesses in the measurement and reporting of results.

Results measured are relevant to key stewardship issues
18.22 Results selected for measurement and reporting must be relevant to the organization's key objectives. The key results selected - namely, jobs created and maintained, total effect on employment and total impact on gross domestic product - are relevant and linked closely to the Agency's legislated objectives. The number of jobs created and maintained is a common measure for economic development programs. However, in measuring the total effect on employment and the total impact on gross domestic product (both of which are measures of macro economic impact not typically subject to program evaluation by federal economic development agencies), the Agency has advanced the state of the practice.

18.23 In addition to the macro economic indicators, there are other intermediary indicators directly related to program objectives that could be measured. Measurement of these intermediate indicators could provide a link between the Agency's programs and the reported macro economic results. While the Agency established program objectives for both the Action and COOPERATION programs, they were not stated in a manner that was clear and measurable. In addition, achievement of the objectives was either not measured or only partially or indirectly measured in the evaluations. Exhibit 18.5 shows the objectives by program and whether results were measured for each objective.

18.24 The Agency should improve the clarity and measurability of objectives for its key programs and measure their achievement in its program evaluations.

Agency's response: ACOA appreciates the positive comments of the Report concerning the progress made by the Agency in what the OAG acknowledges to be the difficult task of evaluating the impact of regional development.

From its inception the Agency recognized that there are few measurement models available that meet regional development evaluation needs. Accordingly, the Agency used a multi-disciplinary team approach, including contracts with independent, national consulting firms, to evaluate both major ACOA programs and to provide recommendations regarding appropriate models and assumptions to be used. As a result, the Agency has focussed its efforts and resources on measuring the most important results of its programming, the ultimate impacts on the Atlantic economy and regional disparity. The Agency has measured job creation, the top priority of the government, and produced a value-for-money evaluation of its programming, which is rarely available for economic development programs. Based upon the lessons learned from this effort, the Agency is taking steps to establish a few key measurable objectives in addition to jobs, such as productivity, sales and export sales, which support the ultimate objectives of job creation and increased income. The Agency plans to continue its practice of follow-up surveys of representative samples of assisted projects as a means of evaluating the achievement of objectives. This approach produces sound results while being most cost effective.

The need to gather and maintain information on an ongoing basis
18.25 For most new programs, the Treasury Board recommends that an evaluation framework be developed as early as possible in a program's design or implementation. An evaluation framework provides the basis for future evaluation. Among other things, it identifies the evaluation issues and indicators. It also sets out the data requirements for evaluation - including identification of the data that need to be collected on an ongoing basis.

18.26 The Agency developed evaluation frameworks for the Action Program (July 1990) and the COOPERATION Program (November 1990). Both evaluation frameworks indicated that there was a need to develop a set of clearly defined and measurable objectives and to collect and manage all relevant baseline data. The Agency did not collect all the baseline data necessary for evaluation as identified in the evaluation frameworks. Clear, measurable objectives are important to the ongoing management of the programs and are needed to determine what information should be gathered to assess the achievement of results.

18.27 Where the Agency was directly responsible for the delivery of the Action Program, it had some program information available when completing the evaluation. For each project, it had gathered basic information about recipients, along with such information as job creation and maintenance expected at the time of application, funds approved and spent, and cash-flow projections. While this information was useful, the Agency did not, as a matter of course, maintain information on actual project results in its database.

18.28 Most of the individual agreements under the COOPERATION Program were delivered by provincial governments or other federal agencies. Although for each COOPERATION agreement there was an evaluation framework setting out the information requirements, these concentrated on operational or service issues rather than on results information linked to the objectives of the agreement. Information on projects was kept by the delivering organizations. As a result, the fundamental information on program activities necessary for accountability and evaluation (for example, a complete list of recipients) was not readily available. Even basic information was often very difficult to obtain, as the delivering organizations took numerous and varied approaches to collecting it.

18.29 The 1990 framework for the evaluation of the COOPERATION Program identified the need to gather appropriate data, and recommended that ACOA immediately proceed with the steps necessary to collect and manage all relevant baseline data.

18.30 Furthermore, the 1993 evaluation report itself stated the following:

Strong and credible material, records and information for program evaluation purposes were not readily available. This suggests that ACOA is at risk in demonstrating an important condition of accountability.
18.31 In our view, this problem restricted the Agency's ability to conduct a cost-effective evaluation of the COOPERATION Program.

18.32 For future COOPERATION agreements, the Agency should ensure that information on recipients, activities and results is maintained in a consistent form and is readily accessible.

Agency's response: Since its inception, the Agency, in regard to programs it delivers in co-operation with the provinces, has recognized the complexity of performing program evaluations, given the differing practices and requirements of program evaluation among the provinces and their varying capabilities to deliver such products.

For this reason, ACOA initiated the evaluation of the COOPERATION Program, following which the Agency implemented steps to ensure the ready availability of data on COOPERATION program recipients and activities. These steps are continuing and include investments in computerized systems so that project-level data maintained by provincial governments, for example, can be transferred to ACOA's database and be readily accessible to ACOA officials. It is planned that data on results will be obtained through follow-up surveys of representative samples of recipients.

Measuring results - Action Program
18.33 The evaluation process. Information was gathered for the Action Program evaluation through a survey of clients, interviews, case studies and internal program sources. The macro economic impacts were established through a technique known as econometric modelling. The Agency selected the number of jobs created as the measure of the program's direct impact. This was then translated into a measure of economic activity ("value added"), by sector. Value added, by sector, was the basic input into the econometric model that estimated the effect on gross domestic product and other macro economic impacts of the Action Program. In our view, this approach to measuring the key results of the Action Program was valid and appropriate. The evaluation process followed is outlined in Exhibit 18.6 .

18.34 We performed a limited survey of the state of evaluation of similar economic development programs in other countries. The type of evaluation conducted for the Action Program is similar to evaluations carried out in Western European countries, although those evaluations do not always measure the macro economic impacts of economic development programs.

18.35 Problems with input data and modelling assumptions. The results reported by an econometric modelling exercise are dependent on the model selected, the modelling assumptions applied and the data input. While the model the Agency used was appropriate, our analysis of the quantification of the direct impact selected by the Agency (jobs created), used as the basic input data, revealed problems.

18.36 To assist in determining the program's direct impact, the Agency conducted a survey of its clients asking them for estimates of the number of jobs created and maintained by the projects selected for the survey that could be attributed to Action Program assistance. The Agency received survey responses from 607 clients who had received assistance during the period February 1988 to March 1992. The survey produced an estimate of the number of jobs created that ranged from a low of 18,346 to a high of 23,181, with the midpoint at 20,763. The Agency has informed us that one of the main objectives of the survey was to establish a success rate for the achievement of actual job creation and maintenance. The job creation success rate was to be used to adjust the Agency's database of expected job creation, for modelling purposes.

18.37 We reviewed a sample of 51 of the 607 projects of clients who responded to the survey. Our review was designed to determine the reliability of job creation and maintenance information from the survey when compared with information from the Agency's own files, including information obtained through site visits by Agency officials and self-reporting by clients. For the projects we reviewed, the number of jobs created according to the survey was higher than our estimate based on the documentation in the Agency's files. Where differences existed, we found that the survey information was consistently higher than information in the Agency's files. Our findings, while based on a significant portion of the reported jobs created, cannot be used in place of the survey as an estimate of the jobs created by the Agency's clients.

18.38 We found several important reasons for the differences between the information included in the survey and information contained in the Agency's files:

18.39 We have further concerns about the conduct of the survey of clients:

18.40 Since the Agency's database contains information gathered at the time of application, it is not a source of information on the program's actual direct impact. Notwithstanding the problems we observed with the survey of clients, we believe that if the results had been fully adjusted, they would have provided a more accurate estimate of the number of jobs created than does applying a success rate to the database of jobs expected.

18.41 The Agency conducted a case study of tourism accommodation to assess the impact that its support to recipients had on competitors in the selected areas. This impact, known as displacement, was examined but not quantified and, therefore, could not be incorporated into the determination of the program's impact. Although we were able to identify examples of the measurement of displacement in European evaluations, we could not find examples in evaluations of similar Canadian programs.

18.42 An important assumption included by the Agency in the modelling exercise was that all of the jobs created by the program would last for a period of 10 years. Although this assumption had a significant impact on the output of the econometric modelling exercise, we found no reliable evidence to validate its use. The Agency believes that this is a reasonable assumption: some projects will do better than expected and some will do worse than expected over the 10-year period, but the results will balance out over time. The Agency had a short program history upon which to base the assumption of the duration of the jobs created.

18.43 Subsequent adjustments for reporting to Parliament. The final results reported to Parliament incorporated the following adjustments, which were estimated as part of the evaluation but were not used during the evaluation to measure direct impacts:

18.44 In addition, the Agency adjusted the output of the modelling exercise so that another year of experience could be included in its reports to Parliament.

18.45 We concur in principle with these adjustments but are concerned that the Agency incorporated an estimate of jobs maintained into the econometric modelling without a rigorous estimation of the input. The Agency essentially followed the same process to estimate jobs both created and maintained. We noted the same problems in the estimate of jobs maintained as we found in the estimate of jobs created. Further, we noted that some of the surveyed projects reported significant numbers of jobs maintained, when job maintenance was not part of the Agency's rationale for its original support of the projects. In future evaluations, the Agency needs to analyze critically the likelihood and extent of job loss in projects before it estimates the "value added" to be input for modelling purposes.

18.46 Conclusion. We found that the overall approach and the econometric model used by the Agency were appropriate. However, because of the concerns noted about the input data and modelling assumption used, the Action Program results reported in both the evaluation and reports to Parliament have significant limitations. In our view, the Agency can correct these problems in future evaluations.

18.47 The Agency should improve its measurement processes for future evaluations of the Action Program to reduce the problems associated with the input and assumptions used in the econometric modelling exercise.

Agency's response: In measuring the impact of its programming on job creation and economic growth, the Agency, as the AG indicates, "has gone beyond most common practice..." In order to do so, it had to develop new measurement techniques. Inevitably, the first-time use of these techniques involves limitations on the precision of the estimated results. While the Report notes examples of the 51 cases it has analyzed where it believes the number of jobs created has been overestimated, it notes there is no basis to suggest a different estimate of overall jobs created for the approximately 6,000 companies supported. The client survey was designed to capture the full employment impact on a firm basis of marketing and innovation projects resulting from increased product sales and not just the expansion of employment in the marketing or research department. Also, it could be argued that the number of jobs was underestimated for those projects that had not reached their full potential at the time of the survey. While there will always be limits to the accuracy of estimates, ACOA believes its estimates are reasonable given available resources and the methodology of the day.

Now that the measurement techniques have been established, the Agency plans to move to improve their accuracy on the basis of additional program experience and with the benefit of the OAG's observations.

Measuring results - COOPERATION Program
18.48 The Agency's evaluation of the COOPERATION Program was conducted using information from client surveys, interviews, and case studies and by analysis of information from individual COOPERATION agreements and external sources. The evaluation was based on all COOPERATION agreements signed by 31 March 1993.

18.49 Determining the direct impacts of the COOPERATION Program was much more complex than for the Action Program because of the nature of the activities funded and, as we have noted, the absence of results information for individual COOPERATION agreements. Hence, a large number of assumptions had to be made to estimate the direct impacts of the program. The reasonableness of assumptions is crucial to the ultimate estimate of macro economic impacts indicated by the econometric modelling exercise.

18.50 We examined the five sectors or expenditure areas on which the Agency estimated the COOPERATION Program had the most significant direct impacts: human resource development, mining, forestry, highways and business support. We have concerns about some of the underlying assumptions that were used:

18.51 The COOPERATION Program evaluation was conducted on both a retrospective and a prospective basis. That is, the evaluation incorporated results from actual expenditures up to 31 March 1993, the effective date of the evaluation, with projected results based on commitments for future expenditures from 1 April 1993 to 31 March 1997. Of the $1.3 billion in total program spending covered by the evaluation, approximately 44 percent represented expenditures actually incurred, while the remaining 56 percent represented commitments for future expenditures. Therefore, for the last four years covered by the evaluation, the reported results included both the expected future impacts of incurred expenditures and the expected future impacts of planned expenditures.

18.52 The Agency should ensure that its results measurement clearly distinguishes between the results of the direct impacts of incurred expenditures and the forecasts of impacts of future expenditures.

Agency's response: The Agency's first Five Year Report to Parliament emphasized the additional jobs resulting from Agency programming over the 1987-92 period, the first five years of the Agency's life. Clearly, the job impact to 1992, a total of 42,000 jobs from the Action and COOPERATION programs combined, results only from incurred expenditures. However, in order to provide information to guide decisions on future funding for the COOPERATION Program, ACOA, in consultation with the provincial governments, believed it was necessary to estimate the impact resulting from the total commitments already made by the federal government and the Atlantic provinces as of March 31, 1993. In this way, the full impact of the agreed-upon funding allocation to the COOPERATION Program could be examined.

In the future, the Agency will endeavour to more clearly delineate the impacts of incurred versus forecast expenditures in both program evaluations and within the evaluation of the individual COOPERATION agreements.

Measuring results - Advocacy
18.53 ACOA has reported that federal procurement in the Atlantic provinces increased in the three years after its creation, by 70 percent over the three preceding years. In addition, the Agency used case studies to demonstrate the impact of individual advocacy initiatives. Our audit included a review of the data used to make the overall claims.

18.54 We found that the data used to make the overall claims about increases in federal procurement in the region came from a Department of Supply and Services document. However, that document states:

The statistics provided cannot properly be used to identify the effects of federal contracting in generating economic activity within Canada.
18.55 We were not able to identify any other source of information to support the results claimed by the Agency.

Reporting results to Parliament
18.56 The most important vehicles for reporting the results of the Agency's activities have been its Part III of the 1994-1995 Estimates and its 1988-1993 Five-Year Report to Parliament. Information contained in these reports came, for the most part, from the program evaluations, as subsequently adjusted. The results information is considerable and relevant to the Agency's legislated objectives.

18.57 Information on the number of jobs created by the Action Program, the COOPERATION Program and both programs combined was reported on different bases and for different time frames. Measures presented were not fully explained in the reports. In certain instances, the reported number of jobs created included only those jobs expected to be created directly by the program, without adjusting for incrementality or the measured success rate. In others, it included an estimate of both the direct and indirect jobs created and maintained, and was adjusted for incrementality. Exhibit 18.4 displays the information on results that the Agency has reported to Parliament. We are concerned that information prepared using different methodologies, over different time frames, and reported in a fragmented way is difficult to understand and place in an appropriate context.

Recent Agency activity
18.58 The first evaluations undertaken by the Agency represent a major effort to measure the macro economic impact of its programs. The Agency has learned a great deal from this experience and is taking steps to build on the knowledge acquired. In the Action Program, the Agency is developing intermediary indicators for such areas as export trade, tourism and entrepreneurship. It is also undertaking a new survey of Action Program clients to update information on the program's impact.

Project Management and Decision Making

Introduction
18.59 We audited a random sample of 100 projects approved or applied for between 1 October 1993 and 30 September 1994 under the Action, Fisheries Alternatives and COOPERATION programs. In addition, we audited all 16 projects with assistance greater than $1 million approved during the same period.

18.60 We reviewed the documentation in each project file and met with the responsible account manager and compliance officer. We visited a few of the applicants to get their views, to see the projects and to ensure that the Agency's documentation described the projects accurately.

18.61 We audited the Action and Fisheries Alternatives programs separately from the COOPERATION Program because there are significant differences in the programs' objectives, control frameworks and delivery.

18.62 The Action Program is delivered directly by Agency officials. Terms and conditions approved by the Treasury Board provide regulatory guidance for the program, supplemented by internal policies and procedures. The Fisheries Alternatives Program had a similar control framework.

18.63 The COOPERATION Program is a federal-provincial, cost-shared program, with a series of agreements covering specific industry sectors, geographic areas or subjects in each province. Individual agreements outline objectives and eligibility criteria for accepting or rejecting project proposals. COOPERATION agreements are delivered primarily by provincial officials, on behalf of both levels of government. Each agreement is managed by a committee of federal and provincial officials, who are responsible for its overall delivery.

18.64 Typically, the federal government funds 70 percent of the cost of projects under the COOPERATION agreements. Exhibit 18.7 summarizes the COOPERATION agreements in place as of 31 March 1995. COOPERATION agreements implemented by other federal government departments were outside the scope of our audit.

Assessment of projects under the Action and Fisheries Alternatives programs
18.65 The private sector invests in physical assets, market studies or research and development for commercial reasons, based on the anticipated return. When government provides support for private sector activities, it assumes part of the private sector's business risk in order to obtain economic benefits for a region or for the country as a whole. In both instances, an assessment of the proposed activity is necessary before a decision is made to spend funds. Inevitably, some investments will not be successful. The use of public funds to support commercial activities places a great deal of responsibility on public servants to ensure that reasonable and prudent analysis is carried out before a decision is made to support a project.

18.66 The basis for ACOA's support of an economic development project under the Action and Fisheries Alternatives programs is set out in the terms and conditions approved by Treasury Board. For commercial projects, the key factors to consider are:

18.67 For non-commercial projects, we asked if the Agency determines whether a proposed project addresses a need already being met by the government, not-for-profit or private sector. We also assessed, where appropriate, whether such projects are sustainable in the long run.

18.68 The Agency also has the following guidelines for assessing projects. If total project cost is:

18.69 The terms and conditions for the Action Program specify basic criteria such as eligible activities, eligible enterprises, eligible costs and levels of assistance relative to cost. All of our sample projects met these basic eligibility requirements. Account managers are knowledgeable about these requirements and the procedures necessary to ensure that they are applied properly.

18.70 Beyond the basic eligibility requirements, the analysis performed in deciding whether to provide financial assistance was limited to what the Agency's guidelines required. Although the appropriate information required by Agency guidelines was generally gathered, we noted instances where obvious concerns were not addressed.

18.71 Definition of project objectives. We had expected that project objectives would be defined to show the expected economic results of the project. But in 26 percent of the cases, the outcomes were described as activities to be accomplished rather than as expected economic results.

18.72 Exhibit 18.9 shows examples that illustrate our concerns, and also provides instances where results were stated in terms of the economic benefits to be achieved. One way to explain our concern is that it is impossible for a project to "fail" if the objective is limited to an activity such as hiring a marketing manager or conducting a study, without linkage to a result such as increased sales or use of underutilized production capacity.

18.73 Incrementality. Generally, we found that the Agency's procedure for determining the need for government funding was to ensure that the applicant signed a declaration stating that the funds were required. In 19 percent of the commercial cases we reviewed, there were indications that the applicant had sufficient means to raise the necessary funds for a proposed project.

18.74 There may be other appropriate reasons for funding an applicant. For example, other jurisdictions may offer incentives or the funding may be necessary to yield a sufficient return for a project to proceed in the Atlantic region. However, we could not find evidence that the Agency had analyzed these issues. Exhibit 18.10 shows examples that illustrate the nature of our concerns.

18.75 Net economic benefit. If the benefits of an Agency-supported project are less than any damage to an existing enterprise, there is no net benefit to the region. The Agency's terms and conditions approved by the Treasury Board require that project proposals be considered for their net benefit to Canada and Atlantic Canada. The Agency's internal guidelines for applications of $200,000 or less require consideration of the impact on only an applicant's local market area. In cases where the analyzed impact was wider, it was usually limited to a province rather than applied to the region as a whole. However, the Agency has indicated that this assessment is supplemented by ad hoc studies of industry sectors and other research.

18.76 Nevertheless, we found in 23 percent of projects we examined that the Agency did not appear to have considered adequately the net economic benefit to the Atlantic region. Based on the limited information gathered under the Agency's guidelines, we noted instances where providers of a service or product already in business in Atlantic Canada could be hurt by Agency support of a project.

18.77 Viability. From the point of view of economic development, this criterion is important because viable commercial projects provide ongoing employment, create wealth and pay taxes. In 17 percent of the commercial cases we examined, we found that the project analysis did not deal with key issues affecting the viability of projects. In particular, projections were based on incomplete or unreasonable assumptions, and significant risks associated with future success were not addressed. Exhibit 18.11 provides examples of the Agency's assessments.

18.78 Risk sharing. The terms and conditions approved by Treasury Board require applicants to invest in their own projects. Although the amounts invested varied, the required level of investment was adhered to in all the projects in our sample.

18.79 Non-commercial projects. For the non-commercial projects we examined, we found examples where the services of the applicant were currently being provided by another entity, where the net economic benefit to the Atlantic region had not been considered adequately or where the financial sustainability of the applicant had not been addressed adequately.

18.80 Efficiency of delivery. As part of our review, we assessed whether the Agency was meeting its own standard of 45 days to reach a decision on an application for Action Program assistance. Based on a random sample of 50 project files, the average time to process an application was approximately 40 days.

18.81 Measuring decision-making times should begin from the day when substantially all information is received, but that date was not identified regularly in Agency records. As a result, the overall average performance time recorded in our sample results could overstate the actual application processing time.

Monitoring and control of projects under the Action and Fisheries Alternatives programs
18.82 The Agency's monitoring of supported projects begins during the release of funds to the applicant. The Agency's compliance staff examine documentation submitted by the applicant and ensure that it supports the costs claimed. They also review compliance with terms of the contract before any funds are released. After an applicant has received all of the allotted funding, the Agency will contact the applicant toward the end of the control period (normally 24 months after commencement of the project's commercial production) to determine the status and results of project activity.

18.83 Monitoring the success of individual projects can provide significant information for account managers in their analysis of subsequent proposals and can provide the Agency with feedback about the achievement of its objectives, provided it is done on a regular and timely basis.

18.84 We expected that the Agency would play an active role in monitoring all stages of supported projects. In particular, we expected that it would ensure that conditions of assistance were met, that the government's financial and development interests were protected and that project progress and results were identified.

18.85 The importance of these activities increased with the Agency's announcement that effective 7 February 1995, all direct assistance to commercial organizations under the Action Program would be made repayable.

18.86 Monitoring compliance with payment conditions. The Agency does a good job of monitoring compliance with the payment conditions of contracts with applicants. Compliance officers are attentive to the Agency's guidelines and to conditions established in the contract. Compliance monitoring consumes the majority of the Agency's monitoring resources.

18.87 Monitoring project progress. Besides the monitoring for contractual commitments, the Agency's priority areas for monitoring are:

18.88 While we recognize the importance of these matters as well as the limitations of the Agency's resources, these priorities do not currently apply to the majority of the Agency's activities. Therefore, we believe that it is also important for the Agency to ensure that the largest portion of its activities are subject to review in order to ensure that supported projects continue to operate and that they remain financially viable. Such monitoring activities are important to the achievement of the Agency's development objectives.

18.89 For 28 percent of the projects we audited, progress had not been adequately monitored, or timely financial information had not been obtained. For example, we noted instances where advances of funds had been made six months or more before our review, but the Agency did not have information on the progress of the project or the current financial status of the applicant.

18.90 Monitoring project results. It is important to obtain information on the results of the project and to use it as input to future decisions on whether to support similar projects or industries. Agency officials have informed us that the extent of monitoring of results is limited to self-assessment by the applicant, verified only infrequently and not recorded in the management information system. The Agency has stated that rather than assessing results on a project-by-project basis, it evaluates project results through ad hoc surveys and studies of industrial sectors and through periodic program evaluations.

18.91 Although the nature of our sample did not permit us to evaluate the Agency's results-monitoring activity, we have previously noted that the Agency does not, as a matter of course, gather information on actual project results. The Agency is currently reviewing the cost effectiveness of all of its payment and monitoring processes.

Conclusion
18.92 Obviously, it is not possible to design a system that can guarantee that all projects approved will succeed. In many cases, the Agency's analysis dealt appropriately with the risks to its economic development objectives. However, in a number of cases the analysis did not address significant risks to meeting one or more of the key economic development criteria.

18.93 The guidelines issued to account managers were developed at a time when the typical project had a higher value than it does today. Now that the vast majority of Agency projects fall into a category requiring limited review, we believe the Agency needs to re-evaluate the appropriateness of its guidelines. The underlying question that must be addressed for each proposal is why the government should provide support for the project.

18.94 The Agency's current monitoring activities do not adequately reflect the current nature of its development activities. While the Agency concentrates its monitoring resources on issues of compliance with payment conditions, we are concerned that project progress and results are not addressed adequately.

18.95 The Agency should ensure that the expected project results are specified and clearly linked to the objectives of its programs (increased export sales, innovation, etc.). The Agency should modify its assessment procedures to require that analysis be based on risks to achieving its goals, as well as on project size.

18.96 The Agency should consider implementing monitoring procedures that address its current development activities. Account managers should have an understanding of client operations and the progress and results of projects.

Agency's response: The Report identifies the fundamental criteria for the evaluation of applications and comments on the adequacy of Agency performance in satisfactorily addressing each. We are pleased that in the majority of cases, ranging from 72 percent to 83 percent, the OAG has not found any significant problems.

Project Objectives

The Agency fulfilled the aspects of project results measurement for the majority of cases, through the program evaluation exercises. Consequently, ACOA recognizes the need to identify project objectives, both in terms of the projects' success and also in terms of their relationship with overall program objectives and Agency strategies. To ensure that this requirement is addressed effectively, ACOA completed a review of applicable functions; recommendations from that exercise include improvements in the internal process of objective determination and measurement. Since early in its mandate, the Agency has had program evaluation frameworks to help address the need for results measurement. Similarly, the Agency initiated an evaluation framework for new programming and the deliverables for this product include the identification of data and information requirements for project objectives and their linkage to Agency strategies.

The Agency has also used results from the OAG audit to improve the application assessment procedures. The Agency has always reflected risk to project size; however, formal procedures are being developed that will evaluate risk to project goal achievement and link that analysis to assessment decisions.

Monitoring

The Agency accepts the OAG findings that, while ACOA has done a good job of monitoring compliance with the payment conditions, Agency monitoring of individual project progress/results should be improved. Complete repayability of commercial assistance, and other results of Program Review in 1994, changed the nature of the Agency. In light of this, we have implemented program enhancements and realigned resources, so as to further emphasize the role of the Account Manager throughout the monitoring function, thereby improving current knowledge of project progress and results. ACOA will continue to perform aggregate benefits monitoring through application of statistically valid sampling.

Assessing, monitoring and controlling projects under the COOPERATION Program
18.97 COOPERATION agreements are typically delivered by provincial officials on behalf of both levels of government. It is clearly not cost-effective for the Agency to operate a duplicate control structure. However, it needs a means of obtaining assurance that the federal government's objectives and interests are being protected.

18.98 Individual COOPERATION agreements are the documents governing the implementation of the program. The agreements do not explicitly require consideration of incrementality, net economic benefit or commercial viability/sustainability. Therefore, it is not surprising that the application of these criteria often was not evident when we reviewed the project files. The percentage of exceptions was the same as or greater than we found in the Action Program. We also reviewed COOPERATION projects to determine if project objectives were defined in terms of expected results rather than as activities. As with the Action Program, we found that objectives were frequently specified as an activity to be completed rather than as an expected result.

18.99 We found that most projects were eligible under the federal-provincial agreement. In fact, the eligibility criteria and objectives are quite broad for most agreements - so broad that projects that appear to be peripheral to the overall program objectives are supported. We found it difficult to determine how some projects were contributing to COOPERATION Program objectives. For example, we reviewed an agreement intended to revitalize urban cores. Many of the projects involved painting buildings or replacing siding. Another agreement was intended to diversify the economy of an area by developing human resources and supporting economic diversification to increase incomes and employment. In reviewing a sample of approved projects under this agreement, we found they assisted industries that were already well established in the area. Although the projects were eligible under the agreements, it was difficult to link them to the economic diversification objective.

18.100 Also, in reviewing agreements designed to support community economic development, we could not identify a clear definition of roles and responsibilities or what these organizations are accountable for achieving. This is particularly important to avoid duplication, because the federal and provincial governments continue to deliver economic development programming in these communities.

18.101 Many of the COOPERATION agreements expired on 31 March 1995. The federal government has announced its intention to negotiate one broad federal-provincial agreement with each province. We believe this is an opportunity to negotiate agreements that incorporate the accountability features, key economic assessment criteria and results orientation necessary to ensure that federal priorities are addressed.

18.102 In negotiating future COOPERATION agreements, the Agency should attempt to incorporate the following aspects of accountability:

Agency's response: ACOA recognizes the intent of the recommendations and the Agency has already made significant strides in addressing the federal responsibilities through the program evaluation exercise. The co-ordination of program evaluation requirements in the context of differing practices and capabilities among the four provinces remains a complex and challenging activity. ACOA led the progress in the accountability aspects of past agreements and remains committed to similar effort on future agreements.

Additionally, the Agency:

Co-ordination

Background
18.103 The direction to ACOA to co-ordinate federal economic development in Atlantic Canada is clear and specific:

The Minister shall co-ordinate the policies and programs of the Government of Canada in relation to opportunity for economic development in Atlantic Canada.
18.104 As part of this responsibility, the Minister responsible for the Agency has the authority to enter into agreements with the government of a province to carry out the Agency's programs.

18.105 The federal, provincial and municipal governments all have economic development policies and programs for their respective jurisdictions. Within the federal government, many programs exist that have significant implications for regional economic development. While ACOA can attempt to co-ordinate or influence the programming and activities of other departments, at least within Atlantic Canada, it does not have responsibility to deliver the programs of other departments.

18.106 Federal and provincial governments in Atlantic Canada have a long history of working together in the field of regional economic development. However, expenditure reductions have led all governments to examine ways to work more co-operatively and to increase the efficiency and effectiveness of programming.

18.107 The Agency's most visible co-ordination efforts are the federal-provincial agreements under the COOPERATION Program. The Agency has become increasingly involved in co-ordinating the economic development component of federal government responses to key industry crises, such as the fishery crisis, and to the economic impacts of significant federal expenditure reductions, such as military base closures.

Support of third-party economic development initiatives
18.108 Through both the Action and COOPERATION programs, the Agency provides core funding support to various non-commercial organizations involved in economic development, including universities, community economic development organizations, industry associations, etc. Although the majority of funding goes toward the supported development activity, in many instances the Agency is funding administrative overhead. Our analysis of the funding arrangements with third parties in 3 of the 16 high-value Action and COOPERATION projects in our sample found that 24 percent of the funds went toward the cost of administrative overhead. Although the Agency records the costs of individual projects, it does not accumulate the overall cost of administering the economic development activities it supports.

18.109 The current COOPERATION agreements have been structured to rationalize the community economic development network in three of the four Atlantic provinces. The thrust of the rationalization is a reduction in the number of community economic development organizations, with a resulting reduction in administrative overhead. Although the Agency is a significant contributor to the funding of the community economic development network, these organizations provide very little in the way of performance information to the Agency. Therefore, the Agency does not have assurance that their activities are contributing significantly to the economic development of the region.

18.110 The Agency should require improved reporting of results by non-commercial organizations, including the community economic development network.

Agency's response: The Agency will clarify with non-commercial organizations, including community economic development organizations, the nature and scope of the performance information it requires.

The Agency recognizes the positive comments in the Report concerning the pan-Atlantic initiatives and will continue to pursue opportunities within that framework of delivery.

Assessment of recent co-ordination initiatives
18.111 The Agency is taking an active approach to co-ordinating certain initiatives. It has forged several sectoral partnerships in an effort to provide a more co-ordinated approach to development activities in these sectors. We examined two of the Agency's recent co-ordination initiatives.

18.112 Atlantic Canada Tourism Partnership. In 1992 and subsequent years, the Agency entered into arrangements collectively with the provincial tourism associations and provincial governments, covering tourism industry activities such as domestic and international marketing, accommodation grading, research and human resource development. Together these arrangements are called the Atlantic Canada Tourism Partnership. Prior to the Partnership, the Agency provided support directly to provincial governments or provincial tourism associations but not on an Atlantic-wide basis.

18.113 The Partnership is a semi-formal association of the key players in the tourism industry in Atlantic Canada, including the four provincial tourism deputy ministers, the four presidents of the provincial tourism industry associations, an Agency representative and a Canadian Tourism Commission representative. The members meet regularly to discuss and develop co-ordinated tourism development strategies and activities. Individual initiatives are delegated to either provincial or industry staff. As a result of the Partnership, the significant participants in Atlantic Canada's tourism industry work together on common industry-wide initiatives.

18.114 The Agency is in the process of establishing baseline data for future evaluations. This Partnership is a major step toward achieving a focussed, co-ordinated approach to developing Atlantic Canada's tourism industry.

18.115 Geomatics Industry Development Initiative. The Geomatics Industry Development Initiative began as a proposal to support a development fund for research in Atlantic Canada in geomatics, or automated geographic information systems. The Agency responded to the proposal by entering into a process of negotiation, consultation and co-ordination with stakeholders to determine the appropriate approach to development of the industry. In addition, it commissioned a market study of the geomatics industry to identify opportunities for Atlantic Canadian firms. The process led to a clear definition of goals and objectives for the initiative. The consultation and co-ordination were important to the development of the final plan.

18.116 The final initiative was an undertaking by the Agency to provide $10 million over three years to help private sector projects designed to develop new products and services. Proposals were reviewed against a detailed set of criteria by a panel of experts from government departments, academe and industry. The process for selection of the projects was extensive and thorough.

18.117 Several marketable products have resulted, although it is too early to evaluate the overall success of the initiative. The Agency's approach ensured a co-ordinated effort to develop the geomatics industry in Atlantic Canada.

Conclusion
18.118 We are encouraged by the Agency's efforts to co-ordinate its economic development activities with other levels of government. In our view, such efforts to improve the co-ordination of all economic development programming are important opportunities to reduce overlap and duplication.

Audit Team

Nancy Adams
Glenn Doucette
Clyde MacLellan
Donald MacNeill
Heather McManaman
Kevin Potter

For information, please contact John O'Brien, the responsible auditor.