Western Economic Diversification Canada | Diversification de l'économie de l'Ouest Canada

Home : Reports and Publications : Audit & Evaluation : Evaluation of the Community Futures Program - April 2003

Evaluation Findings


D. Monitoring and Reporting on Performance


Information and reporting systems provide management, staff, partners and other relevant parties with needed facts and data to meet their responsibilities and to effectively gauge program performance.

Information and reporting systems should provide executive, Directors, management and staff, and to some extent other stakeholders, with needed information to carry out their responsibilities and to effectively gauge performance. The process involves monitoring and reporting on specific performance measures that are relevant, balanced and practical.

Two critical questions that can aid in determining the effectiveness of a monitoring and reporting process are:

  • Do we know what dimensions of performance are most important?
  • Do we receive information that reflects these dimensions?

In essence, it is necessary to focus on the most critical, or core, aspects of performance that best define success for the Program and WD. This requires that there be a limited number of measures specific to the intended audiences and judgments likely to be rendered, clear definitions, and evident linkages between the measures and desired outcomes.

 

1. Is there an appropriate balance, definition and quality of performance measures?

 

 

 

Need to “define what is measurable” in community economic development projects – Senior Staff Respondent

There are measurable results in some programs, such as Self Employment, but it is hard to measure results in other program areas – “how do you measure the impact of community economic development, the evaluation is very subjective?” – Senior Staff Respondent

“Sheer numbers do not reflect activity” – Senior Staff Respondent

Performance measures are “too generic” – Senior Staff Respondent

 

 

There is “no correlation” between what is projected in the annual operations plans and what is reported on a quarterly basis – Senior Staff Respondent

“Currently see limited value” in reporting templates and requirements, with this being evidence of “WD micro-managing at its worst” – Executive Respondent

“Seems like gathering numbers” – Senior Staff Respondent

Reporting requirements are “heavy” – Senior Staff Respondent

“Recognize the need for accountability, but the current approach is way in excess of what is needed” – Executive Respondent

“Too many systems in use and not integrated, overkill on reporting” – Executive Respondent

Based on a sample of reviewed contribution agreements for the CFDCs, it was noted that in some cases there are over forty performance measures that are diverse in nature, and that tend to be more activity and output related, including:
  • Number of Board members and meetings (output);
  • Total number of clients receiving business services (output);
  • Number of business plans completed (output);
  • Number of jobs created and maintained (outcome);
  • Number of new business starts (outcome);
  • Percentage of bad-debt write-offs (outcome);
  • Number of clients trained (output);
  • Number of web-site inquiries and visits (output); and
  • Number of presentations and participants (output).

This indicates the need for further refinement of measures of performance, with a focus on ensuring that a smaller set of metrics is applied and that they are both relevant and targeted towards the key decisions to be made. As stated by one respondent, the current mix of measures is such that the, “real picture is not shown”. This is particularly the case with respect to community economic development, and for which a shortage of appropriate measures was noted as being the case. To frame it differently, far fewer measures have been defined in support of community economic development, and those that do exist appear to focus primarily on the number of related projects and events, as well as responses to inquiries and community meetings.

Frustration was also expressed by a majority of management and senior staff respondents over the monitoring and reporting process being too time and resource intensive. Reported inconsistencies in statistics, and between templates that are used and reporting formats, has resulted in what is seen to be a “very bureaucratic process”. Some respondents indicated that there is no correlation between annual operating plans, particularly targets, and quarterly reports, creating a frustration over “gathering numbers” that have less meaning for management decisions. Furthermore, inconsistencies were identified between quarterly reports in terms of the consistency in tracking and reporting on certain measures, including the creation and maintenance of jobs. More specific comments made by respondents included:

  • Currently investing too much time and resources towards reporting.
  • Has turned into a stats activity instead of doing the work.
  • Have to report and it is very onerous.
  • Template does not adapt well to customizing descriptions of performance and results.
  • TEA system has been a big improvement, but still a frustration in collecting the data.
  • There are problems with the numbers that the TEA system generates in terms of accuracy.
  • The type of client is not recorded in reports.
  • More time spent tracking and less time is available for clients.
  • Annual statement requirements are different than the quarterly reports.
  • Some problems with the TEA system. The report format is slightly different from WD’s template and certain statistics are not picked up such as loan delinquencies. Marketing and community economic development activities are hard to track.
  • Difficult to capture the impact of some initiatives.
  • There should be an ongoing project section in the report.
  • Quarterly reporting gives a fair picture of the activities of the business analysts, but are cumbersome.
  • Have never been formally shown how to do the reporting.

An independent review of CFDC performance reporting processes that was carried out in 2002 reaffirmed these observations by stating that, “there were gaps in reporting and … many different forms and formats – the information is found in different places and captured in various ways.” On gaps, examples that were reported in this study included data on jobs created and maintained in relation to the number and value of loans issued. In like fashion, a noted example of the inconsistency in reporting was the variation in descriptions of community economic development projects. The author then noted that there is a need to address dissimilarities in report formats (i.e., over 20 different report form versions were observed), report guidelines and definitions, verification of reported information, and to improve communications between WD and the CFDCs.

  2. How is information on performance being applied?
“Never had feedback on quarterly reports – does anybody read them?” – Senior Staff Respondent

“Why report, who is looking at this information, and how is it being used?” – Senior Staff Respondent

“WD is using the information for ‘auditing purposes’, but not really sure.” – Executive Respondent

“Assume WD uses statistics to justify the program, but don’t really know” – Executive Respondent

“It’s the black hole” – Executive Respondent

In response to this evaluative question, there was a general lack of certainty among Board, management and senior staff respondents as to how reported information is being used by WD for decision-making. The lack of performance feedback from WD was seen as being one factor that has contributed to diminishing levels of thoroughness in reporting, particularly on more qualitative measures such as descriptions of community economic development initiatives. Essentially, the quarterly reports were viewed as being a tool for ensuring accountability to WD, rather than as an effective means for monitoring and commenting on actual performance in relation to established targets. Individual comments from the respondents included:
  • Have not really seen how WD uses this information.
  • Limited or no feedback is received from WD on the submittals.
  • Quarterly reports are completed for compliance purposes only. It is uncertain what WD does with them, if anything.
  • WD uses reported data to convert pilots into fully funded initiatives and to profile the Community Futures program at a federal level. However, can also end up with WD staff getting too focused on reporting details that they “lose sight of the big picture”.
  • Not sure what some information is used for.
  • Received a call recently about 2001 statistics, so doubt that WD uses this information in a timely fashion.
  • WD rarely provides feedback.
  • Assume WD uses it for their purposes and to lobby government.
  • Don’t feel that WD is properly using this information.

Based on interviews with WD regional representatives though, there appear to be some positive changes in this regard. In British Columbia, for example, more of a variance analysis is now being performed using an Excel program, with comparisons taking place between reported data on specific measures to the targets established in operation plans. An alternative practice, but one that addresses similar issues, is the development of a goals and accomplishment template in Manitoba that is filed by CFDCs along with their operation plans and that provide for an assessment of accomplishments over the prior fiscal year as well as an understanding of performance expectations for the subsequent year. It is also recognized that WD is operating under an environment of fiscal restraint that has resulted in limited resources to interact on an ongoing basis with CFDCs.

7 Refer to “Final Report of Western Economic Development’s Community Futures Development Corporations Performance Reporting Process”, Marshall Management Inc., 2002. Pp. 5.

 

<< previous | contents | next >>