Home : Reports and Publications : Audit & Evaluation : Evaluation of the Community Futures Program - April 2003
Information and reporting systems provide management, staff, partners and other relevant parties with needed facts and data to meet their responsibilities and to effectively gauge program performance. Information and reporting systems should provide executive, Directors, management and staff, and to some extent other stakeholders, with needed information to carry out their responsibilities and to effectively gauge performance. The process involves monitoring and reporting on specific performance measures that are relevant, balanced and practical. |
Two critical questions that can aid in determining the effectiveness of a monitoring and reporting process are:
In essence, it is necessary to focus on the most critical, or core, aspects of performance that best define success for the Program and WD. This requires that there be a limited number of measures specific to the intended audiences and judgments likely to be rendered, clear definitions, and evident linkages between the measures and desired outcomes.
|
1. Is there an appropriate balance, definition and quality of performance measures? |
Need to “define what is measurable” in community economic development projects – Senior Staff Respondent There are measurable results in some programs, such as Self Employment, but it is hard to measure results in other program areas – “how do you measure the impact of community economic development, the evaluation is very subjective?” – Senior Staff Respondent “Sheer numbers do not reflect activity” – Senior Staff Respondent Performance measures are “too generic” –
Senior Staff Respondent
There is “no correlation” between what is projected in the annual operations plans and what is reported on a quarterly basis – Senior Staff Respondent “Currently see limited value” in reporting templates and requirements, with this being evidence of “WD micro-managing at its worst” – Executive Respondent “Seems like gathering numbers” – Senior Staff Respondent Reporting requirements are “heavy” – Senior Staff Respondent “Recognize the need for accountability, but the current approach is way in excess of what is needed” – Executive Respondent “Too many systems in use and not integrated, overkill
on reporting” – Executive Respondent |
Based on a sample of reviewed contribution agreements for
the CFDCs, it was noted that in some cases there are over forty
performance measures that are diverse in nature, and that tend
to be more activity and output related, including:
This indicates the need for further refinement of measures of performance, with a focus on ensuring that a smaller set of metrics is applied and that they are both relevant and targeted towards the key decisions to be made. As stated by one respondent, the current mix of measures is such that the, “real picture is not shown”. This is particularly the case with respect to community economic development, and for which a shortage of appropriate measures was noted as being the case. To frame it differently, far fewer measures have been defined in support of community economic development, and those that do exist appear to focus primarily on the number of related projects and events, as well as responses to inquiries and community meetings. Frustration was also expressed by a majority of management and senior staff respondents over the monitoring and reporting process being too time and resource intensive. Reported inconsistencies in statistics, and between templates that are used and reporting formats, has resulted in what is seen to be a “very bureaucratic process”. Some respondents indicated that there is no correlation between annual operating plans, particularly targets, and quarterly reports, creating a frustration over “gathering numbers” that have less meaning for management decisions. Furthermore, inconsistencies were identified between quarterly reports in terms of the consistency in tracking and reporting on certain measures, including the creation and maintenance of jobs. More specific comments made by respondents included:
An independent review of CFDC performance reporting processes that was carried out in 2002 reaffirmed these observations by stating that, “there were gaps in reporting and … many different forms and formats – the information is found in different places and captured in various ways.” On gaps, examples that were reported in this study included data on jobs created and maintained in relation to the number and value of loans issued. In like fashion, a noted example of the inconsistency in reporting was the variation in descriptions of community economic development projects. The author then noted that there is a need to address dissimilarities in report formats (i.e., over 20 different report form versions were observed), report guidelines and definitions, verification of reported information, and to improve communications between WD and the CFDCs. |
2. How is information on performance being applied? | |
“Never had feedback on quarterly reports –
does anybody read them?” – Senior Staff Respondent
“Why report, who is looking at this information, and how is it being used?” – Senior Staff Respondent “WD is using the information for ‘auditing purposes’, but not really sure.” – Executive Respondent “Assume WD uses statistics to justify the program, but don’t really know” – Executive Respondent “It’s the black hole” – Executive
Respondent |
In response to this evaluative question, there was a general
lack of certainty among Board, management and senior staff respondents
as to how reported information is being used by WD for decision-making.
The lack of performance feedback from WD was seen as being one
factor that has contributed to diminishing levels of thoroughness
in reporting, particularly on more qualitative measures such
as descriptions of community economic development initiatives.
Essentially, the quarterly reports were viewed as being a tool
for ensuring accountability to WD, rather than as an effective
means for monitoring and commenting on actual performance in
relation to established targets. Individual comments from the
respondents included:
Based on interviews with WD regional representatives though, there appear to be some positive changes in this regard. In British Columbia, for example, more of a variance analysis is now being performed using an Excel program, with comparisons taking place between reported data on specific measures to the targets established in operation plans. An alternative practice, but one that addresses similar issues, is the development of a goals and accomplishment template in Manitoba that is filed by CFDCs along with their operation plans and that provide for an assessment of accomplishments over the prior fiscal year as well as an understanding of performance expectations for the subsequent year. It is also recognized that WD is operating under an environment of fiscal restraint that has resulted in limited resources to interact on an ongoing basis with CFDCs. |
7 Refer to “Final Report of Western Economic Development’s Community Futures Development Corporations Performance Reporting Process”, Marshall Management Inc., 2002. Pp. 5.