Government of Canada | Gouvernement du Canada Government of Canada
    FrançaisContact UsHelpSearchHRDC Site
  EDD'S Home PageWhat's NewHRDC FormsHRDC RegionsQuick Links

·
·
·
·
 
·
·
·
·
·
·
·
 

3. Evaluation Approach


This section presents the evaluation issues identified for the evaluation of the SDPP. It also highlights some of the challenges involved in evaluating a partnership program like the SDPP, the methods used to evaluate this program, and the strengths and limitations of the evaluation approach.

3.1 Evaluation Issues

Thirteen issues were identified for the evaluation of the SDPP and grouped into five categories:

  • Program Rationale and Current Status:
  1. Are the strategic objectives of the program still relevant from governmental and departmental perspectives?
  2. How well do the activities and resource allocations of the SDPP reflect the program's strategic objectives?
  3. Do funding priorities (originally established in consultation with the Reference Groups) adequately reflect program objectives?

    Design and Delivery:

  4. Do the funding decisions reflect the program priorities?
  5. Does the SDPP have processes dealing with information for potential applicants, processing of applications, and funding approval which are clear, comprehensive, transparent and equitable?

    Accountability:

  6. How well do the program systems meet standards for accountability?
  7. Should information-collection and reporting requirements be adapted for particular projects and sponsors (e.g. different for small and large recipients)?
  8. Does the SDPP have adequate resources for the full implementation of new financial and program accountability measures?

    Achievement of Objectives:

  9. Are the activities of the SDPP successful in identifying, developing and promoting nationally significant best practices and models of service delivery? Are there outcomes additional to those expected?
  10. Do the activities of the SDPP successfully contribute to building community capacity to meet the social development needs and aspirations of populations who are, or may be, at risk? Are there outcomes additional to those expected?
  11. Do all types of funded activities contribute effectively to program objectives (capacity-building, applied research, development and organizational support grants)?
  12. Has the SDPP led to the establishment of effective capacity of the social non-profit sector for contributing information and knowledge on current and emerging social issues to government and others?

    Cost-effectiveness

  13. Is the SDPP cost effective? Could any SDPP activities be undertaken more cost-effectively.

3.2 Evaluation Methods

Evaluating partnership programs like the SDPP presents a number of practical challenges. The SDPP is designed to achieve its objectives indirectly through funding projects and organizations that are not administered by the federal government. Also, the intermediate and final outcomes of the SDPP involve responses from additional organizations or groups that are beyond the direct control of the program. As a result, in individual instances specific final outcomes of SDPP funding are not easy to track or measure.

The outputs of the SDPP are also difficult to measure. First there is the challenge of obtaining data through project sponsors. Second, the outputs of the SDPP are highly diverse. For example, the purpose of one of the funded projects was to reduce symptomology of war trauma in immigrant children, while another funded project was to improved building standards to accommodate motorized wheel chairs. Given the diverse range of projects, there are no major or common metrics (such as number of job placements) on which performance may be quantified.

Further challenges arose from the fact that grants represented 40% of SPDD expenditures in 2001/02 (and about 35% of SDPP expenditures since the start of the program). By definition, grants have no specific deliverables and are not auditable. They also have only limited accountability requirements under which information may be collected.

Prior to the evaluation, an international literature review of evaluations of similar programs in other countries was undertaken to help identify examples of current evaluation methods that could be useful in evaluating the SDPP. The Literature Review of Selected Funding Programs Similar to the Social Development Partnership Program3 confirmed that there is no well-developed formula for evaluating partnership programs similar to the SDPP, but noted that some examples of methods used in other countries may be helpful.

In particular, the international literature review highlighted the importance of making effective use of qualitative information in the case of partnership programs like the SDPP. For example the literature review noted the importance of collecting complementary information from a range of informants in a manner that controls for the subjectivity of opinions. The literature review also highlighted the importance of using case studies and site visits to obtain in-depth information to help explore and complement the findings of the qualitative analysis and other lines of evidence.

The methodology used to evaluate SDPP incorporates the findings of the international literature review. Multiple lines of evidence were drawn from case studies, key informant interviews, focus groups, and a review of program-related files and documents. A broad range of key informants was interviewed. Case studies were also conducted, including many on-site visits. The collection of the case study information used a "rating" format to allow for a broadly based assessment of the degree to which various program outputs contributed to the achievement of program objectives. In addition, the on-site evaluation work used a technique identified by the literature review in attempting to confirm that improved technical systems, best practices and service delivery models were not only reported, but were also implemented. This involved determining whether there was evidence to support the claim that a specific improvement/output had occurred (e.g. gathering on-site, complementary evidence on whether a reported best practices model was actually being used and to what extent it was being used).

Case Studies

Case studies were particularly important to examining the effectiveness of the SDPP in achieving its objectives. The case studies were conducted by the firm of Goss Gilroy as a separate report. A broad range of 30 cases (i.e. 26 projects and four grants) was used in order to capture the diversity of project types and recipient organizations. The 30 cases spanned 17 organizations.

To help control for subjectivity and to quantify diverse experiences, the case studies included a numerical rating system for closed-end questions:

  • Respondents for contribution projects were asked to rate their project's performance on a three-point expectation scale (1=Below Expectations, 2=Met Expectations, 3=Exceeded Expectations) for a range of performance indicators (e.g. relevance of information/report produced, impact on government's awareness of issues related to the project, impact on clients in terms of access to services, impact on clients in terms of participation in society).
  • Respondents for grants were asked to rate the impact of the grant on the organization on a three-point impact scale (1=Low Impact, 2=Medium Impact, 3=High Impact) for a range of performance indicators (e.g. sustainability of the organization, capacity to train staff/volunteers, capacity to raise funds, and capacity to improve the situation of targeted clients in terms of access to services and participation in society).

To reduce bias, each case study included interviews with a representative of the organization receiving funds and a client or stakeholder of the funded project or organization. By including client and stakeholder views, as well as those of a representative of the sponsored project, the rating was more objective than one based solely on the views of the sponsoring organization. In addition, those who scored case studies also provided evidence that supported the rating. This evidence validates the ratings and enhances the usefulness of the ratings in assessing project impacts.

Key Informant Interviews

Thirty-three key informant interviews were conducted covering:

  • Nineteen funded organizations (e.g. Canadian Council on Rehabilitation and Work, Mohawk Council of Akwesasne, Canadian Association for Community Care and the National Network for Mental Health);
  • Eight program staff;
  • Two organizations potentially eligible for funding who had not applied;
  • Two organizations whose funding applications had been rejected;
  • Representatives of two provincial initiatives that performed similar functions to the SDPP at the provincial level (Ontario Trillium Foundation, which is an agency of the Ontario Ministry of Tourism, Culture and Recreation, and the Secrétariat à l'action communautaire autonome du Québec (SACA)).

Focus Groups

Unresolved issues emerging from the key informant interviews were further explored in two focus groups. Each of the focus groups involved five organizations. Issues examined through the focus groups included general reaction to the SDPP's mandate, terms and conditions, and the program's funding profile. The list of issues also included the program's current approach to consultation with potential recipients, funding priorities and the use of grants versus contributions. Participation in the focus groups included organizations from "Reference Groups" that the SDPP has historically consulted.

Samples for the case studies, focus groups and key informant interviews were independent of one another (i.e. organizations chosen for case studies were not chosen for interviews). In total, the evaluation reached representatives from 44 organizations associated with the program. This corresponded to more than a quarter of the 164 organizations that had received funding commitments as of 2000/2001. The evaluation was conducted in the last two quarters of fiscal year 2001/2002.

3.3 Strengths and Limitations of Evaluation Approach

Although the evaluation approach developed for the SDPP recognized and attempted to address the challenges of evaluating this type of partnership program and the diversity in funded projects and organizations, the following limitations should be noted:

  • For the case studies, the choice was made to conduct a large number of case studies in less depth, rather than a small number in greater depth. Therefore, greater reliance was placed on interviews of case study respondents and their ratings of the program combined with the methods noted above to help control for subjectivity (i.e., inclusion of an independent project stakeholder or client in rating project impacts, taking the lowest of the two scores provided by a representative of the sponsoring organization and a stakeholder or project client, and gathering specific evidence on program impact).
  • In a number of instances the information or data needed to confirm a particular outcome was simply not available or obtainable. This occurred in the case of leveraging and ongoing partnership impacts. Although the SDPP collects some information related to leveraging of contributions from others, a quantitative analysis of leveraging could not be undertaken at the time of the evaluation because the SDPP data systems were in the process of being modified in response to the need for improvements identified by the Auditor General. It should be noted that the unavailability of certain administrative/ operational data is a serious problem frequently encountered in program evaluations, and is not unique to the SDPP. The need to improve this situation is a very important issue for programs with results-driven accountability frameworks.
  • The evaluation examined the relative effectiveness of funding different types of project activities, but was unable to explore broader cost-effective issues because the diverse range of outputs from diverse projects combined with working through intermediaries precluded a quantitative analysis of cost-effectiveness.


Footnotes

3 Literature Review of Selected Funding Programs Similar to the Social Development Partnerships Program,(Evaluation and Data Development, HRDC, October 2001.) [To Top]


[Previous Page][Table of Contents][Next Page]