This section presents the evaluation issues identified for the evaluation of the SDPP. It also highlights some of the challenges involved in evaluating a partnership program like the SDPP, the methods used to evaluate this program, and the strengths and limitations of the evaluation approach. 3.1 Evaluation IssuesThirteen issues were identified for the evaluation of the SDPP and grouped into five categories:
3.2 Evaluation MethodsEvaluating partnership programs like the SDPP presents a number of practical challenges. The SDPP is designed to achieve its objectives indirectly through funding projects and organizations that are not administered by the federal government. Also, the intermediate and final outcomes of the SDPP involve responses from additional organizations or groups that are beyond the direct control of the program. As a result, in individual instances specific final outcomes of SDPP funding are not easy to track or measure. The outputs of the SDPP are also difficult to measure. First there is the challenge of obtaining data through project sponsors. Second, the outputs of the SDPP are highly diverse. For example, the purpose of one of the funded projects was to reduce symptomology of war trauma in immigrant children, while another funded project was to improved building standards to accommodate motorized wheel chairs. Given the diverse range of projects, there are no major or common metrics (such as number of job placements) on which performance may be quantified. Further challenges arose from the fact that grants represented 40% of SPDD expenditures in 2001/02 (and about 35% of SDPP expenditures since the start of the program). By definition, grants have no specific deliverables and are not auditable. They also have only limited accountability requirements under which information may be collected. Prior to the evaluation, an international literature review of evaluations of similar programs in other countries was undertaken to help identify examples of current evaluation methods that could be useful in evaluating the SDPP. The Literature Review of Selected Funding Programs Similar to the Social Development Partnership Program3 confirmed that there is no well-developed formula for evaluating partnership programs similar to the SDPP, but noted that some examples of methods used in other countries may be helpful. In particular, the international literature review highlighted the importance of making effective use of qualitative information in the case of partnership programs like the SDPP. For example the literature review noted the importance of collecting complementary information from a range of informants in a manner that controls for the subjectivity of opinions. The literature review also highlighted the importance of using case studies and site visits to obtain in-depth information to help explore and complement the findings of the qualitative analysis and other lines of evidence. The methodology used to evaluate SDPP incorporates the findings of the international literature review. Multiple lines of evidence were drawn from case studies, key informant interviews, focus groups, and a review of program-related files and documents. A broad range of key informants was interviewed. Case studies were also conducted, including many on-site visits. The collection of the case study information used a "rating" format to allow for a broadly based assessment of the degree to which various program outputs contributed to the achievement of program objectives. In addition, the on-site evaluation work used a technique identified by the literature review in attempting to confirm that improved technical systems, best practices and service delivery models were not only reported, but were also implemented. This involved determining whether there was evidence to support the claim that a specific improvement/output had occurred (e.g. gathering on-site, complementary evidence on whether a reported best practices model was actually being used and to what extent it was being used). Case StudiesCase studies were particularly important to examining the effectiveness of the SDPP in achieving its objectives. The case studies were conducted by the firm of Goss Gilroy as a separate report. A broad range of 30 cases (i.e. 26 projects and four grants) was used in order to capture the diversity of project types and recipient organizations. The 30 cases spanned 17 organizations. To help control for subjectivity and to quantify diverse experiences, the case studies included a numerical rating system for closed-end questions:
To reduce bias, each case study included interviews with a representative of the organization receiving funds and a client or stakeholder of the funded project or organization. By including client and stakeholder views, as well as those of a representative of the sponsored project, the rating was more objective than one based solely on the views of the sponsoring organization. In addition, those who scored case studies also provided evidence that supported the rating. This evidence validates the ratings and enhances the usefulness of the ratings in assessing project impacts. Key Informant InterviewsThirty-three key informant interviews were conducted covering:
Focus GroupsUnresolved issues emerging from the key informant interviews were further explored in two focus groups. Each of the focus groups involved five organizations. Issues examined through the focus groups included general reaction to the SDPP's mandate, terms and conditions, and the program's funding profile. The list of issues also included the program's current approach to consultation with potential recipients, funding priorities and the use of grants versus contributions. Participation in the focus groups included organizations from "Reference Groups" that the SDPP has historically consulted. Samples for the case studies, focus groups and key informant interviews were independent of one another (i.e. organizations chosen for case studies were not chosen for interviews). In total, the evaluation reached representatives from 44 organizations associated with the program. This corresponded to more than a quarter of the 164 organizations that had received funding commitments as of 2000/2001. The evaluation was conducted in the last two quarters of fiscal year 2001/2002. 3.3 Strengths and Limitations of Evaluation ApproachAlthough the evaluation approach developed for the SDPP recognized and attempted to address the challenges of evaluating this type of partnership program and the diversity in funded projects and organizations, the following limitations should be noted:
|