Government of Canada | Gouvernement du Canada Government of Canada
    FrançaisContact UsHelpSearchHRDC Site
  EDD'S Home PageWhat's NewHRDC FormsHRDC RegionsQuick Links

·
·
·
·
 
·
·
·
·
·
·
·
 

2.0 Methodology


2.1 Overview

An Evaluation Committee was established to guide the planning and conduct of the evaluation. In carrying out this responsibility, the committee reviewed all aspects of the research design, participated in developing the program logic model, reviewed and made recommendations for modifications to the questionnaires, and provided much of the information that was needed to contact potential interview candidates.

The research design included the following components:

  • program logic model;
  • evaluation criteria;
  • data collection plan; and
  • interview format and protocol.

2.2 Evaluation Criteria

Evaluation criteria describe those aspects of the program operation that the evaluation is intended to concentrate on. In this case those criteria include program design, program delivery, program outcomes, and implications for future planning. Each of the criteria has been broken down further into a number of questions, the answer to which is provided by one of three types of data that appear in one of five sources. The following description identifies the general question that was posed under each of the criteria and the corresponding indicators that are included within the question.

Criteria: Program Design

1. What were the strengths and weaknesses of the program design?

Indicators:

  • degree of congruence regarding the nature of the problem and the intended response;
  • degree of co-ordination between funding bodies; and
  • fit between available resources and expected level of demand.

Criteria: Program Delivery

2. Was the QRTA program delivered as intended, and if not, why?

Indicators:

  • extent to which employers and workers represented the target population level of Cupertino between funding bodies; and
  • consistency of communication between funding bodies, employers and workers.

3. How efficiently was the program delivered?

Indicators:

  • comparison of actual to expected demand;
  • timeliness of response; and
  • proportion of federal, provincial and employer contributions.

Criteria: Program Outcomes

4. Could the delivery process have been modified to enhance program outcomes?

Indicators:

  • level of specification related to expected outcomes;
  • identification of performance problems;
  • response to performance problems.

5. Would program outcomes have improved over a longer period of time?

Indicators:

  • patterns and trends inherent in outcome data;
  • projected need for program services; and
  • expected response rate from target population.

Criteria: Implications for Future Programming

6. What factors contributed to the termination of the program as part of the Strategic Initiative?

Indicators:

  • priorities in federal and provincial policies;
  • program co-ordination and management; and
  • perceptions of target population.

7. What can be learned from this Initiative that will improve future efforts?

2.3 Data Collection Plan

Data Types

Three types of data were collected during the evaluation. The first is the written historical record. This includes the terms of reference under which the program was intended to be delivered and the descriptions of program delivery. The written historical record was contained in the terms of reference for the program, brochures and information pamphlets describing the program, program files containing requests for service, project proposals, contracts for service, follow up reports and other related correspondence.

The second data type included the perceptions that program managers, consultants, employers and workers representatives had about the program structure and implementation. These perceptions were examined through a series of structured questions that were based on descriptions contained in the historical record. The structured questions were intended to determine the level of understanding regarding the way the program was designed and intended to be carried out.

The third type of data included the considered opinion of the respondents regarding reasons why the program evolved in the manner it did, and what conclusions may be drawn from that experience. This data was solicited through a series of open-ended questions intended to provide an opportunity for the respondents to express their opinion regarding how the program evolved in the manner that it did, and what conclusions may be drawn from that experience.

The second and third data types were found in the interviews that were conducted with respondents in one of the following five categories of Program Managers, Adjustment and Labour Market Services Consultants, Program Service Officers and Employers.

Interviews

There were a total of 22 interviews conducted that included representatives of the managers and directors, and Adjustment Consultants from Ministry of Education, Skills and Training; managers and directors, Program Service Officers and Labour Market Services Consultants from HRDC as well as representatives from four employment settings which had received service under QRTA.

The list of 65 sites that had completed a QRTA contract was examined for sites that met the following criteria:

  • services were provided to a substantial number of participants (minimum number selected = seventy five);
  • work site locations were representative of major geographical areas of province;
  • contracts were jointly funded by MoEST and HRDC;
  • services were provided between August 1995 and March 1996; and
  • representatives were available to participate in interviews.

Four work sites were selected that matched all of the criteria that have been described. The sample of interview respondents was then built around the selection of work sites. The representatives from MoEST and HRDC included program managers and consultants who had been directly involved in the provision of services to the selected sites, as well as others who had experience with QRTA in other sites. The interview respondents in three of the four sites were representatives of employers who had received services under the QRTA program. The employer in the fourth setting had had a minimal involvement in the application and service delivery process. In that setting, the Steering Committee determined that since the employer was no longer in business, the staff member from the workers' union that had been closely involved with the application and service delivery, was the appropriate person to comment on the process.

The sample size is small in that it contains 6% of the total number of sites in which QRTA services were provided. However, it is a stratified sample that was based on predetermined criteria, held to be reliable descriptors of the universe of QRTA sites. In that regard, although it would not be advisable to make an unqualified generalisation of the findings beyond the sample, they are believed to be a reliable indication of what would be determined if a larger and more comprehensive sample were surveyed.

An introduction letter along with the interview schedule was provided to all candidates prior to their participating in the interviews. The interviews began with an introduction in which the interviewer described the nature of the interview and the expectations regarding length, confidentiality and importance to the evaluation. Following the introduction, the interviewer solicited the interviewee's agreement to proceed and upon receiving that agreement, the formal interview was conducted.

All of the candidates were able to readily recall their involvement with QRTA and in only one situation did the person being interviewed have any difficulty differentiating the previous QRT program from QRTA. Some of the interview candidates in management positions within both MoEST and HRDC expressed the opinion that the extent of their direct involvement with QRTA might limit their capacity to comment on some of the questions in the interview schedule. In those situations, questions that were obviously outside their range of experience were noted and passed over.

One of the candidates refused, without explanation, to answer the question concerning the reasons the QRTA program was terminated. A second candidate offered what was described as an official explanation, but would not provide their own perception of those reasons. Despite these two responses to that one question, in all other situations the interview candidates readily engaged in the interviews with a high level of interest, and freely provided their informed and articulate views of the program and the way in which it was delivered.

All of the interviews were recorded on tape, with the provision of anonymity, and the tapes were then transcribed to files that were used to conduct the analysis. The thematic analysis indicates a high degree of consistency among responses to the structured questions concerning program purpose and procedures. The responses to the open-ended questions indicate a wider range of perceptions but the major themes that are presented appeared readily. Question 18 in the MoEST and HRDC interview schedule was generally not well understood and failed to generate any useful data. All of the other questions in the interview schedule were easily understood and responded to.

Readers can consult the Technical Report for additional information pertaining to the interview protocol and corresponding findings.


[Previous Page][Table of Contents][Next Page]