Government of Canada | Gouvernement du Canada Government of Canada
    FrançaisContact UsHelpSearchHRDC Site
  EDD'S Home PageWhat's NewHRDC FormsHRDC RegionsQuick Links

·
·
·
·
 
·
·
·
·
·
·
·
 

Introduction


This report presents a general framework that can form the core of any well-designed evaluation of the EBSMs1. It identifies questions that should be addressed, offers assessments of strengths and weaknesses of various approaches for measurement, and highlights issues toward which more research efforts might be directed.2

Even though the paper does go into considerable depth on some matters, it should not be read as a definitive detailed plan for any given evaluation as such. As discussed in the paper, the framework will have to be adapted (modified and extended) to reflect features that may be unique to the particular requirements of any given evaluation. For example, any final design for an evaluation must ultimately depend on the goals set for that evaluation, the nature of the programmatic interventions to be studied, the quality of the data available, and on such practical considerations as time and resource availability.

An important consideration for the evaluation of the EBSMs concerns the reporting and ultimate perceived validity of evaluation evidence. Although this paper attempts to describe the "state-of-the-art" with respect to evaluation methodology, it does not pretend to be the final word on what a "good" evaluation is, nor how the evidence issuing from such methodology should be used. Assessing the overall validity of the evidence relies, in large measure, on: (1) An a priori understanding of the applicability and limitations of the general approach being used (e.g., limitations of quasi-experimental designs); and (2) An ex post appraisal of the quality of the data collected and of robustness of the estimation strategies used. This is especially important in the context of the EBSM evaluations where it is likely that a major component of most evaluations will be to provide evidence to assist in the assessment of the "incremental effects" of the interventions being offered. While well-understood and organized data can contribute a great deal, the problems associated with deriving reliable estimates of such incremental effects in non-experimental contexts are pervasive. In many cases it may simply prove to be impossible to provide valid results based solely on the available data. Hence, planned evaluations must include procedures for the assessment of strengths and weaknesses together with specification of a clear process for the integration of results and their dissemination to wider audiences.

These cautionary notes are an integral part of the rationale for the framework articulated in this paper and are re-enforced throughout. A proper understanding of the Panel's suggestions requires that they be viewed within this cautionary context. Recognizing the potential strengths and weaknesses associated with various approaches in the development of the summative evaluations will be instrumental in the proper interpretation and presentation of the evidence gathered and reported in future.

The paper is divided into five analytical sections that focus on the following general conceptual points:

  1. Definition of the EBSM Program Participation
  2. Analysis Methods (including Comparison Group Selection)
  3. Sample Design
  4. Outcome Measurement
  5. Integration with the MTI Project.

A final section (F) summarizes some of the issues raised here that should be kept in mind when designing quasi-experimental evaluations and includes some guidelines regarding implementation of the EBSM evaluations.


Footnotes

1 EBSM stands for "Employment Benefits and Support Measures". This terminology is used at the national level, but regional terms for the program may vary. [To Top]
2 This report assembles, organizes and extends the contents of a series of papers that summarize the views of the EBSM Expert Panel. The views of the panel were obtained during a two-day meeting (March 8 and 9, 2001 — see Nicholson 2001) and in sundry and ad hoc ongoing discussions. [To Top]


[Previous Page][Table of Contents][Next Page]