Government of Canada | Gouvernement du Canada Government of Canada
    FrançaisContact UsHelpSearchHRDC Site
  EDD'S Home PageWhat's NewHRDC FormsHRDC RegionsQuick Links

·
·
·
·
 
·
·
·
·
·
·
·
 

Appendix A: Methodology: Survey of Youth Applicants, Participants and Discontinuants


A. Survey Methodology

The sampling frame for the survey was respondents to the Baseline Survey that was administered to applicants to YSC projects. Data of this type were available on 3,656 youth. Distinguishing successful applicants (“participants”) from unsuccessful ones (“non-participants”) required identifying participants using SICs from administrative data provided by the Program. Difficulties of selection owing to missing SIC data in the Baseline data and other factors led to the decision to survey what was essentially a census of all applicants and then post-code respondents as participants or non-participants on the basis of their self-identification. However, 120 names were eliminated at the outset because these applicants had previously been surveyed by the YSC program.

The survey proceeded as follows:

  • A single survey instrument was developed with appropriate skip patterns to accommodate the responses of participants who completed the project, participants who stayed nearly to the end, early discontinuants (“called dropouts”) and non-participants.
  • The instrument was pretested on March 8, l998 and as a result a few changes were made to clarify the youth employment program we were asking about; clarify the identification of the date when participants left the project; and to add a few answer categories which arose during the interviewing.
  • The telephone survey was conducted in English and French from March 9 to March 29, 1998, using Canadian Facts’ computer-assisted FACTS system. Calling was done from Central Location Telephones in Toronto, Edmonton, London, Quebec City and Bathurst. Callbacks were continued until unproductive (and the field had to be closed in order to keep to the production schedule for tables ).
  • Open ended questions and items requiring specification were coded, reviewed by the research team, revised and included in tables along with prelisted responses.

The record of call, shown below, indicates that despite efforts to improve contact and tracing of applicants a great many respondents could not be contacted because no active phone number could be obtained.

We started with 3100 unduplicated names and addresses, but 38.2% of these yielded no reply or did not find an applicant, were not in service, or could not generate a number.

Outright refusals by potential respondents were relatively few (4.8%), but respondents who were never available or situations where terminations occurred for a variety of reasons added to that proportion.

A curious category of result is found in the 158 persons who said that they did not apply. This raises questions about how the Baseline surveys were administered within projects, but the outcome may also be explained by some implicit refusals, misperceptions of the process or poor memory of events.

  Number of Cases %
Completed 1,242 40.1
Engaged 10 .3
No reply 112 3.6
Appointment 34 1.1
Respondent not available 2 .1
Refusal 149 4.8
Language barrier 19 .6
Not in service 469 15.1
Non-residential 72 2.3
Respondent ill/never available 233 7.5
Respondent did not apply 158 5.1
No one by that name and QB (no, refused) 242 7.8
Refusal for giving phone number 9 .3
No phone number obtained 290 9.4
Terminated (other reason) 59 1.9

B. Non-Response Bias Analysis

A logistic regression model indicated several variables from the baseline survey associated with the likelihood of responding to the follow-up survey. The following conditions were associated to a statistically significant degree with greater likelihood of responding:

  • Person is located in the Atlantic region.
  • The person’s future plans include wanting to find a job (Q.20).
  • “I’d turn down a better-paying job if I had to move from my community to get it” describes the person’s self-image relatively well (Q.21).
  • Frequency of reading simple instructions such as in recipes or on packaged goods (Q.22)

The following conditions were associated to a statistically significant degree with lesser likelihood of responding:

  • “Being unemployed is one of the worst things I can think of” describes the person’s self-image relatively well (Q.21).
  • “I know how to find a job” describes the persons self-image relatively well (Q.21).

We tabulated the relative frequencies of response to the follow-up survey for the above-listed variables. For each variable that had several levels of response, the levels were combined, on the basis of similarity of response rates, into just two categories, and the response rates recalculated for the resulting categories. The inverses of these response rates were then applied to data for each of the respondents to the follow-up survey, with an appropriate adjusting factor to ensure that the total weighted sample size was equal to the unweighted sample size (1,242). The weights ranged from 0.68 to 2.05, with a mean (by design) of 1.00.

These weights were then used in all the multivariate analyzes performed. The purpose of doing so was simply to ensure that the people who responded to the follow-up survey represented the baseline population as closely as possible on the basis of characteristics associated with differing response rates to the follow-up survey. For example, the response rate was significantly higher in the Atlantic region than elsewhere in the country. Therefore, responses from that region would receive relatively less weight in the analyzes because they represent fewer people than do responses from other parts of the country.


[Previous Page][Table of Contents][Next Page]