Government of Canada | Gouvernement du Canada Government of Canada
    FrançaisContact UsHelpSearchHRDC Site
  EDD'S Home PageWhat's NewHRDC FormsHRDC RegionsQuick Links

·
·
·
·
 
·
·
·
·
·
·
·
 

4. Satisfaction With COMPASS


This chapter explores the level of satisfaction with Compass on the part of its SAR participants and the employers who hired them. It also investigates the extent of discontinuation from the program before completion, and the reasons for dropping out. Finally, it looks at participant perceptions of how well Compass prepared them for achieving economic self-sufficiency.


4.1 Client Satisfaction

The participant survey explored satisfaction with all major facets of Nova Scotia Compass. Survey respondents were asked to assign letter grades to indicate their degree of satisfaction. Graph 4.1 displays clients' overall high level of satisfaction with the Compass Program. As is evident, most participants thought the program was excellent: 57% awarded the program an A. Few gave the program a failing grade (4%) or a D (3%). The mean overall grade was B+18.

Chart 4.1
Overall Grade Given to Compass


N=650
Mean=B+

With not much variance in the overall grade, it should come as no surprise that there were no significant differences in opinion between options, regions, or sexes. There wasn't even a difference between those who were offered a job by their placement employer when the subsidy ended and those who weren't.

A key aspect of Compass was the placement with an employer. The next graph reveals that half the participants thought that their placement was excellent. But a quarter of the sample rated their placement a C or lower: 5% gave their placement a failing grade. Also of note, WEO participants rated their placements significantly higher than did TTO participants: on average, WEO clients assigned their placement a B+ grade, and TTO clients gave a B (t=2.6, df=581, p<.01).

Chart 4.2
Grade Given to Placement


N=583
Mean=B

The remaining graphs show at a glance how satisfied participants were with the different aspects of Compass. All facets received a B average or better from clients, except for guidance on services available after Compass, which rated a B-. There was a significant difference between options for only one of these measures: WEO clients gave direction and supervision provided by their employer a mark of B +; TTO gave this aspect a B. Four facets were rated differently by region: information provided about your options was graded B- in the Western region and B elsewhere; help provided by the job developer before the placement was B in the Western region and B + elsewhere; help provided by the job developer during the placement was B in the Western region and B + elsewhere; and direction and supervision by the employer was rated a B in the North Shore and Western regions and a B+ in Halifax and Cape Breton regions. In short, clients in the Western region were least satisfied with these services.

Chart 4.3
Information Provided About Your Options


N=613
Mean=B

Chart 4.4
Help Provided by Job Developer Before Placement 
N=583
Mean=B+

Chart 4.5
Help Provided by Job Developer During Placement 
N=570
Mean=B+

Chart 4.6
Suitability Of Your Placement To Your Job Skills 
N=579
Mean=A-

Chart 4.7
Suitability Of Your Placement To Your Career Interests 
N=586
Mean=B

Chart 4.8
Direction And Supervision Provided By Employer 
N=585
Mean=B

Chart 4.9
Guidance On Services Available After Placement 
N=520
Mean=B-

Chart 4.10
Level Of Financial Support While In Compass 
N=552
Mean=B

The next three charts pertain only to EDO participants19. They show that even though EDO participants did not rate the program as a whole any differently than did their counterparts in WEO or TTO, they were much more apt to rate specific aspects low. Thus, two important facets of EDO - the role model (mentor), and the availability of income assistance - were given only a C + and a B- average by its participants. Of particular note, a quarter of EDO clients gave their role model an F.

But, EDO clients considered the small business training aspects of EDO to be very good. Over 40% rated the self-employment business skills taught by EDO an A; an equivalent proportion gave a B grade. Interestingly, interviewees were less enamoured of the training, at least one version of it. In some cases, SARs were grouped together into training courses. In others, they were integrated into mainstream training programs (seats were purchased in community college programs). Interviewees much preferred the latter arrangement. "The SARs-only classes gave us grief." "Clients found it distracting and took away from the course. Clients talked about differences in assistance received and compared size of the loans received." Mainstream courses were preferred because they were seen as being "more realistic" and providing more of a challenge to the clients. Furthermore, "the Single Seat purchase was born out of necessity really because it provided the program in a timely fashion. We couldn't wait until we got 25 people who were expressing an interest . . . to whittle it down to 12 which we would have been keeping people on assistance for much, much longer than necessary."

Chart 4.11
Self-Employment Business Skills EDO taught 
N=38
Mean=B

Chart 4.12
Help Provided By Mentor


N=35
Mean=C+

Chart 4.13
Availability Of Income Assistance While In Compass 
(EDO participants only)
N=39
Mean=B-


4.2 Participant Discontinuation

Because there was no variable on the database to identify a drop-out20, we had to come up with a scheme to classify cases as drop-outs. It yielded 249 drop-outs out of 1,609 participants, for a discontinuation rate of 16%21. This is substantially lower than many other welfare reform programs, wherein a 50% drop-out rate is not uncommon.

In the exit survey, we asked those who failed to complete the program why they discontinued (Table 4.1). Mentioned most often - by about one-fifth of drop-outs - was that they were fired or laid off by the placement employer. About one in seven left because they found a job with another employer. And about one in eleven left due to illness. Five percent disliked the job. No other reason was mentioned by more than three respondents.

Table 4.1 Reasons for Discontinuation
REASON FOR DISCONTINUATION Number of Cases Percent of Cases
Fired or laid off by placement employer 16 20.5%
Found a job with another employer 12 15.4
Illness 7 9.0
Disliked the job 4 5.1
Financial difficulties 3 3.8
Hired by placement employer (without subsidy) 3 3.8
Went back to school/started training course 1 1.3
Other 29 37.2
Don't remember 3 3.8

We checked to see if drop-outs had different opinions about the program that may have been associated with quitting the program. It turns out that there those who completed the program were significantly more satisfied with Compass than those who quit (t=2.2, df=648, p<.05): those who finished gave Compass a B+ average; those who dropped out gave it a B. There was also a large difference in their valuation of the placement: whereas those who completed the placement gave a B+ grade on average, drop outs gave only C+ grade (t=5.6, df=581, p<.001). This is to be expected given that 21% were fired or laid off by their placement employer (this group gave their placement a C- grade). As for more specific aspects of Compass, two were rated lower by drop-outs: direction and supervision provided by the employer (B by finishers, C+ by quitters); and - perhaps not surprisingly - guidance on services available after the placement (B- by finishers, C+ by quitters).


4.3 Employer Satisfaction With Compass

Employers were asked in the employer survey to assign letter grades to rate their satisfaction with various aspects of the Compass Program. Their responses are displayed in the charts below. In general, employers were very satisfied with Compass. The overall average grade assigned to Compass was A-, with 57% giving the program an A, 38% a B, and 6% a C; no D or F was given by any employer.

Chart 4.14
Overall Grade Given To Compass

Employers showed some dissatisfaction with certain facets of Compass, however. A mean grade of B- was given to quality of the employees referred and to employees' work attitudes; 11% of employers gave Compass failing grades for both of the aspects. At the other extreme, employers were particularly happy with the service they received from the job developer, with 68% awarding an A grade, and with the level of the wage subsidy, with 62% assigning an A.

Chart 4.15
Service Received From Job Developer

Chart 4.16
Communications With Job Developer During Placement

Chart 4.17
Quality of Employees Referred

Chart 4.18
Suitability Of Employees For Work

Chart 4.19
Employee's Work Attitudes

Chart 4.20
Amount Of Paperwork Required

Chart 4.21
Length of Placement

Chart 4.22
Wage Subsidy

Chart 4.23
Payment Response Time

As mentioned above, employers were very happy with the job developer. They were asked how the job developer was most helpful. No single reason predominated (Table 4.2). Mentioned most often was that the job developer informed them about Compass.

Table 4.2 Chief Value of the Job Developer
PRIMARY VALUE OF THE JOB DEVELOPER % of Employers
Informed me about the program 19.1%
Identified appropriate employee(s) 18.0
Assisted With Employer-Employee Interventions (Troubleshooting) 16.9
Saved My Time by Screening Employees 13.5
Supportive 13.5
Administrative help 10.1
None 7.9
Not sure 1.1
N 89


4.4 Preparation for Economic Self-sufficiency

The next series of charts gives participants' feedback on how well Compass prepared them for achieving economic self-sufficiency. For the most part, respondents gave high marks to the program. All but two aspects were graded B or higher on average. One of the two aspects receiving a lower grade - upgraded educational skills - was not an objective of Compass22. But the other aspect certainly was: helping participants to find a permanent job, which was graded only a C+ on average. Indeed, almost a quarter of the respondents gave Compass a failing mark in this respect. Not surprisingly, those who were offered a permanent job by the placement employer after the subsidy ended gave this aspect of Compass a much higher average mark (B) than those who were not (C -).

Ratings on preparation for self-sufficiency differed significantly by option in four areas:

It is not surprising that EDO got lower marks for providing work experience and helping to find a permanent job than did the other two groups. That WEO clients gave a higher grade for improved job skills than did TTO clients is a surprise given the nature of the two options.

Two categories differed by region:

Chart 4.24
Increased Motivation


N=558
Mean=B+

Chart 4.25
Developed Your Career Action Plan


N=554
Mean=B

Chart 4.26
Improved Job Skills


N=553
Mean=B

Chart 4.27
Improved Job Search Skills


N=537
Mean=B

Chart 4.28
Upgraded Your Education Skills


N=509
Mean=B-

Chart 4.29
Provided Work Experience


N=547
Mean=B+

Chart 4.30
Helped You Find A Permanent Job


N=537
Mean=C+


4.5 Conclusion

As a useful summary, multiple regression analysis was used to determine what aspects of the Compass Program were most important to participants and employers in awarding an overall grade. The next table displays the results of the participant analysis. The final column shows the level of significance23. Variables significant at the 5% level were (in order of importance): help provided by the job developer during the placement; help provided by the job developer before the placement; guidance on services available after the placement; information provided about options under Compass; suitability of placement to career interests; and level of financial support while in Compass. Clearly, the job developer was of central importance when it came to rating the program. For the most part, participants were very happy with the job developer, so they were very happy with Compass.

Table 4.3 Regression Analysis of Overall Rating of Compass by Participants
Specific Facet of Compass ß SE T p
Rating of placement .045 .034 1.4 .158
Information provided about your options in Compass .090 .035 2.6 .010
Help provided by job developer before your placement .191 .043 4.5 .000
Help provided by job developer during your placement .210 .041 5.2 .000
Suitability of your placement to your job skills -.026 .040 0.7 .515
Suitability of your placement to your career interests .079 .034 2.3 .021
Direction and supervision provided by your employer .019 .033 0.6 .568
Guidance on services available after your placement .096 .030 3.2 .001
Level of financial support while in Compass .058 .029 2.0 .050
Constant .108 .077 1.4 .158
Statistics adj R2 = .526. F = 54.8, df = 9/428, p<.001

Table 4.4 displays what aspects of Compass were uppermost in the minds of employers when rating their overall satisfaction with Compass, using stepwise regression with overall satisfaction as the dependent variable and the various aspects of the program as independent variables. The best model, explaining 32% of the variance in the overall rating, included three aspects of Compass.

Table 4.4 Regression Analysis of Overall Rating of Compass by Employers
Variable ß SEB T p
Communications with job developer .282 .078 3.600 .001
Length of placement .212 .072 2.939 .004
Payment response time .158 .072 2.189 .032
(Constant) .373 .190 1.957 .055
Adj. R2 = .316 F = 11.9, df = 3/68, p<.001

Employers' satisfaction with these three aspects of Compass helps to explain their high degree of satisfaction with the overall program.


Footnotes

18 Mean grade is calculated by setting A=1, B=2, C=3, D=4, and F=5 (the values used in the questionnaire). Equal intervals are established to stand for the average grade: 1 to 1.167=A; 1.168 to 1.5=A-, 1.501 to 1.834=B+; 1.835 to 2.167=B; 2.168 to 2.5=B-; 2.501 to 2.834=C+; 2.835 to 3.167=C; and so on. For the overall grade given to Compass, the mean was 1.67, with a standard error of .04. [To Top]
19 Note the small number of cases represented by each graph. [To Top]
20 There is a field to indicate if the client was discontinued from the program, but this appears to have been frequently filled out even when the placement was completed (i.e., the client 'discontinued' because the program was finished): about half the clients discontinued according to this variable. Our algorithm identified drop-outs using several variables: outcome=quit or discontinued; date of discontinuation earlier than scheduled end date; or end date a few days after start date. [To Top]
21 The drop-out rate calculated should be viewed with caution. The survey found that almost half those we had initially classified as drop-outs said they had completed the program. It turned out that most of those had discontinued but then returned later to complete a placement. Our figures were therefore adjusted for subsequent placements that were completed. Still, though, about a fifth of those that we had classified as drop-outs said they had completed the program (for many of these clients, the outcome listed on the system was 'discontinued'. We took the respondents at their word and re-coded these cases. But it suggests that others who were coded as drop-outs but were not surveyed may have in fact completed the program. [To Top]
22 We included a question on this facet because it was posed in the Terms of Reference. [To Top]
23 The column labeled is the regression coefficient, SE is the standard error and t is the t-test statistic. The regression coefficients indicate the relative importance of each variable in explaining the overall rating (since all variables are measured in the same units). Standard errors indicate how accurate the sample is (for inference to the population): the lower the SE, the more accurate the estimate. The t-test statistic tests the hypothesis that there is no linear relationship between the independent and dependent variables and is the quotient of b SE for each independent variable. A significance level (p) of <.05 supports the hypothesis that the independent variable (e.g., rating of the placement) influences the dependent variable (overall rating of Compass). [To Top]


[Previous Page][Table of Contents][Next Page]