![]() |
![]() |
![]() ![]() ![]() ![]() ![]() |
||
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
![]() |
This chapter explores the level of satisfaction with Compass on the part of its SAR participants and the employers who hired them. It also investigates the extent of discontinuation from the program before completion, and the reasons for dropping out. Finally, it looks at participant perceptions of how well Compass prepared them for achieving economic self-sufficiency. 4.1 Client Satisfaction The participant survey explored satisfaction with all major facets of Nova Scotia Compass. Survey respondents were asked to assign letter grades to indicate their degree of satisfaction. Graph 4.1 displays clients' overall high level of satisfaction with the Compass Program. As is evident, most participants thought the program was excellent: 57% awarded the program an A. Few gave the program a failing grade (4%) or a D (3%). The mean overall grade was B+18. Chart 4.1
With not much variance in the overall grade, it should come as no surprise that there were no significant differences in opinion between options, regions, or sexes. There wasn't even a difference between those who were offered a job by their placement employer when the subsidy ended and those who weren't. A key aspect of Compass was the placement with an employer. The next graph reveals that half the participants thought that their placement was excellent. But a quarter of the sample rated their placement a C or lower: 5% gave their placement a failing grade. Also of note, WEO participants rated their placements significantly higher than did TTO participants: on average, WEO clients assigned their placement a B+ grade, and TTO clients gave a B (t=2.6, df=581, p<.01). Chart 4.2
The remaining graphs show at a glance how satisfied participants were with the different aspects of Compass. All facets received a B average or better from clients, except for guidance on services available after Compass, which rated a B-. There was a significant difference between options for only one of these measures: WEO clients gave direction and supervision provided by their employer a mark of B +; TTO gave this aspect a B. Four facets were rated differently by region: information provided about your options was graded B- in the Western region and B elsewhere; help provided by the job developer before the placement was B in the Western region and B + elsewhere; help provided by the job developer during the placement was B in the Western region and B + elsewhere; and direction and supervision by the employer was rated a B in the North Shore and Western regions and a B+ in Halifax and Cape Breton regions. In short, clients in the Western region were least satisfied with these services. Chart 4.3 Chart 4.4 Chart 4.5 Chart 4.6 Chart 4.7 Chart 4.8 Chart 4.9 Chart 4.10 The next three charts pertain only to EDO participants19. They show that even though EDO participants did not rate the program as a whole any differently than did their counterparts in WEO or TTO, they were much more apt to rate specific aspects low. Thus, two important facets of EDO - the role model (mentor), and the availability of income assistance - were given only a C + and a B- average by its participants. Of particular note, a quarter of EDO clients gave their role model an F. But, EDO clients considered the small business training aspects of EDO to be very good. Over 40% rated the self-employment business skills taught by EDO an A; an equivalent proportion gave a B grade. Interestingly, interviewees were less enamoured of the training, at least one version of it. In some cases, SARs were grouped together into training courses. In others, they were integrated into mainstream training programs (seats were purchased in community college programs). Interviewees much preferred the latter arrangement. "The SARs-only classes gave us grief." "Clients found it distracting and took away from the course. Clients talked about differences in assistance received and compared size of the loans received." Mainstream courses were preferred because they were seen as being "more realistic" and providing more of a challenge to the clients. Furthermore, "the Single Seat purchase was born out of necessity really because it provided the program in a timely fashion. We couldn't wait until we got 25 people who were expressing an interest . . . to whittle it down to 12 which we would have been keeping people on assistance for much, much longer than necessary." Chart 4.11 Chart 4.12 Chart 4.13 4.2 Participant Discontinuation Because there was no variable on the database to identify a drop-out20, we had to come up with a scheme to classify cases as drop-outs. It yielded 249 drop-outs out of 1,609 participants, for a discontinuation rate of 16%21. This is substantially lower than many other welfare reform programs, wherein a 50% drop-out rate is not uncommon. In the exit survey, we asked those who failed to complete the program why they discontinued (Table 4.1). Mentioned most often - by about one-fifth of drop-outs - was that they were fired or laid off by the placement employer. About one in seven left because they found a job with another employer. And about one in eleven left due to illness. Five percent disliked the job. No other reason was mentioned by more than three respondents. Table 4.1 Reasons for Discontinuation
We checked to see if drop-outs had different opinions about the program that may have been associated with quitting the program. It turns out that there those who completed the program were significantly more satisfied with Compass than those who quit (t=2.2, df=648, p<.05): those who finished gave Compass a B+ average; those who dropped out gave it a B. There was also a large difference in their valuation of the placement: whereas those who completed the placement gave a B+ grade on average, drop outs gave only C+ grade (t=5.6, df=581, p<.001). This is to be expected given that 21% were fired or laid off by their placement employer (this group gave their placement a C- grade). As for more specific aspects of Compass, two were rated lower by drop-outs: direction and supervision provided by the employer (B by finishers, C+ by quitters); and - perhaps not surprisingly - guidance on services available after the placement (B- by finishers, C+ by quitters). 4.3 Employer Satisfaction With Compass Employers were asked in the employer survey to assign letter grades to rate their satisfaction with various aspects of the Compass Program. Their responses are displayed in the charts below. In general, employers were very satisfied with Compass. The overall average grade assigned to Compass was A-, with 57% giving the program an A, 38% a B, and 6% a C; no D or F was given by any employer. Chart 4.14 Employers showed some dissatisfaction with certain facets of Compass, however. A mean grade of B- was given to quality of the employees referred and to employees' work attitudes; 11% of employers gave Compass failing grades for both of the aspects. At the other extreme, employers were particularly happy with the service they received from the job developer, with 68% awarding an A grade, and with the level of the wage subsidy, with 62% assigning an A. Chart 4.15 Chart 4.16 Chart 4.17 Chart 4.18 Chart 4.19 Chart 4.20 Chart 4.21 Chart 4.22 Chart 4.23 As mentioned above, employers were very happy with the job developer. They were asked how the job developer was most helpful. No single reason predominated (Table 4.2). Mentioned most often was that the job developer informed them about Compass. Table 4.2 Chief Value of the Job Developer
4.4 Preparation for Economic Self-sufficiency The next series of charts gives participants' feedback on how well Compass prepared them for achieving economic self-sufficiency. For the most part, respondents gave high marks to the program. All but two aspects were graded B or higher on average. One of the two aspects receiving a lower grade - upgraded educational skills - was not an objective of Compass22. But the other aspect certainly was: helping participants to find a permanent job, which was graded only a C+ on average. Indeed, almost a quarter of the respondents gave Compass a failing mark in this respect. Not surprisingly, those who were offered a permanent job by the placement employer after the subsidy ended gave this aspect of Compass a much higher average mark (B) than those who were not (C -). Ratings on preparation for self-sufficiency differed significantly by option in four areas: ![]() It is not surprising that EDO got lower marks for providing work experience and helping to find a permanent job than did the other two groups. That WEO clients gave a higher grade for improved job skills than did TTO clients is a surprise given the nature of the two options. Two categories differed by region: ![]() Chart 4.24 Chart 4.25 Chart 4.26 Chart 4.27 Chart 4.28 Chart 4.29 Chart 4.30 4.5 Conclusion As a useful summary, multiple regression analysis was used to determine what aspects of the Compass Program were most important to participants and employers in awarding an overall grade. The next table displays the results of the participant analysis. The final column shows the level of significance23. Variables significant at the 5% level were (in order of importance): help provided by the job developer during the placement; help provided by the job developer before the placement; guidance on services available after the placement; information provided about options under Compass; suitability of placement to career interests; and level of financial support while in Compass. Clearly, the job developer was of central importance when it came to rating the program. For the most part, participants were very happy with the job developer, so they were very happy with Compass. Table 4.3 Regression Analysis of Overall Rating of Compass by Participants
Table 4.4 displays what aspects of Compass were uppermost in the minds of employers when rating their overall satisfaction with Compass, using stepwise regression with overall satisfaction as the dependent variable and the various aspects of the program as independent variables. The best model, explaining 32% of the variance in the overall rating, included three aspects of Compass. Table 4.4 Regression Analysis of Overall Rating of Compass by Employers
Employers' satisfaction with these three aspects of Compass helps to explain their high degree of satisfaction with the overall program.
|