Western Economic Diversification Canada | Diversification de l'économie de l'Ouest Canada

Home : Reports and Publications : Audit & Evaluation : Canada Business Service Centres in Western Canada: Evaluation Report 2002

Canadian Business Service Centres
in Western Canada:
Evaluation Report 2002

Appendix C:

Methods

Scope

Three issues defined the scope of the evaluation: relevance, success, and effectiveness related to design and delivery alternatives.

Methodology

The evaluation methodology was consistent with the National Evaluation Framework for the Canada Business Service Centres developed by Industry Canada.

Five methods were used:

  1. A survey of 282 clients, stratified by province.*
  2. A survey of 134 representatives (potential clients) of small and medium-sized enterprises (SMEs) in western Canada, who were not previously clients of the CBSC.
  3. A census of front-line and second-line staff in each Centre.
  4. Interviews with 37 senior officials.
  5. Document reviews.
* In fact, 300 responses were received, but the province of origin could not be established for 18. The total stratified by province is therefore 282.

The surveys were primarily web-based, and self-administered. After discussion with the client, the Macleod Institute sent an introductory e-mail from Western Economic Diversification (WD) to all prospective respondents. The Institute followed up with e-mail invitations to all prospective respondents. Respondents accessed the survey questionnaire over the Internet, and responded directly on-line. A small number of surveys were administered over the telephone, using the same questionnaires.

Client Survey

A survey instrument was prepared (Appendix B), based on the National Evaluation Framework (NEF) as requested by WD. All questions in the Common Measurement Tool (CMT) were included although some required minor modifications (they were either split into two questions, or slightly rephrased to suit the survey medium). All other mandatory questions in the NEF were also included, with similar modifications where appropriate. Some optional NEF questions were not pertinent to the three evaluation issues being addressed, and were therefore not included. In the result, the survey totaled 58 questions.

The Institute requested each Centre to provide either a list of all clients who accessed the Centres over the past 12 months, or a random, stratified sample of those clients. The lists were to provide names, mode of access, phone numbers and email addresses for each client. The Centres provided the following:

  • British Columbia: Approximately 300 client authorization cards from walk-in clients, collected in two installments over a total of about four weeks after the evaluation was initiated.
  • Alberta: Approximately 3,900 names and phone numbers, with a few hundred e-mail addresses.
  • Saskatchewan: Approximately 900 names and phone numbers, with a few e-mail addresses.
  • Manitoba: Approximately 1,050 names and phone numbers, without e-mail addresses.

The sample populations provided by the Centres were mainly comprised of walk-in and telephone clients. Sample frame coverage is therefore weighted towards walk-in clients, particularly in BC where it is also biased in favour of clients visiting the Centre within one four week period.

The Institute randomized the client lists provided by the Centres, using a random number generator. Where there were insufficient e-mail addresses, clients from the randomized list were called and asked to provide their email address. The WD introductory letter, Institute invitation and survey questionnaire were then sent by e-mail.

A sample target of 68 was set for each province. The sample size was selected to yield a precision of +/- 10% at a 90% confidence interval in each province and an overall precision of +/- 5% at a 90% confidence interval. An initial sample size of 113 per province was chosen, reflecting an expected response rate of 60% (based on predicted response rates in recent web-survey literature).

During the first ten days of the survey, the Institute sent two reminder e-mails to non-respondents. The Institute followed up with telephone reminders to the remainder of non-respondents. The response rate was about half what was anticipated in most provinces, so the Institute sent out a second batch of invitations in order to reach the target sample size. The statistics are summarized in Table 1.

Table 1: Client Survey Statistics
 
BC
AB
SK
MB
Totals

Initial Sample Size

222
350
139
160
871
Responses (#)
66
69
63
78
282*
Response Rate (%)
29.7
19.7
35.3
48.6
32.3
Precision @ 90% confidence level
+/- 10.1%
+/- 10%
+/- 10.4%
+/- 9.4%
+/- 4.9%
* In fact, 300 responses were received, but the province of origin could not be established for 18. The total stratified by province is therefore 282.

Potential Client Survey

A survey instrument was prepared (Appendix B), based on the National Evaluation Framework (NEF) as requested by WD. All questions in the Common Measurement Tool (CMT) were included, although some required minor modifications (they were either split into two questions, or slightly rephrased to suit the survey medium). All other mandatory questions in the NEF were also included, with similar modifications where appropriate. Some optional NEF questions were not pertinent to the three evaluation issues being addressed, and were therefore not included. In the result, the survey totaled 60 questions.

The Institute surveyed 134 representatives of small and medium-sized enterprises (SMEs) in western Canada. The sample frame was a database of western Canadian SMEs maintained by a commercial directory service. The Institute randomized the SME lists provided by the directory service, using a random number generator, and stratified the lists by province. An introductory letter from WD, the Institute's invitation and a survey questionnaire were then sent by e-mail.

A sample target of 68 was set for each province. The sample size was selected to yield a precision of +/- 10% at a 90% confidence interval to allow meaningful analysis at the provincial strata. As a result, the survey achieved an overall precision of +/- 7.1% at a 90% confidence interval. An initial sample of 250 SMEs in each province was prepared (based on predicted response rates in recent web-survey literature). During the first ten days of the survey, the Institute sent an e-mail reminder to non-respondents. The Institute followed up with telephone reminders to the remainder of non-respondents. When response rates fell below expectations, a second sample of 250 SMEs in each province was invited to take the survey. In total, 2,000 SMEs were contacted. The statistics are summarized in Table 2.

Table 2: Potential Client Survey Statistics
 
BC
AB
SK
MB
Totals

Initial Sample Size

500
500
500
500
2,000
Responses (#)
21
31
46
36
134
Response Rate (%)
4.2
6.2
9.2
7.2
6.7
Precision @ 90% confidence level
+/- 17.9%
+/- 14.8%
+/- 12.1%
+/- 13.7%
+/- 7.1%

Staff Census

The Institute identified 59 front-line and second-line members of the staff at each CBSC from personnel lists provided by the Centres. An introductory letter from WD, the Institute's invitation and a survey questionnaire were then sent by e-mail. E-mail reminders were sent from the Institute within the first ten days if the prospective respondent failed to reply. In addition, CBSC General Managers were asked to send a blanket e-mail reminder to all staff.

The survey instrument was prepared and finalized after discussions with WD's Evaluation Steering Committee (Appendix B). A number of mandatory questions from the NEF were mirrored in the staff survey. The questions were intended to elicit thoughtful responses from staff involved in day to day client interactions and Centre operations, and to generate ideas for possible improvements.

A total of 46 staff members responded to the census. There is no sampling error associated with a census. Since the response rate was 78%, the results should be highly representative of all staff members. The statistics are summarized in Table 3.

Table 3: Staff Survey Statistics
 
BC
AB
SK
MB
Totals

Initial Census Size

14
18
10
17
59
Responses (#)
8
14
7
17
46
Response Rate (%)
57.1
77.8
70.0
100.0
78.0

Senior Officials Interviews

The Institute interviewed 37 senior officials in-person or by telephone. The non-random sample (selected after discussions with the Centres' Evaluation Steering Committee) included CBSC general managers, federal and provincial Managing Partners in each province, and a selection of key informants representing Co-operating and other Partners, the National Secretariat and WD headquarters staff (see Appendix E for a list of interviewees, together with a copy of the interview guide). An interview guide was prepared and finalized after discussions with WD's Evaluation Steering Committee. The interviews were partially structured, but largely open-ended. They were intended to elicit in-depth responses from officials, to help provide a deeper understanding of the issues and to generate ideas for possible improvements. The statistical reliability of the sample is not known.

Document Review

The Institute reviewed a wide-range of documents (Appendix F) including:

  • current and previous Partnership Agreements,
  • annual reports and operating plans, budgets, and incorporation documents,
  • data from the National Secretariat's Client Service System (usage statistics by mode of access provided by the Centres),
  • previous evaluation reports, round table discussions and other internal documents,
  • sample informational and promotional items,
  • client satisfaction surveys, an exit survey (BC) and feedback on the website, and
  • relevant literature.

Observations

The National Evaluation Framework stipulated that the client questionnaire include 8 CMTs (prescribed by the Treasury Board Secretariat) and 14 mandatory NEF questions (prescribed by the CBSC National Secretariat). The NEF also prescribed 20 mandatory questions for the potential client questionnaire. As a result, the client and potential client surveys evolved into long and unwieldy instruments.

CMTs largely measure a citizen's impression of service quality (staff courtesy, competence, fairness and timeliness). They are intended to track levels of client satisfaction across the entire federal government. However, none of the quality of service questions ** was germane to the current evaluation. Similarly, the CBSC National Secretariat's intention is to "facilitate the compilation of a national picture and to help track trends in subsequent evaluations." The NEF mandatory questions were far more to the point, but even so there were a few which might have been replaced with others better suited to the evaluation's purpose.

Survey design is a concern because it affects the willingness of prospective respondents to participate. Answering 50 to 60 questions requires a significant time commitment from respondents. Redundancy (asking the same question several times) is useful if meant to triangulate evidence. If merely an artifact of combining surveys designed for quite separate purposes, redundancy unnecessarily burdens the respondent. In addition, too many open-ended questions in a self-administered survey tend to be inconvenient for the respondent. Many participants, once they realize how long it is taking, jettison the exercise by failing to complete the questionnaire. Others (rightly or wrongly) anticipate tedium and simply avoid getting involved.

The Institute recognizes and appreciates the valid objectives behind both CMTs and NEF mandatory questions. Some consideration could, perhaps, be given to running separate surveys, each dedicated to a primary set of objectives. *** Efforts to focus the question set may help reduce respondent fatigue.

Another important design factor is the sample frame. As mentioned earlier, the sample frames given to the Institute did not provide complete coverage of the clientele since they were weighted to walk-in and telephone customers. This distribution created some limitations on how the samples could be used. It prevented the Institute from stratifying the samples by mode of access, for example. **** Another limitation was imposed by the way BC collected its sample population in two branches within one month, starting after the evaluation was initiated. This sample frame provided a snapshot of clients, rather than the intended sample of individuals who had used the Centre over the past twelve months. Consideration could be given to maintaining appropriate sample frames for use as and when required over time. A customer relationship management (CRM) strategy is one option that would provide ongoing sample frames.

** The CMTs include a simple query regarding outcomes ("In the end, did you get what you needed from our organization?"). This one question was relevant to the current evaluation.

*** The BC CBSC conducted an exit survey, for example. Exit polling is a good device to check service quality issues because the experience is still fresh in the client's mind. Pop-up surveys are also being run on CBSC websites, and are a good tool to track quality of service.

**** Although it can be said that, in the whole population of clients, Internet is the most common mode of access.

<< previous | next >>