3.8 Statistics are also used for statutory or regulatory purposes such as determining electoral boundaries; distributing federal funds to provinces; apportioning federal and provincial taxes; determining the eligibility of the unemployed for employment insurance; and indexing payments to beneficiaries (for example, Canada Pension Plan payments).
3.9 In addition, statistics about Canada's economic and social conditions keep us informed. They help us in deciding where to live, what careers to pursue and what investments to make, for example. They also support our democracy by helping us make informed decisions about voting and other actions designed to influence governments. ( see photograph )
3.10 Rapid and accelerating social, economic and geopolitical changes have heightened the demand for reliable, objective statistical information on a wide spectrum of issues like the environment, health and provincial economies. As the demand for and use of statistics grow, their quality becomes increasingly important.
3.12 With planned gross expenditures of almost $435 million in 1998-99 (funded by a parliamentary appropriation of $360 million and revenues of $75 million), the Agency has some 360 statistical programs and releases more than 1,000 statistical products each year. Its objective is to provide:
...comprehensive and relevant statistical information on the economic, demographic and social structure of Canada in order to support the development, implementation and evaluation of policies, programs and decision-making.3.13 As noted in the most recent Annual Report to Parliament of the President of the Treasury Board ( Managing for Results 1998 ), Statistics Canada has committed itself to furnishing Canadians with objective and non-partisan statistics. These statistics are intended to provide, for various aspects of Canada's economy and society, measures that:
3.16 While there is a general recognition among statistical agencies that quality is multi-dimensional, and some convergence of opinion about the range of characteristics that make up quality, there is no international standard definition for statistical quality. In common with other reputable statistical agencies, Statistics Canada approaches quality from the user's perspective. Its Quality Assurance Framework document identifies six characteristics of quality that its policies and practices need to address: relevance, accuracy, timeliness, accessibility, interpretability and coherence (see Exhibit 3.1 ).
3.17 Over the past 20 years, Statistics Canada has put in place a wide variety of policies and processes to:
3.19 Within the framework of the policies, guidelines and other initiatives that are in place, it is generally left to each program manager to select and implement the quality management techniques that are appropriate to their specific programs.
3.21 In view of the technical complexities involved and the professional expertise available within the Agency, we proposed that it carry out self-assessments of a number of statistical programs as an integral part of our audit, focussing on the adequacy of systems and practices for managing quality. The Agency agreed to this proposal and completed assessments of four of its major surveys.
3.22 Our objectives in this audit were to determine whether Statistics Canada:
3.25 In addition to this internal balancing act, the Agency is committed to external reporting. It reports to Parliament and the public on its performance, and to users on the quality of its statistics. We believe that Statistics Canada thus needs to have meaningful information on the adequacy of its quality management systems and practices (including information on the quality of its statistics) to help improve its statistical programs, support corporate decision making and assurance, and report externally on its performance in achieving quality.
3.26 Thus, we expected that the Agency would systematically assess the adequacy of its quality management systems and practices in individual statistical programs, and assess the quality actually achieved. Such assessments, reported at the corporate level, are particularly important given the wide latitude managers have to select and implement quality management techniques in individual programs.
3.28 Program evaluations. In the 1980s the Agency established a corporate program evaluation function. Over a span of five years, it completed 19 evaluations of statistical programs, drawing heavily on feedback from major users. The evaluations made some 200 recommendations for improvements to help meet user needs; many of the recommendations were implemented. In 1991-92, following an assessment of this first cycle of evaluations, the Agency devolved responsibility for evaluation to program managers (see paragraph 3.31 ). There have been no corporate program evaluations by the Agency since then.
3.29 Internal audits. In 1990 the Agency carried out an internal audit of compliance by individual statistical programs with the Agency's Policy on Informing Users of Data Quality and Methodology. It found that many of the audited programs did not fully comply with the Policy, and proposed that a follow-up audit be carried out two to three years later (that is, in 1994-1995). In view of other priorities, the follow-up was postponed to 1998. After we set out our plans for this audit, the Agency decided not to proceed with the follow-up. Although the Agency retains an internal audit function, we found that in recent years it has carried out no audits focussed specifically on managing the quality of statistics.
3.30 Program Reports. As a means of devolving responsibility for evaluation to program managers, in 1991-92 the Agency introduced a requirement that each program produce a detailed annual Program Review Report as input to the Agency's Long-term Planning Process. A subsequent change required Program Reports to be produced every two years. The Program Reports are designed to be self-evaluation reports from program managers to the Chief Statistician of Canada on the achievement of program objectives. According to the planning guidelines, the reports should pull together from various sources the key findings on program performance and include extensive indicators of quality.
3.31 Although these Program Reports have been used in the Agency's Long-term Planning Process, we found that they have not always been submitted on time. Twelve of the 19 product-related Program Reports scheduled for the first two-year cycle had been submitted when we completed this audit, and only one of the 15 scheduled for the second cycle. One reason why the Reports are late is that many of the programs are being revised under the Agency's ongoing Project to Improve Provincial Economic Statistics (PIPES). That project itself has important implications for the assessment and reporting of statistical quality (see Exhibit 3.2 ).
3.32 We also found that when Program Reports were submitted, they did not always include quality indicators as described in the guidelines for preparing these reports. Finally, we found that some Reports did not cover all the programs for which the reporting manager was responsible. We note, however, that some of the reporting managers are responsible for a great many programs - more than 60 in one case.
3.33 User input and feedback. Statistics Canada uses a variety of mechanisms to obtain user input and feedback on its statistical outputs. These help it to keep its programs relevant and to meet the evolving needs of those who use its products. The mechanisms include:
3.35 We found that the information entered into the system has been little used for evaluation and planning. In recent years the SDDS has been used mainly to produce the Guide to Statistics Canada's Programs and Products , the first document that many users consult for information about the Agency's products.
3.36 Program managers submit annual updates to the SDDS, but SDDS staff do not verify their completeness. A "data quality" section for each survey is required to describe briefly the most important sources of error, and to provide quantitative measures and qualitative descriptions of data quality. However, we were told that one in every six surveys had no information about data quality in the SDDS. Further, our review showed that where such information was available, it often failed to describe quality in a consistent and meaningful way.
3.38 As already noted, Program Reports, one of the Agency's key formal mechanisms currently used for evaluating programs, are frequently not submitted on time and may lack some of the required information. Although valuable in their own right, other formal mechanisms (such as internal audits, the processes related to the SDDS, and processes for gathering client input and feedback) have not filled this gap.
3.39 We concluded that the Agency's formal quality assessment mechanisms are not applied consistently. As a result, they do not provide, either individually or collectively, systematic and transparent information on the adequacy of quality management systems and practices in the Agency's statistical programs or on the quality they actually achieve.
3.40 Statistics Canada should ensure that formal quality assessment mechanisms are applied consistently so that they provide systematic information on the adequacy of quality management systems and practices in individual statistical programs and, to the extent possible, information on the quality that they achieve.
Agency's response: We agree with the need to improve compliance related to some of our internal reporting mechanisms. However, we maintain that this lack of compliance has not in any way jeopardized the quality of our statistical output. We have already launched an initiative to improve internal reporting.
3.42 We selected the surveys jointly with Statistics Canada, based on their significance to users and their public profile, as well as the range of subject matter and data collection techniques they represented. Exhibit 3.3 provides brief descriptions of the selected surveys: the Consumer Price Index, the Labour Force Survey, the Monthly Survey of Manufacturing and the Uniform Crime Reporting Survey.
3.44 The Agency developed a set of guidelines to help each Program Division describe systematically the way it manages quality in the planning, design, execution and analysis phases of the survey, as well as in its overall management. Exhibit 3.4 shows the types of information the Program Divisions were asked to provide.
3.45 The second step in the self-assessment involved review teams of experienced Statistics Canada staff drawn from outside the program areas. They assessed the information and reached conclusions about the adequacy of quality management systems and practices. The review teams provided a degree of independence to help ensure an objective, fair and equitable review process. A senior-level Steering Committee provided guidance and oversight to the work of the review teams.
3.46 The self-assessment process began in January 1998, with the Uniform Crime Reporting Survey serving as a pilot. The pilot provided feedback and evidence on the viability of the approach. On the basis of this pilot, the self-assessments of the other surveys were initiated in May 1998. Reports documenting the results of the self-assessments were available to us by early September 1998.
3.47 Throughout the process, we maintained close contact with the Agency's Steering Committee and the review teams. We reviewed the planning and execution of the self-assessments, reviewed the supporting documentation and held follow-up discussions with the Agency's review team and program managers as well as interviews with users of statistical products.
3.48 Overall, we found that the self-assessments were carried out in accordance with the methodology and schedule agreed to with our Office. The descriptions prepared by Program Divisions provided the background information needed to understand the programs in general and the management of their quality in particular. The review teams were independent of Program Divisions and knowledgeable about quality management as well as about the surveys they were reviewing. We concluded that the self-assessments were generally well planned and executed.
3.51 As a component of the National Justice Statistics Initiative, the purpose of the UCR Survey is to help ensure that the Canadian public has accurate information on the nature and extent of crime in Canada. The UCR Survey relies on administrative data provided by police forces across the country. As a result, Statistics Canada has less direct control over the completeness and the quality of the data than in those cases where it collects survey data itself. The two key factors that affect the quality of the UCR statistics, therefore, are the participation of all police forces in Canada and the integrity of the record systems maintained by them. The self-assessment clearly reports quality weaknesses in both these areas.
3.52 An additional complicating factor is that the UCR Survey relies on two conceptually different surveys that run in parallel. Some police forces report summary data on the number of offences (UCR1). Others provide detailed data on individual criminal incidents, including, for example, the age and sex of victims and offenders, the level of injury, weapons involved, location of incident and dollar values of property and drug crimes (UCR2). Statistics Canada converts the UCR2 data to summary data and combines them with UCR1 to publish crime statistics at the national level. In addition, it publishes the UCR2 data that are available. Although considerably richer for purposes of analysis, the published UCR2 data are limited by the fact that they cover only about half the reported crime in Canada and are not representative of Canada as a whole.
3.53 In our view, these circumstances place a particular onus on the Agency to inform potential users as clearly as possible about the quality of the UCR data and limitations on their use. We note that while the UCR review team made recommendations to strengthen practices for managing the relevance and interpretability of the survey (see Appendix B ), its overall conclusion states that "no major changes are being recommended" and labels the recommendations as "fine tuning". Although the meaning of terms such as "major changes" and "fine tuning" are clearly matters of judgment, we believe the recommendations are more important than the self-assessment suggests and deserve the attention of senior management.
3.54 Appendix B includes excerpts that summarize the self-assessments' conclusions about the four surveys under each of the six characteristics of quality, along with our comments. In most cases, we agree with the review teams' conclusions and with the opportunities for improvement that they identified. In some cases, we point to additional areas where we believe that quality management practices, including the reporting of quality, can be improved.
3.56 The constraints on the self-assessments carried out for this audit would need to be removed. The four self-assessments set out to examine the adequacy of quality management practices explicitly within the context of the priorities and resources allocated to the programs by corporate management. Consequently, the assessments looked only at whether the programs did what could reasonably be expected in prevailing circumstances to manage quality. In addition, they did not set out to assess or to comment on the quality actually achieved in the four surveys.
3.57 We believe that a stronger focus on results would enhance the self-assessment technique. The following kinds of issues would need to be assessed:
3.59 Statistics Canada should review the potential for a wider application of an enhanced self-assessment technique as one component of the formal processes it uses to assess quality.
Agency's response: We will consider the benefits and costs of applying this technique as one component of our quality management approach.
3.61 As already noted, the Agency has committed itself to providing Canadians with high-quality statistics that are relevant to policy making and responsive to emerging issues. In many of its policies and processes it gives a central place to the quality of statistics. We therefore expected to find quality-related performance indicators in the Agency's Performance Report for the period ended 31 March 1998, which was tabled in October 1998.
3.62 Under the heading "Information Quality", the Performance Report notes that Statistics Canada uses a wide range of quality assurance practices and conducts intensive "institutional" quality verification of all data releases. Under the same heading, the Report notes that indicators of data quality are included in all publications, although our review, and the Agency's own assessments, showed that this is not always the case (see Informing Users About Data Quality and Methodology ).
3.63 The Performance Report includes some descriptions of processes used to keep programs relevant - including external advice and user feedback - and provides examples of changes made as a result. With respect to timeliness, the Report shows the time elapsed between the reference period and the release dates for six major surveys; but it provides no rationale for selecting only these surveys from among some 360 statistical programs. For 28 selected outputs in "major subject areas", the Report includes the frequency of publication (for example, monthly or quarterly) and indicates whether they were released on schedule in 1997-98.
3.64 The only other quality-related indicators in the Report are indicators of accessibility. They provide some information about the growing use of the Agency's Internet site, as well as changes to the statistical information available there.
3.65 Overall, we concluded that the Performance Report contains only partial information about the Agency's actual performance in terms of the six characteristics it has identified as central to quality. In particular, the Report provides no performance information at all about the accuracy, interpretability or coherence of statistics.
3.66 We were told that the process used to gather information for the Performance Report was informal and not linked explicitly to other quality-related initiatives. Program Reports were consulted to some extent. However, no use was made of the Statistical Data Documentation System, which could be a key database to generate quality-related indicators for external reporting.
3.67 Statistics Canada should improve the coverage and content of information on statistical quality in its annual Performance Report to Parliament by drawing on quality information available from internal assessment and reporting systems.
Agency's response: We will review the content of the annual Performance Report to Parliament to improve the way in which information on statistical quality is presented. However, we do not believe that it is possible to produce simple summary quality measures across a wide variety of programs in a way that is useful and meaningful for Parliament.
3.69 Statistics Canada recognizes the need to provide potential users with information about data quality and the concepts, definitions and methods used, so they can determine whether the statistics fit their purposes and can make informed use of them. Since 1978, the Agency has had a Policy on Informing Users of Data Quality and Methodology.
3.71 The internal audit report made a number of important observations and recommendations on Statistics Canada's dissemination of quality-related information to users. For example, the report stated that:
3.73 We compared Statistics Canada's Policy with the approaches taken by reputable statistical agencies in other countries. We found that while all of these agencies recognized the need to inform users of data quality and methodology and had taken steps to do so, not all of them had documented policies in place. The policies that we saw varied in their structure and content. Although some were quite similar to that of Statistics Canada, none, in our view, were more advanced.
3.75 We assessed a selection of 10 products to determine the nature and extent of disclosure about data quality and methodology both in hard copy publications and electronic media. We used criteria based on the 11 mandatory minimum requirements specified in the Agency's current Policy on Informing Users of Data Quality and Methodology (see Exhibit 3.5 ).
3.76 Our test showed that disclosure of information on data quality and methodology in the Agency's products did not always comply with the mandatory minimum requirements of the Policy. Therefore, users are not always appropriately informed of the strengths and limitations of statistics. Disclosure was inconsistent across products. Specifically, we noted the following:
3.78 To ensure that its Internet site complies with its policy on informing users, the Agency recently began to develop an Integrated Meta Database to replace the SDDS. It is to contain information on the data quality, concepts and underlying methodology of each Agency survey, and is to be accessible to all Internet users. Because the new database was still being developed at the time of our audit, we were unable to assess the nature and extent of the information it contains.
3.79 Our findings on the lack of information and the inconsistencies in disclosure are in accordance with the Agency's findings on "interpretability" in the four self-assessments it carried out (see Appendix B ).
3.80 Statistics Canada's Methods and Standards Committee has functional responsibility for the Policy on Informing Users of Data Quality and Methodology, and is mandated to monitor its implementation. However, we found that the Committee has not done so, nor has it produced the periodic reports on the state of compliance that the Policy requires. We noted, too, that while program managers can ask the Methods and Standards Committee for exemptions from compliance with the Policy, no such exemptions have been requested or granted.
3.81 The Agency's Guide to Statistics Canada's Programs and Products includes some information on data quality and methodology. However, our review showed that practice with respect to the inclusion of such information is inconsistent across programs. In some cases no information is provided; in others, information on the source and definition of possible errors is included. In still others, actual indicators of data quality are provided. Although the policy on informing users does not apply to the Guide, including information on data quality and methodology could help the many users who would consult the Guide first when seeking information about the Agency's products.
3.82 Officials in the other statistical agencies we visited indicated that their own quality disclosure practices were also inconsistent. Most officials acknowledged that they could, and should, do better. Disseminating statistics by means of new technologies, such as the Internet and compact discs, was widely recognized as a particular challenge.
3.83 Statistics Canada should ensure that its Policy on Informing Users of Data Quality and Methodology is applied consistently across products and dissemination media.
Agency's response: Agreed.
3.84 In informing users of data quality and methodology, Statistics Canada should:
3.86 However, we believe there is a need for each statistical program to demonstrate regularly - through self-reporting, some form of independent assessment, or both - that it has met design parameters or quality targets, such as sample size and response rate (see paragraphs 3.40 and 3.59 ). The frequency and depth of such reporting would need to be decided with due consideration to the cost, complexity and importance of the statistical programs concerned. Without systematic quality assessments, it is questionable whether the Agency can assure itself, or others, that the systems and practices it uses to build in quality are effective.
3.87 The Agency has many quality-related policies, guidelines and systems that could become the basis for effective quality assessment and reporting. Many of the necessary building blocks are already in place, and recent experience with the self-assessment approach may point to a useful addition. However, we believe that the Agency needs to reshape some of these building blocks and reorient others to make them more cohesive. A basis for such integration could be the newly documented Quality Assurance Framework, which sets out what the Agency considers to be the key characteristics of quality in statistics, and describes the processes already in place to manage quality.
3.89 We noted that definitions and requirements relating to quality of statistics are not always consistent. For example, the term "quality" itself has taken on new dimensions over time, expanding from the more traditional meaning of accuracy to one that reflects an explicit user orientation. Today, quality is defined or described differently in various policies, guidelines and systems.
3.90 We also noted that internal and external reporting requirements are not co-ordinated and sometimes overlap. Besides being required to disclose information on data quality and methodology to users of each of their products, program managers are currently required to update the Statistical Data Documentation System annually, prepare Program Reports every two years and provide input to the Agency's annual Performance Report to Parliament. Because these different reporting requirements are not co-ordinated, each one leads to additional work and can generate new streams of quality-related information. This may contribute to the difficulties that program managers and the Agency itself face in satisfying all the reporting requirements adequately.
3.91 The Agency's Policy on Informing Users of Data Quality and Methodology establishes expectations for managers that are stronger than those established by the Quality Guidelines. The former clearly sets out "minimum standards" for disclosure by program managers. The latter - a "collection of methods, procedures and practices that govern the pursuit of quality objectives" - includes no mandatory requirements. As already noted, the Quality Guidelines give program managers latitude to select and implement whatever practices they consider appropriate in their particular circumstances.
3.92 The fact that standards for informing users are mandatory, while the guidelines for documenting quality are discretionary, may contribute to program managers overlooking or violating mandatory minimum requirements to inform users. Both our audit and the Agency's self-assessments showed that many of the programs examined did not comply fully with the Policy on Informing Users of Data Quality and Methodology.
3.93 We believe that if quality were defined more consistently, and if quality assessment, documentation and reporting were better integrated, the burden on program managers and the Agency could be reduced. The Integrated Meta Database (the planned successor to the Statistical Data Documentation System) could be used consistently to document, for each survey, key quality-related decisions and the rationales for them, as well as quality indicators (as the Agency's Quality Guidelines already suggest). The Database could then become a central repository of information on quality to support consistent and effective internal reporting and external disclosure.
3.94 We believe that co-ordinated quality-related policies, guidelines, systems and processes, and a more disciplined approach to documentation, would help provide more systematic information about quality to support management and reporting. A more disciplined approach means better, not necessarily more, documentation.
3.95 Statistics Canada should make its quality-related policies, guidelines and systems more coherent and cohesive by:
3.96 Statistics Canada should co-ordinate the development of the Integrated Meta Database with other quality-related initiatives and take steps to ensure the ongoing completeness and reliability of the Database.
Agency's response: Agreed. The integration of quality-related information was already recognized as one objective of the development of the Integrated Meta Database, and is under way.
3.97 Statistics Canada should assign to a corporate focal point the responsibility for promoting an integrated, consistent approach to developing and implementing quality-related initiatives throughout the Agency, including the assessment and reporting of quality.
Agency's response: We agree with the objective underlying this recommendation. However, we believe that the most effective arrangement is to have the maintenance of quality as a prime responsibility of every line manager. We would be concerned about introducing any organizational arrangement that suggests to program managers that "someone else" is looking after quality issues. We will consider this issue further.
3.99 Statistics Canada has in place a wide range of policies and processes to ensure the ongoing relevance of its programs, to build quality in through design, execution and the use of new technologies, and to maintain an environment that encourages a concern for quality throughout the organization. However, we found that its achievement of quality is not sufficiently assessed and reported either within or outside the Agency. ( see photograph )
3.100 Its formal quality assessment mechanisms are not sufficiently co-ordinated and are not applied consistently. Program managers do not always comply with reporting requirements. As a result, these mechanisms do not provide systematic, transparent information about the adequacy of quality management systems and practices in the Agency's statistical programs or on the quality they actually achieve.
3.101 However, the Agency's existing policies, guidelines and systems can become the basis for more effective quality assessment and reporting practices. Integrating them better and taking a more disciplined approach to documentation would improve the nature and extent of information available to support internal decision making and assurance, as well as external reporting to Parliament and the public on performance, and to users on data quality and methodology.
3.102 The four self-assessments Statistics Canada carried out for this audit were well planned and executed. All reached positive conclusions about the adequacy of quality management in the four surveys. For three of the four (the Consumer Price Index, the Labour Force Survey, the Monthly Survey of Manufacturing), we concluded that the self-assessments provided reasonable assurance that quality management systems and practices are adequate. In our judgment, the evidence presented in the self-assessment of the Uniform Crime Reporting Survey could have led to a stronger conclusion about the weaknesses identified and the importance of recommended improvements.
3.103 The quality of statistics figures prominently in Statistics Canada's effectiveness and in its commitments to Parliament for results. We therefore expected to find quality-related performance information in its most recent Performance Report to Parliament, tabled in October 1998. Overall, we concluded that the Report provided only limited information on the Agency's performance with respect to the quality of the statistics that it produces.
3.104 Because statistics have to be used in full awareness of their strengths and limitations, we assessed Statistics Canada's policy and practices for informing users about data quality and methodology. We concluded that the Agency's Policy on Informing Users of Data Quality and Methodology is well structured and sets out clear expectations for program managers. However, the Agency's practices in informing users are inconsistent, and users are not always appropriately informed about the strengths and limitations of the statistics.
In addition to auditing Statistics Canada's self-assessments to determine whether we could rely on their conclusions, we reviewed documents and interviewed Agency staff. We also interviewed key users of statistics in the federal and provincial governments and the private sector. In addition, we compared the Agency's approach to managing the quality of statistics with practices in a number of respected statistical agencies in other countries - including Australia, the Netherlands, Sweden, the United Kingdom and the United States.
Doreen Deveen
Werner J. M�ller-Clemm
For information, please contact Henno Moenting.
For household surveys, data collection activities are subject to market testing. As part of the process, quality targets are clearly specified and the achievements by contractors (internal and external) are monitored. The surveys equivalent to the Labour Force Survey and the Consumer Price Index were two examples shown to us.
In addition, a separate internal audit process helps assure the quality of the systems that produce statistical outputs and identifies any general risks associated with them for corporate action.
In its annual Compliance Plan (submitted to Parliament), the ONS reports on changes in the response burden on businesses, by individual survey and in total. In the same document, planned and actual response rates for each survey are reported, providing some indication of data quality.
The ONS launched its StatBase in October 1998. It includes a meta-database in which users can find information about the data quality and methodology of each survey. This initiative, we understand, is based on Statistics Canada's Statistical Data Documentation System (SDDS).
The results of these self-assessments are summarized for the agency's management. They are also included in the agency's annual report to Parliament. The Swedish National Audit Office audits the report and certifies the reliability of information on quality. Statistics Sweden's self-assessment procedure is currently being revised so as to complement it with continuous measurement of key process variables and specific quality indicators.
All audits are carried out by a pool of about 25 agency staff, drawn from various divisions and working part-time on statistical auditing. They receive training from a private consulting firm with experience in auditing and quality management.
Accuracy . "The constraints imposed by budget reduction have led to sample reductions in the program. However, these reductions were implemented without compromising the accuracy of the most widely used measures."
Timeliness . "The monthly measure of the CPI is produced in a timely fashion."
Accessibility . "Current information on the CPI is freely and widely available."
Interpretability . "The CPI program provides information on concepts and definitions and on issues of data quality through various channels, both through personal contact and through a series of publications designed to meet the needs of various users."
We agree with this statement. However, we noted that quarterly publications do not consistently refer the reader to sources where such information can be found.
Coherence . "The conceptual framework and classifications used in the CPI reflect the needs of its clients."
Accuracy . "The LFS is exemplary in terms of regularly monitoring the accuracy of data, primarily through the Data Quality Committee, which meets each month prior to the release of the survey results. A whole range of quality measures is monitored, including coefficients of variation, non-response, slippage, and coding error rates. Information products are reviewed in compliance with bureau policy."
Regional rates of unemployment are used, pursuant to the Employment Insurance Act, to determine the number of hours of work necessary to qualify for employment insurance benefits and the number of weeks of benefit. The LFS allocates its sample to achieve accuracy targets (coefficients of variation - CVs) for statistics on the unemployment rate in each Employment Insurance Region. The target CV, as established by a long-standing agreement between the Agency and Human Resources Development Canada (HRDC), is 15 percent. The Agency told us that occasionally the accuracy targets are not attained. When this happens, the Agency, in agreement with HRDC, may reallocate sample sizes to achieve the desired targets.
Our review shows that the LFS quality reports did not include the CVs for the unemployment rates in each region. The Agency informed us that in the future these accuracy measures will be formally reviewed in the LFS quality reports, and that HRDC, the user of the information, will be regularly informed.
Timeliness. "...the LFS results are released in very timely fashion - two weeks after the end of the survey collection period."
Accessibility. "Information on program outputs is widely available to users and the public through Statistics Canada's Daily and the LFS release on the Statistics Canada's world-wide web site, through copies of the publication available free of charge at Regional Offices and depository libraries, and via wide media coverage of the highlights of the survey results."
We agree with this conclusion, but note that users consulted by the Agency have indicated some concerns about accessibility, including the cost of accessing labour market information. During our audit, several users we interviewed also expressed concerns about the high cost of LFS products.
Interpretability. "LFS products conform to the Policy on Informing Users of Data Quality and Methodology."
The review team told us that although individual products may not conform to the requirements of the Policy on Informing Users of Data Quality and Methodology, collectively they are in compliance because they make reference to other documents. With the December 1998 release of a publication on the methodology of the LFS, users now have access to a wide range of quality indicators (for example, vacancy rates, non-response rates, design effects, sample sizes, sampling errors)
Coherence . "The LFS has taken adequate measures to ensure coherence, including use of international standards for definition of key labour market variables, such as unemployment, employment and the unemployment rate. The program also uses standard classification systems for industry, occupation and geography. To further ensure coherence, the LFS program undertakes, in concert with other program areas, analysis of LFS estimates with those from other sources."
We concur generally with the conclusion about coherence. However, two major users told us that the Agency could do more to improve the continuity of the LFS time series when technical changes, such as changes to classification systems, are introduced.
There have been questions about the international comparability of unemployment rates. Although most Western countries follow the International Labour Organization (ILO) guidelines in measuring unemployment, each country may build into its surveys special features for its own needs. For example, while both countries follow the ILO guidelines, Canada includes "passive job seekers" in the unemployed but the United States does not. It is important to take account of such measurement differences when comparing unemployment rates.
A discussion of this definitional difference and the resulting unemployment gap between the two countries was featured in the November 1998 issue of the Labour Force Update. LFS management told us that, in the future, key LFS publications would warn users of the dangers of international comparison of unemployment rates.
Accuracy. "It is the judgement of the [review team] that, given its budget, the accuracy aspect of quality is very well managed in this survey." The self-assessment goes on to state, "The survey produces estimates of high standard as judged by measures of quality that are observed...response rates are usually in the nineties which compares favorably to the best sample surveys."
Discussions with MSM staff confirmed that with the stratified sample used in this survey, a small number of data sources represent a very large proportion of the value of the measured variables. In these circumstances, a more useful indicator of data quality than response rate might be, for example, the coverage achieved in terms of the value of shipments.
Timeliness. "It is the judgement of the [review team] that the MSM's timeliness is acceptable and, while there should be a continual striving to improve it, it should not be at the expense of a deterioration in accuracy."
We have no reason to question the judgment of the review team in this regard. However, we note that the report provides no evidence to support conclusions about timeliness versus accuracy.
Accessibility. "...the MSM information is readily available through Statistics Canada's Daily publication, in printed copy and through electronic media."
Interpretability. "The survey largely complies with the agency's Policy on Informing Users of Data Quality and Methodology."
Despite this conclusion, the review team recommends the inclusion of a new "Concepts and Methods" section in the key publication that "more clearly explains the data quality issues" and that includes tables and graphs to "illustrate the size and impact of revisions". We agree with this recommendation.
Coherence. "In the opinion of the [review team] the MSM does as good a job on the coherence dimension of quality as can reasonably be expected from a monthly survey."
The review team goes on to recommend, "The establishment of an advisory committee on analytical studies and requirements be considered to enhance the perception of independence of the UCR and to broaden the scope and extent of data use." We agree with this recommendation.
Accuracy. "The data are at most as good as the information available within the policing system itself...effective monitoring and assessment of quality needs an examination of data at the level of police forces. This is done in part through the editing process. Some of the primary checks and analysis at this level are performed by the respondents themselves, who "sign off" on the results. (But this process) does not necessarily assure accuracy and it does not permit an independent assessment of accuracy. It does not account for differences in reporting and enforcement practices."
In 1997 Statistics Canada distributed a self-audit manual to police forces across the country. This initiative was based on the recommendation of a 1989 study, which found that police forces needed to conduct periodic audits of their internal systems to ensure accurate reporting of crime statistics. The Agency's follow-up with a number of major police forces showed that the forces did not have the resources to conduct the audits. In the meantime, there is continuing uncertainty about the quality of the information that is available within the policing system and reported to Statistics Canada. We were told that an internal proposal has recently been submitted within the Agency, requesting resources to perform a data quality audit with selected major police forces.
Timeliness. "Given the nature of the survey, there is strong support for concluding that the data are as timely as is reasonable to expect under the current operational constraints...Any significant improvement in timeliness would have to come in respondent related activities."
Accessibility. "The practices used to disseminate and ensure accessibility of UCR data are consistent with Agency practices in general, and with requirements and views of major clients."
Interpretability. "In all cases (including the releases in the Daily), information on the concepts and definitions is given and amply meets the requirements and the intent of the Policy on Informing Users of Data Quality and Methodology..."
With respect to information on data quality (as distinct from information on concepts and definitions), the review team makes three recommendations for improvement. 1) "An assessment of the impact of the coverage limitations of the UCR2 might be helpful (for example, by examination of aggregate data differences between the UCR1 and UCR2 populations). 2) Cross-references in all products to the report "Canadian Crime Statistics" ... for details on methodology and data quality would also be useful ... 3) Response rates and an assessment of the effect of imputation by major variable should be provided." In view of the nature of this survey, we agree with these recommendations and believe them to be important.
Coherence. "There are clear and convincing attempts to provide a broad picture of criminal incidents, victims and perpetrators - integration of aggregate data across the surveys within the Police Services Program, integration of data from a variety of statistical sources (including some international data) and development across statistical programs."