Statistics Canada

Managing the Quality of Statistics

line

Introduction

Statistics are essential to our society
3.7 All levels of government, industry and other non-governmental organizations need statistics for a variety of purposes. These include monitoring the country's economic and social conditions; developing performance information; planning and evaluating the policies, programs and investments of governments and the private sector; entering into labour contracts; supporting policy debates and advocacy; and keeping the public informed.

3.8 Statistics are also used for statutory or regulatory purposes such as determining electoral boundaries; distributing federal funds to provinces; apportioning federal and provincial taxes; determining the eligibility of the unemployed for employment insurance; and indexing payments to beneficiaries (for example, Canada Pension Plan payments).

3.9 In addition, statistics about Canada's economic and social conditions keep us informed. They help us in deciding where to live, what careers to pursue and what investments to make, for example. They also support our democracy by helping us make informed decisions about voting and other actions designed to influence governments. ( see photograph )

3.10 Rapid and accelerating social, economic and geopolitical changes have heightened the demand for reliable, objective statistical information on a wide spectrum of issues like the environment, health and provincial economies. As the demand for and use of statistics grow, their quality becomes increasingly important.

Statistics Canada is responsible for Canada's national statistics
3.11 Statistics Canada's mandate derives mainly from the Statistics Act . The Act requires the Agency to collect, compile, analyze and publish statistical information on the economic, social and general conditions of the country and its citizens. Statistics Canada also has a mandate to provide professional co-ordination and leadership of the country's statistical system. Consistent with this mandate, it identifies as its mission "to inform Canadian citizens, businesses and governments about the evolution of their society and economy and to promote a high quality national statistical system".

3.12 With planned gross expenditures of almost $435 million in 1998-99 (funded by a parliamentary appropriation of $360 million and revenues of $75 million), the Agency has some 360 statistical programs and releases more than 1,000 statistical products each year. Its objective is to provide:

...comprehensive and relevant statistical information on the economic, demographic and social structure of Canada in order to support the development, implementation and evaluation of policies, programs and decision-making.
3.13 As noted in the most recent Annual Report to Parliament of the President of the Treasury Board ( Managing for Results 1998 ), Statistics Canada has committed itself to furnishing Canadians with objective and non-partisan statistics. These statistics are intended to provide, for various aspects of Canada's economy and society, measures that:

3.14 In its view, Statistics Canada's effectiveness depends on, among other things, the safeguarding of respondents' confidentiality, the relevance of its programs, the quality and accessibility of its products, the attainment of high professional standards, and the control of the burden on survey respondents.

The Agency is committed to producing statistics of high quality
3.15 The quality of statistics thus figures prominently in Statistics Canada's effectiveness and in its commitments to Parliament for results. The quality of any product or service is measured by how well it serves users' needs and meets their expectations. From a user's perspective, therefore, the quality of statistics is their "fitness for use". It is a complex concept, which depends on certain fundamental characteristics of quality and on the intended uses of the statistics.

3.16 While there is a general recognition among statistical agencies that quality is multi-dimensional, and some convergence of opinion about the range of characteristics that make up quality, there is no international standard definition for statistical quality. In common with other reputable statistical agencies, Statistics Canada approaches quality from the user's perspective. Its Quality Assurance Framework document identifies six characteristics of quality that its policies and practices need to address: relevance, accuracy, timeliness, accessibility, interpretability and coherence (see Exhibit 3.1 ).

3.17 Over the past 20 years, Statistics Canada has put in place a wide variety of policies and processes to:

3.18 In 1981, for example, senior management of the Agency urged staff to follow a code of behaviour for statistical agencies with respect to quality assurance. Our 1985 follow-up of an earlier audit indicated that the Agency had taken a number of steps to improve its quality management processes. These included issuing comprehensive Quality Guidelines; revising its 1978 Policy on Informing Users of Data Quality and Methodology; establishing a National Statistics Council; and implementing a long-term planning process. These quality-related tools and mechanisms remain in effect, and others have since been introduced.

3.19 Within the framework of the policies, guidelines and other initiatives that are in place, it is generally left to each program manager to select and implement the quality management techniques that are appropriate to their specific programs.

Focus of the audit
3.20 We examined Statistics Canada's systems and practices for assessing the adequacy of quality management in its statistical programs, and for reporting to users on the quality of its statistics and to Parliament and the public on its performance.

3.21 In view of the technical complexities involved and the professional expertise available within the Agency, we proposed that it carry out self-assessments of a number of statistical programs as an integral part of our audit, focussing on the adequacy of systems and practices for managing quality. The Agency agreed to this proposal and completed assessments of four of its major surveys.

3.22 Our objectives in this audit were to determine whether Statistics Canada:

3.23 Further details on the audit can be found in About the Audit .

Observations and Recommendations

Corporate Assessment of Quality

Information on quality is required for management and accountability
3.24 Because of constantly changing user needs and the complex nature of quality, it is an ongoing balancing act to manage quality across statistical programs as well as in any single program. For example, in a climate of fiscal restraint, demands for new statistical series may require the acceptance of lower quality in some existing series. Similarly, meeting demands for increased timeliness in a statistical program may involve some trade-off with other characteristics of quality, such as accuracy.

3.25 In addition to this internal balancing act, the Agency is committed to external reporting. It reports to Parliament and the public on its performance, and to users on the quality of its statistics. We believe that Statistics Canada thus needs to have meaningful information on the adequacy of its quality management systems and practices (including information on the quality of its statistics) to help improve its statistical programs, support corporate decision making and assurance, and report externally on its performance in achieving quality.

3.26 Thus, we expected that the Agency would systematically assess the adequacy of its quality management systems and practices in individual statistical programs, and assess the quality actually achieved. Such assessments, reported at the corporate level, are particularly important given the wide latitude managers have to select and implement quality management techniques in individual programs.

A number of assessment mechanisms have been used
3.27 We found that, like other statistical agencies, Statistics Canada has used a number of different mechanisms to assess quality. Appendix A provides a brief overview of approaches to quality assessment and reporting that we observed in some of the other statistical agencies we visited.

3.28 Program evaluations. In the 1980s the Agency established a corporate program evaluation function. Over a span of five years, it completed 19 evaluations of statistical programs, drawing heavily on feedback from major users. The evaluations made some 200 recommendations for improvements to help meet user needs; many of the recommendations were implemented. In 1991-92, following an assessment of this first cycle of evaluations, the Agency devolved responsibility for evaluation to program managers (see paragraph 3.31 ). There have been no corporate program evaluations by the Agency since then.

3.29 Internal audits. In 1990 the Agency carried out an internal audit of compliance by individual statistical programs with the Agency's Policy on Informing Users of Data Quality and Methodology. It found that many of the audited programs did not fully comply with the Policy, and proposed that a follow-up audit be carried out two to three years later (that is, in 1994-1995). In view of other priorities, the follow-up was postponed to 1998. After we set out our plans for this audit, the Agency decided not to proceed with the follow-up. Although the Agency retains an internal audit function, we found that in recent years it has carried out no audits focussed specifically on managing the quality of statistics.

3.30 Program Reports. As a means of devolving responsibility for evaluation to program managers, in 1991-92 the Agency introduced a requirement that each program produce a detailed annual Program Review Report as input to the Agency's Long-term Planning Process. A subsequent change required Program Reports to be produced every two years. The Program Reports are designed to be self-evaluation reports from program managers to the Chief Statistician of Canada on the achievement of program objectives. According to the planning guidelines, the reports should pull together from various sources the key findings on program performance and include extensive indicators of quality.

3.31 Although these Program Reports have been used in the Agency's Long-term Planning Process, we found that they have not always been submitted on time. Twelve of the 19 product-related Program Reports scheduled for the first two-year cycle had been submitted when we completed this audit, and only one of the 15 scheduled for the second cycle. One reason why the Reports are late is that many of the programs are being revised under the Agency's ongoing Project to Improve Provincial Economic Statistics (PIPES). That project itself has important implications for the assessment and reporting of statistical quality (see Exhibit 3.2 ).

3.32 We also found that when Program Reports were submitted, they did not always include quality indicators as described in the guidelines for preparing these reports. Finally, we found that some Reports did not cover all the programs for which the reporting manager was responsible. We note, however, that some of the reporting managers are responsible for a great many programs - more than 60 in one case.

3.33 User input and feedback. Statistics Canada uses a variety of mechanisms to obtain user input and feedback on its statistical outputs. These help it to keep its programs relevant and to meet the evolving needs of those who use its products. The mechanisms include:

3.34 The Statistical Data Documentation System. In the early 1980s the Agency implemented the Statistical Data Documentation System (SDDS), an information system to support planning, evaluation and marketing as well as other management functions. The SDDS was designed to contain detailed information on each survey (for example, its content, coverage, design and methods).

3.35 We found that the information entered into the system has been little used for evaluation and planning. In recent years the SDDS has been used mainly to produce the Guide to Statistics Canada's Programs and Products , the first document that many users consult for information about the Agency's products.

3.36 Program managers submit annual updates to the SDDS, but SDDS staff do not verify their completeness. A "data quality" section for each survey is required to describe briefly the most important sources of error, and to provide quantitative measures and qualitative descriptions of data quality. However, we were told that one in every six surveys had no information about data quality in the SDDS. Further, our review showed that where such information was available, it often failed to describe quality in a consistent and meaningful way.

Formal assessment mechanisms are not applied consistently
3.37 We recognize that the Agency has a number of informal processes for assessing and monitoring quality, including personal knowledge and verbal presentations. Nevertheless, in government documents and other sources it has emphasized the importance of formal and interrelated processes ( paragraphs 3.28-3.36 ) for documenting and monitoring performance, including quality.

3.38 As already noted, Program Reports, one of the Agency's key formal mechanisms currently used for evaluating programs, are frequently not submitted on time and may lack some of the required information. Although valuable in their own right, other formal mechanisms (such as internal audits, the processes related to the SDDS, and processes for gathering client input and feedback) have not filled this gap.

3.39 We concluded that the Agency's formal quality assessment mechanisms are not applied consistently. As a result, they do not provide, either individually or collectively, systematic and transparent information on the adequacy of quality management systems and practices in the Agency's statistical programs or on the quality they actually achieve.

3.40 Statistics Canada should ensure that formal quality assessment mechanisms are applied consistently so that they provide systematic information on the adequacy of quality management systems and practices in individual statistical programs and, to the extent possible, information on the quality that they achieve.

Agency's response: We agree with the need to improve compliance related to some of our internal reporting mechanisms. However, we maintain that this lack of compliance has not in any way jeopardized the quality of our statistical output. We have already launched an initiative to improve internal reporting.

Statistics Canada's Self-Assessment of Four Surveys

The Agency assessed the adequacy of quality management systems and practices
3.41 Statistics Canada agreed at the outset of our audit to carry out self-assessments of four major surveys as a one-time project. The objective was to assess the adequacy of quality management systems and practices in the four surveys. The work was carried out under the Quality Assurance Framework that the Agency had documented before undertaking the assessments, and in the light of the priorities and resources that management had allocated to the surveys. ( see photograph )

3.42 We selected the surveys jointly with Statistics Canada, based on their significance to users and their public profile, as well as the range of subject matter and data collection techniques they represented. Exhibit 3.3 provides brief descriptions of the selected surveys: the Consumer Price Index, the Labour Force Survey, the Monthly Survey of Manufacturing and the Uniform Crime Reporting Survey.

The self-assessments were well planned and executed
3.43 The self-assessments were conducted in two stages. As a first step, Program Divisions were asked to document the quality management activities for the selected surveys. This recognized the fact that each Program Division possesses a unique set of knowledge and expertise.

3.44 The Agency developed a set of guidelines to help each Program Division describe systematically the way it manages quality in the planning, design, execution and analysis phases of the survey, as well as in its overall management. Exhibit 3.4 shows the types of information the Program Divisions were asked to provide.

3.45 The second step in the self-assessment involved review teams of experienced Statistics Canada staff drawn from outside the program areas. They assessed the information and reached conclusions about the adequacy of quality management systems and practices. The review teams provided a degree of independence to help ensure an objective, fair and equitable review process. A senior-level Steering Committee provided guidance and oversight to the work of the review teams.

3.46 The self-assessment process began in January 1998, with the Uniform Crime Reporting Survey serving as a pilot. The pilot provided feedback and evidence on the viability of the approach. On the basis of this pilot, the self-assessments of the other surveys were initiated in May 1998. Reports documenting the results of the self-assessments were available to us by early September 1998.

3.47 Throughout the process, we maintained close contact with the Agency's Steering Committee and the review teams. We reviewed the planning and execution of the self-assessments, reviewed the supporting documentation and held follow-up discussions with the Agency's review team and program managers as well as interviews with users of statistical products.

3.48 Overall, we found that the self-assessments were carried out in accordance with the methodology and schedule agreed to with our Office. The descriptions prepared by Program Divisions provided the background information needed to understand the programs in general and the management of their quality in particular. The review teams were independent of Program Divisions and knowledgeable about quality management as well as about the surveys they were reviewing. We concluded that the self-assessments were generally well planned and executed.

The conclusions of three of the four self-assessments were consistent with the evidence
3.49 The following excerpts show that the self-assessments reached positive conclusions about the overall adequacy of quality management systems and practices in each of the four surveys:

3.50 In three of the four cases, we found sufficient appropriate evidence to support these overall conclusions. We concluded that the self-assessments provided reasonable assurance that quality management systems and practices are adequate in the Consumer Price Index, Labour Force Survey and Monthly Survey of Manufacturing. In the case of the Uniform Crime Reporting Survey, we believe the weaknesses identified, and the recommendations made, are more important than the self-assessment suggests.

3.51 As a component of the National Justice Statistics Initiative, the purpose of the UCR Survey is to help ensure that the Canadian public has accurate information on the nature and extent of crime in Canada. The UCR Survey relies on administrative data provided by police forces across the country. As a result, Statistics Canada has less direct control over the completeness and the quality of the data than in those cases where it collects survey data itself. The two key factors that affect the quality of the UCR statistics, therefore, are the participation of all police forces in Canada and the integrity of the record systems maintained by them. The self-assessment clearly reports quality weaknesses in both these areas.

3.52 An additional complicating factor is that the UCR Survey relies on two conceptually different surveys that run in parallel. Some police forces report summary data on the number of offences (UCR1). Others provide detailed data on individual criminal incidents, including, for example, the age and sex of victims and offenders, the level of injury, weapons involved, location of incident and dollar values of property and drug crimes (UCR2). Statistics Canada converts the UCR2 data to summary data and combines them with UCR1 to publish crime statistics at the national level. In addition, it publishes the UCR2 data that are available. Although considerably richer for purposes of analysis, the published UCR2 data are limited by the fact that they cover only about half the reported crime in Canada and are not representative of Canada as a whole.

3.53 In our view, these circumstances place a particular onus on the Agency to inform potential users as clearly as possible about the quality of the UCR data and limitations on their use. We note that while the UCR review team made recommendations to strengthen practices for managing the relevance and interpretability of the survey (see Appendix B ), its overall conclusion states that "no major changes are being recommended" and labels the recommendations as "fine tuning". Although the meaning of terms such as "major changes" and "fine tuning" are clearly matters of judgment, we believe the recommendations are more important than the self-assessment suggests and deserve the attention of senior management.

3.54 Appendix B includes excerpts that summarize the self-assessments' conclusions about the four surveys under each of the six characteristics of quality, along with our comments. In most cases, we agree with the review teams' conclusions and with the opportunities for improvement that they identified. In some cases, we point to additional areas where we believe that quality management practices, including the reporting of quality, can be improved.

A wider application of the self-assessment technique merits consideration
3.55 Although Statistics Canada carried out the four self-assessments as a one-time exercise for our audit, our review of the process and the results of the assessments suggest that a wider application of this technique merits consideration - especially as the Agency does not now have other independent assessment mechanisms in place. If the Agency were to apply the technique more broadly, it would need to ensure that assessments were carried out rigorously and that their conclusions were clear.

3.56 The constraints on the self-assessments carried out for this audit would need to be removed. The four self-assessments set out to examine the adequacy of quality management practices explicitly within the context of the priorities and resources allocated to the programs by corporate management. Consequently, the assessments looked only at whether the programs did what could reasonably be expected in prevailing circumstances to manage quality. In addition, they did not set out to assess or to comment on the quality actually achieved in the four surveys.

3.57 We believe that a stronger focus on results would enhance the self-assessment technique. The following kinds of issues would need to be assessed:

3.58 We believe that a wider application of self-assessments would likely also require some training of reviewers in assessment techniques, as well as documented guidance for review teams.

3.59 Statistics Canada should review the potential for a wider application of an enhanced self-assessment technique as one component of the formal processes it uses to assess quality.

Agency's response: We will consider the benefits and costs of applying this technique as one component of our quality management approach.

Reporting Performance to Parliament

The Agency's Performance Report to Parliament provides limited information on quality
3.60 The government reformed its Expenditure Management System in 1995. Part of that reform included providing better planning and performance information to Parliament. Departmental Performance Reports are now tabled in the fall to start the cycle of budget and business planning decisions. Among other things, these reports are to provide information on results achieved, important management initiatives and financial performance for consideration by parliamentarians in the Estimates and Supply process. Reports on Plans and Priorities, which reflect decisions based on performance and government priorities, are tabled in the spring.

3.61 As already noted, the Agency has committed itself to providing Canadians with high-quality statistics that are relevant to policy making and responsive to emerging issues. In many of its policies and processes it gives a central place to the quality of statistics. We therefore expected to find quality-related performance indicators in the Agency's Performance Report for the period ended 31 March 1998, which was tabled in October 1998.

3.62 Under the heading "Information Quality", the Performance Report notes that Statistics Canada uses a wide range of quality assurance practices and conducts intensive "institutional" quality verification of all data releases. Under the same heading, the Report notes that indicators of data quality are included in all publications, although our review, and the Agency's own assessments, showed that this is not always the case (see Informing Users About Data Quality and Methodology ).

3.63 The Performance Report includes some descriptions of processes used to keep programs relevant - including external advice and user feedback - and provides examples of changes made as a result. With respect to timeliness, the Report shows the time elapsed between the reference period and the release dates for six major surveys; but it provides no rationale for selecting only these surveys from among some 360 statistical programs. For 28 selected outputs in "major subject areas", the Report includes the frequency of publication (for example, monthly or quarterly) and indicates whether they were released on schedule in 1997-98.

3.64 The only other quality-related indicators in the Report are indicators of accessibility. They provide some information about the growing use of the Agency's Internet site, as well as changes to the statistical information available there.

3.65 Overall, we concluded that the Performance Report contains only partial information about the Agency's actual performance in terms of the six characteristics it has identified as central to quality. In particular, the Report provides no performance information at all about the accuracy, interpretability or coherence of statistics.

3.66 We were told that the process used to gather information for the Performance Report was informal and not linked explicitly to other quality-related initiatives. Program Reports were consulted to some extent. However, no use was made of the Statistical Data Documentation System, which could be a key database to generate quality-related indicators for external reporting.

3.67 Statistics Canada should improve the coverage and content of information on statistical quality in its annual Performance Report to Parliament by drawing on quality information available from internal assessment and reporting systems.

Agency's response: We will review the content of the annual Performance Report to Parliament to improve the way in which information on statistical quality is presented. However, we do not believe that it is possible to produce simple summary quality measures across a wide variety of programs in a way that is useful and meaningful for Parliament.

Informing Users About Data Quality and Methodology

3.68 All statistics are, to some extent, estimates of the reality they seek to convey. Therefore, they have to be used in full awareness of their strengths and limitations. Unlike users of many products and services, however, users of statistics cannot readily assess all of the important quality characteristics of the data produced by Statistics Canada. For the most part, users have to rely on the Agency's integrity and professionalism, as well as what it tells them about data quality and the methodology used to collect and compile the data.

3.69 Statistics Canada recognizes the need to provide potential users with information about data quality and the concepts, definitions and methods used, so they can determine whether the statistics fit their purposes and can make informed use of them. Since 1978, the Agency has had a Policy on Informing Users of Data Quality and Methodology.

A 1990 internal audit found inconsistent practices
3.70 As we have noted, in 1990 the Agency carried out an internal audit of its compliance with this policy on informing users. In a sample of 20 "major" and 50 "other" surveys, the audit identified many instances of non-compliance.

3.71 The internal audit report made a number of important observations and recommendations on Statistics Canada's dissemination of quality-related information to users. For example, the report stated that:

The Agency's Policy is clear and well structured
3.72 In 1992 the Agency updated its Policy on Informing Users of Data Quality and Methodology and, in doing so, implemented many of the recommendations of the 1990 internal audit. The key elements of the current Policy are set out in Exhibit 3.5 . We assessed the Policy and found it to be well structured. It sets out clearly the expectations for program managers, both mandatory and discretionary.

3.73 We compared Statistics Canada's Policy with the approaches taken by reputable statistical agencies in other countries. We found that while all of these agencies recognized the need to inform users of data quality and methodology and had taken steps to do so, not all of them had documented policies in place. The policies that we saw varied in their structure and content. Although some were quite similar to that of Statistics Canada, none, in our view, were more advanced.

The Agency's practices in informing users continue to be inconsistent
3.74 As noted in paragraph 3.29 , the Agency decided not to carry out the follow-up of the 1990 internal audit that had been planned for 1998. We therefore carried out a limited test of its disclosure practices.

3.75 We assessed a selection of 10 products to determine the nature and extent of disclosure about data quality and methodology both in hard copy publications and electronic media. We used criteria based on the 11 mandatory minimum requirements specified in the Agency's current Policy on Informing Users of Data Quality and Methodology (see Exhibit 3.5 ).

3.76 Our test showed that disclosure of information on data quality and methodology in the Agency's products did not always comply with the mandatory minimum requirements of the Policy. Therefore, users are not always appropriately informed of the strengths and limitations of statistics. Disclosure was inconsistent across products. Specifically, we noted the following:

3.77 We found also that disclosure was inconsistent across media of dissemination. In particular, information available on or through the Agency's Internet site showed weaker compliance than printed publications.

3.78 To ensure that its Internet site complies with its policy on informing users, the Agency recently began to develop an Integrated Meta Database to replace the SDDS. It is to contain information on the data quality, concepts and underlying methodology of each Agency survey, and is to be accessible to all Internet users. Because the new database was still being developed at the time of our audit, we were unable to assess the nature and extent of the information it contains.

3.79 Our findings on the lack of information and the inconsistencies in disclosure are in accordance with the Agency's findings on "interpretability" in the four self-assessments it carried out (see Appendix B ).

3.80 Statistics Canada's Methods and Standards Committee has functional responsibility for the Policy on Informing Users of Data Quality and Methodology, and is mandated to monitor its implementation. However, we found that the Committee has not done so, nor has it produced the periodic reports on the state of compliance that the Policy requires. We noted, too, that while program managers can ask the Methods and Standards Committee for exemptions from compliance with the Policy, no such exemptions have been requested or granted.

3.81 The Agency's Guide to Statistics Canada's Programs and Products includes some information on data quality and methodology. However, our review showed that practice with respect to the inclusion of such information is inconsistent across programs. In some cases no information is provided; in others, information on the source and definition of possible errors is included. In still others, actual indicators of data quality are provided. Although the policy on informing users does not apply to the Guide, including information on data quality and methodology could help the many users who would consult the Guide first when seeking information about the Agency's products.

3.82 Officials in the other statistical agencies we visited indicated that their own quality disclosure practices were also inconsistent. Most officials acknowledged that they could, and should, do better. Disseminating statistics by means of new technologies, such as the Internet and compact discs, was widely recognized as a particular challenge.

3.83 Statistics Canada should ensure that its Policy on Informing Users of Data Quality and Methodology is applied consistently across products and dissemination media.

Agency's response: Agreed.

3.84 In informing users of data quality and methodology, Statistics Canada should:

Agency's response: Agreed

An Integrated Approach to Managing Quality

Many building blocks are in place
3.85 After-the-fact assessments of quality cannot replace measures that build quality into statistical programs - appropriate design and execution, the use of suitably qualified and motivated staff, the effective management of resources and activities, and a corporate culture of integrity. We support the Agency's emphasis on building quality in through these and other means, and its approach to giving managers discretion to use the best techniques available to pursue quality in the Agency's diverse statistical products.

3.86 However, we believe there is a need for each statistical program to demonstrate regularly - through self-reporting, some form of independent assessment, or both - that it has met design parameters or quality targets, such as sample size and response rate (see paragraphs 3.40 and 3.59 ). The frequency and depth of such reporting would need to be decided with due consideration to the cost, complexity and importance of the statistical programs concerned. Without systematic quality assessments, it is questionable whether the Agency can assure itself, or others, that the systems and practices it uses to build in quality are effective.

3.87 The Agency has many quality-related policies, guidelines and systems that could become the basis for effective quality assessment and reporting. Many of the necessary building blocks are already in place, and recent experience with the self-assessment approach may point to a useful addition. However, we believe that the Agency needs to reshape some of these building blocks and reorient others to make them more cohesive. A basis for such integration could be the newly documented Quality Assurance Framework, which sets out what the Agency considers to be the key characteristics of quality in statistics, and describes the processes already in place to manage quality.

Better integration and documentation are needed
3.88 The Agency's quality-related initiatives were developed over a long span of time by different units in the organization. For example, the Statistical Data Documentation System was developed by the Standards Division; the Quality Guidelines by a team of professionals drawn from various units; the Policy on Informing Users of Data Quality and Methodology by the Methods and Standards Committee; and the Performance Report and guidelines for Program Reports by the Corporate Planning Division. Although many of these initiatives were developed under the purview of the Methods and Standards Committee, the Agency has not had a focal point with specific responsibility to ensure their co-ordination and integration. Nor does it have such a focal point now.

3.89 We noted that definitions and requirements relating to quality of statistics are not always consistent. For example, the term "quality" itself has taken on new dimensions over time, expanding from the more traditional meaning of accuracy to one that reflects an explicit user orientation. Today, quality is defined or described differently in various policies, guidelines and systems.

3.90 We also noted that internal and external reporting requirements are not co-ordinated and sometimes overlap. Besides being required to disclose information on data quality and methodology to users of each of their products, program managers are currently required to update the Statistical Data Documentation System annually, prepare Program Reports every two years and provide input to the Agency's annual Performance Report to Parliament. Because these different reporting requirements are not co-ordinated, each one leads to additional work and can generate new streams of quality-related information. This may contribute to the difficulties that program managers and the Agency itself face in satisfying all the reporting requirements adequately.

3.91 The Agency's Policy on Informing Users of Data Quality and Methodology establishes expectations for managers that are stronger than those established by the Quality Guidelines. The former clearly sets out "minimum standards" for disclosure by program managers. The latter - a "collection of methods, procedures and practices that govern the pursuit of quality objectives" - includes no mandatory requirements. As already noted, the Quality Guidelines give program managers latitude to select and implement whatever practices they consider appropriate in their particular circumstances.

3.92 The fact that standards for informing users are mandatory, while the guidelines for documenting quality are discretionary, may contribute to program managers overlooking or violating mandatory minimum requirements to inform users. Both our audit and the Agency's self-assessments showed that many of the programs examined did not comply fully with the Policy on Informing Users of Data Quality and Methodology.

3.93 We believe that if quality were defined more consistently, and if quality assessment, documentation and reporting were better integrated, the burden on program managers and the Agency could be reduced. The Integrated Meta Database (the planned successor to the Statistical Data Documentation System) could be used consistently to document, for each survey, key quality-related decisions and the rationales for them, as well as quality indicators (as the Agency's Quality Guidelines already suggest). The Database could then become a central repository of information on quality to support consistent and effective internal reporting and external disclosure.

3.94 We believe that co-ordinated quality-related policies, guidelines, systems and processes, and a more disciplined approach to documentation, would help provide more systematic information about quality to support management and reporting. A more disciplined approach means better, not necessarily more, documentation.

3.95 Statistics Canada should make its quality-related policies, guidelines and systems more coherent and cohesive by:

Agency's response: Agreed.

3.96 Statistics Canada should co-ordinate the development of the Integrated Meta Database with other quality-related initiatives and take steps to ensure the ongoing completeness and reliability of the Database.

Agency's response: Agreed. The integration of quality-related information was already recognized as one objective of the development of the Integrated Meta Database, and is under way.

3.97 Statistics Canada should assign to a corporate focal point the responsibility for promoting an integrated, consistent approach to developing and implementing quality-related initiatives throughout the Agency, including the assessment and reporting of quality.

Agency's response: We agree with the objective underlying this recommendation. However, we believe that the most effective arrangement is to have the maintenance of quality as a prime responsibility of every line manager. We would be concerned about introducing any organizational arrangement that suggests to program managers that "someone else" is looking after quality issues. We will consider this issue further.

Conclusion

3.98 In the course of our audit we noted Statistics Canada's commitment to producing high-quality statistics and continuing to improve quality. We noted, too, that the Agency is widely respected among its peers, and has an international reputation second to none for independence, innovation and quality. Indeed, many employees of other well-regarded statistical agencies whom we interviewed indicated that they were complimented that we would look to their agencies as benchmarks for Statistics Canada, when in fact they were striving to emulate it.

3.99 Statistics Canada has in place a wide range of policies and processes to ensure the ongoing relevance of its programs, to build quality in through design, execution and the use of new technologies, and to maintain an environment that encourages a concern for quality throughout the organization. However, we found that its achievement of quality is not sufficiently assessed and reported either within or outside the Agency. ( see photograph )

3.100 Its formal quality assessment mechanisms are not sufficiently co-ordinated and are not applied consistently. Program managers do not always comply with reporting requirements. As a result, these mechanisms do not provide systematic, transparent information about the adequacy of quality management systems and practices in the Agency's statistical programs or on the quality they actually achieve.

3.101 However, the Agency's existing policies, guidelines and systems can become the basis for more effective quality assessment and reporting practices. Integrating them better and taking a more disciplined approach to documentation would improve the nature and extent of information available to support internal decision making and assurance, as well as external reporting to Parliament and the public on performance, and to users on data quality and methodology.

3.102 The four self-assessments Statistics Canada carried out for this audit were well planned and executed. All reached positive conclusions about the adequacy of quality management in the four surveys. For three of the four (the Consumer Price Index, the Labour Force Survey, the Monthly Survey of Manufacturing), we concluded that the self-assessments provided reasonable assurance that quality management systems and practices are adequate. In our judgment, the evidence presented in the self-assessment of the Uniform Crime Reporting Survey could have led to a stronger conclusion about the weaknesses identified and the importance of recommended improvements.

3.103 The quality of statistics figures prominently in Statistics Canada's effectiveness and in its commitments to Parliament for results. We therefore expected to find quality-related performance information in its most recent Performance Report to Parliament, tabled in October 1998. Overall, we concluded that the Report provided only limited information on the Agency's performance with respect to the quality of the statistics that it produces.

3.104 Because statistics have to be used in full awareness of their strengths and limitations, we assessed Statistics Canada's policy and practices for informing users about data quality and methodology. We concluded that the Agency's Policy on Informing Users of Data Quality and Methodology is well structured and sets out clear expectations for program managers. However, the Agency's practices in informing users are inconsistent, and users are not always appropriately informed about the strengths and limitations of the statistics.


About the Audit

Objectives

Our audit objectives were to determine whether:

Scope and Approach

We examined various mechanisms used by the Agency to assess quality in individual statistical programs and to report results externally. We audited the Agency's self-assessments of four selected programs: the Consumer Price Index, the Labour Force Survey, the Monthly Survey of Manufacturing and the Uniform Crime Reporting Survey. We also reviewed the Agency's policy and practices for informing users about data quality and methodology, and reporting to Parliament on performance.

In addition to auditing Statistics Canada's self-assessments to determine whether we could rely on their conclusions, we reviewed documents and interviewed Agency staff. We also interviewed key users of statistics in the federal and provincial governments and the private sector. In addition, we compared the Agency's approach to managing the quality of statistics with practices in a number of respected statistical agencies in other countries - including Australia, the Netherlands, Sweden, the United Kingdom and the United States.

Criteria

Audit Team

Assistant Auditor General: Maria Barrados
Principal: Henno Moenting
Director: Robert W. Chen

Doreen Deveen
Werner J. M�ller-Clemm

For information, please contact Henno Moenting.


Appendix A

Approaches to Quality Assessment and Reporting in Selected
Statistical Agencies

United Kingdom's Office for National Statistics
The United Kingdom Office for National Statistics (ONS) has an internal review process through which quality practices of major business surveys are audited periodically. A member of the team from outside the ONS helps ensure that the review is objective.

For household surveys, data collection activities are subject to market testing. As part of the process, quality targets are clearly specified and the achievements by contractors (internal and external) are monitored. The surveys equivalent to the Labour Force Survey and the Consumer Price Index were two examples shown to us.

In addition, a separate internal audit process helps assure the quality of the systems that produce statistical outputs and identifies any general risks associated with them for corporate action.

In its annual Compliance Plan (submitted to Parliament), the ONS reports on changes in the response burden on businesses, by individual survey and in total. In the same document, planned and actual response rates for each survey are reported, providing some indication of data quality.

The ONS launched its StatBase in October 1998. It includes a meta-database in which users can find information about the data quality and methodology of each survey. This initiative, we understand, is based on Statistics Canada's Statistical Data Documentation System (SDDS).

Statistics Sweden
In Statistics Sweden, each survey must complete an annual "quality declaration" based on a 24-item self-assessment checklist. The objective is to determine whether the quality of the survey has improved, deteriorated or remained the same, in each of four characteristics: content (relevance), accuracy, timeliness and coherence. Division management reviews the results of the self-assessments in its area of responsibility.

The results of these self-assessments are summarized for the agency's management. They are also included in the agency's annual report to Parliament. The Swedish National Audit Office audits the report and certifies the reliability of information on quality. Statistics Sweden's self-assessment procedure is currently being revised so as to complement it with continuous measurement of key process variables and specific quality indicators.

Statistics Netherlands
Statistics Netherlands adopted an overall quality program in 1996. One of the objectives was to introduce quality systems in all statistical departments. Provisional guidelines were issued in 1997. At the same time, a system of "statistical auditing" was set up. The aim of this system is to check how quality management in statistical departments is functioning and how the quality of statistical products and processes (including the procedures) can be improved. Another objective is to document best practices and to incorporate them into the guidelines for quality management systems.

All audits are carried out by a pool of about 25 agency staff, drawn from various divisions and working part-time on statistical auditing. They receive training from a private consulting firm with experience in auditing and quality management.


Appendix B

Conclusions of Statistics Canada's Self-Assessments

Except where we make specific comments, we concur with the following conclusions of the self-assessments.

Consumer Price Index (CPI)

Relevance . "The CPI has continued to maintain a high degree of relevance in the context of Canada's ever-changing economy, in spite of adverse budgetary conditions and the accompanying major reduction in the sample.

Accuracy . "The constraints imposed by budget reduction have led to sample reductions in the program. However, these reductions were implemented without compromising the accuracy of the most widely used measures."

Timeliness . "The monthly measure of the CPI is produced in a timely fashion."

Accessibility . "Current information on the CPI is freely and widely available."

Interpretability . "The CPI program provides information on concepts and definitions and on issues of data quality through various channels, both through personal contact and through a series of publications designed to meet the needs of various users."

We agree with this statement. However, we noted that quarterly publications do not consistently refer the reader to sources where such information can be found.

Coherence . "The conceptual framework and classifications used in the CPI reflect the needs of its clients."

Labour Force Survey (LFS)

Relevance. "The LFS program has adequate measures to assure relevance, including use of advisory committees, direct client-stakeholder feedback, and careful monitoring and response to media coverage. The program incorporates this feedback through decennial redesign, ongoing quality assurance committees, and a committee to manage major developments between decennial redesigns."

Accuracy . "The LFS is exemplary in terms of regularly monitoring the accuracy of data, primarily through the Data Quality Committee, which meets each month prior to the release of the survey results. A whole range of quality measures is monitored, including coefficients of variation, non-response, slippage, and coding error rates. Information products are reviewed in compliance with bureau policy."

Regional rates of unemployment are used, pursuant to the Employment Insurance Act, to determine the number of hours of work necessary to qualify for employment insurance benefits and the number of weeks of benefit. The LFS allocates its sample to achieve accuracy targets (coefficients of variation - CVs) for statistics on the unemployment rate in each Employment Insurance Region. The target CV, as established by a long-standing agreement between the Agency and Human Resources Development Canada (HRDC), is 15 percent. The Agency told us that occasionally the accuracy targets are not attained. When this happens, the Agency, in agreement with HRDC, may reallocate sample sizes to achieve the desired targets.

Our review shows that the LFS quality reports did not include the CVs for the unemployment rates in each region. The Agency informed us that in the future these accuracy measures will be formally reviewed in the LFS quality reports, and that HRDC, the user of the information, will be regularly informed.

Timeliness. "...the LFS results are released in very timely fashion - two weeks after the end of the survey collection period."

Accessibility. "Information on program outputs is widely available to users and the public through Statistics Canada's Daily and the LFS release on the Statistics Canada's world-wide web site, through copies of the publication available free of charge at Regional Offices and depository libraries, and via wide media coverage of the highlights of the survey results."

We agree with this conclusion, but note that users consulted by the Agency have indicated some concerns about accessibility, including the cost of accessing labour market information. During our audit, several users we interviewed also expressed concerns about the high cost of LFS products.

Interpretability. "LFS products conform to the Policy on Informing Users of Data Quality and Methodology."

The review team told us that although individual products may not conform to the requirements of the Policy on Informing Users of Data Quality and Methodology, collectively they are in compliance because they make reference to other documents. With the December 1998 release of a publication on the methodology of the LFS, users now have access to a wide range of quality indicators (for example, vacancy rates, non-response rates, design effects, sample sizes, sampling errors)

Coherence . "The LFS has taken adequate measures to ensure coherence, including use of international standards for definition of key labour market variables, such as unemployment, employment and the unemployment rate. The program also uses standard classification systems for industry, occupation and geography. To further ensure coherence, the LFS program undertakes, in concert with other program areas, analysis of LFS estimates with those from other sources."

We concur generally with the conclusion about coherence. However, two major users told us that the Agency could do more to improve the continuity of the LFS time series when technical changes, such as changes to classification systems, are introduced.

There have been questions about the international comparability of unemployment rates. Although most Western countries follow the International Labour Organization (ILO) guidelines in measuring unemployment, each country may build into its surveys special features for its own needs. For example, while both countries follow the ILO guidelines, Canada includes "passive job seekers" in the unemployed but the United States does not. It is important to take account of such measurement differences when comparing unemployment rates.

A discussion of this definitional difference and the resulting unemployment gap between the two countries was featured in the November 1998 issue of the Labour Force Update. LFS management told us that, in the future, key LFS publications would warn users of the dangers of international comparison of unemployment rates.

Monthly Survey of Manufacturing (MSM)

Relevance. "There is no evidence that the absence of a systematic, on-going external mechanism to ensure relevance is adversely affecting the MSM." Nevertheless, the review team notes that "the MSM lacks a systematic communications link with its client community" and suggests a "more formal dialogue mechanism."

Accuracy. "It is the judgement of the [review team] that, given its budget, the accuracy aspect of quality is very well managed in this survey." The self-assessment goes on to state, "The survey produces estimates of high standard as judged by measures of quality that are observed...response rates are usually in the nineties which compares favorably to the best sample surveys."

Discussions with MSM staff confirmed that with the stratified sample used in this survey, a small number of data sources represent a very large proportion of the value of the measured variables. In these circumstances, a more useful indicator of data quality than response rate might be, for example, the coverage achieved in terms of the value of shipments.

Timeliness. "It is the judgement of the [review team] that the MSM's timeliness is acceptable and, while there should be a continual striving to improve it, it should not be at the expense of a deterioration in accuracy."

We have no reason to question the judgment of the review team in this regard. However, we note that the report provides no evidence to support conclusions about timeliness versus accuracy.

Accessibility. "...the MSM information is readily available through Statistics Canada's Daily publication, in printed copy and through electronic media."

Interpretability. "The survey largely complies with the agency's Policy on Informing Users of Data Quality and Methodology."

Despite this conclusion, the review team recommends the inclusion of a new "Concepts and Methods" section in the key publication that "more clearly explains the data quality issues" and that includes tables and graphs to "illustrate the size and impact of revisions". We agree with this recommendation.

Coherence. "In the opinion of the [review team] the MSM does as good a job on the coherence dimension of quality as can reasonably be expected from a monthly survey."

Uniform Crime Reporting Survey (UCR).

Relevance. "The UCR program has in place a superior means for both client and respondent liaison, consultation and feedback to ensure the relevance of its data and products."

The review team goes on to recommend, "The establishment of an advisory committee on analytical studies and requirements be considered to enhance the perception of independence of the UCR and to broaden the scope and extent of data use." We agree with this recommendation.

Accuracy. "The data are at most as good as the information available within the policing system itself...effective monitoring and assessment of quality needs an examination of data at the level of police forces. This is done in part through the editing process. Some of the primary checks and analysis at this level are performed by the respondents themselves, who "sign off" on the results. (But this process) does not necessarily assure accuracy and it does not permit an independent assessment of accuracy. It does not account for differences in reporting and enforcement practices."

In 1997 Statistics Canada distributed a self-audit manual to police forces across the country. This initiative was based on the recommendation of a 1989 study, which found that police forces needed to conduct periodic audits of their internal systems to ensure accurate reporting of crime statistics. The Agency's follow-up with a number of major police forces showed that the forces did not have the resources to conduct the audits. In the meantime, there is continuing uncertainty about the quality of the information that is available within the policing system and reported to Statistics Canada. We were told that an internal proposal has recently been submitted within the Agency, requesting resources to perform a data quality audit with selected major police forces.

Timeliness. "Given the nature of the survey, there is strong support for concluding that the data are as timely as is reasonable to expect under the current operational constraints...Any significant improvement in timeliness would have to come in respondent related activities."

Accessibility. "The practices used to disseminate and ensure accessibility of UCR data are consistent with Agency practices in general, and with requirements and views of major clients."

Interpretability. "In all cases (including the releases in the Daily), information on the concepts and definitions is given and amply meets the requirements and the intent of the Policy on Informing Users of Data Quality and Methodology..."

With respect to information on data quality (as distinct from information on concepts and definitions), the review team makes three recommendations for improvement. 1) "An assessment of the impact of the coverage limitations of the UCR2 might be helpful (for example, by examination of aggregate data differences between the UCR1 and UCR2 populations). 2) Cross-references in all products to the report "Canadian Crime Statistics" ... for details on methodology and data quality would also be useful ... 3) Response rates and an assessment of the effect of imputation by major variable should be provided." In view of the nature of this survey, we agree with these recommendations and believe them to be important.

Coherence. "There are clear and convincing attempts to provide a broad picture of criminal incidents, victims and perpetrators - integration of aggregate data across the surveys within the Police Services Program, integration of data from a variety of statistical sources (including some international data) and development across statistical programs."