Government of Canada Canada
Français Contact Us Help Search Canada Site
What's New Who We Are DSP Home InfoSource Publications.gc.ca
Search the DSP Web site for   

Depository Services Program

Government Documents Reference Service in Canada: Implications for Electronic Access

Juris Dilevko and Elizabeth Dolan

Faculty of Information and Media Studies
University of Western Ontario
March 1999

Executive Summary

Providing Canadian citizens with free, equitable, timely, and uncomplicated access to federal government information is a national and worthy goal. Now that official publications in print form are gradually being replaced by electronic documents increasingly available on the World Wide Web, systematic examination of the capabilities of federal depository libraries to provided permanent access and quality reference service to the Canadian public is essential. The study reported here was funded by the Depository Services Program (DSP), Public Works and Government Services, Canada, and undertaken by researchers in the Program of Library and Information Science, Faculty of Information and Media Studies, University of Western Ontario.

For this investigation, a set of government documents related questions was developed and tested for use in an unobtrusive evaluation of full and selective depositories in academic and public libraries located across the five geographic and socio-cultural regions of Canada. Results are based on the analysis of data collected by paid proxies who asked 15 questions a total of 488 times at 104 libraries in some 30 metropolitan areas as defined by Statistics Canada.

The major conclusions of the study are summarized as follows:

    • Accuracy of answers to test questions. Overall, depository reference staff provided complete answers to test questions 29.3% of the time, a disappointing result. When complete and partially complete answers are taken together, the success rate climbs to 42.2%. There were notable differences among the four types of depositories in providing complete answers. Services levels differed as well according to region and census metropolitan area. Academic full depositories achieved the highest rate of success, followed by public full depositories. Academic and public selective libraries did less well. Among the regions, Ontario performed best, followed by British Columbia and the Atlantic Provinces, then by the Prairie Provinces and Québec. Census areas with populations over one million or between one quarter and one half million inhabitants offered the best opportunity for complete or partially complete answers to government documents related questions. No or incorrect answers were given at a rate of about 38%. The most common explanation for this was that reference staff did not know how to find the needed information.
    • Degree to which questions were referred. Referrals comprised one-fifth of responses to questions posed by proxies. Half were to government departments and agencies, 36% to other libraries, and 14% to external non-governmental agencies or commercial establishments.
    • Use of electronic sources (WWW). Despite the widespread availability of Canadian federal government Web sites, findings show that print materials were by far the largest single source used to answer questions in this study, even though data suggest that Web-based information sources provide easily located answers to test questions.
    • Difficulty of the questions. Legislative questions were answered at a significantly higher rate than were queries dealing with the executive branch of the federal government. The rate for document-retrieval questions was higher than those designated as data-retrieval. Data-retrieval questions were referred at more than twice the rate of document-retrieval requests. As the amount of time spent with proxies increased, the number of complete and partially complete answers went up significantly. Results suggest that, with enough time and resources, reference staff are able to achieve a high rate of complete and partially complete answers.
    • Walk-in versus telephone questions. A greater number of complete and partially complete answers was given when questions were asked in person, especially in full depositories. More referrals were given to telephone questions. Selective depositories were able to supply answers to telephone and walk-in questions at an equal or about equal rate.
    • Level of knowledge of official sources. Reference staff rely much more on print materials than electronic sources. Use of Web resources is low. When it is used, findings suggest that library staff are much more familiar with information available on the Parliamentary and DSP sites. There is evidence of limited knowledge about the extensive range of executive branch information on the Internet.

Results of this study reinforce the findings of an earlier examination of federal depositories in Canada. Respondents to the Dolan and Vaughan (1998) study acknowledged the potential of the Internet, but, among other reservations, pointed to the lack of time and resources to provide adequate training for library staff. The Depository Services Program could provide invaluable assistance in this area if it is able to mount effective training programs and press for the development of improved "metadata," indexing, and archiving of its Web-based information, as well as better search engines and enhanced subject access.


TABLE OF CONTENTS

Executive Summary
Figures
Tables
Acknowledgements
PART I - INTRODUCTION
PART II - KEY RESEARCH QUESTIONS
PART III - METHOD
  • Overview
  • Geographic Considerations
  • Types of Depository Libraries
  • Choice of Census Metropolitan Areas
  • Categorization of Questions
  • Survey Packages
  • Recruitment and Training of Proxies
  • Issues of Anonymity and Informed Consent
  • Coding Procedures
PART IV - DEVELOPMENT OF SAMPLE QUESTIONS
PART V - LIMITATIONS OF THE STUDY
PART VI - DATA ANALYSIS AND RESULTS
  • Profile of Question Distribution
  • Variation in Service Levels by External Factors
  • Variation in Service Levels by Internal Factors
  • Variation in Service Levels by Question Variables
  • Referrals and No/Incorrect Answers
  • Sources Used to Answer Questions
  • Efficacy Rates and Source Types
PART VII - NATURE OF PROXY - ADMINISTERED REFERENCE QUESTIONS
  • Relative Degree of Difficulty
  • Question Analysis
PART VIII - CONCLUSIONS AND RECOMMENDATIONS
PART IX - BIBLIOGRAPHY

Juris Dilevko, Elizabeth Dolan

Faculty of Information and Media Studies
University of Western Ontario


FIGURES

  1. Geographic Distribution of Depository Libraries where Questions were Asked
  2. Distribution of Type of Depository Libraries where Questions were Asked
  3. Population Distribution of Census Areas of Depository Library Locations
  4. Frequency Distribution of Relative Difficulty of Questions
  5. Daily Distribution of Questions
  6. Time of Day when Questions were Asked
  7. Distribution of Responses Received
  8. Responses Received by Type of Depository Library
  9. Types of Responses Received by Region
  10. Types of Responses by Size of Census Metropolitan Area
  11. Responses Received by Day of Week
  12. Responses Received by Whether Library Had Separate Area for Government Documents Reference Service
  13. Responses Received by Degree of Busyness
  14. Distribution of Time Spent on Each Question
  15. Responses Received by Time Spent with Patron
  16. Responses Received by Method of Question Delivery
  17. Responses Received by Subject Matter of Reference Question
  18. Responses Received by Class of Reference Question
  19. Types of Referral
  20. Reasons for No/Incorrect Answer
  21. No/Incorrect Answers and Referrals by Type of Library
  22. No/Incorrect Answers and Referrals by Geographic Region
  23. Distribution of Sources Used to Answer Questions
  24. Sources Used by Type of Library
  25. Major Sources Used By Region
  26. Use of Sources by Size of Census Metropolitan Area
  27. Individual Questions by Source Use
  28. Sources Used by Subject Matter of Reference Question
  29. Sources Used by Class of Reference Question
  30. Use of Sources by Method of Question Delivery
  31. Responses Received by Type of Source
  32. Major Sources Used by Time Spent Using Source
  33. Responses Received by Individual Question
  34. Rank Descending Order of Questions which were Completely Answered
  35. Rank Descending Order of Questions which Received Most No/Incorrect Answers
Haut

TABLES

  1. Comparison of Regional Population Distribution and Distribution of Questions
  2. Coding Scheme for Reference Questions Developed by John V. Richardson, Jr.
  3. Modified Richardson Coding Scheme
  4. Preliminary Questions Given to Pre-testers
  5. Results of Pre-Tested Questions
  6. Final List of Questions
  7. Impact of Separate Area on Complete Answers
  8. Impact of Busyness on Type of Answers
  9. Method of Question Delivery and Depository Library Types
  10. Time Spent on Reference Questions
  11. Comparison of Legislative and Executive Questions
  12. Type of Answer Received by Source
  13. Average Time Spent Finding Answers
  14. Types of Answers Received by Type of Library for CRTC Question
  15. Types of Answers Received by Type of Library for Book Question
  16. Types of Answers Received by Type of Library for Barley Question
  17. Types of Answers Received by Type of Library for Lyrics Question
  18. Types of Answers Received by Type of Library for Fuels Question
  19. Types of Answers Received by Type of Library for Firearms Question
  20. Types of Answers Received by Type of Library for Audgen Question
  21. Types of Answers Received by Type of Library for Crime Question
  22. Types of Answers Received by Type of Library for Magdalen Question
  23. Types of Answers Received by Type of Library for Rules Question
  24. Types of Answers Received by Type of Library for Refugee Question
  25. Types of Answers Received by Type of Library for Garbage Question
  26. Types of Answers Received by Type of Library for Photo Question
  27. Types of Answers Received by Type of Library for Fisheries Question
  28. Types of Answers Received by Type of Library for Africa Question
Haut

ACKNOWLEDGEMENTS

As with any research project, this study could not have been completed without the assistance of many individuals. We would, first, like to thank our research assistant, Ms. Moya Mason, who provided indefatigable and patient help. Kevin Carrothers was also of timely assistance at key moments. We would also like to thank Mr. Bruno Gnassi, Manager, Depository Services Program, both for the financial support necessary to conduct a project of this scope and for his unflagging enthusiasm throughout the many stages of this study. Comments from three referees, all of whom are senior management staff at libraries in Canada, were invaluable in clarifying important issues raised in the study. Finally, we would like to thank the proxies who asked unobtrusive reference questions at libraries across Canada. These proxies were: Barbara Adamson, Gillian Akenson, Teri Badiou, Beverly Ball, Christine Brown, Kevin Carrothers, Danielle Deavereux, Peter Duerr, Sara Firmani, Laura Gardner, Janet Goosney, Chris Hogan, Jody Hull, Janet Kozoris, Jeff Kozoris, Elaine Magusin, Sarah Marlowe, Daniel Mason, Moya Mason, Michelle Morris, Lisa Mulak, Rebecca Smith, Chris Thomas, Gerry Vogel, and Mia Yen.

Top

INTRODUCTION

The history of Canada's Depository Services Program (DSP), currently administered by Communications Coordination Services Branch of Public Works and Government Services Canada (PWGSC), can be traced to a time before Confederation when the practice began of making selected government publications freely available to the public through members of Parliament and the Queen's Printer. The program was formally inaugurated by an Order-in-council in 1927 and for more than 70 years has continued to provide an essential link between the federal government and the citizens of Canada. It now provides access to federal government information in all formats. Publications are distributed free of charge to 949 public, academic, and government libraries in Canada and abroad where they are housed, organized, and used to provide reference service for the public, other governments, businesses, and universities. The DSP was among the first federal services to adopt the Internet as a means of document delivery, the first to experiment with the provision of priced virtual information, and to explore the feasibility of making online publications accessible to depository institutions.

All Canadian federal government departments and agencies subject to the Treasury Board Communications Policy are responsible for participating in the Program; they provide copies of their publications to the DSP for distribution. The Program absorbs all costs of operation and manages the distribution of priced publications to government depositories. Participating libraries are responsible for all subsequent costs of housing and making the information available to the public.1 Full depository libraries, of which there are 48 in Canada, automatically receive all publications listed in the Weekly Checklist of Canadian government publications.2 The 754 selective depositories in Canada choose items they wish to order for their collections from the Weekly Checklist, available both in print and electronically on the DSP Web site.3

After seven decades of distributing printed publications through the Depository Services Program, the federal government is moving rapidly to the electronic dissemination of official information. Print materials are gradually being replaced as government agencies develop and implement procedures to convert their publications to electronic formats. Although the World Wide Web is increasingly seen as a primary means of more timely and broader availability of government information, libraries are facing important challenges in adapting to the new electronic environment.

Systematic examination of the readiness of federal depository libraries to undertake a smooth transition to electronic formats is essential if the public is to benefit from rapid, cost-effective, and timely availability of a profusion of rich resources. To this end, in the fall of 1996, the DSP funded the first extensive examination of the state of readiness of depository libraries in Canada to adopt the new technologies. Dolan and Vaughan (1998), in Electronic Access to Federal Government Documents: How Prepared Are The Depository Libraries?, reported and analyzed the results of a project to investigate the technological capabilities and related services required by depository libraries to provide permanent public access to Canadian federal government information in electronic form. The study was conducted through a self-administered questionnaire that was sent to all full and selective depositories in Canada and abroad in order to collect both quantitative and qualitative data.

Dolan and Vaughan (1998) found that, while a majority of the libraries surveyed consider official publications to be a very important or essential part of their collections, depositories are severely pressed by the demands of developing new methods of handling documents in electronic form, providing help to patrons in the use of the new technologies, and meeting the associated costs. Respondents to the survey acknowledged the potential of the Internet for timely access to government information, but expressed reservations in the following areas: inadequate bibliographic control and archiving; the threat of inequitable access if fees for service are imposed; the transfer of publishing costs from the government to libraries if they are expected to download and print documents available only on the Internet; and the demands of staff training and costs of maintaining and replacing equipment. The study also found a significant degree of uncertainty among depositories about the future use of government information when it is available primarily in electronic form. Recommendations were made for further study of related issues, among them the nature of adequate reference service associated with collections of official publications. In late 1997, the DSP funded a second inquiry, this time focusing on the reference process in Canadian full and selective depository libraries.

Effectiveness in providing accurate answers to reference queries is a central element in the provision of public access to official information. This present study reports on the results of an unobtrusive examination of reference encounters carried out in full and selective depository libraries in all five geographic areas (Atlantic Provinces, Québec, Ontario, Prairie Provinces, British Columbia and the northern territories) of Canada. Unlike the analysis of other aspects of the reference procedure such as question negotiation, search strategies, and subject analysis, unobtrusive testing emphasizes the user's perspective and can offer useful insights into the quality of service provided to library patrons (McClure & Hernon, 1983, p. 11).

Librarians have long been told that only 55% of questions asked at a reference desk are answered correctly (Hernon & McClure, 1986). Many library personnel have taken umbrage at this low figure, suggesting that a more qualitative approach to evaluation of reference services is needed (Durrance, 1989; Tyckoson, 1992). Such an approach would take into account the interaction between librarian and user by concentrating on behavioural aspects of the reference process. In general terms, the results of such studies have been to suggest that reference "success rates" are much higher than the "55 percent rule." For instance, Parker (1996) reports a 72.3% success rate, while Jardine (1995) points to a 99% success rate, as measured by whether the patron would return to the same library staff member with another question. While librarian self-satisfaction may increase as a result of such studies, Hults (1992) observes that responses of this nature "beg the question" because what the library community "really needs to address" is the question of whether a 55% accuracy rate "is acceptable [and] if not, what priority do libraries place on improving that rate" (p. 143).

Unobtrusive testing has been used since the 1960s and is currently in the news in Canada, as attested by a report in The Globe and Mail describing Health Canada's effort to discover whether retailers are complying with a law that forbids the sale of tobacco to minors (McIlroy, 1998, pp. A1, A10). An account in The New York Times offers another example: undercover shoppers, posing as customers, are paid by marketing agencies to grade service in stores so that retailers can evaluate themselves (Steinhauer, 1998, pp. C1, C23). In the library milieu the process involves asking pre-tested questions of library staff members (who are unaware that they are being evaluated) by proxies who have been trained in presenting the queries and recording their observations. The benefits of unobtrusive testing have been identified by Lancaster (1977) as including: staff members are observed under operating conditions assumed to be normal; the success with which staff members answer various types of question can be measured; and there is an opportunity to make conjectures about the reasons for incorrect answers (pp. 77-136). Hernon and McClure (1987) note that 22 unobtrusive evaluations of reference service were conducted at various types of libraries between 1968 and 1986. Since 1986, Hults (1992) reports that many public and academic libraries have adopted policies in which unobtrusive testing of the service provided by reference staff is a vital part of self-evaluation studies. Certainly, there are many ways to evaluate the quality of reference service, but "accuracy of information… seems the baseline to work from" (Hults, 1992, p. 143).

For this investigation, 15 government documents-related question were developed in order to elicit information of the following kinds: the accuracy of the answers; the extent to which library staff used electronic sources such as the World Wide Web; the degree to which staff members engaged in referral; the types of questions that tended to be referred; the impact of asking questions over the telephone; the value of separate government document reference desks; and the level of knowledge of official sources and expertise in using them displayed by the librarians and other staff members to whom the queries were addressed. The test questions cover major categories of Canadian federal documents of interest to various sectors of the public and were modeled after actual queries such as those compiled by the Inquiry Desk of the Transport Canada Library and Information Centre (Canada, 1986).

Top

KEY RESEARCH QUESTIONS

The purpose of this study is to investigate how well library staff members in Canadian federal depository libraries are answering government documents reference questions and whether they are using Internet-accessible and Web-based sources to do so. Research questions were formulated as follows:

  • What is the degree of accuracy of government reference service in Canadian academic and public libraries that participate in the Depository Services Program, as measured by the number of complete answers supplied by library personnel to specific questions?
  • To what extent do staff members in these libraries make use of electronic information sources such as CD-ROMs and the range of Web sites made available by the Canadian federal government?
  • Which categories of government reference questions are the most difficult to answer for library staff personnel at depository libraries?

While there are legislative libraries with full depository status in most provinces, public access to government documents is most readily achieved through public and academic libraries. Accordingly, the research questions developed for this study were examined through the lens of four categories of depository libraries: academic full depositories; academic selective depositories; public full depositories; and public selective depositories.

Top

METHOD

3.1 Overview

This study was conducted using paid proxies in a cross-country unobtrusive evaluation of reference service at academic and public depositories. Quality of reference service was operationally defined as the percentage of complete or combined complete and partially complete answers to 15 government documents questions. Selection of tested libraries was based on a proportionally stratified cluster sample. In the first instance, proportional stratification was effected on the basis of the five geographic areas of Canada. On the second level, clusters of cities and towns within the geographic areas were identified, and a sample of public and academic depository libraries was taken to reflect the proportion of these libraries in the depository system as a whole. Fifteen different questions were asked a total of 488 times at 104 libraries in 30 metropolitan census areas as defined by Statistics Canada. Each proxy package consisted of 15 different questions and a brief survey form. Proxies were recruited from students enrolled in the Master of Library and Information Science (MLIS) program of the University of Western Ontario (UWO). Questions were asked from 10 December 1997 to 10 February 1998 – a period during which many students traditionally return to their hometowns for the holiday season. To a certain extent, cities chosen for the study were determined by the travel plans of the proxies, although every attempt was made to adhere to the considerations set out in the following section.

3.2 Geographic Considerations

Canada's political complexity demands that a geographical distribution of questions must take regional differences into account. Canada consists of five geographically and socio-culturally distinct regions. These regions are: the Atlantic Provinces of Newfoundland, Prince Edward Island, Nova Scotia and New Brunswick; Québec; Ontario; the Prairie Provinces of Manitoba, Saskatchewan, and Alberta; and British Columbia. In addition to the ten provinces, Canada also includes the far-northern regions of Yukon, Northwest Territories, and Nunavut (as of 1999).

1

Top Figure

It was important that the number of questions asked in each of the five geographic areas reflect approximately the population distribution of Canada as determined by the 1996 Census. It was considered equally important to ask at least some questions in each individual province and at least one of the territories. Figure 1 shows the geographical distribution of reference questions asked at depository libraries. Seventy-five questions (15.3%) were asked in the Atlantic region; 105 (21.5%) in Québec; 165 (33.8%) in Ontario; 90 (18.5%) in the Prairie Provinces; and 53 (10.9%) in British Columbia and the northern territories.

Given the desirability of asking questions in each province and in at least one of the territories, the Atlantic and Prairie regions are slightly over-represented. Atlantic Canada is over-represented not only because of the inclusion of libraries in Prince Edward Island, but also because Moncton, New Brunswick, was chosen as a test site in order to take into account the demographic reality of a francophone population outside of Québec. Consequently, British Columbia, Ontario, and Québec are slightly under-represented in relation to their national population percentage. Table 1 shows the extent of this under- and over-representation.

 

Table 1: Comparison of Regional Population Distribution and Distribution of Questions

 

% of National Population

% of Questions Asked

Under- or Over- Representation

Atlantic

8.1%

15.3%

+7.2%

Québec

24.7%

21.5%

-3.2%

Ontario

37.3%

33.8%

-3.5%

Prairies

16.6%

18.5%

+1.9%

BC/North

13.2%

10.9%

-3.3%

 

Top Figure

It should be noted that, from an overall perspective, the distribution of questions asked accurately represents the regional, cultural, and social mosaic of Canada.

3.3 Types of Depository Libraries

Since the sampling frame was confined to public and academic libraries which make up 88.9% of the total number of depositories, the proportion of questions asked was made to conform approximately to the proportion of public libraries and academic libraries, respectively, within the sample. Public libraries make up 50.8% of Canadian federal depositories, academic libraries constitute 38.1%, and legislative libraries make up the rest. The latter were excluded from this study because members of the general public do not generally use them. Thus, 296 questions were asked at various public libraries, while 192 questions were asked at academic libraries. Some 49% of the questions were asked at public selective depositories (237 questions), while 26% of the questions were asked at academic full depositories (127 questions). Put another way, 38% of the questions (186 questions) were asked at full depositories (public and academic), while about 62% of the questions (302 questions) were asked at selective depositories (public and academic). Figure 2 provides a graphic representation of the distribution of questions by type of depository library.

Top Figure

Since there are 790 depository libraries in Canada, of which only 48 enjoy full depository status, the study disproportionately concentrates on full depositories. But because full depositories, whether public or academic, tend to be concentrated in major population centres, they are accessible to a large percentage of the total Canadian population and thus provide good indicators of the type of reference service that is available to a significant number of Canadians. Conversely, many of the public selective libraries are in small towns serving a very low percentage of the total Canadian population. Often these public selective depositories in smaller towns opt not to carry a wide range of official publications. Therefore, it would not be representative to send proxies to many libraries where sources may not be available.

3.4 Choice of Census Metropolitan Areas

The choice of cities to which proxies were sent was based on the 25 most populous census metropolitan areas as defined by Statistics Canada in the 1996 Census. In selecting cities the following factors were taken into account:

  • the availability of student proxies who were traveling to their hometowns over the holidays;
  • the presence of a full depository library in those 25 most populous census metropolitan areas;
  • the fact that a geographical distribution that approximated the regional diversity of Canada was required; and
  • the necessity of asking questions in all provinces and in one of the territories.

In total, proxies were sent to 30 different metropolitan census areas. Twenty-three of those areas were among the 25 most populous metropolitan census areas as reported by the 1996 Census. The list of cities to which proxies were sent is as follows: Whitehorse, Victoria, Vancouver, Edmonton, Calgary, Lethbridge, Saskatoon, Regina, Winnipeg, Thunder Bay, Sudbury, Windsor, London, Kitchener, Guelph, Hamilton, Toronto, Kingston, Ottawa, Montréal, Sherbrooke, Trois-Rivières, Québec City, Chicoutimi-Jonquière, Moncton, St. John, Fredricton, Halifax, Sydney-Cape Breton, Charlottetown, and St. John's. The three largest centres –Montréal, Toronto, and Vancouver– were assigned two proxy packages each. Smaller centres such as Charlottetown and Whitehorse were assigned one half of one proxy package. And, in order to include at least a few small public selective depositories, two students whose holiday itineraries would cause them to travel between two major metropolitan centres were asked to make stopovers at some of the public selective libraries in towns on the path between the two major centres. In total, ten questions were asked at such small public selective depositories. The populations contained in these census metropolitan areas include 61.8% of the total population of Canada. Figure 3 provides a more detailed picture of the size of the metropolitan areas where questions were asked.

Top Figure

Of the total 488 questions, 105 were asked in metropolitan areas having over 1 million inhabitants; 80 were asked in cities having a population of between 500,000 and 999,999; 75 were asked in cities having a population of between 250,000 and 499,999; 172 in cities with between 100,000 and 249,999 inhabitants; and finally, 56 questions were asked in those areas with a population of less than 100,000.

3.5 Categorization of Questions

Fifteen government documents questions were developed and tested before they were given to the proxies. McClure and Hernon (1983) established 20 different types of United States government documents for their unobtrusive study. Some of these types are: statistics; administrative reports; directories; maps; bills; laws; regulations; debates; agencies/boards; and periodicals. Fifteen of their categories were chosen and adapted where necessary to suit the Canadian context. Appropriate questions were then developed for the present study. All questions could be answered using either print or electronic and Web-based sources. The nature of the questions is discussed in more detail in Section 7 below.

Five questions were designated as "phone" questions, that is, questions that would be asked by the proxies over the telephone. The remaining ten questions were designated as "walk-in" or in-person questions. It was thought that this division of questions into two different modes of delivery would provide a broadly accurate representation of actual reference situations at depository libraries. McClure and Hernon (1983) also followed this differentiation between telephone and in-person questions.

In addition, it was considered desirable to divide the questions into two groups: one dealing with documents emanating from the legislative branch of government (i.e. bills, statutes, debates, parliamentary procedure) and one pertaining to those produced by the executive branch (i.e. departmental reports, statistics, directories, periodicals). In this way the level of reference staff familiarity with a wide range of governmental operations would be brought to light. Questions were also classed according to whether they dealt primarily with data retrieval or document retrieval. Katz (1996) writes that this is "[a] useful method of distinguishing types of queries." Data-retrieval queries are those for which individuals ask "specific questions and expect answers in the form of data." Document-retrieval queries are those for which patrons "want information, not just simple answers," and the information is "usually in the form of some type of document" (p. 18). Katz recognizes, nonetheless, the fluid nature of almost all reference questions. "Few situations require, or, indeed even allow opportunity to categorize questions in this manner; and this is just as well. A ready-reference query can quickly turn into a specific-search question, and someone embarked on research may have a few ready-reference questions related to that quest" (Katz, 1996, p. 118).

3.6 Survey Packages

Proxies were provided with printed forms containing one reference question each. A full proxy package consisted of 15 reference question forms. Information about whether the question was an in-person question, telephone question, a legislative branch question, or an executive branch question was already printed on the form. In addition to providing the reference questions themselves, the forms asked the proxies to supply some answers about selected institutional variables and question variables. Institutional variables included the type of depository library and whether it had a separate area or desk designated for government reference service. Question variables included day of the week and time of day when the question was asked, time spent by library staff member with proxy, and the degree of busyness at the reference desk where the question was asked. On the reverse side of the form, proxies were asked to indicate whether, in their opinion, they received a complete answer, a partial answer, some type of referral, or, quite simply, no answer at all. Whenever they received an answer, they were asked to state as fully as possible the answer itself and the source used to provide it. Moreover, even if they did not receive an answer or were referred, proxies were asked to write down everything that happened during the reference interview. Proxies did not know the correct answers to the questions that they asked. This was a conscious decision taken on the part of the investigators in order to simulate as closely as possible a real situation in which a reference question would be asked by a member of the public. Each proxy package also included three forms without printed questions; these were to be used in case the proxies made mistakes on the printed forms.

3.7 Recruitment and Training of Proxies

Proxies were recruited during the period 27 November 1997 to 9 December 1997 in order to take advantage of the traditional holiday season when many students travel to their various hometowns. It was not possible to recruit proxies from MLIS students at UWO for several selected cities with full depositories. In these cases, UWO students were asked to contact friends or family members residing in those identified cities, and to ask them if they would be willing to participate in the study. A $200 honorarium was paid for the completion of each proxy package.

A training session was held on 10 December 1997. The proxies were told about the purpose of the study, asked to fill out consent forms, and provided with extensive instructions about all aspects of the study. Each proxy was provided with a complete set of printed question forms and a list of libraries at which the questions were to be asked. Beside each named library on this list was a library type designation, that is, whether the library in question was an academic full depository, an academic selective depository, a public full depository, or a public selective depository. Proxies were repeatedly told not to indicate the actual name of the visited or telephoned library on their question forms; rather, they were merely to indicate the type of library at which each question was asked. Any questions that the proxies had about the nature of the study were discussed and answered in order that proxies understand clearly what they were expected to do. Stress was put on the importance of providing as completely as possible the source of any answer to each reference question, i.e., whether it was a CD-ROM product, a book, or a World Wide Web address. Proxies were told that they could visit the library or telephone the library on any day of the week and at any time of the day of their choosing between 10 December 1997 and 10 February 1998.

3.8 Issues of Anonymity and Informed Consent

An elaborate system was devised to preserve the anonymity of the libraries at which proxies asked the reference questions, thus allowing results rather than institutions to be the focus of the report. At the training session proxies were provided with two envelopes of different colours and sizes in which they were to return the completed survey forms. In the bottom right-hand corner of the smaller white envelope, the name of one of the five geographic areas of Canada was printed. Proxies were instructed to place all completed survey forms in this white envelope and seal it. They were then instructed to place this white envelope in a larger brown envelope, on the outside of which was printed the name of the specific census area the proxy had visited, and to seal this envelope. Envelopes were either returned in person or mailed back. When a completed package was received, it was recorded that a particular proxy had sent back a completed package from a particular census area. This was purely for record-keeping reasons in order to keep track of proxy packages that had yet to be returned. The brown envelope was then discarded. The name of the geographic area contained on the white envelope of that particular proxy package and the size category of that census area was recorded on each of the 15 forms contained in the white envelope of the returned proxy package. Both envelopes were then discarded and all returned forms sorted into 15 different question groups.

These procedures ensure anonymity so that there was no possibility that the jobs, promotions, or salary increments of the library workers who answered questions posed by proxies would be in danger should these library staff members perform poorly or be perceived to perform poorly when the results of the study are reported. The linking of specific test sites with results is avoided by the procedures described here. The issue of informed consent and debriefing was addressed through a message sent by the DSP to all depository libraries.

3.9 Coding Procedures

A research assistant was hired to enter data gathered from the returned proxy packages. For most items such as constitutional region, day of week question was asked, and time spent with patron, data entry was straightforward. Particular attention, however, was paid to coding for the type of answer the proxies received in response to each question asked. The primary reason for this was that the proxies merely recorded whether they received an answer; they did not record whether it was a complete or incorrect answer.

The coding scheme adopted for this study is a modified version of a grid developed by Richardson (1998), itself a modification of Gers and Seward (1985) and Elzy, Nourie, Lancaster, and Joseph (1991). Richardson's coding grid is presented in Table 2.

 

Table 2: Coding Scheme for Reference Questions Developed by John V. Richardson, Jr.

Grade

Definitional Description of Reference Question Outcome

Evaluation

5.0

Referred to single source, complete and correct answer

Excellent

4.0

Referred to several sources, one of which gave complete and correct answer

Very good

3.0

Referred to single source, none of which leads directly to answer, but one which serves as a preliminary source

Good

2.0

Referred to several sources, none of which leads directly to answer, but one of which serves as a preliminary source

Satisfactory

1.0

no direct answer; referred to [external] specific source or person or institution

Fair/poor

0

no answer; no referral (I don't know)

Failure

-1.0

Referred to single inappropriate source

Unsatisfactory

-2.0

Referred to several inappropriate sources, none of which answers question correctly

Most unsatisfactory

Top Table

Richardson's definitional descriptions were retained, but his evaluation levels were reworked and simplified into four categories. Richardson's categories of "excellent" and "very good" were collapsed into the category of "complete answer"; his categories of "good" and "satisfactory" were collapsed into the category of "partially complete answer"; his category of "fair/poor" was retained intact, but was renamed "referral"; and finally, Richardson's bottom three categories of "failure," "unsatisfactory," and "most unsatisfactory" were categorized as "no/incorrect answer." Table 3 summarizes the modifications.

 

Table 3: Modified Richardson Coding Scheme

Coding

Definitional Definition

Complete answer

Referred to single source, complete and correct answer OR referred to several sources, one of which gave complete and correct answer.

Partially complete answer

Referred to single source, none of which leads directly to answer, but one which serves as a preliminary source OR referred to several sources, none of which leads directly to answer, but one of which serves as a preliminary source.

Referral

No direct answer; referred to external specific source or person or institution.

No/incorrect answer

No answer; no referral (I don't know) OR referred to single inappropriate source OR referred to several inappropriate source, none of which answers question correctly.

Top Table

All findings in this study are reported according to the coding grid outlined in Table 3. More detailed subcodes were also assigned for the bottom three categories. Investigators wanted to know, for instance, exactly where a proxy was referred and what the reasons were for a no/incorrect answer. Types of referral were coded as follows: another non-government library; government or legislative library; government department; external non-government agency or establishment. Reasons for no/incorrect answer were coded as follows: tried, but got incorrect answer; did not know; sources unavailable; unwilling to answer; and told to telephone or come back. These categories were based on McClure and Hernon (1983). Data were entered into an electronic file (Microsoft Excel, Version 7) and charts were generated in various versions of Microsoft Excel. Results of statistical analyses are reported in aggregate form only.

Many of the results of the study are analyzed and reported so that separate figures are provided for "complete answers" and for "complete and partially complete answers." This reflects the two types of reference service identified by Katz (1982) and described as "liberal" and "conservative." A liberal philosophy of reference service is defined as one in which the librarian "give[s] the greatest amount of help to people" and where it is understood that "[t]he primary function of a reference librarian is to answer questions [by] giving total service." A conservative philosophy, on the other hand, is characterized by a librarian who "points rather than assists," that is, showing the patron a possible direction and path, and then leaving the patron to locate the final answer (pp. 32-33). Results designated "complete answers" reflect the liberal approach to reference service, while those termed "complete and partially complete" exemplify the conservative philosophy.

Top

Development of Sample Questions

A preliminary list of 32 questions was developed and tested by two students enrolled in the MLIS program at UWO. These questions are shown in Table 4 on the following page. Each student was paid $125.00. The students undertook this project while they were approximately half way through the program; both students had some knowledge of Web sources and Internet searching skills. One of these students (Student A) was enrolled in the Government Documents course; the other (Student B) had never taken such a course. The reason for this procedure is as follows. One criticism levied against McClure and Hernon (1983) was that their proxies did not know whether the people to whom they talked at library reference desks were government documents specialists, generalist reference librarians, or paraprofessionals. McClure and Hernon felt that this criticism was unfair. After all, members of the public do not know about the distinctions between library staff members, nor do they enquire about these differences at the reference desk. Patrons simply want their questions answered. To take into account criticisms about this aspect of the McClure and Hernon study, questions were chosen that could be answered by individuals who had had special training in government documents as well as those who had not had such special training.

 

Table 4: Preliminary Questions Given to Pre-testers

1

Who is the Chair and who are the other full-time members of the CRTC (Canadian Radio-Television and Telecommunications Commission)?

2

What is the cost of Aboriginal Self-Government by Jill Wherrett, published in 1996?

3

Where can I find the government publication that lists forgotten bank accounts?

4

I'd like to get a copy of the bill that says criminals can't profit from books they may write about their crimes.

5

What did Preston Manning say in Parliament in response to the Speech from the Throne?

6

Can you help me find the regulations attached to the Canada Student Loans Act?

7

I'd like some government figures about circulation and sales of Canadian magazines or What is the percentage of French lyrics in Canadian-content sound recordings for 1990-1994?

8

I'd like to see the 1997 Commons committee report on draft regulations on firearms.

9

I'd like to get the text of the act that talks about unconventional fuels.

10

I'd like ordering information and price lists for an aerial photograph of our cottage and lake.

11

Who chaired a Royal Commission on Newspapers in the 1980s?

12

How many total workers went on strike in Canada in the 1950s?

13

I'd like a copy of the statement made by the Foreign Affairs Minister this fall about the treaty banning landmines.

14

Has Health Canada produced a factsheet on electromagnetic fields?

15

Has anything been said in the House of Commons about closing the marine radio station on the Magdalen Islands?

16

I'd like the list of everything the Senate did for all its sessions in October 1997.

17

What are the rules for oral questions in the House of Commons?

18

What are the export sales for electricity for Québec in the 1980s?

19

Did the Auditor General say something about forest management practices of natives in the 1992 report?

20

Are there any contracts for work hauling garbage for the federal government?

21

I'd like to see the evidence from December 11, 1996, of the House of Commons subcommittee on sustainable human development.

22

What are the names of all the members of the House of Commons born outside Canada?

23

Where can I rent art works from the government for my company offices?

24

I'd like to know about river drainage into Hudson Bay for about the past 30 years.

25

I'd like some information about an application filed by the Bank of Montréal in 1995 before the Competitions Tribunal.

26

Who were the witnesses that appeared before the Senate Committee on Legal Affairs when it had hearings in 1996 about changes to the names of electoral districts?

27

What were the final payments per bushel of No. 2 Canada Western Amber Durum Wheat for 1995-6?

28

Is there an official document about the possibility of immigrating to Canada because of gender persecution?

29

Did the Senate special committee on euthanasia say anything about palliative care?

30

I'd like the committee minutes for the first 25 meetings of the House of Commons Standing Committee on Canadian Heritage in the 35th Parliament.

31

Are there any regulations attached to the Fisheries Prices Support Act?

32

Does the government publish any newsletters or bulletins about business opportunities in Africa?

Top Table

The students were told they could use either electronic or print sources to find the answers to these questions; each chose Web resources. In order that the project not take them away from their school work for an overlong period, they were advised to spend no more than 15-20 minutes searching for the answer to each question. Table 5 shows how well each student did on each question and how long each student took to find the answer.

 

Table 5: Results of Pre-Tested Questions

Question

Did "Student A" Find Right Answer?

Time Spent by Student A

Did "Student B" Find Right Answer?

Time Spent by Student B

1

Yes

2

Yes

2

2

Yes

15

Yes

10

3

Yes

2

No

time out

4

Yes

5

Yes

15

5

Yes

3

Yes

5

6

Yes

5

Yes

5

7

No

time out

Yes

11

8

Yes

15

Yes

15

9

Yes

5

Yes

5

10

Yes

2

Yes

13

11

No

time out

No

time out

12

no

time out

No

time out

13

Yes

5

Yes

10

14

Yes

15

Yes

2

15

Yes

5

Yes

15

16

Yes

2

Yes

5

17

Yes

3

Yes

4

18

No

time out

No

time out

19

Yes

3

Yes

4

20

No

time out

No

time out

21

Yes

5

Yes

10

22

Yes

2

Yes

10

23

Yes

1

No

time out

24

No

time out

No

time out

25

Yes

4

Yes

5

26

Yes

2

No

time out

27

Yes

15

Yes

5

28

Yes

5

No

time out

29

Yes

2

Yes

15

30

Yes

2

Yes

15

31

Yes

10

Yes

1

32

Yes

5

Yes

8

Top Table

The results indicate that Student A found the answers to 26 out of the 32 questions, for a completion rate of 81.25%. Student B did slightly worse, finding 23 out of 32 questions for a completion rate of 71.9%. Both students found all their answers in Web-based documents. For the 26 questions that Student A answered completely, the average time spent searching was five minutes. For the 23 questions that Student B answered completely, the average time spent on each question was 8.2 minutes. The high success rate of the students in finding complete answers to these questions in relatively short periods showed that almost all of these questions could be answered by all library personnel no matter their level of specialization in government information sources.

Questions were chosen that represented various levels of difficulty based on the time spent by these two students in searching for answers. The total time spent by both students on each question was divided by two and an average time per question was calculated. Questions for which the two students did not find answers, i.e. questions that "timed out," were arbitrarily assigned a value of 20 minutes. Five levels of difficulty were created based on time spent answering the questions as follows: between 1-4 minutes; between 5-9 minutes; between 10-14 minutes; between 15-19 minutes; and 20+ minutes.

The choice of the final 15 questions to be used during the study depended on two factors. First, there had to be as close to a statistically normal distribution as possible with respect to the time needed to answer each individual question. Second, a broad cross-section of types or categories of government questions, as defined by McClure and Hernon (1983) and mentioned above in Section 3.5, was required. Figure 4 below represents the frequency distribution of the relative difficulty of the final 15 questions selected for the study, as measured by the time spent on each question by student pre-testers.

Top Figure

As shown in Figure 4, seven of the questions could be answered in less than 10 minutes, five questions could be answered in a period of time ranging from 10 to 14 minutes, and only three questions required more than 15 minutes to answer. The curve is very close to being normal, with a mean of 9.26 minutes, a median of 10 minutes, and a modal value of 10 minutes. In other words, the average time spent answering these questions by the two student pre-testers was a little over nine minutes. Moreover, a median value of 10 means that an equal number of questions could be answered in less than 10 minutes and in more than 10 minutes. Three questions could be answered in 10 minutes (the most frequently occurring value). Twelve of the 15 questions were completely answered by both student pre-testers; two questions were answered by one or the other of the pre-testers; and only one question was unable to be answered by either pre-tester.

Table 6 presents the final choices for the 15 questions in the study. The wording of the questions in Table 6 and on the printed survey forms is the same. Questions 1-5 were telephone questions, while questions 6-15 were walk-in questions.

 

Table 6: Final List of Questions

 

Type of Question

Short Form

Full Wording of Question

Avg. Time Spent by Pre-Testers

1

Directory;

Executive; Data

Crtc

Who is the Chair and other full-time members of the CRTC (Canadian Radio-Television and Telecommunications Commission)?

2

2

Bibliography;

Executive;

Data

Book

I want to order a copy of Aboriginal Self-Government by Jill Wherrett, published in 1996. I'm sure it's a government document, and I specifically want to know how much it costs and any ordering instructions.

12.5

3

Agency or

Board Report;

Executive; Data

Barley

I'd like to know what the total payments were per bushel of barley for 1995-1996? Specifically, I'm interested in the category "select two-row" of designated barley.

10

4

Statistics;

Executive; Data

Lyrics

I'd like to know how many new Canadian-content sound recordings (albums, tapes, CD's) released during 1990-1994 have French lyrics?

15.5

5

Statute;

Legislative;

Document

Fuels

I'd like to get the text of the act that requires crown corporations to power their motor vehicles with fuels that do not harm the environment. How many of their vehicles have to use these non-conventional fuels?

5

6

Committee Report;

Legislative;

Document

Fire-arms

There was a parliamentary sub-committee on the draft regulations that submitted a report to the House of Commons in January or February of 1997. I'd like to see a copy of this report.

15

7

Administrative Report; Executive;

Document

Audgen

I'd like to know if the Auditor-General said something in the 1992 annual report about forest management practices of natives, specifically about the good job done by the Stuart Trembleur Lake Band.

3.5

8

Bill;

Legislative;

Document

Crime

I'd like to see a bill that was introduced into the House of Commons this past fall. It has to do with the profits convicted criminals might make if they were to publish books about their crimes.

10

9

Debates;

Legislative;

Document

Magdal

I'm doing a class project about the Magdalen Islands, and there was talk about closing the marine radio station there. I'd like to know if anything was said in the House of Commons about this topic in the last year, and if anything has been decided about its fate.

10

10

Procedures;

Legislative;

Document

Rules

I'd like to know the complete set of rules that govern Question Period in the House of Commons.

3.5

11

Admin Guidelines;

Executive;

Document

Refugee

I want to know if there is any official document about the possibility of immigrating to Canada as a refugee because of persecution based on gender.

12.5

12

Contracts;

Executive;

Data

Garbage

Someone I know is looking for work hauling garbage. Would there be any specific opportunities to put in bids for contracts in this field with the federal government?

20+

13

Maps;

Executive;

Data

Photo

My mother's birthday is coming soon, and I want to order a color enlargement of an aerial photograph of the lake where my parents have their summer cottage as her present. Could I have a price list for the enlargements, and information about what I need to order such a photograph?

7.5

14

Regulations;

Executive;

Document

Fish

Can you help me find any regulations or enabling statutes associated with the Fisheries Prices Support Act?

5.5

15

Periodicals;

Executive;

Document

Africa

Does any government department put out any newsletters or bulletins about business opportunities in Africa? If so, I'd like a copy of the latest one.

6.5

Top Table

The column labeled "type of question" provides three pieces of information. First, it indicates the specific type of government document in which the answer can be found; second, it indicates whether the question deals with the executive arm or legislative branch of government; and third, it classifies the question as to whether it is primarily a data- or document-retrieval question. To be sure, historical questions are not included here. On the other hand, a number of questions directly pertaining to government services were included. Question #2 and #13 deal with ordering various government products, while Questions #12 and #15 deal with employment and business possibilities.

Top

Limitations of the Study

One limitation of this study derives from the fact that each depository library did not have an equal and independent chance of being selected for inclusion in the study. All public full depositories and academic full depositories in Canada, with the exception of one, were visited by proxies for the purposes of this study. The inclusion of many public selective and academic selective libraries in the sample therefore depended on the presence of a full depository library in a particular census area. As explained in Section 3.3 above, a study based on complete randomness would not have offered a fair representation of the ability of depository libraries to answer government reference questions, given the size and collection extent of many public selective libraries. On the positive side, the sampling frame was large and national in scope. Another limitation stems from the fact that there was little control over the exact wording used by individual proxies asking questions at various reference desks. While they were told in each case to stress that questions were government-related and to ensure that they mentioned all key concepts in each question, it is logical to expect that there were differences in emphasis from one proxy to another when individual questions were asked. As McClure and Hernon (1983) noted in their study, "it is possible that proxies failed to provide accurate renditions of the test questions" (p. 22).

McClure and Hernon (1983) and Hernon and McClure (1987) have carefully and thoroughly established the validity and reliability of unobtrusive testing in measuring the quality of documents reference service. Yet, it must be acknowledged that fact-based questions of the type used in their studies and this one account for a small proportion of the total number of reference queries. Childers (1987) suggests that queries with factual and unambiguous answers make up only about one-eighth of the volume in reference departments. In an obtrusive study of five northern California libraries, Whitlatch (1989) found that factual questions were only asked 11.3% of the time at reference desks, while bibliographical questions were asked at a rate of 18% and subject-instructional questions were asked 70.7% of the time. The success rate for factual questions in this study was 78.6%; for bibliographic questions and subject/instructional questions, the success rates were 70.5% and 62.6%, respectively. Compiling the results of 71 Wisconsin-Ohio Reference Evaluation Program surveys, Murfin (1995) reported that factual-based transactions represent about 21% of all in-person reference questions at academic libraries and 18% at public libraries (p. 235).

The choice of time period in which to ask the questions could also be faulted. Levels of expertise may be reduced during the holiday season, since key staff may have priority in release time over this period and thus may not be available for desk duty. On the other hand, holidays may be taken at any time during the calendar year, and so there does not exist one optimum time to conduct a study such as this one. Indeed, the December-January holiday season may be less busy than usual at libraries – a circumstance which might provide more time for staff members to answer reference questions.

One of the central issues in this study deals with the extent to which depository libraries are able to cope with reference questions by using the Internet-based Web resources. Queries requiring the use of retrospective sources were not included since most Web documents have been produced very recently. This explains the absence of historical questions among those asked by the proxies. In addition, for reasons of anonymity, relationships between, on the one hand, institutional variables such as budget, collection size, staffing, and education levels of staff, and, on the other, success in answering proxy-administered questions, were not explored.

Top

Data Analysis and Results

6.1 Profile of Question Distribution

Proxies were not given instructions about the day of the week nor the time of day when they were to ask questions. Yet as Figure 5 shows, the distribution of questions across the week is relatively uniform. Proxies asked 14% of their questions on Monday (69 questions). Peak times for questions were Tuesday, with about 20% of the total (103 questions), and Wednesday, with 18% (87 questions). On Thursday, 14% of the questions were asked (66 questions), while on Friday, the figure was 12% (57 questions). On Saturday, 17% of the questions were asked (83 questions). Only 5% of the questions (23 questions) were asked on Sunday. This makes sense, since many libraries are closed on Sundays or open for limited hours with reduced staffing levels.

Top Figure

Data presented in Figure 6 show the times of day when the proxies asked their questions. Just over two-thirds of the questions (67.8%) were asked in the afternoon (331 questions), while about a quarter of the questions (23.2%) were asked in the morning (113 questions). Only about 9% of questions were asked in the evening (44 questions).

Top Figure

Because many libraries have staggered shifts for their professional staff, librarians may work a regular 9:00 a.m. – 5:00 p.m. shift one day, but an evening shift of 1:00 p.m. – 9:00 p.m. the following day. Thus, the optimum time to ask a reference question would seem to be in the overlapping afternoon period. Budget cuts in the past several years have occasioned cutbacks in the number of weekly hours public libraries are open to users, and some public libraries have elected to close on alternating weekday evenings and weekday mornings. For both of the above reasons, optimum access times for reference questions are in the afternoon. The fact that over two-thirds of proxy questions were asked during this afternoon period offers confidence about the results.

6.2 Variation in Service Levels by External Factors

Evaluation studies concerning the efficacy of library reference service have consistently shown that librarians are able to offer complete and satisfactory answers to patrons about 55% of the time (Hernon & McClure, 1986). There have been only a few that have focused specifically on the evaluation of government documents reference service. The landmark study of this type is by McClure and Hernon (1983), which dealt entirely with academic libraries located in the Northeastern and Southwestern regions of the United States. The results of their study indicated that library staff members answered government documents questions with an accuracy rate of 37%. This lower fill-rate for government reference questions may reflect the more specialized and difficult nature of the subject matter.

Until now no cross-country examination of the quality of government reference service at depository libraries in Canada has been undertaken. Figure 7 displays results of the present investigation by showing the extent to which proxies were given complete and accurate answers to their questions according to the criteria adduced in Table 3. There are significant implications here concerning the quality of depository library reference service.

Top Figure

Complete answers were provided at a rate of 29.3% (143 questions). When complete and partially complete answers (64 questions) are taken together, reflecting the conservative philosophy of reference service as described in section 3.9 above, the success rate climbs to 42.4% (207 questions). Library staff members referred a full one-fifth (98 questions) of the 488 questions. No answers or incorrect answers to questions were received 37.5% of the time (183 questions).

Particularly noteworthy is the figure of 29.3% for complete answers. The level of service and knowledge suggested by this figure is especially disquieting given the emphasis the DSP places on the depositories' role as the public's centre of expertise for finding, accessing, and retrieving federal information. It may be that the complexity and sheer quantity of official documentation from all sources is overwhelming depository libraries. It may also be that depository staff members are not confident enough to move through the labyrinth that many perceive government documents to be. Another explanation for the level of service found in Figure 7 may be that, compared with for-profit competitive publishers, the package provided to depository libraries by the DSP may be lacking in consistency, indexing, and accompanying training material. It might also be noted here that no text or manual giving guidance in the use of federal documents has been published since the appearance of Olga Bishop's Canadian Official Publications in 1981.

In the past decade libraries have been forced to suffer painful budget cuts. Respondents to the survey conducted by Dolan and Vaughan (1998) reported that libraries are suffering from an absence of funding, a dearth of training programs, and a lack of time available for maintaining improving staff expertise in the area of official publications. Depositories are especially in need of knowledgeable personnel to assist with electronic access – a finding that is of particular relevance for the present study.

While the overall rate of complete answers was 29.3%, there were statistically significant differences among the four types of depository libraries (x2=29.13, df=9, p < .01). Figure 8 summarizes this aspect of the results. The highest rate for complete answers was achieved by academic full depositories, at 39.4% (50 out of 127 questions). Public full depositories were able to provide complete answers at a rate of 32.2% (19 out of 59 questions), while academic selective depositories did so at a rate of 29.2% (19 out of 65 questions). Public selective depositories lagged behind with a rate of 23.2% for complete answers (55 out of 237 questions). When complete and partially complete answers are taken together, academic and public full depositories display an almost identical rate – 51.2% for academic full depositories (65 out of 127 questions) and 50.9% for public full depositories (30 out of 59 questions). Both types of selective libraries also answered questions either completely or partially completely at a statistically equal rate – 37.1% for public selectives (88 out of 237 questions) and 36.9% for academic selectives (24 out of 65 questions).

Top Figure

Full depositories seem to be performing at a higher level than selective depositories. This should not be surprising given the fact that full depositories have access to the entire range of DSP publications. Moreover, they are typically located in large urban areas or at major universities across the country and have the benefit of staffing and funding levels that are much higher than selective depositories. This latter circumstance suggests that full depositories may have more specialized government documents reference personnel than selective libraries. Full depositories performed above the national rate of 29.3% for complete answers and the national rate of 42.4% for both complete and partially complete answers.

Another way of analyzing the data is to see whether there are significant differences between the five geographic areas of Canada with respect to complete and partially complete answers to proxy questions. Figure 9 summarizes these findings. Ontario displays the best performance in this regard, with a rate of 38.2% (63 out of 165 questions) complete answers and a rate of 57.6% for combined complete and partially complete answers (95 out of 165 questions). Depository libraries in British Columbia (including one location in the northern territories) provided proxies with complete answers 35.9% of the time (19 out of 53 questions), and at a rate of 45.3% for complete and partially complete answers (24 out of 53 questions). Depository libraries in the Atlantic Provinces were able to give complete answers to 28% of proxy-administered reference questions (21 out of 75 questions); combined complete and partially complete answers were given 41.3% of the time (31 out of 75 questions). The regions of Ontario and British Columbia provided complete and partially complete answers at or above the national rate of 29.3% for complete answers, and 42.4% for combined complete and partially complete answers. Atlantic Canada conformed to the national average. Results mentioned in this paragraph are statistically significant (x2=33.54, df=12, p < .01).

Top Figure

The Prairie Provinces and Québec fall below the national average for complete and partially complete answers. In the provinces of Manitoba, Saskatchewan, and Alberta, staff at depository libraries were able to answer completely proxy-administered questions at a rate of 23.3% (21 out of 90 questions), while combined complete and partially complete answers were provided 32.2% of the time (29 out of 90 questions). In Québec complete answers were given at a rate of 18.1% (19 out of 105 questions), while combined complete and partially complete answers were elicited 26.7% of the time (28 out of 105 questions).

Figures 8 and 9 provide more detail about complete answers and partially complete answers provided by each type of library in the five geographic regions. In the Atlantic Provinces, for example, public selective libraries provided complete and partially complete answers at a rate of 43.1% (19 out of 44 questions), while academic full depositories provided such answers 42.1% (8 out of 19 questions) of the time. Academic selective depositories in Atlantic Canada gave complete and partially complete answers at a rate of 33.3% (4 out of 12 questions). No public full depositories exist in Atlantic Canada, but results suggest that in this region equally good service for government reference questions is available at public selective libraries and academic full libraries, while academic selective libraries lag behind.

In Québec, academic full depositories answered completely or partially completely at a rate of 47.8% (11 out of 23 questions). By contrast, academic selectives provided such answers at the much lower rate of 10.5% (2 out of 19 questions). Taken together, public full and public selective libraries gave complete and partially complete answers at a rate of 23.8% (15 out of 63 questions). In Québec, academic full depository libraries were able to answer government reference questions most effectively. Public depositories and academic selective depositories in Québec were notable for poor success rates.

In Ontario, academic full depositories gave complete and partially complete answers at a rate of 55.8% (29 out of 52 questions); for academic selective libraries the figure is 58.3% (7 out of 12 questions); for public full depositories it is 61.1% (22 out of 36 questions); and for public selective depositories the comparable figure is 56.9% (37 out of 65 questions). Proxies in Ontario received markedly similar and relatively high levels of government documents reference service no matter what type of depository library they visited.

In the Prairie Provinces, both academic full and selective depositories provided complete and partially complete answers at a rate of 47.1% (8 out of a total of 17 questions each). Taken together, public full depositories and public selective depositories give such answers 23.2% of the time (13 out of 56 questions). Results from Québec and the Prairie Provinces are strikingly similar. Academic libraries in the Prairies successfully answered proxy questions at twice the rate of public depository libraries in that region. In British Columbia and the North, academic full depository libraries provided complete or partially complete answers at a rate of 56.3% (9 out of a total of 16 questions), while the rate for academic selective libraries was 60% (3 out of 5 questions). Both types of public libraries provided such answers at a rate of 37.5% (12 out of a total of 32 questions).4

Results show that the level of government documents service varies according to region and type of depository library. In general terms, a patron in Québec, the Prairies, and British Columbia might be well advised to seek out an academic depository library, preferably an academic full depository, for government information. In Atlantic Canada and Ontario, however, similar levels of government information service are provided by all four types of depository libraries, with the exception of academic selective depositories in Atlantic Canada.

Depository libraries are located in census metropolitan areas of varying sizes. Data are displayed in Figure 10 to determine whether the size of a particular census area has an impact on the level of government documents service provided. Results are statistically significant (x2=30.71, df=12, p <.01). Two main findings emerge. First, the lowest number (7 out of 56 questions) (12.5%) of complete and combined complete and partially complete answers (13 out of 56 questions) (23.2%) to government reference questions was given in cities with fewer than 100,000 inhabitants. The best chances of receiving complete or partially complete answers was in census areas with a population either over one million (53 out of 105 questions) or between one quarter and one half million inhabitants (43 out of 75 questions). Census areas of both sizes had complete and partially complete answer rates of over 50%. It may seem counter-intuitive that there is a decrease in service levels in the two largest categories of census areas, but this may in part be explained by the exceptional results that were obtained in libraries in two of the cities with a census area population of between 250,000 and 499,999. Moreover, one explanation for the low success rate in small localities may be that these small centres generate fewer government documents questions, and therefore library staff are more likely to have only a more generalist knowledge of the field.

Top Figure

Analysis of external institutional variables concludes with an examination of whether the day of the week on which proxy questions were asked made a difference in type of answers received. Figure 11 addresses this point. On most days of the week –Monday, Wednesday, Thursday, Friday, and Saturday– the level of service was remarkably similar. On these five days complete answers were provided at a rate of between 24.7% and 28.8%, whereas complete and partially complete answers are provided at a rate of between 39.1% and 43.9%. On Sundays, however, the rate for complete answers falls to 13%, and the rate for complete and partially complete answers is 21.7%. Proxy questions that were administered on Tuesdays were answered completely 41.7% of the time, while such questions received complete or partially complete answers 53.4% of the time.

Top Figure

6.3 Variation in Service Levels by Internal Factors

One likely determinant of the level of government documents reference service is the presence of a specific area or reference desk that deals solely with government reference questions. Having such a special area may be an indication of the availability of specialist librarians who devote some or all of their time to official publications. Figure 12 shows the rate of complete and partially complete answers from the standpoint of the existence of a separate government documents reference area or desk.

Depository libraries without a separate area for government documents reference service answered proxy-administered question completely at a rate of 24.9% (64 out of 257 questions). Complete or partially complete answers were provided at such libraries 39.3% of the time (101 out of 257 questions). Depository libraries that had a separate area for government documents reference service provided complete answers at a rate of 35.2% (76 out of 216 questions), while combined complete or partially complete answers were given 47.2% of the time (102 out of 216 questions). An approximately 10% difference exists between depository libraries with and without separate government documents reference areas. This difference is statistically significant when complete and partially complete answers are placed in one category, while no/incorrect answers and referrals are placed in another category (x2=4.85, df=1, p < .05).

Top Figure

Proxies asked questions at depository libraries that had separate government reference areas 44.3% of the time (216 questions); questions were asked at depository libraries that did not have such separate areas 52.6% of the time (257 questions).5 Of the 216 libraries that did have these separate areas, 31% (67) were academic full depositories, 13.9% (30) were academic selectives, 22.2% (48) were public full depositories, and 32.9% (71) were public selectives. Of the 257 libraries that did not have such separate areas, 63.4% (163) were public selective depositories and only 21% (54) were full academic depositories. Table 7 shows the impact of a separate government documents reference area on complete answers by type of library.

 

Table 7: Impact of Separate Area on Complete Answers

Complete Answers

Separate Area

No Separate Area

Academic Full

30/67 (44.78%)

19/54 (35.19%)

Academic Selective

8/30 (26.67%)

9/29 (31.03%)

Public Full

17/48 (35.42%)

2/11 (18.18%)

Public Selective

21/71 (29.58%)

34/163 (20.86%)

Top Table

In both academic full depositories and public full depositories, a trend emerges. Those full depositories that have separate reference areas for government questions tend to provide more complete answers than did those institutions without such areas. The tendency was most pronounced in public full depositories where the difference in success rates was about 17%, although the difference of some 10% in academic full depositories is also noteworthy. One conclusion that may be drawn from this is that government documents collections in full depository libraries are so extensive and complex that it is necessary for adequate reference service to devote a special reference desk to them. Similarly, public selective depositories with a separate reference area for government documents provided complete answers at a rate of 29.6%, while those without gave complete answers 20.9% of the time. For academic selective libraries, the difference between those institutions that do and do not have separate areas is negligible.

Dolan and Vaughan (1998) report that 29% of depository libraries in Canada have separate government documents collections, while 55% have a mixed arrangement. Whereas only 14.7% have integrated their government holdings into either their main collections or their reference collections, 70 libraries (16.1%) have moved to merge their collections since 1986. Although integrated government documents collection does not preclude the existence of a separate government documents reference area and there is not necessarily a relationship between the organization of a government documents collection and the presence (or absence) of a government documents reference area, the findings presented in Table 7 suggest that, even if the trend to consolidate government documents collections continues, depository libraries should seriously consider retaining separate government documents reference areas. Integration of government collections into the main collections may devalue the specialist knowledge that government documents librarians possess, and thus the retention of a separate government documents reference area may serve to maintain and valorize this specialist knowledge.

The present study also examined whether the degree of busyness at government documents reference desks had an impact on the quality of answers received. Proxies were requested to make a judgement about whether the reference desk was busy, very busy, or not busy at the time each question was asked based on "such indicators as the number of people waiting in line to ask questions or whether the library staff member was answering questions by phone or performing other duties." In total, proxies approached reference desks with their questions during busy times at a rate of 29.7% (145 questions), at non-busy times at a rate of 65.9% (322 questions), and at very busy times at a rate of 4.3% (21 questions). Figure 13 summarizes these findings.

Top Figure

 

What is immediately apparent here is that depository library reference desks provided the same level of service, as measured by complete, partially complete, or no/incorrect answers, when they were busy and when they were not busy. For example, complete answers were provided 29.5% of the time (95 out of 322 questions) when the reference area was not busy, and 31% of the time when it was busy (45 out of 145 questions). Partially complete and complete answers were provided at a rate of 45.5% during busy times (66 out of 145 questions) and at a rate of 41.6% during non-busy periods (134 out of 322 questions). There is no statistical significance in these variations (x2=5.34, df=6). A similar trend is evident for no/incorrect answers; both at busy times and not busy times, no/incorrect answers were received 37% of the time. Only when the reference desk becomes "very busy" did the level of service substantially decline. Thus, complete answers were received 14.3% of the time (3 out of 21 questions) when the reference area was very busy, a drop of some 15% in success rates from busy or not busy periods. A similar 15% drop was experienced when complete and partially complete answers are taken together. The rate of no/incorrect answers, however, stays approximately the same during very busy, busy, or not busy periods. Another interesting point is that, while referral rates are approximately the same during busy and not busy times, referral rates soar to 33.3% during very busy periods.

When complete and partially complete answers are further analyzed by type of depository library and degree of busyness, results as displayed in Table 8 emerge.

 

Table 8: Impact of Busyness on Type of Answers

Complete And Partially Complete Answers

Not Busy

Busy

Academic full

48/92 (52.17%)

16/32 (50.00%)

Academic selective

21/51 (41.18%)

3/14 (21.40%)

Public full

8/23 (34.78%)

18/29 (62.07%)

Public selective

57/156 (36.54%)

29/70 (41.40%)

Top Table

Notwithstanding whether they are busy or not busy, academic full depositories provided complete and partially complete answers at the same rate of about 50% and public selectives do so at a rate of between 36% and 41%. The quality of reference service, however, declines significantly at academic selective depositories when the degree of busyness increases, although the small sample size (n=14) of academic selectives during busy periods may be a factor in this finding. Paradoxically, the quality of reference service, as measured by complete and partially complete answers, increases substantially at public full depositories as the degree of busyness rises. This last finding may also be explained in part by the small sample size of public full depositories (n=23 at not busy times; n=29 at busy times).

From an overall perspective, the degree of busyness does not seem to be an important factor in whether a patron receives a complete or partially complete answer. One possible explanation for this is that library staff members at most libraries attempt to accord each reference question a respectful degree of attention no matter how stressful long lineups or ringing phone lines may be. The question, after all, is equally important to the patron asking it when there are few people in the library as when the library is serving hundreds. Although those library staff members who were experiencing busy times at the reference desk achieved high rates for complete and partially complete answers, it might be expected that others who were not so busy might have spent more time with patrons, especially in light of the findings discussed in the following paragraphs.

Proxies gathered information about how long library staff members spent with them in answering their questions. The amount of time spent was precisely recorded, but for the purposes of this study, minutes were grouped into the following categories: 1-4 minutes; 5-9 minutes; 10-14 minutes; 15-19 minutes; and more than 20 minutes. Figure 14 presents an overall picture of how much time was spent in answering proxy-administered questions.

Top Figure

It is apparent that, in a plurality of cases (33%), library staff members spent between one and four minutes with each patron (160 questions). Indeed, 57% of the time (160 questions + 119 questions), library staff members spent nine minutes or less dealing with each reference question. Still, in a substantial number of cases (34%), library staff members spent more than 10 minutes with each patron (169 questions), and 18% of the time (89 questions) they spent 15 minutes or more attempting to find a satisfactory answer to a reference question. In fact, collected data show that some librarians spent approximately one hour trying to assist proxies.

Another point of interest was the relationship between time spent with patrons and complete or partially complete answers. Figure 15 summarizes these findings. Differences in types of answers received are statistically significant (x2=70.29, df=15, p < .01). In those reference encounters where a staff member spent between one and four minutes with a patron, complete answers were received only 11.25% of the time (18 out of 160 questions), while complete and partially complete answers were received at a rate of 21.25% (34 out of 160 questions). As the amount of time spent with a patron increased, however, the number of complete and partially complete answers went up significantly. For instance, spending between five and nine minutes with a patron led to complete answers 31.9% of the time (38 out of 119 questions), and to complete or partially complete answers 43.7% of the time (52 out of 119 questions). In those instances when a library staff member spent more than 10 minutes with a patron, the rate of complete and partially complete answers rose to 56.8% (96 out of 169 questions). Moreover, when the library staff member devoted 20 minutes or more on a reference question, the rate of complete and partially complete answers rose still more to 65.2% (30 out of 46 questions).

Top Figure

With referrals, the opposite tendency was observed. In those situations where one to four minutes were spent with a patron, the rate for referrals was 29.4% (47 out of 160 questions). When a library staff member spent more than 10 minutes with a patron, however, the referral rate fell to 11.2% (19 out of 169 questions), and when more than 20 minutes was devoted to a question, the referral rate was a very low 2.2% (1 out of 46 questions). In phoneback situations, where the librarian might be under less pressure in searching for an answer, the rate of complete and partially complete answers was 62.5% (25 out of 40 questions) – approximately the same rate as for those questions where the librarian spent more than 20 minutes in assisting a patron. In sum, the more time a library staff member is able to spend with a patron, the chances that a patron will receive a complete or partially complete answer increase.

Results suggest that, given sufficient time and opportunity, library staff members are able to achieve very high rates of complete and partially complete answers. One troubling finding does emerge from a further parsing of the data. When library staff members were judged by proxies to be not busy, and when these library staff members subsequently spent only one to four minutes answering proxy-administered questions, the rates for complete and partially complete answers were very low – 19.1% and 17.9% in academic depositories and public depositories, respectively.

6.4 Variation in Service Level by Question Variables

Proxies asked their questions either by telephone or by appearing in person at a government documents reference area. A total of 163 telephone questions was asked, while 325 questions were asked by proxies in person.

Top Figure

Figure 16 shows the impact of question delivery method on the type of response received. A greater number of complete and partially complete answers was received when the reference questions were delivered in person than when they were asked over the telephone. In-person questions (149 out of 325 questions) (45.85%) generated approximately 10% more complete and partially complete answers than did telephone questions (58 out of 163 questions) (35.58%). On the other hand, more referrals were given to telephone questions than to walk-in questions. These results are statistically significant (x2=4.68, df=1, p <.05). And, as Table 9 shows, the findings from Figure 16 hold true when results are broken down by type of depository library.

 

Table 9: Method of Question Delivery and Depository Library Types

Complete or Partially Complete Answers

Walk-in

Phone

Academic full

46/80 (57.50%)

19/47 (40.42%)

Academic selective

18/48 (37.50%)

6/17 (35.29%)

Public full

23/40 (57.50%)

7/19 (36.84%)

Public selective

62/157 (39.49%)

26/80 (32.50%)

Top Table

Telephone questions were answered less successfully than were walk-in questions in full depository library systems. The greatest divergence –over 20%– occurs in public full depositories; the difference in academic full depositories is a little more than 17%. Selective depositories, on the other hand, seem to be able to give complete and partially complete answers to telephone and walk-in questions at an equal or almost equal rate.

The above circumstance is all the more surprising in light of the data contained in Table 10, which shows differences in time spent with patrons according to method of question delivery and type of depository library.

 

Table 10: Time Spent on Reference Questions

Walk-in Questions

Phone Questions

1-4 minutes

> 10 minutes

1-4 minutes

> 10 minutes*

Academic full

27.50%

43.75%

40.43%

51.06%

Academic selective

33.33%

43.75%

47.06%

41.17%

Public full

37.50%

42.50%

31.58%

31.57%

Public selective

31.21%

40.76%

31.23%

41.25%

* = includes phone back situations

Top Table

When phone back situations are included in the category "more than 10 minutes," three out of the four depository types –with the exception of public full depositories– spend 10 or more minutes either at approximately the same rate or at a greater rate with patrons who use the telephone to ask reference questions as with patrons who ask their questions in person. Indeed, public selective libraries display the most consistency in this regard. An almost equal percentage of the time is devoted to telephone questions and to walk-in questions in the time-spent categories of "one to four minutes" and "greater than 10 minutes." Given the data in Table 10, it is surprising that complete or partially complete answers are provided to walk-in questions at a 10% greater rate than to telephone questions, as noted in Figure 16. It would be logical to expect that the efficacy rates would be about the same. One reason for this may be that library staff members may be able to get a better sense of the question when it is delivered in-person.

Another question variable of interest in this study had to do with whether the subject matter of the question dealt with the legislative or the executive branch of government. In total, proxies asked 324 executive questions and 164 legislative questions. As can be seen from Figure 17, complete and partially complete answers were provided to legislative questions at a statistically significantly higher rate than to executive questions (x2=24.92, df=3, p <.01).

Top Figure

While legislative questions were completely or partially completely answered at a rate of 48.2% (79 out of 164 questions), executive branch questions received complete or partially complete answers 39.5% of the time (128 out of 324 questions). Moreover, legislative questions were given referrals at a substantially lesser rate than were executive branch questions. While executive branch questions were referred at a rate of 26.2% (85 out of 324 questions), legislative branch questions were referred only 7.9% of the time (13 out of 164 questions). Table 11 confirms these findings by analyzing the four types of depository libraries and their ability to provide complete and partially complete answers to legislative or executive branch questions.

 

Table 11: Comparison of Legislative and Executive Questions

Complete and Partially Complete Answers

Executive

Legislative

Academic full

35/75 (46.67%)

30/52 (57.69%)

Academic selective

15/40 (37.50%)

9/25 (36.00%)

Public full

16/38 (42.11%)

14/21 (66.67%)

Public selective

62/171 (36.26%)

26/66 (39.39%)

Top Table

Whereas selective depositories provide complete and partially complete answers to legislative and executive questions at about the same rate, the difference between these two types of questions is most readily apparent in public full depositories and academic full depositories. Full depositories provide complete or partially complete answers to legislative branch questions at a rate that exceeds complete or partially complete answers to executive branch questions by 10%-25%. One reason for this phenomena may be that legislative questions are more homogenous than executive branch questions. While the subject matter of legislative questions can be as broad as executive branch questions, the locations in which the answers to legislative questions can be found are limited in number. For instance, once a librarian knows how to find one statute, one bill, or one comment in Hansard, then the answer to any subsequent questions dealing with statutes, bills, or debates will be found in the same location. It should also be noted that the questions themselves often include clear references to appropriate sources, e.g. "Where can I find a bill about Topic X?" Executive branch questions, on the other hand, are heterogeneous not only in regard to subject matter, but also in regard to potential locations.

The present study also wanted to know whether a distinction prevalent in the scholarly literature of librarianship between data-retrieval questions and document-retrieval questions was germane for government-based reference questions. As noted above, Katz (1996) defines data retrieval questions as those in which individuals ask "specific questions and expect answers in the form of data." Document retrieval queries are those in which patrons "want information, not just simple answers," and this information is "usually in the form of some type of document" (p. 18). Figure 18 examines whether depository libraries as a group provided a greater number of complete or partially complete answers to data retrieval questions or to document retrieval questions.

Top Figure

Data-retrieval questions were asked 195 times, while document-retrieval questions were asked 293 times. Complete answers were provided to document-retrieval questions (93 out of 293 questions) at a slightly higher rate (31.7%) than for data-retrieval questions (50 out of 195 questions) (25.6%). When complete and partially complete answers are combined, the rate for document-retrieval questions is 45.4% (133 out of 293 questions), while the rate for data-retrieval questions is 37.9% (74 out of 195 questions). Data-retrieval questions (61 out of 195 questions) (31.3%) are referred at more than twice the rate of document-retrieval questions (37 out of 293 questions) (12.6%). Differences are statistically significant (x2=25.86, df=3, p <.01).

6.5 Referrals and No/Incorrect Answers

The previous sections have concentrated on an examination of the various rates of complete and partially complete answers in different categories. In this section, a closer look is given to the reasons for no/incorrect answers, as well as to the types of referrals made by library staff members. Also presented are comparative data about those questions that were referred and those questions to which no answer or the incorrect answer was provided. In total, 98 questions (20.1%) were referred to various governmental and non-governmental institutions. As Figure 19 shows, fully half the referrals (49 times) were to government departments. Proxies were referred to governmental or legislative libraries (7 times) (7%). Proxies were referred to other non-governmental libraries, usually at a university, at a rate of 29% (28 times). In addition, 14% of the time proxies were referred to external non-governmental agencies or establishments that were not libraries (14 times).

Top Figure

Of the referrals made to government departments, 65.3% were made by public selective depositories. This should not be surprising given that many public selective libraries do not collect a wide array of government documents. More interesting is the fact that both academic full depositories and public full depositories each referred patrons to government departments at a rate of 16.3% despite having in their holdings the full range of official publications. Of the referrals to other non-governmental libraries, 71.4% of the time such referrals were made by public selective depositories and academic selective depositories. Again, this is entirely understandable, given the incomplete nature of the holdings of selective libraries. Yet, 28.6% of academic full depositories also directed proxies to seek out other non-governmental libraries. Referrals may be explained by a desire to provide patrons with the most current information, or it may be that staff members are not fully aware of the resources held by their own library.

No/incorrect answers were received by proxies 183 times, that is, at a rate of about 38%. As Figure 20 demonstrates, the most common explanation for a proxy's receiving a no/incorrect answer was that the library staff member simply did not know how to find the needed information. This happened in 37% of the cases in which a no/incorrect answer was given (67 out of 183 questions). The next most common reason for a no/incorrect answer was that the library staff member provided an answer, which, unfortunately, was inaccurate. This happened 21% of the time that a no/incorrect answer was provided (38 out of 183 questions). Ten percent of no/incorrect answers (18 out of 183 questions) were the result of the unwillingness of library staff members to answer proxy questions – a circumstance which may, of course, mean that library policy precludes providing a certain type of service over the phone, or to those not perceived to be primary clients. Indeed, in the case of telephone queries, where the patron cannot see the selection of available resources, some libraries may have a specific policy to refer telephone questions quickly to a government contact. As a last point, 27% of the time that a no/incorrect answer was provided, proxies were told either to come into the library, phone back, or come back at a more opportune time (49 out of 183 questions). Again, some libraries may have policies which encourage users to come in and decide for themselves which resources best fit their needs, or to work through what a staff member may have thought was a complicated reference interview. Sources were unavailable nine times (5%). Two other reasons occurred.

Top Figure

Staff members at public selective depositories accounted for 52.2% (35 out of 67 questions) of the "did not know" category. More problematic is the fact that staff members at academic full depositories told proxies that they "did not know" at a rate of 20.9% (14 out of 67 questions). Staff at public full depositories accounted for 9% of the "did not know" category (6 out of 67 questions). The type of question that most received a "did not know" reply (46 out of 67 questions) was an executive branch question (68.7% of the time), as opposed to a legislative branch question (21 out of 67 questions) (31.3% of the time).

Staff members at public selective depositories accounted for 47.4% of the "tried but got incorrect answer" category (18 out of 38 questions). Their counterparts at academic full depositories fell into this category at a rate of 21.1% (8 out of 38 questions). Staff at public full depositories accounted for 13.2% of the "tried but got incorrect answer" category (5 out of 38 questions). The type of question that most received a "tried but got incorrect answer" reply (25 out of 38 questions) was an executive branch question (65.8% of the time), as opposed to a legislative branch question (34.2% of the time).

Figure 21 addresses the question of whether the type of depository library had an impact on the rate of referrals and no/incorrect answers. Academic full depositories, public full depositories, and public selective depositories gave no/incorrect answers at approximately the same rate, which hovers between 34% and 36%. Academic selective depositories provide no/incorrect answers at a significantly greater rate – 53.9%. Referrals were provided by academic full depositories, academic selective depositories, and public full depositories at an approximately equal rate ranging from 9% to 15%. Among the four types of depository libraries, public selectives referred patrons at an almost 30% rate. Not surprisingly, those depositories that had full collections of government documents were less likely to provide no/incorrect answers or to make referrals.

Top Figure

Figure 22 provides a detailed look at referrals and no/incorrect answers by geographical area. Depository libraries in Ontario provided no/incorrect answers at a rate of only 25.4% (42 out of 165 questions) and referred patrons elsewhere only 17% of the time (28 out of 165 questions), clearly the best record in Canada. Depository libraries in the Atlantic Provinces gave no/incorrect answers at a rate of 37.3% (28 out of 75 questions), while these libraries referred questions elsewhere 21.3% of the time (16 out of 75 questions). British Columbia is much like Atlantic Canada in this respect. Depository libraries in British Columbia provided no/incorrect answers at a rate of 39.6% (21 out of 53 questions) and referred at a rate of 15.1% (8 out of 53 questions). The Prairie Provinces and Québec had no/incorrect answer rates of 45.6% (41 out of 90 questions) and 48.6% (51 out of 105 questions), respectively. Referral rates for the Prairies and Québec were 22.2% (20 out of 90 questions) and 24.8% (26 out of 105 questions), respectively.

Top Figure

The relatively high levels of no/incorrect answers and referrals revealed in this study are a cause for concern. One explanation may be found in a study conducted by Harris and Marshall (1998). They provide evidence, based on interviews with Canadian library directors, that restructuring imperatives occasioned by budget constraints have led to the increased use of paraprofessionals in reference functions. Since paraprofessionals may not have the necessary background knowledge and specialized skills to answer government documents questions, a certain deskilling of reference work may be taking place. In addition, Harris and Marshall note that library directors envision librarians as primarily managerial material – a circumstance which may result in high-quality reference librarians being taken away from desk duty and placed in exclusively administrative roles. Further study is needed to determine the impact on patrons of no/incorrect responses and referrals. Do patrons become discouraged? Frustrated? Do they follow up on referrals or give-up? Do they turn to other sources for the information that they need?

6.6 Sources Used to Answer Questions

An important aspect of this study concerns the kinds of sources library staff members used to answer government documents reference questions. As indicated earlier, all 15 different questions asked by proxies could be answered using Internet-based Web resources. Dolan and Vaughan (1998) report that, by the end of 1996, 89% of depository libraries had Internet access, and that of the 11% that did not have Internet access in December 1996, some 70% were planning to have such access within one year. When the present study was conducted in December, 1997, it was not unreasonable to suppose that Internet access was available in some 96% of federal depository libraries. Dolan and Vaughan (1998) also report that print sources are used much more frequently in depository libraries than are electronic sources. The present study therefore wanted to determine the extent to which library staff turn to various types of sources to answer patron questions. Figure 23 provides a summary of staff source use.

Top Figure

Print materials constitute the largest single source (45.7%) used by staff members at depository libraries (223 out of 488 questions). The Web alone was used 11.5% of the time (56 questions), and the Web in combination with another source was used at a rate of 5.5% (27 questions). Thus in whole or in part, Web use to answer government document reference questions hovers around 17% (83 questions). About 23% of the time no sources were consulted (112 questions), and in an additional 9.6% of cases (47 questions), the only source used was a library on-line public access catalogue (OPAC). CD-ROMs or databases were used at a rate of 3.7% (18 questions), while microforms were employed at a rate of just over 1% (5 questions).

Can more be said about the use of "no sources?" Of the 112 questions for which no sources were used, 55 were walk-in questions and 57 were telephone questions – an almost equal division. Was there a difference among depository libraries in their use of no sources?6 On those 112 occasions when proxies indicated that no sources were consulted by library personnel, 48.2% of the time it was a question of public selective depositories, 30.4% of the time academic full depositories, 14.3% of the time academic selectives, and only 7.1% of the time was it a question of public full depositories. There were minor variations in the sources used depending on the type of depository library. Figure 24 addresses this issue. Web usage at academic full, academic selective, and public full depository libraries ranges from 21.1% to 25.2% – statistically indistinguishable rates. Public selective libraries, however, used Web-based sources at a significantly lower rate – only 12.2% of the time (x2= 9.37, df=9, p<.05). Use of print sources at public full depositories and public selective depositories is around 53%, while in academic full and selective depositories, print use hovers around 40%.

Top Figure

One explanation for the decrease in print use at academic depository libraries, and for an approximate 20% use of Web-based sources, is offered by Dolan and Vaughan (1998), who report that academic libraries tend to have higher bandwidth connections to the Internet. This facilitates speedy access to Web pages. Since public full depository libraries are in metropolitan areas that tend to have the telecommunications infrastructure to support high bandwidth, they too use Web sources at about 20%. As smaller public selectives gain access to higher bandwidth connections, use of the Web to answer government documents questions may be expected to increase in these libraries.

Are different types of sources used in different regions of Canada? Depositories in Ontario, the Atlantic Provinces, and British Columbia make use of the Web at a statistically significant greater rate than do depository libraries in Québec and on the Prairies (x2=49.75, df=16, p <.01). Depositories in Atlantic Canada make use of the Web at a rate of 22.7% (17 out of 75 questions), those in Ontario use Web resources at a rate of 21.2% (35 out of 165 questions), while those in British Columbia employ the Web 18.9% of the time (10 out of 53 questions). These results are, for all intents and purposes, indistinguishable. In Québec, however, Web usage in depository libraries drops to 13.3% (14 out of 105 questions), while on the Prairies, use of Web resources declines yet further to 7.8% (7 out of 90 questions). Figure 25 summarizes findings about this issue.

Top Figure

Depository libraries in all regions of Canada use print sources at approximately double the rate that they use Web-based resources. In Atlantic Canada, print sources are used 41.3% of the time (31 out of 75 questions). Depository libraries in the Prairie Provinces use print sources at a rate of 55.6% (50 out of 90 questions), more than depository libraries in any other region. Depository libraries in Ontario also have a high rate of print use (53.3%) (88 out of 165 questions). In Québec, print sources are used 29.5% of the time (31 out of 105 questions), while in British Columbia and the North the rate of print use is 43.4% (23 out of 53 questions).

Do depositories located in larger census metropolitan areas use Web sources at a higher rate? As shown in Figure 26, there does seem to be a slight trend in this direction (x2=29.63, df=16, p <.05). In all census areas up to one million inhabitants, use of Web sources is around the 13%-16% mark, while in metropolitan areas of more than one million people, Web use increases to 22%. Depository libraries in large metropolitan areas may have access to more financial resources and sophisticated Web-based technology than their counterparts in less populous regions.

Top Figure

Depository libraries in the smallest census areas used OPACs at twice the rate of depository libraries in census areas with populations greater than 100,000 inhabitants. Depository libraries in these smallest census areas also used print sources much less than their counterparts in larger census metropolitan areas. The level of service at depository libraries in census areas with a population of less than 100,000 people appears to lag behind those of larger areas. This may simply reflect a lack of government documents to which patrons may be directed, or it may reflect a lack of knowledge about government documents collections by library personnel at these depositories, whose first step is often to consult their OPAC.

Were there certain types of questions for which Web sources were more popular than print sources? Figure 27 tracks sources by individual questions. In general, it is readily apparent that, for most questions, print sources were much more popular than Web-based sources.

Top Figure

A trend may nevertheless be discerned. There are four questions in which either Web use was greater than print use or where print and Web sources were used at approximately the same rate: the Hansard question about the Magdalen Islands; the question about a committee report on firearms legislation; the question about the Alternative Fuels Act; and the bibliographic question about the price of a government-published book. Three of these four questions deal with legislative-branch issues. Relatively high Web use (in comparison with print use) in searching for answers to these questions may indicate that staff members in depository libraries are familiar with the extensive legislatively-based information available on the Canadian Parliamentary Web site. At the same time, library personnel do not appear to be sufficiently familiar with the range of executive-branch information that is also available on the Web, since in eight out of the nine questions dealing with the executive branch they employed print sources to a greater extent than they did Web sources.

Figure 28 substantiates these observations. For the five legislative branch questions taken as a group, staff members at depository libraries used Web sources at a rate of 24.5%, while for the 10 executive branch questions, staff at depository libraries employed Web sources 14.7% of the time. Still, print sources were by far the most preferred source for both types of questions. Indeed, use of print sources was more than double that of Web sources for legislative and executive questions.

Top Figure

A similar result occurs when questions are categorized as data-retrieval or document-retrieval. As shown in Figure 29 below, Web sources were used to answer document-retrieval reference questions 19.2% of the time, while they were searched at a rate of 16% when data-retrieval questions were being answered. Again, the difference is slight but suggestive of the way that library personnel are using the World Wide Web to locate government information. Library staffs are more comfortable retrieving or searching documents by electronic means than they are retrieving isolated facts and statistics. One reason for this may be that they are not confident that trustworthy data-retrieval can be associated with the often ephemeral nature of Web-based documentation.

Top Figure

Figure 30 explores the issue of whether there is a relationship between certain types of source use and the method of question delivery. Web-based resources were used at almost equivalent rates when answering telephone and walk-in questions – 18.5% for telephone questions and 17.5% for walk-in questions. Print sources, however, were used at a greater rate for walk-in questions (55.1% of the time) than they were for telephone questions (34.6% of the time). The most troubling finding here is that "no sources" were used at a significantly higher rate for phone questions (35.2%) than they were for walk-in questions (18.2%).

Top Figure

As wireless telecommunications technology becomes even more ubiquitous than it is today, it is possible that a greater proportion of reference questions may be asked over the telephone. In addition, experts in demographics have noted that, in the coming decades, a greater proportion of the population will be elderly and, consequently, less mobile. Too, there are a significant number of people who have permanent or temporary difficulties with protracted physical movement. The information needs of the elderly and differently-abled are no different from members of the general population. Accordingly, depository libraries may wish to pay special attention to the data presented in Figure 30 about the discrepancy in "no source" use when questions are asked over the telephone and when they are asked in-person.

When all is said and done, library personnel consistently turn to print sources instead of Web-based resources. Reasons for this could be many. Some library staff may feel that government servers are too slow, that government search engines are ineffective, or that the necessary information is contained in Adobe Acrobat files that are either inaccessible or too large to print. Another reason for this may be the philosophy put forward by Devlin (1997) that the Internet should be chosen as an information source only if the question is unlikely to be answered elsewhere, or if other sources have been unsuccessful, or if a comprehensive search is required. Devlin's approach may be valuable for many general reference questions, but his searching strategy model may not be appropriate for government documents questions. Government information on the Web is usually reliable since it is posted by government departments and agencies themselves. Moreover, Benson (1995) suggests that, if a previously identified credible Internet source has been located, it should be consistently used as an information source. Canadian government documents are readily available on well-established and stable Web-based platforms. Library reference departments may want to consider adopting a service policy that states that, if a question seems to be a government documents question, a staff member should consult Web-based sources early on in the search.

6.7 Efficacy Rates and Source Types

It is clear that depository library personnel favour print sources by a wide margin over Internet-based Web sources. Such practice is no doubt based on long-standing habit and experience as well as the belief that complete answers may be found more readily and quickly in print sources. Are these two beliefs accurate?

Figure 31 shows complete and partially complete answers by type of source used. A good place to begin is with complete answers – perhaps the best indication of the value of individual sources. When print sources alone are used, complete answers are found at a rate of 39.9% (89 out of 223 questions). When Web sources alone are used, the complete answer rate soars to 60.7% – an approximately 50% increase in efficacy (34 out of 56 questions). Complete and partially complete answers in combination are achieved at a rate of 78.6% (44 out of 56 questions) when Web sources are used alone, while complete and partially complete answers in combination are provided 60.1% of the time when only print sources are employed (134 out of 223 questions). These differences are statistically significant (x2=10.27, df=3, p <.05).

Top Figure

In sum, what Figure 31 shows is a step-like progression in efficacy rates. Print is the least effective for achieving either complete or partially complete answers; the second most efficacious results are accomplished when the Web is used in combination with another source; and the best results are achieved when the Web is used as the sole source for government information retrieval.

As Table 12 demonstrates, these differences in efficacy rates between Web sources and print sources were most pronounced in public selective libraries and academic full depositories. In public selective depositories, for instance, Web sources led to complete and partially complete answers 84.2% of the time, while print sources produced the same type of answers 52.9% of the time. The difference is more muted in academic full depositories and public full depositories, although still significant. Web sources led to complete and partially complete answers at a rate of 81% in academic full depositories, while print sources provided complete and partially complete answers at a rate of 70.8%. In public full depositories, Web sources led to complete and partially complete answers 75% of the time, while print sources did so 67.7% of the time. Even though the World Wide Web as a storehouse of information and knowledge was still, in the late 1990s, in its infancy, this study offers some evidence to suggest that it has surpassed print sources in many ways as a means of retrieving complete or partially complete answers to government documents reference questions.

 

Table 12: Type of Answer Received by Source

Complete and Partially Complete Answers

Web Alone

Print Alone

Academic full

80.95%

70.83%

Academic selective

62.50%

65.22%

Public full

75.00%

67.74%

Public selective

84.21%

52.89%

Top Figure

Are librarians adequately prepared to find answers to reference questions using Web sources? Figure 32 provides some insight into this issue. Library personnel feel much more at ease using print sources than Web sources for questions that take nine minutes or less. The difference is striking: when print sources are used, 54.26% of the time (n=121) such sources are used for nine minutes or less. Conversely, when Web sources were used, 28.57% of the time such sources were used for nine minutes or less. This seems to indicate that library staff members, when they think that they have a good notion about where to find a piece of information, gravitate to print sources. Yet, although print sources are used 54.26% of the time (n=121), the rate of complete answers found using those print sources is 34.6%. On the other hand, despite being used only 28.57% of the time (n=16) in reference situations of nine minutes or less, Web-based sources provided complete answers at a rate of 50%. One explanation may be that Web-based resources are more efficacious than print in answering test questions of short duration. Librarians may need more training and background knowledge about the type of government information available on the Web.

Top Figure

Another sign of the need for additional training about Web sources is that, when staff members at depository libraries take 15 minutes or more to respond to patron questions, they are making use of Web sources at approximately twice the rate (32.14%) of print sources (17.49%). This circumstance may be an indication that they are not yet at ease with the complexities of Web-based government tools. Nonetheless, when used for 15 minutes or more, Web-based sources provided complete answers at a rate of 44.4% (n = 18), while print sources led to complete answers at a rate of 38.5% (n= 39).

Another explanation of the phenomena mentioned in the above paragraphs is that the Web qua Web is not a better tool than print, but that library staff members exploiting the Web are better skilled and more expert in comparison to their colleagues. In other words, success in finding government information on the Web is related to the searching knowledge of the Web that some library staff possess. It is interesting to note, from Table 12, that the difference between Web- and print-based resources becomes muted when comparing full depositories and is virtually non-existent with academic selective depositories, since these are the types of libraries where level of expertise might be expected to be vested in a specialist who works daily with both media and who faces a high volume of government questions. Certainly, the Web broadens the base of tools available to library staff, but staff must know how to make use of it intelligently.

There are other issues which need to be addressed before library staff make full and complete use of Web resources as a matter of natural course. The authority of Internet-based Web sources is one such major concern. Many author departments decree that print versions are the authoritative versions of texts. One government Web site, for instance, clearly states that "In the event of a discrepancy between the electronic version and a hard copy publication, the hard copy will be considered the accurate version." Another site warns that "inadvertent errors can occur for which no responsibility is accepted." In addition, librarians have reported important missing elements, especially tables and charts, from electronic copies of publications. In these circumstances, it is quite logical to expect some librarians to turn to print sources before they turn to Web sources. Authority of Web sites remains a serious issue that should be addressed. Another concern for librarians is the question of use restrictions on government sites, although some sites allow the downloading of "one copy of the materials on any single computer for your personal, non-commercial home use only."

A number of technical issues surrounding government sites may also inhibit librarians from turning to Web-based sources. For instance, the use of frames and graphics, as well as PDF formats and proprietary software such as FOLIO, are problematic, especially where public service sites have multiple functionalities. Librarians may also feel that search engines are less than adequate, given that some departments still use HARVEST and that other engines only search HTML documents and do not pick up PDF documents. Moreover, some librarians, having become accustomed to sophisticated Web-site search engines that contain such features as exact phrase, word proximity, date or database limitation, and truncation features, may find government search engines to be lacking in some of these areas.

Many documents do not contain all-important accompanying "metadata," despite the existence of basic standards for Internet publication within the government. Archiving policies are still not yet in place – the result is documents which appear and then disappear. For librarians trained in the integrity and reliable accessibility of information, this circumstance is disquieting. A final issue is that some Web sites and Web addresses are not stable, resulting in confusion for the library community and much extra work in updating electronic bookmarks.

Dolan and Vaughan (1998) present evidence to suggest that staff preparedness to help patrons with "electronic access and competent delivery of electronic government publications" is lacking because of the "absence of funding, the dearth of training programs, and the lack of time available for acquiring and passing on expertise in dealing with electronic sources of government information." The findings of the present study confirm that library personnel are not comfortable using Web-based government information sources. There are many reasons for such discomfort, as the previous paragraphs have tried to suggest. There is overall low use of Web sources in comparison with print sources; library personnel devote more time to finding answers on the Web than they do in locating answers in print sources. At the same time, this study has also shown that, when Web sources are employed, such use translates into a better reference efficacy rate, as measured by complete and partially complete answers.

Should the DSP decide to commit itself to a training and educative mission, it would be logical that Web functionality should be stressed. Nevertheless, it is important that government departments realize that improved "metadata," indexing, and archiving, as well better search engines and enhanced subject access, will go a long way in making Web-based federal information easier to find. In addition, library staff members should have a detailed knowledge of the functions of each government department and branch in order to make effective and efficient use of government Web sources. In other words, they should have a grasp of who does what in the federal government. Training courses developed by the DSP should stress this aspect of government documents reference work. Once staff members can readily identify a potential question as falling within the governmental realm through their knowledge of qui fait quoi, it becomes much easier to identify the electronic site where the desired information may be found.

Top

The Nature of Proxy-Administered Reference Questions

7.1 The Relative Degree of Difficulty of the Questions

Previous sections of this report deal with the proxy-administered questions as a group. Attention is directed in this section to individual questions with a view to determining which were the easiest and which were the most difficult to answer. The individual characteristics of each question are examined with specific focus on the relative ease with which each can be found using Web sources. The average time spent by library staff members in answering each question is compared with the average times that the student pre-testers took to locate answers to the same questions.

Figure 33 shows the percentage of complete and partially complete answers to each of the 15 reference queries. Four questions –"crtc," "audgen," "rules," and "crime"– were supplied with complete or partially complete answers at a rate of 70% or more; three of these four are what Katz characterizes as document-retrieval questions. Nine questions had complete or partially complete answers at a rate of 30% or more; six of these nine –"audgen," "rules," "crime," "magdalen," "fisheries," and "refugee"– are also document-retrieval questions. These findings show that queries for which documents are to be retrieved appear to be easier to answer than data questions, a conclusion supported by an examination of the three questions –"books," "lyrics," and "fuels"– that show complete or partially complete success rates of less than 20%.

Top Figure

Two of these questions, "books" and "lyrics," clearly fall into the data-retrieval category. The "book" question asks for a specific price; "lyrics" asks for the percentage of French lyrics in Canadian-content sound recordings. The "fuels" question was classified as a document-retrieval question because it asks specifically for the text of an act, though not by name. However, this question was deliberately written in such a way as not merely to ask, for instance, "Can I get a copy of the Alternative Fuels Act?," but it requested particular data about standards connected to unconventional fuels. The wording of this question therefore tests the hypothesis that "data questions are more difficult than documents questions" insofar as another question –that dealing with the Fisheries Prices Support Act– specifically mentions its act by name and is among those nine questions for which complete and partially complete answers were found the most. Since the "fuels" question is among the three lowest for complete and partially complete answers, this lends additional credence to the fact that data questions are harder to answer than document questions, as indicated in Figure 18.

Figure 34 displays the rank descending order of questions which were completely answered. Three clear groupings of questions emerge: those for which complete answers were provided at a rate of 60%; those for which complete answers were provided at a rate of between 20% and 40%; and those questions for which complete answers were provided less than 20% of the time. By a wide margin the "audgen" and "crime" questions were those for which complete answers were provided most frequently. Again, both of these are document questions.

Top Figure

Figure 35 displays the rank descending order of questions which received the most no or incorrect answers. While it might be expected that Figure 35 would be the exact reverse of Figure 34, this is not the case because the different rates of partially complete answers or referrals may impact on the rates of complete and no/incorrect answers. Five questions had no/incorrect rates of 50% or more; four questions had no/incorrect rates of between 30% and 50%; six questions had no/incorrect rates of less than 30%. The two most difficult questions appear to be the "fuels" and the "firearms" question. Both of these questions are also in the group that received complete answers less than 20% of the time. These consistent results make the "fuels" and "firearms" questions those which library personnel undeniably found the most difficult to answer.

Top Figure

Table 13 compares the average time spent (in minutes) finding complete or partially complete answers by student pre-testers and library personnel in depository libraries. These averages do not include the time spent by library personnel when they gave referrals or provided no/incorrect answer. These averages are merely indications of tendencies.

Given this caveat, on seven of the questions (indicated with a checkmark) student pre-testers found complete or partially complete answers more than 2 minutes faster than library personnel. On six questions there was a two minute or less difference in the times spent finding such answers; for all intents and purposes, the time spent on these six questions should be considered as equivalent. On the other hand, library personnel were able to locate complete or partially complete answers significantly faster for two questions (indicated with a checkmark). Since both student pre-testers used Web-based resources to answer questions, the above data suggest that library personnel might be more efficient at finding answers were they to be better trained using Web sources.

 

Table 13: Average Time (in minutes) Spent Finding Answers

Question

Student
Pre-Testers*

Library Personnel who found complete or partially complete answers*

Crtc

2 (=)

9

Book

12.5

12

Barley

10

8

Lyrics

15.5

15

Fuels

5 (=)

10

Firearms

15

10.8 (=)

Audgen

3.5 (=)

6.7

Crime

10 (=)

12.3

Magdalen

10 (=)

12.6

Rules

3.5 (=)

9.8

Refugee

12.5

11.5

Garbage

no answer

8.3 (=)

Photo

7.5

6.5

Fisheries

5.5 (=)

11.6

Africa

6.5

8

* all times over 20 minutes recorded as 20 minutes

Top Figure

7.2 Question Analysis

Question #1: Who is the Chair and other full-time members of the CRTC?

Table 14: Types of Answers Received by Type of Library for CRTC Question

No/incorrect

Partially Complete

Referral

Complete

Grand Total

Academic full

2

3

0

2

7

Academic selective

1

1

0

0

2

Public full

0

1

0

2

3

Public selective

3

9

1

8

21

Grand Total

6

14

1

12

33

Top Figure

One reason for asking this question was to ascertain whether staff members would think to update information contained in a directory or realize that the information in the directory was inadequate for the question asked. In the three-month period previous to this study, well-publicized personnel changes had occurred within the CRTC. These updated changes would not appear in commonly used sources such as the Canadian Almanac & Directory, but would appear on the CRTC web site http://www.crtc.gc.ca/ENG/BACKGRND/g2e.htm or in the Corpus Administrative Index. Credit for a partially complete answer was given to depository libraries whose staff used a single source such as the Canadian Almanac; credit for a complete answer was given for the ability to provide the most recent information. One striking aspect of the answers received was that many library staff, especially at public selectives, did not update the answer.

Question #2: I want to order a copy of Aboriginal Self-Government by Jill Wherrett, published in 1996. I'm sure it's a government document, and I specifically want to know how much it costs and any ordering instructions.

Table 15: Types of Answers Received by Type of Library for Book Question

No/incorrect

Partially complete

Referral

Complete

Grand Total

Academic full

4

1

2

1

8

Academic selective

1

0

0

0

1

Public full

1

0

3

0

4

Public selective

2

0

15

3

20

Grand Total

8

1

20

4

33

Top Figure

On the surface this is a difficult question, but a complete answer to this question could be easily found by searching the Depository Services Program Web site at the following address: http://dsp-psd.communication.gc.ca/search_form-e.html. Here, a search engine is available where a search by author can be performed. Three references to a book entitled Aboriginal Self-Government appear; two of these refer to books for which Wherrett is a co-author with Jane Allain. The book for which Wherrett is sole author is the book asked about in this question. Clicking on this title shows that the price of the book is $6.50. A noteworthy aspect of this question is the high number of referrals made to various types of local bookstores. A number of full depositories had copies of this volume, and suggested that the patron come in to view the book, but these libraries could not provide the price.

Question #3: I'd like to know what the total payments were per bushel of designated barley for 1995-1996 in Canada. Specifically, I'm interested in the category "select two-row" of designated barley.

Table 16: Types of Answers Received by Type of Library for Barley Question

No/incorrect

Partially complete

Referral

Complete

Grand Total

Academic full

5

1

0

5

11

Academic select

3

2

1

2

8

Public full

1

0

1

2

4

Public selective

6

0

1

3

10

Grand Total

15

3

3

12

33

Top Figure

A complete answer for this question could be found in a number of locations, although, again, a relatively easy place to locate it is at the Web site of the Canadian Wheat Board, available at http://www.cwb.ca. Once here, click under Payments. Payments are listed for specified years, and are given either in tonnes or in bushels. In print, the answer is available, in tonnes, from a publication entitled Grain Trade of Canada. The introduction to this book states how many bushels there are in a tonne. On this question, an answer was judged to be complete if any price in tonnes was provided for a barley product. Only two library staff members referred to the bushel/tonnes conversion from print sources. In addition, only two staff members consulted the Canadian Wheat Board Web site. Just as many no/incorrect answers (15) were received to this question as the sum of complete (12) and partially complete answers (3).

Question #4: I'd like to know how many new Canadian-content sound recordings (albums, tapes, CD's) released during 1990-1994 have French lyrics?

Table 17: Types of Answers Received by Type of Library for Lyrics Question

No/incorrect

Partially complete

Referral

Complete

Grand Total

Academic full

2

0

3

2

7

Academic select

2

0

0

1

3

Public full

4

0

0

1

5

Public selective

9

1

6

1

17

Grand Total

17

1

9

5

32

Top Figure

Answers to this question could be found using Statistics Canada sources, either print or electronic. In print, a publication entitled Sound Recordings (#87-202) could be used to locate the required information. The Statistics Canada Web site http://www.statcan.ca also contained the answer. Here, choose the subsection Canadian Statistics, then People, then Culture, Leisure, then Profile of the Sound Recording Industry. Many library staff members were unwilling even to attempt answering this question; one proxy was referred to a local record store. On six of the 12 occasions when this question was asked at a full depository, proxies received an incorrect answer or no answer. More than half of the answers (17) to this question were no/incorrect.

Question #5: I'd like to get the text of the act that requires crown corporations to power their motor vehicles with fuels that do not harm the environment. How many of their vehicles have to use these non-conventional fuels in fiscal 1998?

Table 18: Types of Answers Received by Type of Library for Fuels Question

No/incorrect

Partially complete

Referral

Complete

Grand Total

Academic full

8

0

2

4

14

Academic select

2

0

1

0

3

Public full

2

0

0

1

3

Public selective

8

0

3

1

12

Grand Total

20

0

6

6

32

Top Figure

This question attempted to test whether library personnel were aware of the search engine available at the Web site of the Justice Department. The question did not specifically ask for a statute by name in order that the search engine facilities might be used. Full texts of Canadian laws are available at http://canada.justice.gc.ca. Click the search icon. Then enter the following search string in the query box: ["motor vehicle*" and "crown corporation*" and fuel*]. All these terms were specifically stated in the text of the question. The very first hit is the Alternative Fuels Act, in which the complete answer about how many vehicles owned by crown corporations must be powered by non-conventional fuels can be found. The answer is also available in print from the Statutes of Canada 1995. Of interest here is a not untypical response from a staff member at an academic full depository: "I am unable to answer your question. It would take a lot of digging and research to find the stats and the act itself…." Again, more than half the responses received (20) were judged to fall in the no/incorrect category.

Question #6: There was a parliamentary sub-committee on the draft regulations on firearms that submitted a report to the House of Commons in 1997. I'd like to see a copy of this report.

Table 19: Types of Answers Received by Type of Library for Firearms Question

No/incorrect

Partially complete

Referral

Complete

Grand Total

Academic full

5

1

0

6

12

Academic select

8

0

1

0

9

Public full

1

1

0

0

2

Public selective

6

1

3

0

10

Grand Total

20

3

4

6

33

Top Figure

This question tested whether library staff members could find committee reports. A relatively simple way to locate this report was through Web-based resources. Go to the main parliamentary home page at http://www.parl.gc.ca/36/main-e.html. Click on the heading entitled Site Map. From there, scroll down to the sub-section labelled Committees. Pick the committees for the House of Commons, then select Reports. Then scroll through the various committees until the Standing Committee on Justice and Legal Affairs. The final report is available at http://www.parl.gc.ca/36/1/parlbus/commbus/house/juri/reports/jurirp04-e.htm. One surprising finding here was that some library staff members went immediately to CBCA – the Canadian Business and Current Affairs Database. Another worrisome finding was the performance of full depository libraries – on six of the 14 occasions that this question was asked at full depositories, no answer or an incorrect answer was provided.

Question #7: I'd like to know if the Auditor-General said something in the 1992 annual report about forest management practices of natives, specifically about the good job done by the Stuart Trembleur Lake Band.

Table 20: Types of Answers Received by Type of Library for Audgen Question

No/incorrect

Partially complete

Referral

Complete

Grand Total

Academic full

2

0

0

7

9

Academic select

0

0

0

4

4

Public full

1

0

0

2

3

Public selective

2

1

5

9

17

Grand Total

5

1

5

22

33

Top Figure

This was one of the two questions for which the greatest number of complete answers was received. Many libraries had print copies of the 1992 Auditor General's report, and library staff showed patrons how to use it. The answer is also available electronically at the Web site of the Auditor-General: http://www.oag-bvg.gc.ca/domino/reports.nsf/html/92menu_e.html. The answer is in Chapter 15, subsection 60, where the Stuart Trembleur Band is praised for sound forest investment and management. Also noteworthy is the fact that very few no/incorrect answers were received to this question.

Question #8: I'd like to see a bill that was introduced into the House of Commons this past fall. It has to do with the profits convicted criminals might make if they were to publish books about their crimes.

Table 21: Types of Answers Received by Type of Library for Crime Question

No/incorrect

Partially complete

Referral

Complete

Grand Total

Academic full

3

1

0

7

11

Academic select

2

0

0

5

7

Public full

1

0

0

3

4

Public selective

4

0

0

7

11

Grand Total

10

1

0

22

33

Top Figure

As with the previous question, 22 complete answers were received for this question. A caveat, however, must be attached to these impressive results. Even though the question specified that the bill was introduced in the fall of 1997, an answer was coded as being complete if the library staff member located any of the three versions of the bill introduced in the past three years. This private member's bill, entitled "An Act to amend the Criminal Code and the Copyright Act (profit from authorship respecting a crime)," received a great deal of media coverage during the autumn months of 1997. It can be found through the Parliament Web page of the federal government. Within the 36th Parliament, choose the sub-section headed Private Member's Bills, then scroll down to the appropriate bill, numbered C-220. The final address is http://www.parl.gc.ca/cgi-bin/36/pb_prb.pl?e. When using this Web service, a bill number is not required. Some library staff members, however, told patrons that bill numbers were required to locate the required information. Other librarians unsuccessfully made use of such Internet search engines as Infoseek by typing in keywords "crime" and "profits."

Question #9: I'm doing a class project about the Magdalen Islands, and there was talk about closing the marine radio station there. I'd like to know if anything was said in the House of Commons about this topic in the last year, and if anything has been decided about its fate.

Table 22: Types of Answers Received by Type of Library for Magdalen Question

No/incorrect

Partially complete

Referral

Complete

Grand Total

Academic full

4

4

0

2

10

Academic select

2

1

0

2

5

Public full

2

4

0

0

6

Public selective

7

1

3

1

12

Grand Total

15

10

3

5

33

Top Figure

This question deals with Hansard, the official record of debates in the House of Commons. While the subject matter of the question might seem esoteric, the question is intended to test ability to identify and use Hansard. The basis of this question would be no different were a question to be asked about what, for instance, Alexa McDonough or Preston Manning said about any political issue. Go to the House of Commons debates sections of the Parliamentary Home Page at http://www.parl.gc.ca/36/1/parlbus/chambus/house/debates/indexe/homepage.htm. An alphabetical index of subjects and the names of members is located here. Since information about the Magdalen Islands is required, scroll down to the M=s. Click on the M, then scroll down until the subject heading for "Magdalen Islands" is reached. Click on any of three documents until relevant information is found. The answer is that the federal government is "not closing the station but rather it will be operated from Rivière-au-Renard." A decision is still pending about whether to relocate workers from Cap-aux-Meules. As suggested in Table 21, very few complete answers were received to this question. Many library staff consulted the compendium publication Ottawa Letter. Many others simply took patrons over to the print issues of Hansard for which no indexes were available and suggested that they look through the accumulated issues themselves. One exasperated referral was made to the local Member of Parliament.

Question #10: I'd like to know the complete set of rules that govern Question Period in the House of Commons.

Table 23: Types of Answers Received by Type of Library for Rules Question

No/incorrect

Partially complete

Referral

Complete

Grand Total

Academic full

0

4

0

1

5

Academic select

0

0

0

1

1

Public full

1

3

0

2

6

Public selective

6

8

0

7

21

Grand Total

7

15

0

11

33

Top Figure

The answer to this question can, again, be found from the main Parliamentary page. Go to http://www.parl.gc.ca. Choose Reference Material from the available subjects on this first page. Then scroll down to Reference Works – Procedural. The correct answer is contained in the Standing Orders of the House of Commons. As well, a print version of the Standing Orders exists. One interesting aspect about this question is that only two library staff members chose to use a Web-based source. While 11 depository libraries completely answered this question, those whose answers were recorded as partially complete showed patrons either the Précis of Procedure or a general work about the functioning of Parliament. Of more concern is that a number of library staff members showed patrons reference works dealing with procedures in the Congress of the United States. When questions were being developed, this question was thought to be one of the easiest, since it only involves directing the patron to the Standing Orders. It is disquieting, therefore, to find that two-thirds of the library staff approached for this question failed to identify this major reference tool.

Question #11: I want to know if there is any official document about the possibility of immigrating to Canada as a refugee claimant because of persecution based on gender.

Table 24: Types of Answers Received by Type of Library for Refugee Question

No/incorrect

Partially complete

Referral

Complete

Grand Total

Academic full

3

0

1

3

7

Academic select

4

0

1

1

6

Public full

2

1

1

0

4

Public selective

7

6

2

0

15

Grand Total

16

7

5

4

32

Top Figure

The results received for this question are an indication of the difficulty government documents reference staff experience when dealing with what appears to them to be a judicial question. Fully half of the responses fall into the no/incorrect category, while only four of the answers were categorized as being complete. This is especially disturbing, given the potential importance of the subject matter of the question to a patron. One common location to find the answer in print is as an appendix attached to a report by Margaret Young entitled Gender-related Refugee Claims (1994), published by the Laws & Government Division of the Library of Parliament. The required information is also accessible through the Web site of the Immigration and Refugee Board of Canada, which is available at http://www.cisr.gc.ca. From the home page of this site, choose the subject heading Legal References. The very first screen of this hyperlink contains a section entitled Chairperson's Guidelines – Women Refugee Claimants Fearing Gender-Related Persecution. Some library staff members consulted only ready reference sources such as the Canada Yearbook. Others directed proxies to informational leaflets about immigrating to Canada or the Self-Counsel series on the same topic.

Question #12: Someone I know is looking for work hauling garbage. Would there be any specific opportunities to put in bids for contracts in this field with the federal government?

Table 25: Types of Answers Received by Type of Library for Garbage Question

No/incorrect

Partially complete

Referral

Complete

Grand Total

Academic full

0

0

1

4

5

Academic select

4

0

0

1

5

Public full

2

0

0

2

4

Public selective

3

2

8

5

18

Grand Total

9

2

9

12

32

Top Figure

Like the question dealing with the Magdalen Islands, the subject matter of this question may at first appear to be obscure. However, a patron could ask for bidding opportunities connected with any field of endeavour, and the answer could be found in the same location. This is the type of practical question that may be of great financial importance to certain patrons. Probably the best print source is the weekly or bi-weekly Government Business Opportunities, although there is no index in this publication. An electronic source is the Public Works Canada Web site at http://contractscanada.gc.ca/en/index.html. Then go to Database of Current Government Bidding Opportunities at http://contractscanada.gc.ca/en/tender-e.htm. This is the MERX system at http://www.merx.cebra.com. Type in "garbage" under Opportunity Search. This system was monitored throughout the time period for this study; there were many contracts for hauling garbage available. A substantial number of library staff members gave only the vaguest possible answers to this question. Some referred the proxy to "government offices that dealt with this field"; others told the proxy "to go and see the minister who deals with garbage disposal"; still others referred the proxy to local municipal authorities.

Question #13: My mother's birthday is coming soon, and I want to order a color enlargement of an aerial photograph of the lake where my parents have their summer house as her present. Could I have a price list for the enlargements, and information about what I need to do to order such a photograph?

Table 26: Types of Answers Received by Type of Library for Photo Question

No/incorrect

Partially complete

Referral

Complete

Grand Total

Academic full

0

0

7

0

7

Academic select

0

0

2

1

3

Public full

1

1

2

1

5

Public selective

4

2

8

3

17

Grand Total

5

3

19

5

32

Top Figure

One reason that this question was selected for inclusion in the study was that it tests the knowledge of library personnel about government services without specifically mentioning that aerial mapping and aerial photography can be provided by the federal government. The most complete source is available on the World Wide Web. Go to the Geomatics Canada site at http://www.geocan.nrcan.gc.ca. Then, under Thematic Mapping, there is a link to the National air photo library at http://airphotos.NRCan.gc.ca/main.html. This site is searchable by key words; choose prices, enlargement, or ordering. Price lists for enlargements can be found at http://airphotos.NRCan.gc.ca/prices.html. Many provinces also have departments that provide aerial photography. In addition, by using the Yahoo Canada Directory Web site, available at http://www.yahoo.ca, the National Air Photo Library is the first returned hit if the search term "aerial photo" is entered. One significant aspect about this question is the large number of referrals to local photography shops. To be sure, some local photographers would be able to direct patrons to the proper government agency; nevertheless, the inability of some library personnel to identify a government service is cause for concern.

Question #14: Can you help me find any regulations or enabling statutes associated with the Fisheries Prices Support Act?

Table 27: Types of Answers Received by Type of Library for Fisheries Question

No/incorrect

Partially complete

Referral

Complete

Grand Total

Academic full

5

0

1

4

10

Academic select

3

0

0

1

4

Public full

0

0

0

2

2

Public selective

9

0

3

4

16

Grand Total

17

0

4

11

32

Top Figure

Unlike Question #5 above, this question specifically mentions a statute by name. But answers to both questions may be located in the same place, namely, the Justice Canada Web site available at http://canada.justice.gc.ca. From this site, choose Laws, then Text Versions of Statutes and Associated Regulations for Download. Statutes and the regulations that pertain to them are listed in alphabetical order. Click on the letter F, and then scroll down to the Fisheries Prices Support Act. There are three entries for regulations: Canned Mackerel Support Order; Frozen and Cured Herring Price Support Order; and the Small and Extra Small Heavy Salted Dried Cod Price Support Order. These regulations were last updated in 1994. More than half the answers were received fell into the category no/incorrect. Some staff members at full depositories tried to use the 1985 version of Revised Statutes of Canada, clearly an outdated tool. Some library personnel employed the Canada Gazette Part II: Consolidated Index of Statutory Instruments to locate the list of the regulations, but did not search for the texts of the regulations. In an example of persistent and high-quality service, one library staff member spent slightly more than an hour with a patron in a successful attempt to track down these regulations.

Question #15: Does any government department put out any newsletters or bulletins about business opportunities in Africa? If so, I'd like a copy of the latest one.

Table 28: Types of Answers Received by Type of Library for Africa Question

No/incorrect

Partially complete

Referral

Complete

Grand Total

Academic full

1

0

1

2

4

Academic select

3

1

0

0

4

Public full

1

0

2

1

4

Public selective

8

2

7

3

20

Grand Total

13

3

10

6

32

Top Figure

This question tests the ability of library personnel to find periodicals published by the government. Again, this is the type of question that may have immediate practical consequences for some patrons. The answer could readily be found using the Web site of the Department of Foreign Affairs and International Trade at http://www.dfait-maeci.gc.ca. From here, the library staff member has a number of equally valuable options. After glancing at the main page, the following sub-headings could be linked to: News Releases, Statements, and Publications, under the section The Department; Market Information, under the section Trade; Africa & Middle East, under the section The World. Under News Releases, Statements, and Publications, go to the section labelled Publications; then choose Trade. Under this heading, one can find a publication called "Africa & Middle East Bulletin," which briefly summarizes the potential that African countries have for Canadian businesses. Clicking on Market Information leads to a page entitled Market Reports: Information by Region and Sector. Choose the African sector; this leads to an alphabetical listing of countries. Choosing a country leads to detailed information about trade, exploration, and export opportunities. Although some information is password protected, any Canadian citizen is issued a password after completing a basic informational form.

Under "The World" heading, choosing Africa leads to a cornucopia of business information about this region. For instance, there is a publication entitled "The African Development Bank Group: A Guide to Business Opportunities for Canadians." Only six complete answers were provided to this question, while 13 no/incorrect answers and 10 referrals were registered. Some library personnel suggested calling relevant African embassies in Ottawa; others gave out publications listing overseas jobs, most of which were in Asia or South America. One suggested calling UNICEF; another typed in the keywords "Africa" and "business" into an online catalogue and told the proxy to browse the shelves for relevant books. Still another suggested that the proxy consult an Ethiopian newspaper carried by the library.

Top

Conclusions and Recommendations

This study was undertaken to discover: (i) the degree of accuracy of government reference service in Canadian academic and public libraries participating in the Depository Services Program (DSP), as measured by the number of correct answers supplied to test questions asked by proxies; (ii) the extent to which staff members make use of electronic resources, in particular federal government Web sites, in answering the questions; and (iii) which categories of government reference queries do depository reference personnel find most difficult to answer.

In regard to external institutional variables, findings show that, overall, depository library staff members provided complete answers to questions 29.3% of the time. When complete and partially complete answers are taken together (reflecting a conservative reference philosophy as outlined in Section 3.9) the success rate climbs to 42.4%. No/incorrect answers were given at a rate of 37.6% while a full one-fifth of the 488 questions were referred to, mostly to non-specific external sources (Figure 7).

The rate of 29.3% for complete answers is disappointing. It may be indicative of the difficulties commonly associated with the quantity, variety, and complexity of collections of official publications. It may indicate that library staff members are not at ease in finding their way through the maze of documents, whether print or electronic. It may also reflect the increasing pressures on libraries suffering from budget reductions and the growing demands of the new technologies, as reported in Dolan and Vaughan (1998).

There were notable differences among the four types of depositories in providing complete answers. Academic full depositories achieved the highest rate of success, followed by public full depositories. Academic and public selective libraries did less well. When complete and partially complete answers are taken together, academic and public full depositories performed at an almost identical rate, as did both types of selective depositories (Figure 8). Higher success rates among full depositories are no doubt attributable to their higher staffing levels, more specialized service, and access to the full range of DSP publications.

Service levels differed as well according to region and census metropolitan area. Among the five geographical regions, Ontario performed best, followed by British Columbia and the Atlantic Provinces, then by the Prairie Provinces and Québec (Figure 9). Levels of documents service varied according to type of library in the regions. In Québec, the Prairies, and British Columbia, academic full depositories offered the best service. In Atlantic Canada and Ontario, however, similar degrees of service were provided by all four types of depositories, with the exception of academic selectives in the Atlantic Provinces. With regard to census metropolitan areas, it was found that the lowest number of complete and partially complete answers to reference queries was provided in cities with a population of fewer than 100,000 inhabitants. Census areas with populations of over one million or between one quarter and one half million inhabitants offered the best opportunities for complete or partially complete answers to government-related questions (Figure 10). As for the days of the week on which proxy questions were asked, it was found that on most days service was remarkably similar. The rate for complete and partially complete answers rose on Tuesday and declined on Sunday. These differences may be the result of staffing levels that fluctuate over the week (Figure 11).

Internal institutional variables were an important part of the study reported on here. Full depositories with separate reference desks or areas for official publications provided more complete or partially complete answers than those without such areas. This tendency was most pronounced in public full depositories where the difference in success rates was almost 17%, although the 10% variation in academic full depositories is also noteworthy (Table 7). This finding supports the traditional notion that the size and complexity of government documents collections require special attention and expertise on the part of librarians.

The degree of busyness at reference desks and whether it had an impact on the quality of answers was examined. Academic full depositories provided complete and partially complete answers at the same rate of about 50% whether they were busy or not. As for time spent in reference encounters, it was found that as the amount of time spent with proxies increased, the number of complete and partially complete answers went up significantly. With referrals, the opposite tendency was observed; the referral rate was lowest where time spent was greatest (Figure 15). Results suggest that, with enough time and opportunity, library staff members are able to achieve a very high rate of complete and partially complete answers.

A greater number of complete and partially complete answers was received when reference questions were asked in person rather than over the telephone, especially in full depositories; more referrals were given to telephone questions. Selective depositories, on the other hand, were able to supply answers to telephone and walk-in questions at an equal or almost equal rate (Table 9).

One of the most interesting findings in this study shows that legislative questions were answered completely and partially completely at a significantly higher rate than were queries dealing with the executive branch of the federal government. Moreover, executive questions were referred at a substantially higher rate. This was true for all libraries in the study, but the difference was most striking in academic and public full depositories (Table 11). One explanation for this may be that legislative questions are more obvious, and that sources for answers to them more limited in number.

Referrals comprised one-fifth of responses to proxy-administered questions. Half were to government departments, 36% to other libraries, and 14% to external non-governmental agencies or commercial establishments (Figure 19). Most referrals made to government departments were made by public selective libraries, but both academic and pubic full depositories made referrals 16.3% of the time. No/incorrect answers were given at a rate of about 38%. The most common explanation for this rate was that the staff members did not know how to find the needed information. Inaccurate responses made up 21% of no/incorrect answers and 10% resulted from the disinclination of library staff members to answer proxy-administered questions. Finally, 27% of the time proxies were told to come into the library, telephone at a later time, or return at a more convenient time (Figure 20).

No/incorrect answers were provided at approximately the same rate (34%-36%) among academic and public full and public selective libraries, and at a significantly higher rate at academic selective libraries. Referrals were made by academic full, academic selective, and public full depositories at an approximately equal rate ranging from 9% to 15%. For public selective libraries, the rate was almost 30% (Figure 21). Despite the widespread availability of Web resources in Canadian depository libraries, findings show that print materials were by far the largest single source used to answer questions in this study. The Web alone was used at a rate of only 11.5% and the Web in combination with another source a mere 5.5% of the time (Figure 23). Web sources were used at a greater rate in Ontario, the Atlantic Provinces, and British Columbia, than in Québec and the Prairie Provinces (Figure 25). There is a slight increase in the use of Web sources in the largest metropolitan census areas (Figure 26)

For most questions, print sources were preferred much more often than those on the Web (Figure 28). Web use was greater in the search for three legislative-branch questions and one bibliographic question. It is reasonable to surmise that library staff are more familiar with legislative information on the Parliamentary and DSP sites and that they have insufficient knowledge about the extensive range of executive branch information on the Internet. There is a slight difference in the degree to which the Web is used to search for document-retrieval questions (19.2%) and data-retrieval questions (16%), suggesting that reference staff members are more at ease retrieving documents electronically than they are searching for isolated facts and statistics. There may be less confidence placed in electronic data, given its often ephemeral nature (Figure 29).

Rates were essentially the same when it came to using Web sources for telephone and walk-in questions, although print sources far surpassed electronic means for walk-in queries as opposed to telephone requests. More troubling is the finding that "no sources" were used at a significantly higher rate for phone questions than they were for walk-ins (Figure 30).

Results demonstrate that reference staff favoured print sources by a wide margin, yet when print alone was used, complete answers to the test questions were found only 39.9% of the time. When Web sources alone were used, the complete answer rate was 60.7% (Figure 31). Data also suggest that Web-based information sources are more effective than print in searches of short duration.

Based on the findings of this study, depository libraries are not turning to the World Wide Web at a great rate to aid patrons find government documents. Print sources are the overwhelming first choice of library staff members when faced with government document questions. Yet print sources are not nearly so efficacious at finding complete answers as are Web-based informational sources. As indicated by the data presented in Figure 31, when print sources alone were used, complete answers were found at a rate of 39.9%. When Web-based sources were used, however, complete answers were found at a rate of 60.7%. As indicated earlier, one reason for this may be that the material provided to depository libraries by the DSP is poorly indexed, lacks consistency, and comes with little training support.

This study also found varying rates of complete answer provision between full and selective depository libraries, among different regions of the country, and among varying sizes of metropolitan census areas. Accordingly, one advantage of Web-based government information sources is that they can level the playing field between types of libraries and between different regions of the country. All libraries, whether full or selective or neither, have access to the same body of information and documents provided by the Canadian federal government.

The federal government is moving to implement a plan in which the preferred delivery platform for government information will henceforward be electronically-based through the World Wide Web. One consequence of this will be that the distinction between a full depository library and a selective depository library will disappear. Indeed, all computer-owning individuals will have the same access to federal government information as the largest library in the country. However, not everyone will have access to a computer and the World Wide Web. Too, those individuals who do have such access may not be very proficient at finding their way around this new information medium in their search for specific government documents and data. Libraries therefore still have an important role to play as intermediaries between government information and the general public, but if and only if they are able to provide superior government documents reference service. And if they are to offer a new and superior level of service, one requirement must be increased attention to training staff members in efficiently locating government documents and data. The DSP can play a central role in the training of library staff members to ensure that staff members are knowledgeable about government functions and are able to efficiently and effectively retrieve government documents. A new manual or text will go a long way to assist any training endeavours.

To make use of these Canadian government Web resources effectively, it is vital that library staff members are fully aware of the structures and functions of both the legislative and executive branches of government. Staff members need to know what programs are available and who is responsible for which program in the federal government. In short, library staff should be knowledgeable about who does what and how things work. The DSP might undertake this type of training of library staff as a central part of its mission. The DSP might also consider entering into working agreements with provincial and municipal levels of government in order to develop and implement city- and province-specific learning programs about all available governmental services for a particular census metropolitan area. In this way, the DSP could expand its mission to include "who does what" at the non-federal level.

The DSP might also do more to facilitate electronic access to federal documents. Certainly, the DSP has made great strides in this direction through its own Web site. Recently, it has also worked together with Anita Cannon, Reference/Public Service Librarian at Mount Allison University, Sackville, New Brunswick, to create subject-specific abstracts about various government departments. Cannon's web site contains valuable information and links. It is available at http://library.uwaterloo.ca/discipline/Government/CanGuide/Federal.html. However, Cannon's site serves only as a broadly-descriptive roadmap. The DSP might wish to stress improved "metadata," indexing, and archiving of its Web-based information, as well as improved search engines and enhanced subject access. A carefully conceived and complete blueprint for improving access to government documents is provided in the Whitepaper on Government Information in the Electronic Environment, published by the Ad Hoc Committee on the Internet, Government Documents Roundtable (GODORT), of the American Library Association. This report is available at http://www.lib.berkeley.edu/GODORT/whiteppr.html. Particularly relevant, in the context of the present study, are the extensive sections on "Preserving and Archiving Electronic Government Information" and "Education Issues."

In discussing preservation and archiving, GODORT makes a number of salient recommendations that the DSP may wish to consider. First, there is the question of standards and data loss. Because the hardware on which information is stored and the software used to access it are "constantly upgraded and superseded," it may be difficult in the future to work with "older portable electronic sources." Government agencies should therefore be "responsible for maintaining permanent access to their entire historical electronic publication record." Moreover, because of difficulties experienced with the "file integrity" and "life expectancy" of different types of digital storage formats, governments wishing to maintain digitally-stored information on a long-term basis may want to consider codifying standards of "data readability, data retrievability, and data intelligibility." Another issue –called authentification and data-fixing– is the provision of "consistent access" to digital information "through citation and retrieval over time." It is important "to refer to a source over time and assume with reliability that it has the same content that it did when originally cited." Sophisticated digital signatures may be one answer to preventing modification, transformation, and censorship of original government data.

Whereas in previous decades, the DSP provided access to government documents by physically distributing these documents to depository libraries, now, at the dawn of a new century characterized by increasingly powerful and sophisticated electronic information technologies, documents are instantaneously distributed through electronic means. But, without exhaustive indexing, reliable "metadata," and coherent archiving standards, these documents are no more accessible than if they were simply stored in a warehouse. In a very real sense, then, the DSP still needs to provide access to these already-distributed documents. This access is best provided through superior indexing, enhanced subject access, and consistent data retrievability of authentificated and "fixed-in-time" government documents. The mission of the DSP will thus evolve to meet the changing circumstances and needs of the 21st century, although it will still adhere to its historic mission of ensuring egalitarian access to federal government documents.

What should be the future role of depository libraries? Libraries may find it natural to become the preferred location for training programs geared toward making the library the central location for dissemination of various types of government documentation and information. The DSP may want to become the prime agent for achieving such training and education through frequent regionally-based in-person seminars or through regularly-updated distance-learning modules that library personnel would complete on a mandated-timetable basis. Training would cover "who does what" programs as well as sessions about DSP electronic services, including enhanced subject access, improved "metadata," better search engines, and archiving initiatives.

Top

Bibliography

Benson, Allen C. (1995). The complete Internet companion for librarians. New York: Neal-Shuman Publishers, Inc.

Canada. Transport Canada. Administrative Services. (1986). You asked us: Typical questions and answers handled by the inquiry desk of the Transport Canada Library and Information Centre. Ottawa. [TP7555]

Childers, Thomas. (1987). The quality of reference: Still moot after 20 years. Journal of Academic Librarianship, 13, 73-74.

Devlin, Brendan. (1997). Conceptual models for network literacy. The Electronic Library, 15, 363-368.

Dolan, Elizabeth, & Vaughan, Liwen Q. (1998). Electronic access to federal government documents: How prepared are the depository libraries? Ottawa: Canada Communications Group.

Durrance, Joan C. (1989). Reference success: Does the 55 percent rule tell the whole story? Library Journal, 114 (6), 31-36.

Elzy, Cheryl, Nourie, Alan, Lancaster, F. W., & Joseph, Kurt M. (1991). Evaluating reference service in a large academic library. College & Research Libraries, 52, 454-465.

Gers, Ralph, & Seward, Lillie. (1985). Improving reference performance. Library Journal, 110(17), 32-35.

Harris, Roma M., & Marshall, Victoria. (1998). Reorganizing Canadian libraries: A giant step back from the front. Library Trends, 46, 564-580.

Hernon, Peter, & McClure, Charles R. (1986). Unobtrusive reference testing: The 55 percent rule. Library Journal, 111(6), 37-41.

Hernon, Peter, & McClure, Charles R. (1987). Unobtrusive testing and library reference services. Norwood, NJ: Ablex Publishing Company.

Hults, Patricia. (1992). Reference evaluation: An overview. The Reference Librarian, 38, 141-150.

Jardine, Carolyn W. (1995). Maybe the 55 percent rule doesn't tell the whole story: A user-satisfaction survey. College & Research Libraries, 56, 477-485.

Katz, William. (1982). Introduction to reference work, 5th ed. New York: McGraw-Hill.

Katz, William. (1996). Introduction to reference work, Volume 1: Basic information services, 7th ed.

New York: The McGraw-Hill Companies, Inc.

Lancaster, F. W. (1977). The measurement and evaluation of library services. Washington, DC: Information Resources Press.

McClure, Charles R., & Hernon, Peter. (1983). Improving the quality of reference service for government publications. Chicago: American Library Association.

McIlroy, Anne. (1998, February 12). One-third of stores selling tobacco to minors. The Globe and Mail, A1, A10.

Murfin, Marjorie E. (1995). Evaluation of reference service by user report of success. The Reference Librarian, 49/50, 229-241.

Parker, June D. (1996). Evaluating documents reference service and the implications for improvement. Journal of Government Information, 23, 49-70.

Richardson, John V., Jr. (1998). Question Master: An evaluation of a Web-based decision-support system for use in reference environments. College & Research Libraries, 59, 29-37.

Steinhauer, Jennifer. (1998, February 4). The undercover shoppers: Posing as customers, paid agents grade the stores. New York Times, C1, C23 [National].

Tyckoson, David A. (1992). Wrong questions, wrong answers: Behavioral vs. factual evaluation of reference service. The Reference Librarian, 38, 151-173.

Whitlatch, Jo Bell. (1989). Unobtrusive studies and the quality of academic library reference services. College & Research Libraries, 50, 181-194.

Note: All Web addresses mentioned in this report were valid (or operational) during the period that this study was conducted.

Top

1 "Introducing the DSP" http://dsp-psd.communication.gc.ca/dsp-psd/Info/profile-e.html Top


2 Canadian Libraries that are nominated and approved by a Committee consisting of representatives of the National Library of Canada and the Depository Services Program are granted full depository status which is designated as either English, French, or bilingual, depending on the library's clientele. Top


3 Selective depository status is granded to Canadian public libraries and libraries of educational institutions which are open to the general public at least 20 hours per week and have at least one full-time employee. Twelve selective depositories are not libraries. Top


4 Public full depository and public selective depository figures are reported together for Québec, British Columbia, and the Prairies because, in each region, there is only one public full depository library. Identification of this library would therefore be possible. Anonymity is preserved by combining results for both types of public depositories. Top


5 Proxies could not answer with certainty about this issue 3% (15 questions) of the time. Top


6 The chi-square test mentioned in this paragraph and the next two chi-square tests (all three involving various types of sources) are calculated using the five major types of sources used by depository libraries: print; Web alone; Web combined with an another source; OPAC; and no sources. Row totals and column totals thus add up to 465, instead of 488 (total questions). CD-ROM sources (18) and microform sources (5) were not included because of cell size criteria. Top


Last updated: 2002-03-20 Important Notices