Ch. 3: Personal Information ITSS Legal Issues Working Group 11/8/96 3-2 Chapter 3 Collecting and Sharing Personal Information A. Sharing Personal Information: basic principles 2 B. Data Matching 8 C. Personal Identification Information (DNA, Smart Cards) 9 D. Transborder Data Flows 19 E. Summary 23 The government’s “Blueprint for Renewing Government Services using Information Technology” calls for government agencies to collect information electronically, for government as a whole to collect any one item of information only once, and then to share that item information with whoever needs it. The idea is to eliminate paper records (which are inherently difficult to share and manipulate) and to eliminate duplicate entries (which is both a duplication of work and contributes to inaccurate information). It is hoped that the government will improve its services, reduce inaccuracies and labour costs and potentially reduce overpayments and fraud by preventing individuals from supplying different information to different government programs. From a security perspective, personal information must be kept secure and it must be shared only with those persons who need to know the information. Special issues arise when there is an automated data matching program and when personal information is moved across national borders. Other security issues relating to personal information include the use of personal information on smart cards, which is an important technology to promote security of all forms of electronic information and physical premises, and to enable electronic transactions with the government. Two other security issues relating to personal information but which are discussed in other chapters are employee rights to privacy balanced against the employer’s need to ensure its computer systems are secure. This can lead to monitoring of employee use of computers, which is discussed in Chapter 8. Also, personal information can be disclosed to third parties, raising questions of what controls to place on the third parties, which is discussed in Chapter 2. A. Sharing Personal Information: basic principles The protection of personal information is governed by a number of fundamental principles. These principles are set out in somewhat different ways in the Canadian Standards Association’s model privacy code, in the Organization for Economic Cooperation and Development privacy guidelines, in various provincial legislation, and in the federal Privacy Act. Here, we review the basic privacy principles, drawing particularly from the federal Privacy Act. We also provide comments on how these principles might apply to data warehousing, or the ‘collect once, share many’ model set out in the government’s Blueprint. Before reviewing the principles that do exist, we must note the one fundamental principle that does not exist: the right to say no. If government says it needs information for its programs, particularly if it does so by statute or regulation, generally, the citizen has no practical ability to say no. In some cases, it is illegal not to provide the information, such as reporting income for tax purposes, filling out census information and complying with court-ordered subpoenas, and, on a more local scale, providing information for birth certificates and other vital statistic information. For most government services, citizens can refuse to provide the information, so long as they accept the consequence of not receiving government benefits such as health care, unemployment insurance, old age security, or being issued passports, driver licences, automobile registration, and so on. (It should be noted that this ‘right to refuse’ to provide information is really a right to refuse to receive benefits for which the citizen has already paid through taxes.) The counter-balance to government’s power to collect information is that government’s use of that information is regulated by law (although this is a relatively recent development). In the private sector, business can ask for any information it wants, and if the consumer says no, business can refuse to provide the service. Private sector use is not legislated (except in the province of Quebec), but the consumer’s right to refuse to provide that information and accept the consequence of being denied service provides a balance to the lack of regulation. Of course, some services are essential and provided by monopolies, while other services are nearly essential and are provided by a group of competitors that all demand the same information. In these situations, the right to refuse to provide the information is virtually meaningless, because there is no practical alternative for essential and nearly essential services, such as telephone, banking, hydro and some transportation services. Even when a citizen or consumer agrees to provide information, it will be difficult to control the use of that information once it is provided. This is another way the right to say no is limited. Here are the basic privacy principles: Minimal collection Government institutions should only collect personal information that “relates directly” to one of their activities. In some cases, one institution may collect information on behalf of another institution (for example, Justice providing divorce information to Human Resources Development for pension purposes). In such a situation, if the institution collecting that information has no need to have the information being passed on, it should ensure that it passes the information on without keeping a copy of it. Another issue that arises with the minimal collection principle is the problem that when some kinds of information are collected, they provide both the minimum and maximum information at the same time. For example, using DNA as a tool for identifying criminal results in far more information being collected (though not necessarily accessed) than is necessary for criminal identification purposes. Another example is monitoring computers to ensure their security. Effective monitoring and security requires access to all parts of the computer system and all messages conveyed, even though almost none of the information on the computers represents a security problem. In response to the federal Blueprint’s ‘collect once, share many’ model, the Privacy Commissioner asked this question: “How will governments reconcile sharing personal databases with that fundamental privacy tenet - collecting only the minimum personal personal details needed to administer a program?” A number of answers suggest themselves. First, it is a waste of time, effort and money for government programs to seek more information than it needs. This administrative concern is complementary to the privacy concern in most, if not all, cases. Second, it should be possible to restrict an institution’s access privileges to a central database so that it only has access to the information in the database than it needs for its programs. As an alternative, an institution might not be given direct electronic access to the database but would be required to request the computer operator of the database to search the database for the desired information and disclose the relevant information. Third, it is not clear that the Privacy Act vision of separate institutions exists for the purpose of segregating information. It is more likely that the vision exists to facilitate management of the various parts of government, to provide Ministerial accountability to Parliament, and as a reflection of reality at the time the Privacy Act was drafted. In any event, the current model of separate institutions does not ensure information is segregated, because of the many different authorizations for sharing information already in the Privacy Act, discussed in more detail below. Fourth, while data warehousing may be a threat to the ‘mimimum collection’ principle, it will probably enhance other privacy principles discussed below, such as complete, accurate, up-to-date information, and rights of access, correction and notation to personal information, that can be communicated effectively to everyone who uses that information. Finally, wherever one big federal database is located, it will undoubtedly be under the control of a government institution, and therefore all of the Privacy Act protections will apply. Collect information directly from the individual ... The basic principle that institutions should collect information directly from the individual has a very significant exception: information should be collected directly from the individual unless doing so might result in the collection of inaccurate information or might defeat the purpose or prejudice the use for which the information is collected. In a ‘collect once, share many’ model, institutions would collect their information from a central database, rather than directly from the individual. It would likely be the case that the information in the database was collected directly from the individual (but ususally, by some other government institution). Thus, a central database might be a model of direct collection rather than indirect collection. Even if a central database is seen as indirect collection of information, because the institution seeking the information went to the central database instead of the individual, this could be justified in the interests of accuracy: the fewer times the same information is collected and entered, the fewer discrepancies and errors there will be. The Privacy Commissioner himself has written: “The more users who can access and manipulate the data, the more dangerously unreliable it becomes.”1 Indirect collection might also be justified because direct collection might prejudice the use for which information is collected: one purpose of data warehousing of personal information is to avoid government over-payments and fraud. This can be the result when many government institutions collect information from an individual, who may be giving different information to different institutions. However one justifies a ‘collect once, share many’ model, the end result is that government behaves as a single enterprise rather than as a series of separate government institutions. The Access to Information Act and Privacy Act are based on a model of separate government institutions. Their provisions do not fit neatly into a single enterprise model. As the Privacy Commissioner has said “The very reason for segregating personal information is to prevent governments from amassing detailed dossiers about individuals.”2 However, despite the fact that the Acts are based on a model of separate institutions, the Privacy Act already provides numerous authorizations for institutions to share information with each other (although the ‘need to know’ principle must be observed. Requirements for sharing information from one institution to other parties are discussed in more detail later in this Chapter). It is not clear that data warehousing is a substantial departure from the sharing of information already permitted. Tell individuals how the collected information will be used. There is no requirement that notice of how information will be used must be given directly to the individual at the time the information is collected. Although it may frequently happen that individuals are told the primary purpose for which the information is collected, they will almost never be told directly about possible secondary uses, as authorized in s. 8 of the Privacy Act. One way governments inform Canadians how it will use the personal information it collects from them is by stating the intended use on the forms used to collect the information. However, this will not fully capture the various consistent uses and and authorized disclosures under the Privacy Act. Another vehicle the government uses for informing Canadians how it uses the personal information it collects from them is by listing the government’s information holdings and the uses of the information in a 1,000 page index called InfoSource, which is reprinted and updated annually. InfoSource does give information about how information is used for purposes consistent with the original use. (InfoSource is now available on the Internet at “http:www.qlsys.caTBSInfo_Source” and is for sale by a licensee on CD-ROM.) In a data warehousing model, InfoSource would still have relevance. The description of separate “personal information banks” might need to be changed to accurately reflect the centrally combined information, but there would still need to be a description of how the information is used by each institution. An alternative approach to current InfoSource practice might require institutions to reproduce their InfoSource entries on the forms on which they collect information. Where information is collected electronically, this could be done by hyper-linking the InfoSource entry to the electronic form requesting the information. In practice, citizens who want government services have only a limited right to refuse to provide personal information to government. Therefore, knowing how the information is used has little bearing on whether or not the citizens will provide the information. In some cases (Canadian census, income tax reporting), citizens are required by law to provide information to the government. Use personal information only for the purpose for which it was collected, and disclose it only with the consent of the individual ... It might be argued that of all the privacy principles, this is the principle that is the most eroded by the exceptions to it. In fact, the principle is “use information only for the purpose for which it was collected or use it and disclose it as authorized by any of 13 separate exceptions in the Privacy Act.” Here are the ways that personal information can be disclosed or used without an individual’s consent and separate from the reason the information was collected in the first place (below are paraphrases of the paragraphs in s. 8(2) of the Privacy Act): 1. for a purpose “consistent with” the purpose for which the information was originally collected; 2. for any purpose where disclosure is authorized in any Act of Parliament or any regulation made thereunder; 3. to comply with a subpoena, warrant or rule of court issued by any court, person or body with jurisdiction to compel production of information; 4. to the Attorney General for use in any legal proceedings involving the government; 5. to an investigative body for the purpose of enforcing any law or carrying out a lawful investigation, if there is a written request specifying the purpose and describing the information to be disclosed; 6. to any provincial, territorial or foreign government or international association of organization established by states for the purpose of administering or enforcing any law or carrying out any lawful investigation, if there is an agreement or arrangement between Canada and the other jurisdiction; 7. to a member of Parliament for the purpose of assisting the individual to whom the information relates in resolving a problem; 8. for audit purposes; 9. for achival purposes; 10. for research or statistical purposes; 11. for researching or validating any claims, disputes or grievances of any of the Aboriginal peoples of Canada; 12. for locating an individual to collect a debt owing to the government or to make a payment owing by the government to that individual; 13. for any purpose where the public interest clearly outweighs the privacy interest or would clearly benefit the individual to whom the information relates. ‘Consistent use’ disclosures must be described in InfoSource and the Privacy Commissioner must be notified of new consistent uses and of each ‘public interest’ disclosure. The Privacy Commissioner may notify the individual affected of a ‘public interest’ disclosure. Institutions are required to keep a record of all other disclosures. However, they are not required to provide individuals specific notice of any such disclosures. Any number of the above exceptions to the rule could justify a data warehousing or ‘collect-once, share-many’ model, particularly “consistent use” or conformance with a law or regulation if one is passed to deal with data warehousing. Where the information is shared with a provincial government or an investigative agency, or is used locate an individual to collect a debt owing to the government or to make a payment owing to the individual, there are already express authorizations. Access, accuracy, corrections and notations The fundamental privacy principles include the following: · provide citizens with reasonable access to personal information by retaining, preserving, securing and disposing of their personal information according to legislation and directives; · ensure the information is accurate, complete and kept up-to- date; and · allow persons to request corrections, to have notations of their requests added to the files where the requests are refused, and to have anyone to whom the information was disclosed in the past two years notified of the correction or notation. As discussed earlier under the principle “collect directly for the individual unless doing so might result in the collection of inaccurate information,” a ‘collect once, share many’ model may provide the best chance for accurate, complete and up-to-date personal information. Data warehousing may also provide the best opportunity for giving effective notice to institutions about changes or notations to personal information. Further, the central database may be able to trace disclosures better, giving individuals more information about how their personal information was used. In addition, a central database will make it much easier for individuals to access the information government holds about them. B. Data-Matching The ‘collect once, share many’ model creates the opportunity for data-matching. Treasury Board already has a policy on data- matching, which involve consultation with the Privacy Commissioner. Data-matching is defined as the comparison of personal data obtained from different sources ... for the purposes of making decisions about the individuals to whom the data pertains. (Privacy and Data Protection, Ch. 1-1, p. 11) A matching program is used to compare a set or sets of records containing personal information held by a matching institution with another set or sets of records held by the matching source. ... This matching activity may or may not generate a new body of personal information. Data matching generally involves the use of computer rather than manual means, owing to the volume of data and the frequency of transactions. Included in the definition of data matching is data linkage, also known as data profiling. This form of data matching involves the use of a computer and personal data obtained from a variety of sources, including personal information banks, to merge and compare files on identifiable individuals or categories of individuals for administrative purposes. This linkage or profiling activity generates a new body of personal information. (Ch. 2-5, p. 10) There is nothing in the Privacy Act that specifically addresses data-matching, although all of the provisions apply to data- matching as they do to any other use of personal information by the government. The government’s policy on data-matching is “to account for and give public notice of data-matching carried out by or on behalf of the government.” (Privacy and Data Protection, Ch. 1-1, p. 1). Chapter 2-5 of the Privacy and Data Protection Policy sets out the policy requirements for data matching. The policy requires the following: a preliminary assessment, a cost-benefit analysis, notification to the Privacy Commissioner before beginning the data-matching program, approval from the head of the institution, placing a notation of the program in InfoSource and ensuring that matching between institutions is sanctioned by a written agreement setting out further conditions as necessary. Matching performed by an independent contractor must be based on an contract that stipulates that the contracted activities will be conducted in accordance with the Privacy Act and this policy. The data-matching policy sets out a number of security provisions: There must be “a verification process involving original and additional authoritative sources ... before the information is used in a decision-making process that directly involves the individual.” Individuals “should be given an opportunity to refute the information produced by a matching program before any administrative action concerning the individual is initiated.” The Security Policy applies, “personal information and computer systems should be safeguarded from accidental and deliberate threats to confidentiality and integrity as it relates to authenticity, accuracy, currency and completeness. Security safeguards implemented by the matching institution should be at least equivalent to those initiated by the matching source.” Finally, retention and disposal standards also require retaining and disposing “keys used in matching programs.” The data-matching policy demonstrates that data-matching and data-linkage are already approved practices within government. The policy provides good guidance for how any further data linkage or data-warehousing should take place. Importantly, the policy makes explicit the requirement to secure the personal information and computer systems involved. C. Personal Identification Information It is hard to imagine how the government can adopt a ‘collect once, share many’ or data warehouse model without using a universal identification number of some kind. All of us have many kinds of identifiers, some public, some private. Our names, addresses and telephone numbers are generally our most public identifiers. Our bodies carry identifiers in many different ways: physical appearance, blood type, fingerprints, DNA, etc. We also have many numbers generated for specific purposes: we have bank acount and credit card numbers, we have secret Personal Identification Numbers for our bank cards, and we have a very public Social Insurance Number. (The SIN is public in the sense that we routinely consent to disclosing this number to banks and other institutions, who then submit it to credit reporting agencies, available to a wide variety of interested parties.) We have account numbers, employee numbers, dental plan numbers, etc. almost everywhere we do business. The government’s policy on the Social Insurance Number is as follows: It is the policy of the government ... to prevent the SIN from becoming a universal identifer by: · limiting collection and use of the SIN by institutions to specific acts, regulations and programs; and · notifying individuals clearly as to the purposes for collecting the SIN and whether any right, benefit or privilege could be withheld or any penalty imposed if the number is not disclosed to a federal institution requesting it. (Privacy and Data Protection, Ch. 1-1, p. 2) Although the government has limited its use of the SIN, there are currently 16 statutes and regulations authorizing its use (the Income Tax Act, Unemployment Insurance Act, Canada Student Loans Regulations, Canada Pension Plan Regulations, Family Orders and Agreements Enforcement Assistance Act are a few examples of these statutes and regulations). Beyond these however, the SIN is a number used in all sectors of life, whether that use is warranted or not. No law prohibits a business from requesting the social insurance number of a consumer nor is there any prohibition from refusing to give a service to someone who will not provide his or her SIN to the business asking for it. Anyone seeking a credit card or a mortgage loan will be asked for the SIN and likely refused service unless it is provided. Our SIN is requested in so many different contexts that we are generally unable to know who has it or what it is being used for. One thing we know, credit bureaus collect information from a large number of sources and combine it into one record on us. In many ways, the SIN is already a universal identifying number. If the government wanted to use of SIN as a universal identifier for a data warehouse, it would only need to change its 1988 policy restricting the use of the SIN. Even if it wanted to create a different identifying number, there is no legal impediment to doing so. In addition to using identifying numbers to keep track of personal information, the purpose of some personal identifiers is to provide confidentiality and liability. PINs are issued by entitities to users who accept or are deemed to accept responsibility for transactions made with the use of their number. The banks were the first major users of this technology with the distribution of bank cards which would allow their customers to do their banking transactions at an automatic teller machine upon inserting a bank card and entering a PIN in the machine. The legal basis upon which such an arrangement may be made in the banking context is the contract signed by the customer prior to being issued his card in which he assumes responsibility for all transactions carried at a machine with the use of his card and for which his PIN has been entered. This arrangement makes it to the best advantage of the card- holder to keep his PIN secret. For the arrangement to be acceptable, there must also be an understanding between the issuer of the PIN and the user of such PIN that the user is the only one who should know his number and that even the issuer has no easy access to it. The above comments make it fairly evident that social insurance numbers (SINs) would not be useful as personal identification numbers for confidentiality or liability purposes. PINs are by definition secret and must be the subject of utmost security by their holder. A person’s SIN, on the other hand, is a widely known and used personal identifier. If the government were to issue PINs, perhaps associated with government service cards or passwords to access government services at kiosks or from home computers, some of the principles that have governed the use of SINs should perhaps also apply to PINs. For example, the use of government PINs might be limited to statutory authorizations. There might also be a principle that government services cannot be refused to citizens who refuse to disclose their PINs except where such a refusal is specifically authorized. Individuals should be clearly informed of the purpose for which a PIN is issued and the potential consequences of disclosing this number. For example, recently enacted amendments to the Unemployment Insurance Regulations,3 allow unemployment insurance claimants in certain localities of New Brunswick to make a claim for benefit (file their bi-weekly report cards) by means of an interactive voice response system, using a telephone. One of the requirements for making a claim by phone is that the claimant enter, at the beginning of the telephone call, the ‘phone access code’ previously given to him by the Commission. This phone access code is the particular name of the PIN for this project. The regulation then prescribes that a claimant who has provided the required information including the phone access code will be deemed to have signed, made and executed a claim for benefit and to have supplied the information recorded on the computer system in answer to the questions posed by the interactive voice response system. Such a deeming provision will allow the Department to tie any declaration made (Q.: ‘Did you work during the last two weeks?’ A.: ‘No’) to the holder of the telephone access code and, for example, impose a penalty as authorized by the Unemployment Insurance Act should a false declaration be made knowingly. It is however suggested that it is essential that such a provision be well publicized for it to survive legal challenges. To summarize, there is a policy but not a regulatory or statutory impediment to using the SIN as a universal identifier for the ‘collect once, share many’ objective. There are no impediments or rules to regulate government use of PINs (other than the Social Insurance Number) for the purposes of enabling confidential and legally binding electronic transactions between government and individual citizens. Protecting Biometric Information, including DNA Security of electronic infomation depends on secret identifiers: passwords, login names, personal identification numbers are the basics. Encryption depends on secret codes. No system is completely secure because they all depend on the secrecy of these numbers, codes and words, yet that secrecy cannot be guaranteed. People use passwords and numbers that are easy to remember, and thus, more likely to be susceptible to guesswork or password breaking software. The more difficult the password, the more likely it is written down and kept close to the compuer. Access codes can be found in lists found in garbage containers. They can be discovered by tricking the user to reveal the number. For example, a person might receive a phone call or even a visit from someone claiming to be with the technology division of an organization, and in order to be able to conduct necessary computer changes, repairs or diagnostics, needs to know the person’s access codes. All of this can be irrelevant if the user simply leaves the computer unattended after he or she has logged on. Thus, there is a constant search for better secret access codes. The best access codes, from a computer security point of view, are those that are so unique to the individual that they cannot be forgotten, lost or stolen. Generally, this means biomeric information: not a word or number, but biological information about the individual. Biometric information used in the past on such things as driver licenses and medical records is sex, height, skin, eye and hair colour, blood type, fingerprints, birth marks, tatoos and scars. Among these, fingerprints are the only ones that are sufficienly unique to serve as a useful electronic security tool. (As a law enforcement tool, they have limited utility: usually fingerprint evidence is unavailable, and in any event, the fingerprints usually only become useful after other evidence leads to the arrest of the individual.) Now, we are beginning to use DNA: an acronym for a molecule called Deoxyribonucleic Acid, which is described as the basic building block of life, the blueprint of the body. DNA provides extremely unique identifying information and can be taken from any number of sources from the body, including hair. As voice recognition computers and computers equipped with cameras become increasingly available, they offer the possibility of using other biometric information to provide electronic security. Widespread use of biometric information for the purposes of electronic security raises important privacy questions. Who will have access to our most personal information? Who will store that information? How can we have confidence that the access and storage are limited exactly as we want or as the law requires? To date, our society has resisted providing everyone’s fingerprints to a central authority simply to facilitate law enforcement interests. Are we now willing to require the large majoriy of people to register their fingerprints or equivalent biometric information for the purposes of electronic security? What access will law enforcement have to this information? What safeguards will apply if our biometric information is to be held primarily by private companies? This may happen because their services depend on biometrically secured transactions; because our employers require it; or because we use private companies for data storage, management, testing and analysis. Of the various biometric information, DNA poses the most serious problems. This is because DNA issues are with us now: DNA is being used by law enforcement, it is being tested in private laboratories, it provides far more information than is needed for any one purpose (so much for minimal collection principle). In addition, there is every indication that DNA research is only beginning. Some observers have commented that the information age is almost over and the biotechnology age has already begun. Others speculate on being able to put human life into digital form. Biometric information is the the greatest threat possible to our privacy, which is exactly why it is so useful for electronic security measures. Nonetheless, we now face a prospect of a society collecting DNA: law enforcement, employers, health and life insurers, adoption, immigration, prison and child welfare agencies, pharmaceutical research and medical centres may all have legitimate desires to collect DNA information. How will this information be regulated? Given the Auditor General’s findings about how successful the federal government has been to date in information security, is there any reason to expect the government is up to the task posed by this most sensitive information? DNA testing is currently done by the Royal Canadian Mounted Police and has been advanced as a uniquely effective means to link a suspect to a crime, or to exonerate a wrongly accused suspect. The millions of cells composing the human body each contain a nucleus, i.e., a compartment within which are 46 chromosomes, divided into 23 pairs, inherited maternally and paternally. The DNA molecule is arranged in these chromosomes and is the same in each cell. The accepted theory is that no two people, except identical twins, have the same DNA. While the present technology does not allow a scientist to look at the entire DNA chain contained in the 23 pairs of chromosomes of the cell, British scientist Alec Jeffreys determined in 1985 that individuals could be differentiated by examining only certain sections of these chemical combinations. Since its forensic introduction in Canada in 1988-89, DNA has been instrumental in securing convictions in hundreds of violent crimes. Identifying perpetrators by comparing biological samples of suspects against biolical specimens that perpetrators have left at or that have ben taken from crime scenes is the most prominent application of DNA. Important concerns have been raised about the application of DNA to forensics. They can be summarized under two headings: (a) the need for legislation clearly authorizing the taking of biological samples from suspects for DNA testing, and (b) the need to regulate and safeguard the use of DNA evidence obtained. This paper will only deal with concerns about how to handle DNA evidence once it is obtained. It must be stressed at the outset that forensic useof DNA would not occur unless the techniques are designed and used for identification purposes only. DNA evidence banking divides itself into two sub-components: · DNA Banking, i.e. the use and retention of the samples; · DNA Data Banking, i.e. the storage and dissemination of the identification information. DNA Banking DNA Banking raises two basic issues: · what measures can be taken to ensure that only identification information is derived from the sample? · is there a need to retain the actual samples after the identification information is recorded? In answer to the first question, it is known that DNA technology can identify genetic traits and trace inheritable diseases. To prevent the unauthorized use of the technology, any legislative scheme will require specific provisions guarding against any other use than identification. This can be accomplished by identifying the purpose behind the tests in the enabling legislation and by defining the type of tests in subordinate legislation. The concept of ‘consistent use’ and other Privacy Act authorized disclosures needs to be closely examined for such sensitive information. As to the second question, although there would appear, at first blush, to be no reason to retain the sample once typed, the present state of the science and resulting technology seems to require it, at least for the short term. Indeed, DNA technology is changing rapidly (we may not even know how sensitive the information is, or what potential information could be derived from DNA). It is expected that profiles produced with today’s methods could be incompatible with tomorrow’s methods. Redrawing samples for retyping would prove to be very expensive and inefficient, and thus it is preferable to retain samples, despite the security problems this creates. However, once the technology stabilizes, samples should be destroyed promptly after typing. DNA Data Banking For DNA Data Banking, the obvious privacy concern is to ensure that the information is only released to and used by those it was originally gathered for and used for the purpose for which it was gathered. DNA Data Banking raise concerns about the the security of electronic information because, although matches are often declared on the basis of visual inspection of the scientific information, the preferred method involves the use of a computer to convert band positions into numerical codes, and a comparison of the codes to determine the closeness of the match. The data resulting from a DNA analysis is usually stored under various forms which may consist of the following: · copies of the autoradiographs and laboratory books; · a laboratory report, including a description of match criteria, band size measurements, and allele frequency calculation methods; · a copy of the data pool for each locus examined; · statements setting forth observed contaminants, instances of degradation, and laboratory errors, together with a description of control tests performed to determine their origin and effects; and · chain of custody documents. The bulk of this information is voluminous and may consist of information recorded both on paper and digitally. Although the Privacy Act has general provisions addressing the question of disclosure of personal information, strong reservations have been expressed about its effectiveness in dealing with new biotechnologies. It would appear reasonable to strenghten the Privacy Act and provide for, as was done in one U.S. legislation, specific penalties for wilful violations. Linked to these privacy concerns is the concern that samples must be destroyed and the data expunged should the person from whom it was taken not be convicted or should the conviction be overturned. This obligation has been alluded to in some court cases in relation to fingerprints. It should be part of a legislative scheme. Several questions are left unresolved such as to what extent must samples of material used as the testing information in cases involving DNA analysis be preserved for disclosure to an accused person.4 To what measure must statistical information relating to DNA be protected and eventually made available to the public? In theory, the information is subject to the Access to Information Act, and must be disclosed unless it can be withheld under one of the exemptions. However, the personal information and investigative body exemption would appear to provide adequate protection, although if the data is retained for a long period of time, these exemptions would not prevent disclosure to the public. The Security Policy requires that the degree of protection provided be commensurate with the level of sensitivity of the information and the associated threats and risks. Without appropriate safeguards, the confidentiality, integrity and availability of the information could be adversely affected. Because of the extremely sensitive nature of this information and possible infringements on the Charter of Rights and Freedoms, strict compliance with government policy on security may not, however, be a shelter against claims of violations of privacy from concerned individuals.5 To date, the Supreme Court of Canada has not dealt with a challenge to the use of evidence such as DNA but had to examine the reasonableness of a search conducted in the absence of prior judicial authorization while several decisions and legislation have already validated its use in several jurisdictions in the United States.6 Members of the legal profession have raised concerns regarding the storage of DNA, procedures for obtaining blood and biometric samples, the right to complete discovery and the regulation of laboratories.7 Contrary to the situation in the United States where numerous private laboratories conduct DNA analysis, most of the analysis in Canada is currently done by government sponsored laboratories. Nonetheless, the necessary implications of the storage and protection of data8 and material related to DNA, blood testing and other biometric information in the private sector cannot be ignored in Canada. New definitions and answers will be necessary to define the scope of the state’s involvement in the management of novel scientific information and advanced technology. Biometric information raises numerous problems which are not adequately addressed by current legislation. As DNA becomes more imporant for law enforcement, and as biometric information becomes more important for electronic security, we believe that an incomplete approach to this growing reality could result in a loss of credibility for the government.9 A failure to exercise diligence in that field could also result in litigations that could become unmanageable and prove very costly for everyone. Therefore, we recommend that the government adopt strict measures to protect the confidentiality of DNA, blood testing and other biometric information. Smart cards An important technology for promoting security of elecronic transactions and computer systems is smart cards. Smart cards are only one of a number of different card technologies. The legal issues created by smart cards are the same as those created by the other kinds of cards, or indeed by any kind of ‘token’ which is inserted into a computer to identify the user. (See the discussion of personal identification numbers above.) Smart cards are often confused with other advanced card technologies such as the magnetic stripe card, the optical card, the large memory card or the memory card.10 (For definitions of various card technologies, see the glossary.) Smart cards are credit card sized plastic cards of varying thicknesses, containing one or more microchips that allow the card to function as a computer with logic and memory; thus it is capable of storing, processing and retrieving information. There is a broad range of smart card capabilities: differences in processing power, memory type, memory size (up to 64 kilobites of information), erasable and non erasable varieties, programmable and non-programmable varieties, even contactless smart cards that use radio frequencies to read and write to the chip. These options vary and depend on the manufacturer. The card also has microprocessors embedded in it that are capable of encrypting information while the card is being used, can maintain a time-on and time-off log in memory and can handle biometric identification and passwords to allow only authorized persons to read or alter particular parts of data on the card. The diverse functions of the smart card can even monitor access attempts and invalidate the card when the number of incorrect access attempts reaches a predefined number. For your information, in 1994, Secure Dynamics Inc. has introduced the “SecureID smart card.” The latter contains a device which displays and computes an unpredictable, new six-digit password every minute (a similar device is implemented on software or in hardware on a network server or host, so the smart card and host machine are always in sync in terms of the password used). Associated with the personal identification number, the odds against defeating a SecureID card are under one in 14 billion.11 Certain types of smart cards contain information that cannot be read or altered by the cardholder. This functionality is known as EPROM (electrically programmable read only memory) which can only be programmed once and the contents cannot be changed. Other types are totally reprogrammable pursuant to the cardholder’s intentions; it is commonly known as a type using EEPROM (electrically erasable programmable read only memory). France, where the smart card originates, is a heavy user of the smart card in many areas such as telephone networks, financial transactions, parking meters and vending machines. Other countries, such as Japan, have also followed the lead. As of today, there are only a few smart card applications in Canada and in the United States, although given the wide acceptance of smart cards in Europe and Japan, we can expect their use will increase in North America, and that electronic security may be an important impetus for that increase. The smart card’s generic functions include the following:12 · Administration: records management, inventory management, material acquisition from stockrooms, fleet management. For these activities, any time an employee borrows, removes, buys or uses an item, this use is registered both on the employee’s card and on the institution’s computer, by requiring the employee to insert his or her smart card to use or remove the item. This eliminates paper forms and greatly facilitates inventory control and retrieval. It also makes it possible to substantially increase the number of individual items which are monitored. (Conceivably, it could be used to determine how many pens we use.) · Personnel: leave system on smart cards, interdepartmental system, performance appraisals, electronic overtime tracking. Smart cards can be used to keep a precise record of hours worked and available leave credits. Moreover, some records, such as an employee’s personnel file, could be stored directly onto the smart card. Every time the employee changes employment within the federal public service, transferring the file is virtually automatic. · Finance: access control to financial systems, delegation of authority, electronic signatures, electronic wallet. A smart card can contain a code indicating whether an individual has particular signing authority. Electronic transactions can be designed so that only persons with the appropriate codes will be permitted to authorize the electronic transaction. In addition, monetary credits can be stored on a smart card. · Security: access to buildings, computer access, maintain and administer multiple passwords. Access to buildings and computers can be controlled by inserting a smart card, creating an electronic record of who was given access and when. This would replace requirements for sign-in and sign-out paper forms. Smart cards could make it much easier for federal employees to visit colleagues in other departments, reducing the need for front desk sign-in procedures. The smart card can be used as a way to provide employees access to only the database information they need to know for their jobs. It can also keep all of the various passwords an employee needs for various computer resources on the card and automatically activate them. The employee would only need to remember his or her Personal Identification Number. · Other: electronic filing of tax returns, electronic unemployment insurance benefit claims, electronic vote casting for elections, carry medical records. As of today, there is no legislation or regulation in Canada specifically governing either smart card or debit card transactions. In his article “The E.F.T. Debit Card,”13 Benjamin Geva concluded that in Canada contract law primarily governs debit card transactions vis-à-vis the relationships involving the cardholder, card issuer and merchant or service provider. Regulations and the common law can also apply, to the extent to address the issues that may arise relating to the use of smart cards. He stated that in some countries, the cardholder/card issuer relationship is subject to a detailed regulatory scheme which either supersedes or imposes contract terms.14 In addition, he states: In New Zealand there is a binding Code of Practice negotiated by the government. In Australia the banks voluntarily abide by a set of procedures recommended by the government. These procedures provide standards for the availability and disclosure of terms and conditions applicable to the use of debit cards, as well as to changing such terms and conditions; the availability of paper records and periodical statements for debit card transactions; cardholder’s liability for an unauthorized transaction; card issuer’s liability in cases of technical malfunction; error and dispute resolution procedures; deposits in electronic terminals, inability of networking arrangements to deny cardholders direct remedies against card issuers; audit trails; and privacy.15 By analogy with the above, it appears that, without any legislation or regulation in Canada on smart card, contracts and related areas of law would govern the many relationships among cardholders, card issuers and merchants in relation to smart cards transactions. In some cases, other specific legislation and regulation would also apply to particular areas or transactions. For example, section 239(2.3) of the Income Tax Act provides that where the Act requires a private sector organization to obtain Social Insurance Numbers for the purposes of reporting income tax, the organization cannot use the Number for other purposes without the consent of the individual. As another example, the Quebec Act respecting the protection of personal information in the private sector may govern the kinds of information that may be put on the smart card and who will have access to that information. If and when the federal government implements smart cards, whether for its own employees or for persons seeking to use federal government services, the information put on the card and who may have access to it may be subject to the Access to Information Act and Privacy Act. Generally, the smart card raises issues that do not fit neatly within the present scheme of those two Acts. Nonetheless, the principles and requirements in those Acts can be applied to the smart card context in the absence of anything more tailored to the smart card context. In part, the application of the Access to Information Act and Privacy Act will depend on who has control of the information. If the information on the smart card is not kept anywhere else but on the smart card, and only the individual employee or citizen has control of the smart card, then it might be said that the institution does not have control of the card or its contents. On the other hand, if, as is likely, special hardware and software is required for an individual to review the information on the card, and that technology is under the control of the institution, then it could be argued that the institution is in control of the information. At the very least, there would be a moral obligation on the institution to assist the individual in accessing and reviewing the information on the card. Beyond the question of who has control of the card or the technology to read the card, presumably every use of the card results in a collection of personal information by the institution that issued the card. The information may be collected for only a very short period of time (such as to open a locked door), but there is, nonetheless, a collection of information. It would also be necessary to define which collections of smart card information must be retained for two years after an “administrative use” of the information. What is important is that the information collected for the purpose be limited to no more than is necessary for the purpose, that the institution notify the Privacy Commissioner and amend the institution’s entry in InfoSource to describe the collection, and that individuals be given effective access to their personal information and opportunities to correct the information. Aside from the Privacy Act considerations, there would be Security Policy considerations. Some of the information on the cards might need to be protected at the Protected B or C levels, perhaps with a mandatory requirement that the information be encrypted. In addition, it is likely that smart cards will hold a variety of information. It is important that the cards be designed to prevent institutions from accessing any more information than they need to know for a given purpose. The legal framework governing smart cards in Canada is difficult to establish at this point in time due to the lack of legislation, regulations and jurisprudence on the subject. However, future legislation, regulations, rules or procedures which may apply to a smart card will depend extensively on the following factors: functions of the card, the property of the card, the card issuer, the type (EPROM vs. EEPROM), etc. D. Transborder data flows Multi-national corporations are increasingly moving data around the world. Data management services do not have to be geographically close to where the data came from. Corporations set up data management headquarters far away from major population centres. Government increasingly uses multi-national corporations to manage the installation of its information, and increasingly, to manage the data itself. (See comments in Chapter 11 on Value Added Networks (VANs).) Government has always relied on multi-national software products. As those software products become more intelligent and more connected internationally, the prospect of direct connections between the software and the software manufacturer increase (with the software manufacturer having the ability to enter into the software system). What privacy protection measures exist with respect to the export of personal information from Canada? Recall that the private sector in Canada (except for Quebec) is not regulated with respect to how it handles personal information. (However, see the comments in Chapter 2 on the obligations of banks to maintain and process information in Canada, subject to the rules set out in the various federal Acts legislating financial institutions.) As discussed above concerning government disclosures to third parties, there are virtually no mandatory requirements on the government to build in privacy protections when it discloses information to third parties, and therefore, none concerning transborder data flows from the independent third party contractors to other locations outside of Canada. Why is there transborder flows of personal information from Canada and by whom? An in-depth study by the Computer Science and Law Research Group on the matter of the transborder flows of personal data from Canada has identified the following reasons for exporting personal data: 1) banking transactions (banks and financial institutions); 2) financial information systems (credit bureau networks); 3) credit card services (issuing companies); 4) health care (health care institutions); 5) personnel and payroll records (all sectors); 5) airline reservations (air transport), and 5) information transferred by the government (departments and public organizations) and unions (associations sector). The twenty-five (25) main countries of destination of personal information from Canada are: U.S.A., U.K, France, Federal Republic of Germany, Japan, Australia, Hong Kong, Italy, New Zealand, Switzerland, Mexico, Jamaica, Singapore, Portugal, Brazil, India, Belgium, China, Greece, Malaysia, Thailand, Barbados, Colombia, Pakistan and the Philippines. (Note that these transborder data flows occur despite the protection offered by Canada’s Privacy Act, and OECD and European Community directives. OECD Guidelines The development of automatic data processing, which enables vast quantities of data to be transmitted within seconds across national frontiers, and across continents, has made it necessary to consider privacy protection in relation to personal data. For this reason members of the Organization for Economic Co-operation and Development (OECD) (of which Canada is a member)16 considered it necessary to develop Guidelines which would help to harmonize national privacy legislation, and, while upholding such human rights, would at the same time prevent interruptions in international flows of data. The Guidelines, in the form of a Recommendation by the Council of the OECD, were developed by a group of government experts. The Recommendation was adopted and became applicable on September 23rd, 1980. On June 29, 1984 the government of Canada announced that it had formally adhered to the Organization for Economic Cooperation and Development`s (OECD) Guidelines on the Protection of Privacy and Transborder Flows of Personal Data. Briefly, the OECD Guidelines are a set of recommendations for minimum standards for the treatment of personal data by the twenty-four (24) OECD member countries (i.e., Australia, Austria, Belgium, Canada, Denmark, Finland, France, The Federal Republic of Germany, Greece, Iceland, Ireland, Italy, Japan, Luxembourg, the Netherlands, New Zealand, Norway, Portugal, Spain, Sweden, Switzerland, Turkey, the U.K. and the U.S.A.). The Guidelines take the form of eight core principles which constitute a code of fair information practices (upon which Canada`s Privacy Act is based). They apply to manual and automated forms of data handling in both the public and private sectors. The Guidelines are accompanied by an Explanatory Memorandum and a Recommendation of the OECD Council, which make it clear that the Guidelines constitute a general framework for concerted actions which member countries may pursue and implement in different ways. These include both adopting appropriate domestic legislation and encouraging and supporting self-regulation. The OECD Council recommended that member countries take account, in their domestic legislation, of the principles enunciated in the Guidelines. The Guidelines themselves are brief and are divided into five parts. Part one includes definitions and a discussion of the scope of the Guidelines. The definitions provide, for example, that a “data controller,” who may be a legal or natural person, public authority, agency or any other body, should be responsible for activities concerned with the processing of personal data. “Personal data” means any information about an identifiable person, in either manual or automated form and in either the public or private sectors. Part two contains the eight (8) principles of fair information practice that are at the heart of the Guidelines. They are a direct reflection of principles commonly agreed upon in national data protection legislation and in voluntary privacy codes since the 1970s. The basic principles require the following limits on the contents and methods of personal data collection: 1. informed consent from individuals for use of information about them, where appropriate; 2. the collection of only relevant, accurate and timely data, related to the purpose for which they are to be used; 3. advance identification of the purposes for data collection; 4. restrictions on the re-use of data for new purposes without the consent of the data subject or without legal authority; 5. reasonable security safeguards; 6. openness about practices with respect to the collection, storage or use of personal data; 7. a right of access for individuals to information about themselves; and 8. the accountability of the data controller for compliance with data protection measures. The right of the individual to access and challenge personal data is viewed as the most important safeguard for privacy (consent is not). Part three contains the principles which underlie the application of the Guidelines. Member countries of the OECD are asked to ensure that transborder flows of personal data (including data in transit through a member country) are uninterrupted and secure and to refrain from restricting transborder flows between itself and other member countries, except where another member country “does not yet substantially observe these Guidelines or where the re-export of such data would circumvent its domestic privacy legislation.” This exception is particularly significant at present, because OECD member countries could restrict the flow of personal data to Canada (although they have not yet done so). The standard for measuring compliance is equivalent protection; that means substantially similar in effect to that of the exporting country, although not necessarily identical in form or in all respects. Part four establishes a general framework for the national implementation of the Guidelines by member countries, leaving the determination of appropriate measures to individual countries. Part five encourages continued international cooperation in implementing of the Guidelines, a process the OECD itself continues to promote. The OECD guidelines suggest that member countries adopt fairly minimal safeguards to protect personal information. Although Canada’s Privacy Act meets those safeguards, Canada runs the risk of OECD countries restricting transborder data flows to and from Canada because Canada’s private sector (other than Quebec) is not regulated. Proposed European Community Data Protection Law On September 13, 1990, the European Community Commission proposed a directive “concerning the protection of individuals in relation to the processing of personal data.” The Council of Ministers adopted a common position on the revised version of the draft directive on Feb. 20, 1995. The European Parliament is now being consulted for second reading. The directive is expected to be implemented in two years time. A principle purpose of the directive is to assure the untrammelled flow within the Community of personal data necessary for and ancillary to trade in goods and services, while protecting the rights of individuals. Also, the Commission believes that common EC rules would assist the development of the data processing industry. A second major purpose is to facilitate the flow of information between public authorities in different member states that will be required when border controls are replaced by cooperative measures applied within the various EC countries. The impact of the EC Directive is not something which the federal government takes lightly because, once adopted, it could have adverse consequences if it is judged that Canada has not adopted adequate privacy protection measures. This directive would prohibit the transfer of personal data to non-member countries that cannot ensure an “adequate” level of privacy protection. The EC draft directive could have serious economic implications for Canadian companies doing business in Europe. As Canada does not have a data protection scheme applying to the private sector, an EC country could refuse to transmit personal data to a Canadian destination if there is no evidence of the company having adequate safeguards to protect the data when it arrives. For that reason, the Department of Justice has been trying to convince the private sector, short of imposing legislation, to voluntarily adopt privacy protection codes that are tailored to the industry`s particular situation and needs. Such codes have been adopted in many federally-regulated sectors like the banking, insurance, transportation and communications areas but many other sectors have not responded to the federal government`s urgings. Faced with this situation, the Department of Justice has entered into an agreement with the Canadian Standards Association to develop a model privacy code for Canadian companies. The federal government hopes that the existence of such a code will motivate the private sector to adopt and implement it in their sector. E. Summary The Government’s Blueprint on Renewing Government Services using Information Technology suggests a model where information is collected once only and shared between departments who need that information. This model raises a number of legal issues relating to basic privacy principles. The model represents a significant shift away from current practices based on a model that sees government as a series of discrete institutions rather than as a single enterprise, and segregates information holdings of the various institutions. It may be appropriate to invite a public debate on privacy protections under this new model, even though there are good arguments available to suggest that the present Privacy Act supports the ‘collect once, share many’ model. Government is making it easier for citizens to find out about what information government holds about them, particularly by making InfoSource available on the Internet. However, InfoSource presents government as a series of discrete, segregated information banks (which may not accurately reflect a move to data warehousing and multi-purpose data banks), does not provide information about department’s security policies and practices, about legislated protections of information (e.g., Income Tax Act), about standard terms of agreements and contracts when information is shared with third parties, or about the nature of the access to information available to those persons who operate government computer systems or who are responsible for the security of electronic information. More importantly, InfoSource is an indirect way to notify citizens about what information is collected and how it is used. The principle of telling people how information is used can be further enhanced by providing the above information both in InfoSource, and directly at the time that government collects information. As government increasingly conducts its affairs electronically with citizens, hypertext links make the provision of information direct at the time of collection much more practical. If the government moves towards data warehousing and multi- purpose databases for its own reasons of cost efficiency, accuracy of information, reduction of overpayments and fraud, it should take the opportunity to use available technology to create audit trails of disclosures of personal information to help citizens find out how their information is used and to help ensure that corrections or notations to the information as passed along to the appropriate agencies. As the government moves to more electronic transactions, and especially to a ‘collect once, share many’ model of collecting information, and as it moves to provide greater security of electronic information, the question arises as to what kinds of personal identification information will be required of individual citizens and employees. This is particularly important for advanced card technologies. Currently, the government has a policy on the use of the Social Insurance Number restricting its use and providing that the number should not become a universal identification number. As passwords, encryption keys and smart cards proliferate, they will generate new debates about the use of personal identifiers. At the moment, no law restricts the establishment of a universal identification number. Moreover, some information is so sensitive, whether it is DNA, other biometric information or a universal identifier, that the government’s right to ask for such information, and the procedures for securing that information and limiting its disclosure, may be appropriate for specific treatment in the Privacy Act. In addition, to provide a meaningful right of access to one’s own information might require a government institution which requires citizens or employees to use smart cards, for example, to provide the technology necessary to enable individuals to read and correct information on their own cards. If the technology permits, it might be appropriate to include an audit trail of the uses of the smart card on the card itself, to assist the individual in having more information about how the government collects and uses information about him or her. ENDNOTES _______________________________ 1. Privacy Commissioner Annual Report 1993-94, p. 12 2 Privacy Commissioner Annual Report 1993-94, p. 11 3 SOR/94-601, Canada Gazette, Part II, vol. 128, no 20, p.3249 4 See “Discovery and inspection of prosecution evidence under Federal Rule 16 of criminal procedure,” C. P. Jhong, 5 ALR 3d 819 5 Black’s Law Dictionary defines personal security as “a person’s legal and uninterrupted enjoyment of his life, his limbs, his body, his health and his reputation.” The taking of a blood sample is a violation of the person, an assault upon the person. If, as held by the Supreme Court of Canada in Therens, the element of psychological compulsion in the making of a demand for a breath sample leads to involuntary deprivation of liberty, it follows that the taking of a blood sample in the same circumstances must lead to an involuntary deprivation of the right to security of the person: R. v. Chatham et al., (B.C.S.C., November 25, 1985), see also R. v. Dyment, 1988 2 S.C.R. 417, R. v. Dersch, 1993 3 S.C.R. 768, and R. v. Colarusso, 1994 1 S.C.R. 20. 6 In R. v. Borden, 1994 3 S.C.R. 145, at pp. 169-70, the Supreme Court held that “In the absence of a statutory regime whereby the police can demand a blood sample in cases such as these (a scheme that may raise Charter concerns), the police require the true consent of an accused.” For Canadian cases: see “Obtaining and Banking DNA Forensic Evidence, Discussion Paper,” Department of Justice, (1994), p. 28-30. There have been a large number of American cases which have held DNA identification evidence to be admissible to aid in determining identity of criminal perpetrator. For the view that DNA test results are generally admissible evidence but not statistical calculations associated therewith, see: People v Wallace (1993, 1st Dist), 17 Cal Rptr 2d 721, 93 CDOS 2207, 93 Daily Journal DAR 3817, rehearing denied; People v Wallace (1993, Cal App 1st Dist) 1993 Cal App LEXIS 446 review denied; People v Wallace (1993, Cal) 1993 Cal LEXIS 3381; People v Lipscomb (1991, 4th Dist) 215 Ill App 3d 413, 158 Ill Dec 952, 574 NE2d 1345; Wyo Rivera v State (1992, Wyo) 840 P2d 933, rehearing denied (Wyo) 1993 Wyo LEXIS 3. 7 See State v Schwartz (1989, Minn) 447 NW2d 422. The court commented that where an independent test on DNA samples cannot be conducted due to their destruction, as in the instant case, due process requires that the defendant have access to the underlying information in order to prepare and conduct his cross-examination. 8 The court in Andrews v State (1988, Fla App D5) 533 So 2d 841, 13 FLW 2364, later proceeding (Fla App D5) 533 So 2d 851, 13 FLW 2475 and review denied (Fla) 542 So 2d 1332, held admissible in a rape prosecution a forensic scientist’s opinion that there was less than 1 chance in 800 million of finding someone other than the defendant with the same DNA print as was yielded by RFLP analysis of the defendant’s blood and of semen found in the victim’s vagina, despite the defendant’s complaint that the data base of 710 samples, used to determine the coincidental match probability, was too small to be statistically significant. 9 See State v Schwartz (1989, Minn) 447 NW2d 422 (note 7 above), which cited the decision in State v Pennell (1989, Del Super) 1989 Del Super LEXIS 520, which held that statistical calculations of DNA band pattern frequency in the population, based on RFLP analysis of bloodstains on carpeting in the defendant’s van to determine whether the stains came from a murder victim, could not be admitted at the defendant’s trial because the state failed to supply the defendant with the laboratory’s calculations to estimate Hardy-Weinberg expectations for the probes used, or with the statistical input regarding allele frequencies for two of the probes. 10. “Smart Cards in the Public Service: The Department of Communications Experience,” Department of Communications Smart Card Group, Communications Canada, Ottawa. See also “The Smart Card: A Tool,” Dimensions in Health Service, April 1991, vol. 68, no. 3, pp. 15-18 11 “Secure ID keeps passwords a’changing,” Gary H. Anthes, Computerworld, March 28, 1994, v. 28, no. 13, p. 51 12. “Super smart cards: you can take it with you after all,” Doug Powell, Computing Canada, Dec. 21, 1989, vol. 15, no. 26, p. 32. See also: “Smart Cards: pocket power,” Michael Rogers, Newsweek, July 31, 1989, vol. 114, no. 5, p. 54; “Smart card technology serves many business needs,” Bottom Line, Feb. 1988, p. 16 13. Geva, Benjamin, “The E.F.T. Debit Card (Canada),” Canadian Business Law Journal, September 1989, V. 15, no. 14, p. 406-440 14. United States: Electronic Fund Transfer Act, 15 U.S.C.S. (Supp. 1978), 1693 et seq., Regulation E, 12 C.F.R. Part 205 15. Geva, Benjamin, “The E.F.T. Debit Card (Canada),” Canadian Business Law Journal, September 1989, V. 15, no. 14, p. 431 16 The OECD was set up under a Convention signed in Paris on December 14th 1960, which provides that the OECD shall promote policies designed: · to achieve the highest sustainable economic growth and employment and a rising standard of living in Member countries, while maintaining financial stability and thus to contribute to the development of the world economy: · to contribute to the expansion of world trade on a multilateral, non-discriminatory basis in accordance with international obligations.