Skip all menus (access key: 2)Skip first menu (access key: 1)
Canadian Human Rights Commission / Commission canadienne des droits de la personne Canadian Human Rights Commission / Commission canadienne des droits de la personne Canadian Human Rights Commission / Commission canadienne des droits de la personne Canadian Human Rights Commission / Commission canadienne des droits de la personne
Canadian Human Rights Commission
Canadian Human Rights Commission / Commission canadienne des droits de la personne
FrançaisContact UsHelpSearchCanada Site
What's NewAbout UsPublicationsFAQHome
Canadian Human Rights Commission / Commission canadienne des droits de la personneCanadian Human Rights Commission / Commission canadienne des droits de la personne
Canadian Human Rights Commission / Commission canadienne des droits de la personne Printable VersionPrintable Version Email This PageEmail This Page
Discrimination and Harassment
Complaints
Preventing Discrimination
Alternative Dispute Resolution
Strategic Initiatives
Research Program
Employment Equity
Pay Equity
Media Room
Legislation and Policies
Proactive Disclosure
 
Need larger text?
Home Strategic Initiatives Hate on the Internet Hate Speech, Public Communication and Emerging Communications Technologies

Strategic Initiatives

Hate on the Internet

Hate Speech, Public Communication and Emerging Communications Technologies

Hate Speech, Public Communication and Emerging Communications Technologies
by : ANDREA SLANE
Andrea Slane is an Associate at Osler, Hoskin & Harcourt LLP, where she practices intellectual property and technology law, as well as an Adjunct Professor at the University of Toronto.  

Criminal Code section 319(1) makes it an offence to incite hatred by communicating a statement in a public place; Criminal Code section 319(2) outlaws willfully promoting hatred "other than in private conversation"; and the purpose of Canadian Human Rights Act (CHRA) section 13 was recently articulated by Canadian Human Rights Tribunal (CHRT) member Dr. Paul Groarke to be "to remove dangerous elements of speech from the public discourse."1 Since only publicly communicated hate content is illegal in Canada, courts and the CHRT must evaluate new and evolving communications technologies and applications used to convey hate content for whether the resulting communication is public enough to be offside the Criminal Code or the CHRA.

This article sets out the features of the assessment of public communication via technology as it has so far been developed by the CHRT. The analysis will summarize the general principles which have so far allowed the CHRT to assess new technologies and applications, and which should serve to guide the CHRT going forward as the purveyors of hate speech using these other new technologies come before the Tribunal. Establishing the scope of public communication will also allow companies that provide services related to these technologies and applications to consider their role in reducing public exposure to hate content communicated or accessed via their services.

CHRT principles of public communication via technology

CHRA section 13 was made a part of the original human rights legislation in order to deal with the "dial a hate message" practice of the Western Guard Party.2 While drafted to address a fairly specific technological phenomenon, section 13 has proven itself to be remarkably adaptable to the evolving means of communicating with members of the public, as has the CHRT’s interpretation of the scope of the section. The following four principles grow out of the CHRT’s earliest consideration of section 13 and are adaptable enough to deal with a broad range of technological innovations.

  1. Repetition: Section 13 prohibits repeated telephonic communication of hate messages, which initially served to distinguish the "dial a hate message" practice from a private telephone conversation. In the Supreme Court of Canada’s review of section 13 in light of the Charter of Rights and Freedoms in Canada (Human Rights Commission) v. Taylor , the Court upholds the section in large part because of this requirement of repetition, and endorses the CHRT’s view that technology itself is not what determines whether communication is private or public, but rather how it is used. In other words, while radio and television broadcasts clearly address large audiences all at once, even technologies designed for one-on-one communications can be part of a campaign of public communication, since repeated one-on-one communication adds up to mass communication.3 The first guiding principle is therefore that it is the manner of use of technology to reach the public, whether all at once or successively, that amounts to public communication, rather than the characteristics of the technology itself.
  2. Access by the public and passivity of the communicator: A second important principle set down by the CHRT and endorsed by the SCC in Taylor is that a person remains responsible for communicating a hate message to the public even when the message is merely made available and a member of the public must actively seek out or access the message. In the "dial a hate message" context, the message could only be heard if a person called the telephone number that the Western Guard Party had advertised. In Schnell v.Machiavelli Emprize Associates, the CHRT followed this principle in rejecting the respondent’s argument that merely uploading a website is not communicating to the public, since technically transmission of information does not occur upon posting, but rather only when someone visits the website.4 This CHRT finding is in keeping with the definition of "public place" in section 319 of the Criminal Code, which includes "any place to which the public have access as of right or by invitation, express or implied."5 Posting a message on a website is therefore a public communication, since the public has access to websites, regardless of whether a person must choose to go there to be exposed to the hate materials.
  3. Public access through search services: The respondent in Machiavelli also tried to argue that a content provider cannot be considered to communicate website content to the public if he did not advertise or otherwise provide the person accessing the site with the address. The respondent thereby attempted to distinguish the situation of posting messages on a website from the facts in Taylor, where the Western Guard Party had advertised the telephone number both in telephone directories and via handing out leaflets on street corners. The CHRT does not find this argument persuasive, stating that, with the general availability of search engines, websites can be found by members of the public without the owner of a site having to publicize the location. A third guiding principle is therefore that, if there are publicly available means of finding a message, the message is a public communication.
  4. Open access to membership or subscription: In Warman v. Kyburz, the CHRT considered a situation where members of the public would have to subscribe or sign up for a forum in order to be exposed to the hate messages at issue.6 The respondent in Kyburz had created a web forum after his website was shut down and had posted hate messages there (as well as retaliatory messages against the complainant). Because the forum was an open forum, meaning anyone could sign up and read the messages, the CHRT considered the communication going on in the forum to be public. A fourth guiding principle is therefore that, where membership is generally open to the public, a message conveyed within the context of membership-only forum is a public communication. In other words, a message is not converted into a private communication merely because recipients are required to sign up for a membership or subscription.

Evaluation of new communications technologies and applications

The above CHRT decisions offer several solid principles from which to evaluate other communications technologies and applications that have not yet been the subject of a Tribunal decision:

  • A message that is repeatedly conveyed to members of the public is public, regardless of the one-to-one or one-to-many nature of the medium;
  • A message is communicated by the originator regardless of whether a recipient of that message must actively do something to access it;
  • Where messages can be located through publicly available means that message is accessible to the public, regardless of whether the originator actively disseminated the message or advertised its location; and
  • Where messages can only be accessed by people who join or subscribe to a forum, that communication is public if membership is broadly available to the public.

Some newer technologies and applications are easier to classify as public than others, and as with the telephone message service that inspired section 13, most of the time a determination will depend on the facts. However, applying the above principles to a few of these technologies and applications will demonstrate the analytical process these principles enable.

(a) Blogs, bulletin boards, newsgroups: Any communication through an application that enables an individual to post messages to be read by the general Internet public is clearly public communication, since the posted messages are publicly accessible.7 Following Kyburz, even where a person must subscribe to a blog, bulletin board or newsgroup in order to read these messages, if subscription is generally open to the public, the communication will still be a public communication.

(b) Hate spam: Mass unsolicited e-mailing of hate messages (i.e. hate spam) or mass unsolicited text messaging on cell phones would qualify as public communication, following the analysis of telephone answering machine messages in Taylor, regardless of the one-onone nature of e-mail and text messaging.

(c) Secure websites:Websites that require a password to gain access may or may not be public communication, depending on the process by which a member of the public obtains a password. Following Kyburz, if passwords are generally available to the public with few barriers to membership, then the communication going on within the secure website would also be public communication.

(d) Podcasts: Podcasts work on a publisher/ subscriber model, so again following Kyburz, a podcast would be public communication if subscription is generally open to the public. In both the secure website and the podcast situation, the determination is murkier the more restricted the subscriber base.

(e) Peer-to-peer file sharing:Whether making hate materials available through a peer-topeer file-sharing service is public communication will depend in part on whether the file-sharing software contains a search feature that allows other users of the program to find hate materials, or there are otherwise search services available which make it possible for the public to find the files. Popular filesharing software like Kazaa and Grokster have search features, and so users who put hate materials in their shared folders communicate it to the public, since, following the CHRT’s analysis in Machiavelli, access to materials in shared folders is available to anyone who searches for it.0

Newer distributed file-sharing programs like BitTorrent don’t include search features the same way that Kazaa type programs do, so people looking for content have to find where the material is hosted in some other way, including third party search services. The determination of whether BitTorrent file sharing is public communication will depend on the manner of disseminating information on how to locate the material and to whom that information is made available.

Reducing public access to hate materials: the role of private industries

Given parameters for public communication established by the guiding principles above, private industry can clearly help reduce public access to hate materials. In addition to the ongoing centrality of Internet service providers (ISPs) who ultimately serve as the gateway to Internetbased communications, providers of information location tools and search services as well as those who run blogs or subscriber-based services are also in a position to remove materials, bar members or ensure that certain locations do not turn up in search results.8

As with ISPs, these third parties should generally not be considered liable for the materials made available or located through their services, unless they do so knowingly and with the intention to make such materials available. But just as ISPs have been responsive to public pressure to quickly remove hate materials, on the grounds that these materials violate their contractual agreements with subscribers contained in their acceptable use policies, so too should these other service providers be encouraged to do their part to make hate materials less publicly accessible and to include their willingness to take such actions in their terms of service or acceptable use policies.

While these measures are voluntary and will be implemented on a company-by-company basis, terms of use and acceptable use policies serve to educate users of these services as to the limits of acceptable behaviour and to the consequences of exceeding those limits. Companies who assist in reducing public exposure to hate materials will help the Canadian human rights regime achieve the goal of taking such materials out of the public discourse and will help the overall effort to stay ahead of the purveyors of hate speech, who will surely continue to use ever more diversified technologies to communicate their messages.

Conclusion

By evaluating each new means of communicating messages according to the guiding principles, the CHRT should be able to consistently assess the degree to which a hate message is a public communication. Section 13 will therefore be able to accommodate future technologies and applications, just as the section has so far proven to be adaptable to technological change.

Endnotes

1. Warman v. Warman (2005), CHRR Doc. 05-531, 2005 CHRT 36.
2. Canadian Human Rights Act (R.S. 1985, c. H-6), s. 13(1) states "It is a discriminatory practice for a person or a group of persons acting in concert to communicate telephonically or to cause to be so communicated, repeatedly, in whole or in part by means of the facilities of a telecommunication undertaking within the legislative authority of Parliament, any matter that is likely to expose a person or persons to hatred or contempt by reason of the fact that that person or those persons are identifiable on the basis of a prohibited ground of discrimination."
3. Canada (Human Rights Comm.) v. Taylor (1979), CHRR Doc. 79-001 (C.H.R.T.); Canada (Human Rights Commission) v. Taylor [1990] 3 S.C.R. 892.
4. Schnell v. Machiavelli and Associates Emprize Inc. (No. 2), (2002), 43 C.H.R.R. D/453 (C.H.R.T.).
5. Criminal Code s. 319(7).
6. Warman v. Kyburz (No. 2) (2003), 46 C.H.R.R. D/425, 2003 CHRT 18.
7. Blog postings have been subject to prosecution as hate speech in several jurisdictions, including France and Singapore. Further, blog hosts may be liable if they refuse to remove offending postings, since libel law indicates that blog hosts are publishers, not just intermediaries.
8. Search services are already subject to legislation requiring them to respond to content issues related to copyright infringement in some jurisdictions (and likely will soon be subject to such requirements when Canada enacts the next round of Copyright Act amendments). The inventor of BitTorrent, the file-sharing program, recently struck a deal with the Motion Picture Association of America to prevent searching for copyrighted films via his search tool, indicating the ability to manipulate search results in his context as well.

 

Previous PageTable of Contents

Français | Contact Us | Help | Search
Canada Site | What's New | About Us | Publications | FAQ | Home