Teresa Scassa - Blog

Teresa Scassa

Teresa Scassa

The recent scandal regarding the harvesting and use of the personal information of millions of Facebook users in order to direct content towards them aimed at influence their voting behavior raises some interesting questions about the robustness of our data protection frameworks. In this case, a UK-based professor collected personal information via an app, ostensibly for non-commercial research purposes. In doing so he was bound by terms of service with Facebook. The data collection was in the form of an online quiz. Participants were paid to answer a series of questions, and in this sense they consented to and were compensated for the collection of this personal information. However, their consent was to the use of this information only for non-commercial academic research. In addition, the app was able to harvest personal information from the Facebook friends of the study participants – something which took place without the knowledge or consent of those individuals. The professor later sold his app and his data to Cambridge Analytica, which used it to target individuals with propaganda aimed at influencing their vote in the 2016 US Presidential Election.

A first issue raised by this case is a tip-of-the-iceberg issue. Social media platforms – not just Facebook – collect significant amounts of very rich data about users. They have a number of strategies for commercializing these treasure troves of data, including providing access to the platform to app developers or providing APIs on a commercial basis that give access to streams of user data. Users typically consent to some secondary uses of their personal information under the platform’s terms of service (TOS). Social media platform companies also have TOS that set the terms and conditions under which developers or API users can obtain access to the platform and/or its data. What the Cambridge Analytica case reveals is what may (or may not) happen when a developer breaches these TOS.

Because developer TOS are a contract between the platform and the developer, a major problem is the lack of transparency and the grey areas around enforcement. I have written about this elsewhere in the context of another ugly case involving social media platform data – the Geofeedia scandal (see my short blog post here, full article here). In that case, a company under contract with Twitter and other platforms misused the data it contracted for by transforming it into data analytics for police services that allowed police to target protesters against police killings of African American men. This was a breach of contractual terms between Twitter and the developer. It came to public awareness only because of the work of a third party (in that case, the ACLU of California). In the case of Cambridge Analytica, the story also only came to light because of a whistleblower (albeit one who had been involved with the company’s activities). In either instance it is important to ask whether, absent third party disclosure, the situation would ever have come to light. Given that social media companies provide, on a commercial basis, access to vast amounts of personal information, it is important to ask what, if any, proactive measures they take to ensure that developers comply with their TOS. Does enforcement only take place when there is a public relations disaster? If so, what other unauthorized exploitations of personal information are occurring without our knowledge or awareness? And should platform companies that are sources of huge amounts of personal information be held to a higher standard of responsibility when it comes to their commercial dealing with this personal information?

Different countries have different data protection laws, so in this instance I will focus on Canadian law, to the extent that it applies. Indeed, the federal Privacy Commissioner has announced that he is looking into Facebook’s conduct in this case. Under the Personal Information Protection and Electronic Documents Act (PIPEDA), a company is responsible for the personal information it collects. If it shares those data with another company, it is responsible for ensuring proper limitations and safeguards are in place so that any use or disclosure is consistent with the originating company’s privacy policy. This is known as the accountability principle. Clearly, in this case, if the data of Canadians was involved, Facebook would have some responsibility under PIPEDA. What is less clear is how far this responsibility extents. Clause 4.1.3 of Schedule I to PIPEDA reads: “An organization is responsible for personal information in its possession or custody, including information that has been transferred to a third party for processing. The organization shall use contractual or other means to provide a comparable level of protection while the information is being processed by a third party.” [My emphasis]. One question, therefore, is whether it is enough for Facebook to simply have in place a contract that requires its developers to respect privacy laws, or whether Facebook’s responsibility goes further. Note that in this case Facebook appears to have directed Cambridge Analytica to destroy all improperly collected data. And it appears to have cut Cambridge Analytica off from further access to its data. Do these steps satisfy Facebook’s obligations under PIPEDA? It is not at all clear that PIPEDA places any responsibilities on organizations to actively supervise or monitor companies with which it has shared data under contract. It is fair to ask, therefore, whether in cases where social media platforms share huge volumes of personal data with developers, is the data-sharing framework in PIPEDA sufficient to protect the privacy interests of the public.

Another interesting question arising from the scandal is whether what took place amounts to a data breach. Facebook has claimed that it was not a data breach – from their perspective, this is a case of a developer that broke its contract with Facebook. It is easy to see why Facebook would want to characterize the incident in this way. Data breaches can bring down a whole other level of enforcement, and can also give rise to liability in class action law suits for failure to properly protect the information. In Canada, new data breach notification provisions (which have still not come into effect under PIPEDA) would impose notification requirements on an organization that experienced a breach. It is interesting to note, though, that he data breach notification requirements are triggered where there is a “real risk of significant harm to an individual” [my emphasis]. Given what has taken place in the Cambridge Analytical scandal, it is worth asking whether the drafters of this provision should have included a real risk of significant harm to the broader public. In this case, the personal information was used to subvert democratic processes, something that is a public rather than an individual harm.

The point about public harm is an important one. In both the Geofeedia and the Cambridge Analytica scandals, the exploitation of personal information was on such a scale and for such purposes that although individual privacy may have been compromised, the greater harms were to the public good. Our data protection model is based upon consent, and places the individual and his or her choices at its core. Increasingly, however, protecting privacy serves goals that go well beyond the interests of any one individual. Not only is the consent model broken in an era of ubiquitous and continuous collection of data, it is inadequate to address the harms that come from improper exploitation of personal information in our big data environment.

In February 2018 the Standing Committee on Access to Information, Privacy and Ethics (ETHI) issued its report based on its hearings into the state of Canada’s Personal Information Protection and Electronic Documents Act. The Committee hearings were welcomed by many in Canada’s privacy community who felt that PIPEDA had become obsolete and unworkable as a means of protecting the personal information of Canadians in the hands of the private sector. The report, titled Towards Privacy by Design: Review of the Personal Information Protection and Electronic Documents Act seems to come to much the same conclusion. ETHI ultimately makes recommendations for a number of changes to PIPEDA, some of which could be quite significant.

This blog post is the first in a series that looks at the ETHI Report and its recommendations. It addresses the issue of consent.

The enactment of PIPEDA in 2001 introduced a consent-based model for the protection of personal information in the hands of the private sector in Canada. The model has at its core a series of fair information principles that are meant to guide businesses in shaping their collection, use and disclosure of personal information. Consent is a core principle; other principles support consent by ensuring that individuals have adequate and timely notice of the collection of personal information and are informed of the purposes of collection.

Unfortunately, the principle of consent has been drastically undermined by advances in technology and by a dramatic increase in the commercial value of personal information. In many cases, personal information is now actual currency and not just the by-product of transactions, changing the very fundamentals of the consent paradigm. In the digital environment, the collection of personal information is also carried out continually. Not only is personal information collected with every digital interaction, it is collected even while people are not specifically interacting with organizations. For example, mobile phones and their myriad apps collect and transmit personal information even while not in use. Increasingly networked and interconnected appliances, entertainment systems, digital assistants and even children’s toys collect and communicate steady streams of data to businesses and their affiliates.

These developments have made individual consent somewhat of a joke. There are simply too many collection points and too many privacy policies for consumers to read. Most of these policies are incomprehensible to ordinary individuals; many are entirely too vague when it comes to information use and sharing; and individuals can easily lose sight of consents given months or years previously to apps or devices that are largely forgotten but that nevertheless continuing to harvest personal information in the background. Managing consent in this environment is beyond the reach of most. To add insult to injury, the resignation felt by consumers without meaningful options for consent is often interpreted as a lack of interest in privacy. As new uses (and new markets) for personal information continue to evolve, it is clear that the old model of consent is no longer adequate to serve the important privacy interests of individuals.

The ETHI Report acknowledges the challenges faced by the consent model; it heard from many witnesses who identified problems with consent and many who proposed different models or solutions. Ultimately, however, ETHI concludes that “rather than overhauling the consent model, it would be best to make minor adjustments and let the stakeholders – the Office of the Privacy Commissioner (OPC), businesses, government, etc. – adapt their practices in order to maintain and enhance meaningful consent.”(at p. 20)

The fact that the list of stakeholders does not include the public – those whose personal information and privacy are at stake – is telling. It signals ambivalence about the importance of privacy within the PIPEDA framework. In spite of being an interest hailed by the Supreme Court of Canada as quasi-constitutional in nature, privacy is still not approached by Parliament as a human right. The prevailing legislative view seems to be that PIPEDA is meant to facilitate the exchange of personal information with the private sector; privacy is protected to the extent that it is necessary to support public confidence in such exchanges. The current notion of consent places a significant burden on individuals to manage their own privacy and, by extension, places any blame for oversharing on poor choices. It is a cynically neo-liberal model of regulation in which the individual ultimately must assume responsibility for their actions notwithstanding the fact that the deck has been so completely and utterly stacked against them.

The OPC recently issued a report on consent which also recommended the retention of consent as a core principle, but recognized the need to take concrete steps to maintain its integrity. The OPC recommendations included using technological tools, developing more accessible privacy policies, adjusting the level of consent required to the risk of harm, creating no-go zones for the use of personal information, and enhancing privacy protection for children. ETHI’s rather soft recommendations on consent may be premised on an understanding that much of this work will go ahead without legislative change.

Among the minor adjustments to consent recommended by ETHI is that PIPEDA be amended to make opt-in consent the default for any use of personal information for secondary purposes. This means that while there might be opt-out consent for the basic services for which a consumer is contracting (in other words, if you provide your name and address for the delivery of an item, it can be assumed you are consenting to the use of the information for that purpose), consumers must agree to the collection, use or disclosure of their personal information for secondary or collateral purposes. ETHI’s recommendation also indicates that opt-in consent might eventually become the norm in all circumstances. Such a change may have some benefits. Opt out consent is invidious. Think of social media platform default settings that enable a high level of personal information sharing, leaving consumers to find and adjust these settings if they want greater protection for their privacy. An opt-in consent requirement might be particularly helpful in addressing such problems. Nevertheless, it will not be much use in the context of long, complex (and largely unread) privacy policies. Many such policies ask consumers to consent to a broad range of uses and disclosures of personal information, including secondary purposes described in the broadest of terms. A shift to opt-in consent will not help if agreeing to a standard set of unread terms amounts to opting-in.

ETHI also considered whether and how individuals should be able to revoke their consent to the collection, use or disclosure of their personal information. The issues are complex. ETHI gave the example of social media, where information shared by an individual might be further disseminated by many others, making it challenging to give effect to a revocation of consent. ETHI recommends that the government “study the issue of revocation of consent in order to clarify the form of revocation required and its legal and practical implications”.

ETHI also recommended that the government consider specific rules around consent for minors, as well as the collection, use and disclosure of their personal information. Kids use a wide range of technologies, but may be particularly vulnerable because of a limited awareness of their rights and recourses, as well as of the long-term impacts of personal information improvidently shared in their youth. The issues are complex and worthy of further study. It is important to note, however, that requiring parental consent is not an adequate solution if the basic framework for consent is not addressed. Parents themselves may struggle to understand the technologies and their implications and may be already overwhelmed by multiple long and complex privacy policies. The second part of the ETHI recommendation which speaks to specific rules around the collection, use and disclosure of the personal information of minors may be more helpful in addressing some of the challenges in this area. Just as we have banned some forms of advertising directed at children, we might also choose to ban some kinds of collection or uses of children’s personal information.

In terms of enhancing consent, these recommendations are thin on detail and do not provide a great deal of direction. They seem to be informed by a belief that a variety of initiatives to enhance consent through improved privacy policies (including technologically enhanced policies) may suffice. They are also influenced by concerns expressed by business about the importance of maintaining the ‘flexibility’ of the current regime. While there is much that is interesting elsewhere within the ETHI report, the discussion of consent feels incomplete and disappointing. Minor adjustments will not make a major difference.

Up next: One of the features of PIPEDA that has proven particularly challenging when it comes to consent is the ever-growing list of exceptions to the consent requirement. In my next post I will consider ETHI’s recommendations that would add to that list, and that also address ‘alternatives’ to consent.

The Office of the Privacy Commissioner of Canada has released its Draft Position on Online Reputation. It’s an important issue and one that is of great concern to many Canadians. In the Report, the OPC makes recommendations for legislative change and proposes other measures (education, for example) to better protect online reputation. However, the report has also generated considerable controversy for the position it has taken on how the Personal Information Protection and Electronic Documents Act currently applies in this context. In this post I will focus on the Commissioner’s expressed view that PIPEDA applies to search engine activities in a way that would allow Canadians to request the de-indexing of personal information from search engines, with the potential to complain to the Commissioner if these demands are not met.

PIPEDA applies to the collection, use and disclosure of personal information in the course of commercial activity. The Commissioner reasons, in this report, that search engines are engaged in commercial activity, even if search functions are free to consumers. An example is the placement of ads in search results. According to the Commissioner, because search engines can provide search results that contain (or lead to) personal information, these search engines are collecting, using and disclosing personal information in the course of commercial activity.

With all due respect, this view seems inconsistent with current case law. In 2010, the Federal Court in State Farm Mutual Automobile Insurance Co. v. Canada (Privacy Commissioner) ruled that an insurance company that collected personal information on behalf of an individual it was representing in a law suit was not collecting that information in the course of commercial activity. This was notwithstanding the fact that the insurance company was a commercial business. The Court was of the view that, at essence, the information was being collected on behalf of a private person (the defendant) so that he could defend a legal action (a private and non-commercial matter to which PIPEDA did not apply). Quite tellingly, at para 106, the court stated: “if the primary activity or conduct at hand, in this case the collection of evidence on a plaintiff by an individual defendant in order to mount a defence to a civil tort action, is not a commercial activity contemplated by PIPEDA, then that activity or conduct remains exempt from PIPEDA even if third parties are retained by an individual to carry out that activity or conduct on his or her behalf.”

The same reasoning applies to search engines. Yes, Google makes a lot of money, some of which comes from its search engine functions. However, the search engines are there for anyone to use, and the relevant activities, for the purposes of the application of PIPEDA, are those of the users. If a private individual carries out a Google search for his or her own purposes, that activity does not amount to the collection of personal information in the course of commercial activity. If a company does so for its commercial purposes, then that company – and not Google – will have to answer under PIPEDA for the collection, use or disclosure of that personal information. The view that Google is on the hook for all searches is not tenable. It is also problematic for the reasons set out by my colleague Michael Geist in his recent post.

I also note with some concern the way in which the “journalistic purposes” exception is treated in the Commissioner’s report. This exception is one of several designed to balance privacy with freedom of expression interests. In this context, the argument is that a search engine facilitates access to information, and is a tool used by anyone carrying out online research. This is true, and for the reasons set out above, PIPEDA does not apply unless that research is carried out in the course of commercial activities to which the statute would apply. Nevertheless, in discussing the exception, the Commissioner states:

Some have argued that search engines are nevertheless exempt from PIPEDA because they serve a journalistic or literary function. However, search engines do not distinguish between journalistic/literary material. They return content in search results regardless of whether it is journalistic or literary in nature. We are therefore not convinced that search engines are acting for “journalistic” or “literary” purposes, or at least not exclusively for such purposes as required by paragraph 4(2)(c).

What troubles me here is the statement that “search engines do not distinguish between journalistic and literary material”. They don’t need to. The nature of what is sought is not the issue. The issue is the purpose. If an individual uses Google in the course of non-commercial activity, PIPEDA does not apply. If a journalist uses Google for journalistic purposes, PIPEDA does not apply. The nature of the content that is searched is immaterial. The quote goes on to talk about whether search engines act for journalistic or literary purposes – that too is not the point. Search engines are tools. They are used by actors. It is the purposes of those actors that are material, and it is to those actors that PIPEDA will apply – if they are collecting, using or disclosing personal information in the course of commercial activity.

The Report is open for comment until April 19, 2018.

In October 2016, the data analytics company Geofeedia made headlines when the California chapter of the American Civil Liberties Union (ACLU) issued the results of a major study which sought to determine the extent to which police services in California were using social media data analytics. These analytics were based upon geo-referenced information posted by ordinary individuals to social media websites such as Twitter and Facebook. Information of this kind is treated as “public” in the United States because it is freely contributed by users to a public forum. Nevertheless, the use of social media data analytics by police raises important civil liberties and privacy questions. In some cases, users may not be aware that their tweets or posts contain additional meta data including geolocation information. In all cases, the power of data analytics permits rapid cross-referencing of data from multiple sources, permitting the construction of profiles that go well beyond the information contributed in single posts.

The extent to which social media data analytics are used by police services is difficult to assess because there is often inadequate transparency both about the actual use of such services and the purposes for which they are used. Through a laborious process of filing freedom of information requests the ACLU sought to find out which police services were contracting for social media data analytics. The results of their study showed widespread use. What they found in the case of Geofeedia went further. Although Geofeedia was not the only data analytics company to mine social media data and to market its services to government authorities, its representatives had engaged in email exchanges with police about their services. In these emails, company employees used two recent sets of protests against police as examples of the usefulness of social media data analytics. These protests were those that followed the death in police custody of Freddie Gray, a young African-American man who had been arrested in Baltimore, and the shooting death by police of Michael Brown, an eighteen-year-old African-American man in Ferguson, Missouri. By explicitly offering services that could be used to monitor those who protested police violence against African Americans, the Geofeedia emails aggravated a climate of mistrust and division, and confirmed a belief held by many that authorities were using surveillance and profiling to target racialized communities.

In a new paper, just published in the online, open-access journal SCRIPTed, I use the story around the discovery of Geofeedia’s activities and the backlash that followed to frame a broader discussion of police use of social media data analytics. Although this paper began as an exploration of the privacy issues raised by the state’s use of social media data analytics, it shifted into a paper about transparency. Clearly, privacy issues – as well as other civil liberties questions – remain of fundamental importance. Yet, the reality is that without adequate transparency there simply is no easy way to determine whether police are relying on social media data analytics, on what scale and for what purposes. This lack of transparency makes it difficult to hold anyone to account. The ACLU’s work to document the problem in California was painstaking and time consuming, as was a similar effort by the Brennan Center for Justice, also discussed in this paper. And, while the Geofeedia case provided an important example of the real problems that underlie such practices, it only came to light because Geofeedia’s employees made certain representations by email instead of in person or over the phone. A company need only direct that email not be used for these kinds of communications for the content of these communications to disappear from public view.

My paper examines the use of social media data analytics by police services, and then considers a range of different transparency issues. I explore some of the challenges to transparency that may flow from the way in which social media data analytics are described or characterized by police services. I then consider transparency from several different perspectives. In the first place I look at transparency in terms of developing explicit policies regarding social media data analytics. These policies are not just for police, but also for social media platforms and the developers that use their data. I then consider transparency as a form of oversight. I look at the ways in which greater transparency can cast light on the activities of the providers and users of social media data and data analytics. Finally, I consider the need for greater transparency around the monitoring of compliance with policies (those governing police or developers) and the enforcement of these policies.

A full text of my paper is available here under a CC Licence.

Canada’s Federal Court of Appeal has handed down a decision that addresses important issues regarding control over commercially valuable data. The decision results from an appeal of an earlier ruling of the Competition Tribunal regarding the ability of the Toronto Real Estate Board (TREB) to limit the uses to which its compilation of current and historical property listings in the Greater Toronto Area (GTA) can be put.

Through its operations, the TREB compiles a vast database of real estate listings. Information is added to the database on an ongoing basis by real estate brokers who contribute data each time a property is listed with them. Real estate agents who are members of TREB in turn receive access to a subset of this data via an electronic feed. They are permitted to make this data available through their individual websites. However, the TREB does not permit all of its data to be shared through this feed; some data is available only through other means such as in-person consultation, or communications of snippets of data via email or fax.

The dispute arose after the Competition Commissioner applied to the Competition Tribunal for a ruling as to whether the limits imposed by the TREB on the data available through the electronic feed inhibited the ability of “virtual office websites” (VOWs) to compete with more conventional real estate brokerages. The tribunal ruled that they did, and the matter was appealed to the Federal Court of Appeal. Although the primary focus of the Court’s decision was on the competition issues, it also addressed questions of privacy and copyright law.

The Federal Court of Appeal found that the TREB’s practices of restricting available data – including information on the selling price of homes – had anticompetitive effects that limited the range of broker services that were available in the GTA, limited innovation, and had an adverse impact on entry into and expansion of relevant markets. This aspect of the decision highlights how controlling key data in a sector of the economy can amount to anti-competitive behavior. Data are often valuable commercial assets; too much exclusivity over data may, however, pose problems. Understanding the limits of control over data is therefore an important and challenging issue for businesses and regulators alike.

The TREB had argued that one of the reasons why it could not provide certain data through its digital feed was because these data were personal information and it had obligations under the Personal Information Protection and Electronic Documents Act to not disclose this information without appropriate consent. The TREB relied on a finding of the Office of the Privacy Commissioner of Canada that the selling price of a home (among those data held back by TREB) was personal information because it could lead to inferences about the individual who sold the house (e.g.: their negotiating skills, the pressure on them to sell, etc.). The Court noted that the TREB already shared the information it collected with its members. Information that was not made available through the digital feed was still available through more conventional methods. In fact, the Court noted that the information was very widely shared. It ruled that the consent provided by individuals to this sharing of information would apply to the sharing of the same information through a digital feed. It stated: “PIPEDA only requires new consent where information is used for a new purpose, not where it is distributed via new methods. The introduction of VOWs is not a new purpose – the purpose remains to provide residential real estate services [. . .].” (at para 165) The Court’s decision was influenced by the fact that the consent form was very broadly worded. Through it, TREB obtained consent to the use and dissemination of the data “during the term of the listing and thereafter.” This conclusion is interesting, as many have argued that the privacy impacts are different depending on how information is shared or disseminated. In other words, it could have a significant impact on privacy if information that is originally shared only on request, is later published on the Internet. Consent to disclosure of the information using one medium might not translate into consent to a much broader disclosure. However, the Court’s decision should be read in the context of both the very broad terms of the consent form and the very significant level of disclosure that was already taking place. The court’s statement that “PIPEDA only requires new consent where information is used for a new purpose, not where it is distributed via new methods” should not be taken to mean that new methods of distribution do not necessarily reflect new purposes that go beyond the original consent.

The Federal Court of Appeal also took note of the Supreme Court of Canada’s recent decision in Royal Bank of Canada v. Trang. In the course of deciding whether to find implied consent to a disclosure of personal information, the Supreme Court of Canada had ruled that while the balance owing on a mortgage was personal information, it was less sensitive than other financial information because the original amount of the mortgage, the rate of interest and the due date for the mortgage were all publicly available information from which an estimate of the amount owing could be derived. The Federal Court of Appeal found that the selling price of a home was similarly capable of being derived from other publicly available data sources and was thus not particularly sensitive personal information.

In addition to finding that there would be no breach of PIPEDA, the Federal Court of Appeal seemed to accept the Tribunal’s view that the TREB was using PIPEDA in an attempt to avoid wider sharing of its data, not because of concerns for privacy, but in order to maintain its control over the data. It found that TREBs conduct was “consistent with the conclusion that it considered the consents were sufficiently specific to be compliant with PIPEDA in the electronic distribution of the disputed data on a VOW, and that it drew no distinction between the means of distribution.” (at para 171)

Finally, the Competition Tribunal had ruled that the TREB did not have copyright in its compilation of data because the compilation lacked sufficient originality in the selection or arrangement of the underlying data. Copyright in a compilation depends upon this originality in selection or arrangement because facts themselves are in the public domain. The Federal Court of Appeal declined to decide the copyright issue since the finding that the VOW policy was anti-competitive meant that copyright could not be relied upon as a defence. Nevertheless, it addressed the copyright question in obiter (meaning that its comments are merely opinion and not binding precedent).

The Federal Court of Appeal noted that the issue of whether there is copyright in a compilation of facts is a “highly contextual and factual determination” (at para 186). The Court of Appeal took note of the Tribunal’s findings that “TREB’s specific compilation of data from real estate listings amounts to a mechanical exercise” (at para 194), and agreed that the threshold for originality was not met. The Federal Court of Appeal dismissed the relevance of TREB’s arguments about the ways in which its database was used, noting that “how a “work” is used casts little light on the question of originality.” (at para 195) The Court also found no relevance to the claims made in TREB’s contracts to copyright in its database. Claiming copyright is one thing, establishing it in law is quite another.

 

Note that leave to appeal this decision to the Supreme Court of Canada was denied on August 23, 2018.

 

Last year I attended a terrific workshop at UBC’s Allard School of Law. The workshop was titled ‘Property in the City’, and panelists presented work on a broad range of issues relating to law in the urban environment. A special issue of the UBC Law Review has just been published featuring some of the output of this workshop. The issue contains my own paper (discussed below and available here) that explores skirmishes over access to and use of Airbnb platform data.

Airbnb is a ‘sharing economy’ platform that facilitates the booking of short-term accommodation. The company is premised on the idea that many urban dwellers have excess space – rooms in homes or apartments – or have space they do not use at certain periods of the year (entire homes or apartments while on vacation, for example) – and that a digital marketplace can maximize efficient use of this space by matching those seeking temporary accommodation with those having excess space. The Airbnb web site claims that it “connects people to unique travel experiences at any price point” and at the same time “is the easiest way for people to monetize their extra space and showcase it to an audience of millions.”

This characterization of Airbnb is open to challenge. Several studies, including ones by the Canadian Centre for Policy Alternatives, the City of Vancouver, and the NY State Attorney General suggest that a significant number of units for rent on Airbnb are offered as part of commercial enterprises. The description also belies Airbnb’s disruptive impact. The re-characterization and commodification of ‘surplus’ private spaces neatly evades the regulatory frameworks designed for the marketing of short-term accommodation and leaves licensed short-term accommodation providers complaining that their highly regulated businesses are being undermined by competition from those not bearing the same regulatory burdens. At the same time, many housing advocates and city officials are concerned about the impact of platforms such as Airbnb on the availability and affordability of long-term housing.

These challenges are made more difficult to address by the fact that the data needed to understand the impact of platform companies, along with data about short-term rentals that would otherwise be captured through regulatory processes, are effectively privatized in the hands of Airbnb. Data deficits of this kind pose a challenge to governments, civil society and researchers..

My paper explores the impact of a company such as Airbnb on cities from the perspective of data. I argue that platform-based, short-term rental activities have a fundamental impact on what data are available to municipal governments who struggle to regulate in the public interest, as well as to civil society groups and researchers that attempt to understand urban housing issues. The impacts of platform companies are therefore not just disruptive of incumbent industries; they disrupt planning and regulatory processes by masking activities and creating data deficits. My paper considers some of the currently available solutions to the data deficits, which range from self-help type recourses such as data scraping to entering into data-sharing agreements with the platform companies. Each of these solutions has its limits and drawbacks. I argue that further action may be required by governments to ensure their data needs are adequately met.

Although this paper focuses on Airbnb, it is worth noting that the data deficits discussed in the paper are merely a part of a larger context in which evolving technologies shift control over some kinds of data from public to private hands. Ensuring the ability of governments and civil society to collect, retain, and share data of a sufficient quality to both enable and to enhance governance, transparency, and accountability should be priorities for municipal governments, and should also be supported by law and policy at provincial and federal levels.

 

 

An Ontario small claims court judge has found in favour of a plaintiff who argued that her privacy rights were violated when a two-second video clip of her jogging on a public path was used by the defendant media company in a sales video for a real-estate development client. The plaintiff testified that she had been jogging so as to lose the weight that she had gained after having children. She became aware of the video when a friend drew her attention to it on YouTube, and the image “caused her discomfort and anxiety” (para 5). Judge Leclaire noted that the “image of herself in the video is clearly not the image she wished portrayed publicly”.

At the time of the filming, the defendant’s practice was to seek consent to appear in its videos from people who were filmed in private spaces, but not to do so where people were in public places. The defendant’s managing associate testified that if people in public places “see the camera and continue moving, consent is implied.” (at para 9) The judge noted that it was not established how it could be known whether individuals saw the camera. The plaintiff testified that she had seen the camera, and had attempted to shield her face from view; she believed that this demonstrated that she did not wish to be filmed.

Although the defendant indicated that the goal was to capture the landscape and not the people, the judge found that “people are present and central to the location and the picture.” (at para 10) The judge found that the photographer deliberately sought to include an image of someone engaging in the activity of jogging alongside the river. Although the defendant argued that it would not be practical to seek consent from the hundreds of people who might be captured in a video of a public space, the judge noted that in the last two years, the defendant company had “tightened up” its approach to seeking consent, and now approached people in public areas prior to filming to seek their consent to appear in any resulting video.

The plaintiff argued that there had been a breach of the tort of intrusion upon seclusion, which was first recognized in Ontario by the Ontario Court of Appeal in Jones v. Tsige in 2012. Judge Leclaire stated that the elements of the tort require 1) that the defendant’s actions are intentional or reckless; 2) that there is no lawful justification for the invasion of the plaintiff’s private affairs or concerns; and 3) that the invasion is one that a reasonable person would consider to be “highly offensive causing distress, humiliation or anguish.” (Jones at para 71) Judge Leclaire found that these elements of the tort were made out on the facts before him. The defendant’s conduct in filming the video was clearly intentional. He also found that a reasonable person “would regard the privacy invasion as highly offensive”, noting that “the plaintiff testified as to the distress, humiliation or anguish that it caused her.” (at para 16)

Judge Leclaire clearly felt that the defendant had crossed a line in exploiting the plaintiff’s image for its own commercial purposes. Nevertheless, there are several problems with his application of the tort of intrusion upon seclusion. Not only does he meld the objective “reasonable person” test with a subjective test of the plaintiff’s own feelings about what happened, his decision that capturing the image of a person jogging on a public pathway is an intrusion upon seclusion is in marked contrast to the statement of the Ontario Court of Appeal in Jones v. Tsige, that the tort is relatively narrow in scope:

 

A claim for intrusion upon seclusion will arise only for deliberate and significant invasions of personal privacy. Claims from individuals who are sensitive or unusually concerned about their privacy are excluded: it is only intrusions into matters such as one's financial or health records, sexual practises and orientation, employment, diary or private correspondence that, viewed objectively on the reasonable person standard, can be described as highly offensive. (at para 72)

 

Judge Leclaire provides relatively little discussion about how to address the capture of images of individuals carrying out activities in public spaces. Some have suggested that there is simply no privacy in public space, while others have called for a more contextual inquiry. Such an inquiry was absent in this case. Instead, Judge Leclaire relied upon Aubry v. Vice-Versa a decision of the Supreme Court of Canada, even though that decision was squarely based on provisions of Quebec law which have no real equivalent in common law Canada. The right to one’s image is specifically protected by art. 36 of the Quebec Civil Code, which provides that it is an invasion of privacy to use a person’s “name, image, likeness or voice for a purpose other than the legitimate information of the public”. There is no comparable provision in Ontario law, although the use of one’s name, image or likeness in an advertisement might amount to the tort of misappropriation of personality. In fact, with almost no discussion, Judge Leclaire also found that this tort was made out on the facts and awarded $100 for the use of the plaintiff’s image without permission. It is worth noting that the tort of misappropriation of personality has typically required that a person have acquired some sort of marketable value in their personality in order for there to be a misappropriation of that value.

Judge Leclair awarded $4000 in damages for the breach of privacy which seems to be an exorbitant amount given the range of damages normally awarded in privacy cases in common law Canada. In this case, the plaintiff was featured in a 2 second clip in a 2 minute video that was taken down within a week of being posted. While there might be some basis to argue that other damage awards to have been too low, this one seems surprisingly high.

It is also worth noting that the facts of this case might constitute a breach of the Personal Information Protection and Electronic Documents Act (PIPEDA) which governs the collection, use or disclosure of personal information in the course of commercial activity. PIPEDA also provides recourse in damages, although the road to the Federal Court is a longer one, and that court has been parsimonious in its awards of damages. Nevertheless, given that Judge Leclaire’s preoccupation seems to be with the unconsented-to use of the plaintiff’s image for commercial purposes, PIPEDA seems like a better fit than the tort of intrusion upon seclusion.

Ultimately, this is a surprising decision and seems out of line with a growing body of case law on the tort of intrusion upon seclusion. As a small claims court decision, it will carry little precedential value. The case is therefore perhaps best understood as one involving a person who was jogging at the wrong place at the wrong time, but who sued in the right court at the right time. Nevertheless, it should serve as a warning to those who make commercial use of footage filmed in public spaces; as it reflects a perspective that not all activities in public spaces are ‘public’ in the fullest sense of the word. It highlights as well the increasingly chaotic privacy legal landscape in Canada.

 

Metrolinx is the Ontario government agency that runs the Prestocard service used by public transit authorities in Toronto, Ottawa and several other Ontario municipalities. It ran into some trouble recently after the Toronto Star revealed that the organization shared Prestocard data from its users with police without requiring warrants (judicial authorization). The organization has now published its proposals for revising its privacy policies and is soliciting comment on them. (Note: Metrolink has structured its site so that you can only view one of the three proposed changes at a time and must indicate your satisfaction with it and/or your comments before you can view the next proposal. This is problematic because the changes need to be considered holistically. It is also frankly annoying).

The new proposals do not eliminate the sharing of rider information with state authorities without a warrant. Under the new proposals, information will be shared without a warrant in certain exigent circumstances. It will also be shared without a warrant “in other cases, where we are satisfied it will aid in an investigation from which a law enforcement proceeding may be undertaken or is likely to result.” The big change is thus apparently in the clarity of the notice given to users of the sharing – not the sharing itself.

This flabby and open-ended language is taken more or less directly from the province’s Freedom of Information and Protection of Privacy Act (FOIPPA), which governs the public sector’s handling of personal information. As a public agency, Metrolinx is subject to FOIPPA. It is important to note that the Act permits (but does not require) government entities to share information with law enforcement in precisely the circumstances outlined in the policy. However, by adapting its policy to what it is permitted to do, rather than to what it should do, Metrolinx is missing two important points. The first is that the initial outrage over its practices was about information sharing without a warrant, and not about poor notice of such practices. The second is that doing a good job of protecting privacy sometimes means aiming for the ceiling and not the floor.

Location information is generally highly sensitive information as it can reveal a person’s movements, activities and associations. Police would normally need a warrant to obtain this type of information. It should be noted that police are not relieved of their obligations to obtain warrants when seeking information that raises a reasonable expectation of privacy just because a statute permits the sharing of the information. It would be open to the agency to require that a warrant be obtained prior to sharing sensitive customer location data. It is also important to note that some courts have found that the terms of privacy policies may actually alter the reasonable expectation of privacy – particularly when clear notice is given. In other words, even though we might have a reasonable expectation of privacy in location data about our movements, a privacy policy that tells us clearly that this information is going to be shared with police without a warrant could substantially undermine that expectation of privacy. And all of this happens without any ability on our part to negotiate for terms of service,[1] and in the case of a monopoly service such as public transportation, to choose a different provider.

Metrolinx no doubt expects its users to be comforted by the other changes to its policies. It already has some safeguards in place to minimize the information provided to police and to log any requests and responses. They plan to require, in addition, a sign off by the requesting officer and supervisor. Finally, they plan to issue voluntary transparency reports as per the federal government’s Transparency Reporting Guidelines. Transparency reporting is certainly important, as it provides a window onto the frequency with which information sharing takes place. However, these measures do not correct for an upfront willingness to share sensitive personal information without judicial authorization – particularly in cases where there are no exigent circumstances.

As we move more rapidly towards sensor-laden smart cities in which the consumption of basic services and the living of our daily lives will leave longer and longer plumes of data exhaust, it is important to reflect not just on who is collecting our data and why, but on the circumstances in which they are willing to share that data with others – including law enforcement officials. The incursions on privacy are many and from all directions. Public transit is a basic municipal service. It is also one that is essential for lower-income residents, including students.[2]Transit users deserve more robust privacy protections.

Notes:

[1] A recent decision of the Ontario Court of Appeal does seem to consider that the inability to negotiate for terms of service should be taken into account when assessing the impact of those terms on the reasonable expectation of privacy. See: R. v. Orlandis-Habsburgo.

[2] Some universities and colleges have U-Pass agreements which require students to pay additional fees in exchange for Prestocard passes. Universities and colleges should, on behalf of their students, be insisting on more robust privacy.



[

In the 2010-2011 school year, a teacher at a London, Ontario high school used a pen camera to make surreptitious video recordings of female students, with a particular emphasis on their cleavage and breasts. A colleague noticed his activity and reported it to the principal, who confiscated the pen camera and called the police. The police found 19 videos on the camera’s memory card, featuring 30 different individuals, 27 of whom were female. A warrant was obtained a week later to search the teacher’s home – the police found nothing beyond a computer mysteriously missing its hard drive. The teacher was ultimately charged with voyeurism.

The offense of voyeurism requires that there be a surreptitious observation (recorded or not) of a “person who is in circumstances that give rise to a reasonable expectation of privacy”. It also requires that the “observation or recording is done for a sexual purpose” (Criminal Code, s. 162(1)(c)). The trial judge had found that the students had a reasonable expectation of privacy in the circumstances, but he inexplicably found that the Crown had not met its burden of showing, beyond a reasonable doubt, that the recordings of their cleavage and breasts was done for a sexual purpose. He stated: “While a conclusion that the accused was photographing the student’s [sic] cleavage for a sexual purpose is most likely, there may be other inferences to be drawn that detract from the only rationale [sic] conclusion required to ground a conviction for voyeurism.” (Trial Decision at para 77) He did not provide any information about what those other inferences might conceivably be.

On appeal, the Crown argued that the trial judge had erred in finding that the filming was not done for a sexual purpose. All of the appellate judges agreed that the judge had indeed erred. The majority noted that the trial judge had failed to identify any other possible inferences in his reasons. They also noted that his description of the teacher’s behavior as “morally repugnant” was “inconsistent with the trial judge’s conclusion that the videos might not have been taken for a sexual purpose.” (Court of Appeal decision at para 47) The majority noted that “[t]his was an overwhelming case of videos focused on young women’s breasts and cleavage” (at para 53), and they concluded that there was no reasonable inference other than that the videos were taken for a sexual purpose. Clearly, the teacher was not checking for skin cancer.

However, the accused had appealed the trial judge’s finding that the students had a reasonable expectation of privacy. The majority of the Court of Appeal agreed, leading to the overall appeal of his acquittal being dismissed. The majority’s reasoning is disturbing, and has implications for privacy more broadly. In determining what a ‘reasonable expectation of privacy’ entailed, the majority relied on a definition of privacy from the Oxford English Dictionary. That learned non-legal tome defines privacy as “a state in which one is not observed or disturbed by other people; the state of being free from public attention.” (at para 93). From this, the majority concluded that location was a key component of privacy. They stated: “A person expects privacy in places where the person can exclude others, such as one’s home or office, or a washroom. It is a place where a person feels confident that they are not being observed.” (at para 94) The majority accepted that there might be some situations in which a person has an expectation of privacy in a public setting, but these would be limited. They gave the example of upskirting as one “where a woman in a public place had a reasonable expectation of privacy that no one would look under her skirt” (at para 96). Essentially, the tent of a woman’s skirt is a private place within a public one.

The trial judge had found a reasonable expectation of privacy in the circumstances on the basis that a student would expect that a teacher would not “breach their relationship of trust by surreptitiously recording them without there consent.” (at para 103). According to the majority, this conflated the reasonable expectation of privacy with the act of surreptitious recording. They stated: “Clearly students expect that a teacher will not secretly observe or record them for a sexual purpose at school. However, that expectation arises from the nature of the required relationship between students and teachers, not from an expectation of privacy.” (at para 105) This approach ignores the fact that the nature of the relationship is part of the context in which the reasonableness of the expectation of privacy must be assessed. The majority flattened the concept of reasonable expectation of privacy to one consideration – location. They stated that “if a person is in a public place, fully clothed and not engaged in toileting or sexual activity, they will normally not be in circumstances that give rise to a reasonable expectation of privacy.” (at para 108)

Justice Huscroft, in dissent is rightly critical of this impoverished understanding of the reasonable expectation of privacy. He began by situating privacy in its contemporary and technological context: “Technological developments challenge our ability to protect privacy: much that was once private because it was inaccessible is now easily accessible and capable of being shared widely.” (at para 116). He observed that “whether a person has a reasonable expectation of privacy is a normative or evaluative question rather than a descriptive or predictive one. It is concerned with identifying a person’s legitimate interests and determining whether they should be given priority over competing interests. To say that a person has a reasonable expectation of privacy in some set of circumstances is to conclude that his or her interest in privacy should be prioritized over other interests.” (at para 117)

Justice Huscroft was critical of the majority’s focus on location as a means of determining reasonable expectations of privacy. He found that the majority’s approach – defining spaces where privacy could reasonably be expected – was both over and under-inclusive. He noted that there are public places in which people have an expectation of privacy, even if that expectation is attenuated. He gave the example of a woman breastfeeding in public. He stated: “Privacy expectations need not be understood in an all-or-nothing fashion. In my view, there is a reasonable expectation that she will not be visually recorded surreptitiously for a sexual purpose. She has a reasonable expectation of privacy at least to this extent.” (at para 125) Justice Huscroft also noted that the majority’s approach was over-inclusive, in that while a person has a reasonable expectation of privacy in their home, it might be diminished if they stood in front of an open window. While location is relevant to the privacy analysis, it should not be determinative.

Justice Huscroft found that the question to be answered in this case was “should high school students expect that their personal and sexual integrity will be protected while they are at school?” (at para 131). He noted that schools were not fully public in the sense that school officials controlled access to the buildings. While the school in question had 24-hour video surveillance, the cameras did not focus on particular students or particular body parts. No access was permitted to the recordings for personal use. The school board had a policy in place that prohibited teachers from making the types of recordings made in this case. All of these factors contributed to the students’ reasonable expectation of privacy. He wrote:

No doubt, students will be seen by other students, school employees and officials while they are at school. But this does not mean that they have no reasonable expectation of privacy. In my view, the students' interest in privacy is entitled to priority over the interests of anyone who would seek to compromise their personal and sexual integrity while they are at school. They have a reasonable expectation of privacy at least to this extent, and that is sufficient to resolve this case. (at para 133)

Justice Huscroft observed that the majority’s approach that requires the reasonable expectation of privacy to be considered outside of the particular context in which persons find themselves would unduly limit the scope of the voyeurism offence.

This case provides an ugly and unfortunate window on what women can expect from the law when it comes to voyeurism and other related offenses. In the course of his reasons, the trial judge stated that ““[i]t may be that a female student’s mode of attire may attract a debate about appropriate reactions of those who observe such a person leading up to whether there is unwarranted and disrespectful ogling” (Trial decision, at para 46). The issue is not just about public space, it is about the publicness of women’s bodies. The accused was acquitted at trial because of the trial judge’s baffling conclusion that the teacher might have had some motive – other than a sexual one – in making the recordings of female students’ breasts and cleavage. Although the Court of Appeal corrected this error, the majority found that female students at high school do not have a reasonable expectation of privacy when it comes to having their breasts surreptitiously filmed by their teachers (who are not allowed, under school board policies, to engage in such activities). The majority fixates on location as the heart of the reasonable expectation of privacy, eschewing a more nuanced approach that would consider those things that actually inform our expectations of privacy.

 

The Ontario Court of Appeal has just handed down its decision in Keatley Surveying Ltd. v. Teranet Inc. The case involved a copyright dispute between land surveyors and the private company retained by the Province of Ontario to run its land titles registry. There are relatively few court decisions that discuss Crown copyright in Canada, and so this case has been an interesting one to watch.

It has long been accepted that land survey plans are works in which copyright subsists and that the author of a plan of survey is the surveyor. Under the Copyright Act, this creates a default presumption that the surveyor is the owner of copyright in the work. The dispute in this case is about what happens when that plan is deposited in the provincial land titles registry. While such deposits have been taking place for decades, the issue only became controversial after Ontario moved from its old paper-based registry to an electronic system run by a private company on behalf of the province. Under the electronic system, Teranet, the private company, charges fees for access and for the downloading of documents, including plans of survey. The plaintiff, representing the class of surveyors, objected to what it saw as Teranet profiting from the commercial reproduction and dissemination of their copyright-protected works.

For the surveyors to succeed with their action, they had to establish that they owned the copyright in their works. Section 12 of the Copyright Act reads:

12. Without prejudice to any rights or privileges of the Crown, where any work is, or has been, prepared or published by or under the direction or control of Her Majesty or any government department, the copyright in the work shall, subject to any agreement with the author, belong to Her Majesty and in that case shall continue for the remainder of the calendar year of the first publication of the work and for a period of fifty years following the end of that calendar year.

The trial judge found that since they did not create the works under the direction or control of Her Majesty, the Crown could not be said to be the owner of copyright in the plans. However, he was unwilling to find that copyright remained with the surveyors, since to do so might jeopardize the land titles system. Instead, he found that copyright in the plans of survey is “transferred to the province” when plans are deposited. This conclusion is somewhat problematic. As I pointed out in my post on this earlier decision, the Copyright Act requires a signed assignment in writing in order for a transfer of ownership to take place. If the provincial legislation effected a transfer of ownership other than according to the terms of the federal Copyright Act, then this would seem to be a potentially unconstitutional interference with federal jurisdiction over copyrights.

Although constitutional issues were raised before the Court of Appeal, the Court of Appeal arrived at its decision in a way that managed to evade them. The Court agreed that surveyors were the authors of their plans and were thus the original copyright owners. It also agreed that the Crown in right of the Province of Ontario ended up as the copyright owner once the plans became part of the registry. However, Justice Doherty, writing for the unanimous court, disagreed with the approach taken by the trial judge, and rejected the idea that there was a transfer of ownership when plans were deposited in the land titles registry. Instead he adopted a rather interesting interpretation of Crown copyright.

Section 12 of the Copyright Act provides thatthe Crown is the owner of copyright in any work that “is, or has been, prepared or published by or under the direction or control of Her Majesty […]”. Justice Doherty agreed that the plans were not prepared under the direction or control of Her Majesty, but focused instead on the “or published” part of s. 12. In his view, “[m]ere publication” by the Crown does not give rise to Crown copyright – the publication has to be “by or under the direction or control of Her Majesty”. Justice Doherty reviewed the legislation and regulations that related to the land titles system. He noted that the legislation provides for deposit of plans of survey with the province’s Land Registry Office. The statutory scheme also sets strict parameters for the form and content of any plans of survey that are to be deposited. The plans are subject to review, and the Examiner of Surveys can raise questions about the plans with the surveyors, and can require changes to be made before the plans are finally accepted. Justice Doherty noted that this review process did not constitute the “direction or control” necessary to give rise to Crown copyright on the basis that the works were prepared under the direction or control of Her Majesty. However, he found it relevant to the question of whether the “subsequent publication of the registered or deposited plans occurs under the “direction or control” of the Crown.” (at para 37).

Justice Doherty also noted that once a survey plan is deposited in the register, the surveyor is no longer able to make any changes to it without permission from the Examiner of Surveys. He observed that s. 145(6) of the Land Titles Actalso permits the Examiner to make changes at the behest of a third party. Both the Land Titles Act and the Registry Act provide that “certified copies of registered or deposited plans of survey must be made available to members of the public upon payment of the prescribed fee.” (at para 39) Justice Doherty found that the statutory obligation to provide copies of a work “is fundamentally inconsistent with the claim by the document’s author to a right to control the making of copies of the document.” (at para 40) He observed as well that O.Reg 43/96 to the Registry Act provides that no plan deposited in the registry can include “any notes, words or symbols that indicate that the right to make or distribute copies is in any way restricted.” (s. 9(1)(e)).

Justice Doherty found that this combination of provisions created a context in which the Crown has “complete control over registered or deposited plans of survey and complete control over the “publication” of those plans of survey within the meaning of the Copyright Act.” (at para 44) As a result, the plans are works that are published under the direction or control of the Crown, giving rise to Crown copyright in the documents. He stated:

Considered as a whole, the provisions demonstrate that plans of survey registered or deposited in the ELRS are held and published entirely under the Crown’s direction and control. Ownership of copyright does not, however, flow from the provincial land registration scheme. It is s. 12 of the Copyright Act that vests the copyright in the Crown by virtue of the publication of those plans under the “direction or control” of the Crown. (At para 45.)

The solution arrived at by the Court of Appeal is certainly more elegant than that proposed by the trial judge. Nevertheless, it does raise important questions. The first is what actually happens to the original copyright of the surveyors. The Court accepts that they are the first owners of copyright, and that the legislative system does not effect a transfer of rights. Yet at the end of the day, the Court finds that the Crown has copyright in the works. Presumably this extinguishes the copyright of the surveyors, but on what basis? If it is not a transfer, is it an expropriation? What level of statutory/regulatory control is required to trigger such a shift in ownership?

It might not have been necessary for the court to go so far as to find that the Crown assumed copyright over these works. At one point Justice Doherty states that: “The copyright rests in either the Province or the land surveyor who prepared the plan of survey. If the land surveyor has copyright, the making and distribution of paper or digital copies of the plan of survey is a breach of copyright whether done by an employee of the Province or by a third party hired by the Province to perform that function.” (at para 19). What this statement overlooks is the possibility of a licence – one that might well be implied once a surveyor deposits a plan with the land titles registry. Essentially, the same provisions of the statutory regimes governing the registration of plans of survey could be used to support the view that a surveyor who deposits a plan with the registry provides a broad, perpetual licence to the government to reproduce and disseminate the plans as part of the land titles system.

Crown copyright has been a thorn in the side of many who see it as unnecessary at its most benign and a threat to open government at its worst. This decision may breathe complicated new life into this controversial fixture of the Canadian copyright regime.

<< Start < Prev 1 2 3 4 5 6 7 8 9 10 Next > End >>
Page 3 of 29

Canadian Trademark Law

Published in 2015 by Lexis Nexis

Canadian Trademark Law 2d Edition

Buy on LexisNexis

Electronic Commerce and Internet Law in Canada, 2nd Edition

Published in 2012 by CCH Canadian Ltd.

Electronic Commerce and Internet Law in Canada

Buy on CCH Canadian

Intellectual Property for the 21st Century

Intellectual Property Law for the 21st Century:

Interdisciplinary Approaches

Purchase from Irwin Law