Teresa Scassa - Blog

 

The Federal Court has issued its decision in a reference case brought by the Privacy Commissioner of Canada regarding the interpretation of his jurisdiction under the Personal Information Protection and Electronic Documents Act (PIPEDA). The reference relates to a complaint against Google about its search engine, and implicating the so-called ‘right to be forgotten’. Essentially, the complainant in that case seeks an order requiring Google to de-index certain web pages that show up in searches for his name and that contain outdated and inaccurate sensitive information. Google’s response to the complaint was to challenge the jurisdiction of the Commissioner to investigate. It argued that its search engine functions were not a ‘commercial activity’ within the meaning of PIPEDA and that PIPEDA therefore did not apply. It also argued that its search engine was a journalistic or literary function which is excluded from the application of PIPEDA under s. 4(2)(c). The Canadian Broadcasting Corporation (CBC) and the Samuelson-Glushko Canadian Internet Policy and Public Interest Clinic (CIPPIC) both intervened.

Associate Chief Justice Gagné ruled that the Commissioner has jurisdiction to deal with the complaint. In this sense, this ruling simply enables the Commissioner to continue with his investigation of the complaint and to issue his Report of Findings – something that could no doubt generate fresh fodder for the courts, since a finding that Google should de-index certain search results would raise interesting freedom of expression issues. Justice Gagné’s decision, however, focuses on whether the Commissioner has jurisdiction to proceed. Her ruling addresses 1) the commercial character of Google’s search engine activity; 2) whether Google’s activities are journalistic in nature; and 3) the relevance of the quasi-constitutional status of PIPEDA. I will consider each of these in turn.

1) The Commercial Character of Google’s Search Engine

Largely for division of powers reasons, PIPEDA applies only to the collection, use or disclosure of personal information in the course of “commercial activity”. Thus, if an organization can demonstrate that it was not engaged in commercial activity, they can escape the application of the law.

Justice Gagné found that Google collected, used and disclosed information in offering its search engine functions. The issue, therefore, was whether it engaged in these practices “in the course of commercial activity”. Justice Gagné noted that Google is one of the most profitable companies in existence, and that most of its profits came from advertising revenues. Although Google receives revenues when a user clicks on an ad that appears in search results, Google argued that not all search results generate ads – this depends on whether other companies have paid to have the particular search terms trigger their ads. In the case of a search for an ordinary user’s name, it is highly unlikely that the search will trigger ads in the results. However, Justice Gagné noted that advertisers can also target ads to individual users of Google’s search engine based on data that Google has collected about that individual from their online activities. According to Justice Gagné, “even if Google provides free services to the content providers and the user of the search engine, it has a flagrant commercial interest in connecting these two players.” (at para 57) She found that search engine users trade their personal data in exchange for the search results that are displayed when they conduct a search. Their data is, in turn, used in Google’s profit-generating activities. She refused to ‘dissect’ Google’s activities into those that are free to users and those that are commercial, stating that the “activities are intertwined, they depend on one another, and they are all necessary components of that business model.” (at para 59) She also noted that “unless it is forced to do so, Google has no commercial interest in de-indexing or de-listing information from its search engine.” (at para 59)

2) Is Google’s Search Engine Function Journalistic in Nature

PIPEDA does not apply to activities that are exclusively for journalistic purposes. This is no doubt to ensure that PIPEDA does not unduly interfere with the freedom of the press. Google argued that its search engine allowed users to find relevant information, and that in providing these services it was engaged in journalistic purposes.

Justice Gagné observed that depending upon the person, a search by name can reveal a broad range of information from multiple and diverse sources. In this way, Google facilitates access to information, but, in her view, it does not perform a journalistic function. She noted: “Google has no control over the content of search results, the search results themselves express no opinion, and Google does not create the content of the search results.” (at para 82) She adopted the test set out in an earlier decision in A.T. v. Globe24hr.com, whereby an activity qualifies as journalism if “its purpose is to (1) inform the community on issues the community values, (2) it involves an element of original production, and (3) it involves a ‘self-conscious discipline calculated to provide an accurate and fair description of facts, opinion and debate at play within a situation.” (at para 83) Applying the test to Google’s activities, she noted that Google did more than just inform a community about matters of interest, and that it did not create or produce content. She observed as well that “there is no effort on the part of Google to determine the fairness or the accuracy of the search results.” (at para 85). She concluded that the search engine functions were not journalistic activity – or that if they were they were not exclusively so. As a result, the journalistic purposes did not exempt Google from the application of PIPEDA.

3) The Relevance of the Quasi-Constitutional Status of PIPEDA

The Supreme Court of Canada has ruled that both public and private sector data protection laws in Canada have quasi-constitutional status. What this means in practical terms is less clear. Certainly it means that they are recognized as laws that protect rights and/or values that are of fundamental importance to a society. For example, in Lavigne, the Supreme Court of Canada stated that the federal Privacy Act served as “a reminder of the extent to which the protection of privacy is necessary to the preservation of a free and democratic society” (at para 25). In United Food and Commercial Workers, the Supreme Court of Canada found that Alberta’s private sector data protection law also had quasi-constitutional status and stated: “The ability of individuals to control their personal information is intimately connected to their individual autonomy, dignity and privacy. These are fundamental values that lie at the heart of a democracy.” (at para 19)

What this means in practical terms is increasingly important as questions are raised about the approach to take to private sector data protection laws in their upcoming reforms. For example, the Privacy Commissioner of Canada has criticized Bill C-11 (a bill to reform PIPEDA) for not adopting a human rights-based approach to privacy – one that is explicitly grounded in human rights values. By contrast, Ontario in its White Paper proposing a possible private sector data protection law for Ontario, indicates that it will adopt a human rights-based approach. One issue at the federal level might be the extent to which the quasi-constitutional nature of a federal data protection law does the work of a human rights-based approach when it comes to shaping interpretation of the statute. The decision in this reference case suggests that the answer is ‘no’. In fact, the Attorney-General of Canada specifically intervened on this point, argue that “[t]he quasi-constitutional nature of PIPEDA does not transform or alter the proper approach to statutory interpretation”. (at para 30). Justice Gagné agreed. The proper approach is set out in this quote from Driedger in Lavigne (at para 25): “the words of an Act are to be read in their entire context and in their grammatical and ordinary sense harmoniously with the scheme of the Act, the object of the Act, and the intention of Parliament.”

In this case, the relevant words of the Act – “commercial activity” and “journalistic purposes” were interpreted by the Court in accordance with ordinary interpretive principles. I do not suggest that these interpretations are wrong or problematic. I do find it interesting, though, that this decision makes it clear that an implicit human rights-based approach is far inferior to making such an approach explicit through actual wording in the legislation. This is a point that may be relevant as we move forward with the PIPEDA reform process.

Next Steps

Google may, of course, appeal this decision to the Federal Court of Appeal. If it does not, the next step will be for the Commissioner to investigate the complaint and to issue its Report of Findings. The Commissioner has no order-making powers under PIPEDA. If an order is required to compel Google to de-index any sites, this will proceed via a hearing de novo in Federal Court. We are still, therefore, a long way from a right to be forgotten in Canada.

Published in Privacy

 

This post is the third in a series that considers the extent to which the Digital Charter Implementation Act (Bill C-11) by overhauling Canada’s federal private sector data protection law, implements the principles contained in the government’s Digital Charter. This post addresses the fourth principle of the Charter: Transparency, Portability and Interoperability, which provides that “Canadians will have clear and manageable access to their personal data and should be free to share or transfer it without undue burden.”

Europe’s General Data Protection Regulation (GDPR) introduced the concept of data portability (data mobility) as part of an overall data protection framework. The essence of the data portability right in article 20 of the GDPR is:

(1) The data subject shall have the right to receive the personal data concerning him or her, which he or she has provided to a controller, in a structured, commonly used and machine-readable format and have the right to transmit those data to another controller without hindrance from the controller to which the personal data have been provided [...]

In this version, the data flows from one controller to another via the data subject. There is no requirement for data to be in a standard, interoperable format – it need only be in a common, machine-readable format.

Data portability is not a traditional part of data protection; it largely serves consumer protection/competition law interest. Nevertheless, it is linked to data protection through the concept of individual control over personal information. For example, consider an individual who subscribes to a streaming service for audiovisual entertainment. The service provider acquires considerable data about that individual and their viewing preferences over a period of time. If a new company enters the market, they might offer a better price, but the consumer may be put off by the lack of accurate or helpful recommendations or special offers/promotions tailored to their tastes. The difference in the service offered lies in the fact that the incumbent has much more data about the consumer. A data mobility right, in theory, allows an individual to port their data to the new entrant. The more level playing field fosters competition that is in the individual’s interest, and serves the broader public interest by stimulating competition.

The fourth pillar of the Digital Charter clearly recognizes the idea of control that underlies data mobility, suggesting that individuals should be free to share or transfer their data “without undue burden.” Bill C-11 contains a data mobility provision that is meant to implement this pillar of the Charter. However, this provision is considerably different from what is found in the GDPR.

One of the challenges with the GDPR’s data portability right is that not all data will be seamlessly interoperable from one service provider to another. This could greatly limit the usefulness of the data portability right. It could also impose a significant burden on SMEs who might face demands for the production and transfer of data that they are not sufficiently resourced to meet. It might also place individuals’ privacy at greater risk, potentially spreading their data to multiple companies, some of which might be ill-equipped to provide the appropriate privacy protection.

These concerns may explain why Bill C-11 takes a relatively cautious approach to data mobility. Section 72 of the Consumer Privacy Protection Act portion of Bill C-11 provides:

72 Subject to the regulations, on the request of an individual, an organization must as soon as feasible disclose the personal information that it has collected from the individual to an organization designated by the individual, if both organizations are subject to a data mobility framework provided under the regulations. [My emphasis]

It is important to note that in this version of mobility, data flows from one organization to another rather than through the individual, as is the case under the GDPR. The highlighted portion of s. 72 makes it clear that data mobility will not be a universal right. It will be available only where a data mobility framework is in place. Such frameworks will be provided for in regulations. Section 120 of Bill C-11 states:

120 The Governor in Council may make regulations respecting the disclosure of personal information under section 72, including regulations

(a) respecting data mobility frameworks that provide for

(i) safeguards that must be put in place by organizations to enable the secure disclosure of personal information under section 72 and the collection of that information, and

(ii) parameters for the technical means for ensuring interoperability in respect of the disclosure and collection of that information;

(b) specifying organizations that are subject to a data mobility framework; and

(c) providing for exceptions to the requirement to disclose personal information under that section, including exceptions related to the protection of proprietary or confidential commercial information.

The regulations provide for frameworks that will impose security safeguards on participating organizations, and ensure data interoperability. Paragraph 120(b) also suggests that not all organizations within a sector will automatically be entitled to participate in a mobility framework; they may have to qualify by demonstrating that they meet certain security and technical requirements. A final (and interesting) limitation on the mobility framework relates to exceptions to disclosure where information that might otherwise be considered personal information is also proprietary or confidential commercial information. This gets at the distinction between raw and derived data – data collected directly from individuals might be subject to the mobility framework, but profiles or analytics based on that data might not – even if they pertain to the individual.

It is reasonable to expect that open banking (now renamed ‘consumer-directed finance’) will be the first experiment with data mobility. The federal Department of Finance released a report on open banking in January 2020, and has since been engaged in a second round of consultations. Consumer-directed finance is intended to address the burgeoning fintech industry which offers many new and attractive financial management digital services to consumers but which relies on access to consumer financial data. Currently (and alarmingly) this need for data is met by fintechs asking individuals to share account passwords so that they can regularly scrape financial data from multiple sources (accounts, credit cards, etc.) in order to offer their services. A regulated framework for data mobility is seen as much more secure, since safeguards can be built into the system, and participants can be vetted to ensure they meet security and privacy standards. Data interoperability between all participants will also enhance the quality of the services provided.

If financial services is the first area for development of data mobility in Canada, what other areas for data mobility might Canadians expect? The answer is: not many. The kind of scheme contemplated for open banking has already required a considerable investment of time and energy, and it is not yet ready to launch. Of course, financial data is among the most sensitive of personal data; other schemes might be simpler to design and create. But they will still take a great deal of time. One sector where some form of data mobility might eventually be contemplated is telecommunications. (Note that Australia’s comparable “consumer data right” is being unrolled first with open banking and will be followed by initiatives in the telecommunications and energy sectors).

Data mobility in the CPPA will also be limited by its stringency. It is no accident that banking and telecommunications fall within federal jurisdiction. The regulations contemplated by s. 120 go beyond simple data protection and impact how companies do business. The federal government will face serious challenges if it attempts to create data mobility frameworks within sectors or industries under provincial jurisdiction. Leadership on this front will have to come from the provinces. Those with their own private sector data protection laws could choose to address data mobility on their own terms. Quebec has already done this in Bill 64, which would amend its private sector data protection law to provide:

112 [. . .] Unless doing so raises serious practical difficulties, computerized personal information collected from the applicant must, at his request, be communicated to him in a structured, commonly used technological format. The information must also be communicated, at the applicant’s request, to any person or body authorized by law to collect such information.

It remains to be seen what Alberta and British Columbia might decide to do – along with Ontario, if in fact it decides to proceed with its own private sector data protection law. As a result, while there might be a couple of important experiments with data mobility under the CPPA, the data mobility right within that framework is likely to remain relatively constrained.

Published in Privacy

 

This post is the second in a series that considers the extent to which the Digital Charter Implementation Act, by overhauling Canada’s federal private sector data protection law, implements the principles contained in the government’s Digital Charter. It addresses the tenth principle of the Charter: Strong Enforcement and Real Accountability. This principle provides that “There will be clear, meaningful penalties for violations of the laws and regulations that support these principles.”

Canada’s current data protection law, the Personal Information Protection and Electronic Documents Act (PIPEDA) has been criticized for the relatively anemic protection it provides for personal information. Although complaints may be filed with the Commissioner, the process ends with a non-binding “report of findings”. After receiving a report, a complainant who seeks either a binding order or compensation must make a further application to the Federal Court. Recourse to Federal Court is challenging for unrepresented plaintiffs. Yet, awards of damages have been low enough to make it distinctly not worth anyone’s while to hire a lawyer to assist them with such a claim. As a result, the vast majority of cases going to the Federal Court have been brought by unrepresented plaintiffs. Damage awards have been low, and nobody has been particularly impressed. It is now far more likely that privacy issues – at least where data breaches are concerned – will be addressed through class action lawsuits, which have proliferated across the country.

Of course, the protection of personal information is not all about seeking compensation or court orders. In fact, through the complaints process over the years, the Commissioner has worked to improve data protection practices through a variety of soft compliance measures, including investigating complaints and making recommendations for changes. The Commissioner also uses audit powers and, more recently, compliance agreements, to ensure that organizations meet their data protection obligations. Nevertheless, high profile data breaches have left Canadians feeling vulnerable and unprotected. There is also a concern that some data-hungry companies are making fortunes from personal data and that weak legislative sanctions provide no real incentive to limit their rampant collection, use and disclosure of personal data. Public unease has been augmented by overt resistance to the Commissioner’s rulings in some instances. For example, Facebook was defiant in response to the Commissioner’s findings in relation to the Cambridge Analytica scandal. Even more recently, in an inquiry into the use of facial recognition technologies in shopping malls, the respondent politely declined to accept the Commissioner’s findings that certain of their practices were in breach of PIPEDA.

The Digital Charter Implementation Act is meant to address PIPEDA’s enforcement shortfall. It provides for the enactment of two statutes related to personal data protection: The Consumer Privacy Protection Act (CPPA) and the Personal Information and Data Protection Tribunal Act (the PIDPTA). A government Fact Sheet describes this combination as providing a “Comprehensive and accessible enforcement model”. The revamped version of PIPEDA, the CPPA would give the Commissioner the power to order an organization to comply with its obligations under the CPPA or to stop collecting or using personal information. This is an improvement, although the order-making powers are subject to a right of appeal to the new Tribunal created by the PIDPTA. At least the Tribunal will owe some deference to the Commissioner on questions of fact or of mixed law and fact – proceedings before the Federal Court under PIPEDA were entirely de novo.

Under the CPPA, the Commissioner will also be able to recommend that the Tribunal impose a fine. Fines are available only for certain breaches of the legislation. These are ones that involve excessive collection of personal information; use or disclosure of personal information for new purposes without consent or exception; making consent to personal data collection a condition of the provision of a product or service (beyond what is necessary to provide that product or service); obtaining consent by deception; improper retention or disposal of personal information; failure to dispose of personal information at an individual’s request; breach of security safeguards; or failure to provide breach notification. The fines can be substantial, with a maximum penalty of the higher of $10,000,000 or 3% of the organization’s gross global revenue for the preceding financial year. Of course, that is the upper end. Fines are discretionary and subject to a number of considerations, and fines are explicitly not meant to be punitive.

Within this structure, the Tribunal will play a significant role. It was no doubt created to provide greater distance between the Commissioner and the imposition of fines on organizations. In this respect, it is a good thing. The Commissioner still plays an important role in encouraging organizations to comply voluntarily with the legislation. This role is fairer and easier to perform when there is greater separation between the ombuds functions of the Commissioner and the ability to impose penalties. More problematically, the Tribunal will hear appeals of both findings and orders made by the Commissioner. The appeal layer is new and will add delays to the resolution of complaints. An alternative would be to have left orders subject to judicial review, with no appeals. In theory, going to the Tribunal will be faster and perhaps less costly than a trip to Federal Court. But in practice, the Tribunal’s value will depend on its composition and workload. Under the PIDPTA, the Tribunal will have only six members, not necessarily full-time, and only one of these is required to have experience with privacy. Decisions of the tribunal cannot be appealed, but they will be subject to judicial review by the Federal Court.

The CPPA also creates a new private right of action. Section 106 provides that an individual affected by a breach of the Act can sue for damages for “loss or injury that the individual has suffered”. However, in order to do so, the individual must first make a complaint. That complaint must be considered by the Commissioner. The Commissioner’s findings and order must either not be appealed or any appeal must have been dealt with by the Tribunal. Note that not all complaints will be considered by the Commissioner. The Commissioner can decline to deal with complaints for a number of reasons (see s. 83) or can discontinue an investigation (see s. 85). There is also a right of action for loss or injury where and organization has been convicted of an offence under the legislation. An offence requires an investigation, a recommendation, and consideration by the Tribunal. All of these steps will take time. It will be a truly dogged individual who pursues the private right of action under the CPPA.

Ultimately, then, the question is whether this new raft of enforcement-related provisions is an improvement? To better get a sense of how these provisions might work in practice, consider the example of the massive data breach at Desjardins that recently led to a Commissioner’s report of findings. The data breach was a result of employees not following internal company policies, flawed training and oversight, as well as certain employees going ‘rogue’ and using personal data for their own benefit. In the Report of Findings, the Commissioner makes a number of recommendations most of which have already been implemented by the organization. As a result, the Commissioner has ruled the complaint well-founded and conditionally resolved. Class action lawsuits related to the breach have already been filed.

How might this outcome be different if the new legislation were in place? A complaint would still be filed and investigated. The Commissioner would issue his findings as to whether any provisions of the CPPA were contravened. He would have order-making powers and could decide to recommend that a penalty be imposed. However, if his recommendations are all accepted by an organization, there is no need for an order. The Commissioner might, given the nature and size of the breach, decide to recommend that a fine be imposed. However, considering the factors in the legislation and the organization’s cooperation, he might decide it was not appropriate.

Assuming a recommendation were made to impose a penalty, the Tribunal would have to determine whether to do so. It must consider a number of factors, including the organization’s ability to pay the fine, any financial benefit derived by the organization from the activity, whether individuals have voluntarily been compensated by the organization, and the organization’s history of complying with the legislation. The legislation also specifically provides that “the purpose of a penalty is to promote compliance with this Act and not to punish.” (s. 94(6)) In a case where the organization was not exploiting the data for its own profit, took steps quickly to remedy the issues by complying with the Commissioner’s recommendations, and provided credit monitoring services for affected individuals, it is not obvious that a fine would be imposed. As for the private right of action in the legislation, it is not likely to alter the fact that massive data breaches of this kind will be addressed through class action lawsuits.

The reworking of the enforcement provisions may therefore not be hugely impactful in the majority of cases. This is not necessarily a bad thing, if the lack of impact is due to the fact that the goals of the legislation are otherwise being met. Where it may make a difference is in cases where organizations resist the Commissioner’s findings or where they act in flagrant disregard of data protection rights. It is certainly worth having more tools for enforcement in these cases. Here, the big question mark is the Tribunal – and more particularly, its composition.

But there may also be consequences felt by individuals as a result of the changes. The Commissioner’s findings – not just any orders he might make – are now subject to appeal to the Tribunal. This will likely undermine his authority and might undercut his ability to achieve soft compliance with the law. It is also likely to delay resolution of complaints, thus also delaying access to the private right of action contemplated under the legislation. It shifts power regarding what constitutes a breach of the legislation from the Commissioner to the new Tribunal. This may ultimately be the most concerning aspect of the legislation. So much will depend on who is appointed to the Tribunal, and the Bill does not require demonstrable privacy expertise as a general pre-requisite for membership. At the very least, this should be changed.

Published in Privacy

 

It’s been a busy privacy week in Canada. On November 16, 2020 Canada’s Department of Justice released its discussion paper as part of a public consultation on reform of the Privacy Act. On November 17, the Minister of Industry released the long-awaited bill to reform Canada’s private sector data protection legislation. I will be writing about both developments over the next while. But in this initial post, I would like to focus on one overarching and obvious omission in both the Bill and the discussion paper: the failure to address privacy as a human right.

Privacy is a human right. It is declared as such in international instruments to which Canada is a signatory, such as the Universal Declaration of Human Rights and the International Convention on Civil and Political Rights. Data protection is only one aspect of the human right to privacy, but it is an increasingly important one. The modernized Convention 108 (Convention 108+), a data protection originating with the Council of Europe but open to any country, puts human rights front and centre. Europe’s General Data Protection Regulation also directly acknowledges the human right to privacy, and links privacy to other human rights. Canada’s Privacy Commissioner has called for Parliament to adopt a human rights-based approach to data protection, both in the public and private sectors.

In spite of all this, the discussion paper on reform of the Privacy Act is notably silent with respect to the human right to privacy. In fact, it reads a bit like the script for a relationship in which one party dances around commitment, but just can’t get out the words “I love you”. (Or, in this case “Privacy is a human right”). The title of the document is a masterpiece of emotional distancing. It begins with the words: “Respect, Accountability, Adaptability”. Ouch. The “Respect” is the first of three pillars for reform of the Act, and represents “Respect for individuals based on well established rights and obligations for the protection of personal information that are fit for the digital age.” Let’s measure that against the purpose statement from Convention 108+: “The purpose of this Convention is to protect every individual, whatever his or her nationality or residence, with regard to the processing of their personal data, thereby contributing to respect for his or her human rights and fundamental freedoms, and in particular the right to privacy.” Or, from article 1 of the GDPR: “This Regulation protects fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data.” The difference is both substantial and significant.

The discussion paper almost blurts it out… but again stops short in its opening paragraph, which refers to the Privacy Act as “Canada’s quasi-constitutional legal framework for the collection, use, disclosure, retention and protection of personal information held by federal public bodies.” This is the romantic equivalent of “I really, really, like spending time with you at various events, outings and even contexts of a more private nature.”

The PIPEDA reform bill which dropped in our laps on November 17 does mention the “right to privacy”, but the reference is in the barest terms. Note that Convention 108+ and the GDPR identify the human right to privacy as being intimately linked to other human rights and freedoms (which it is). Section 5 of the Bill C-11 (the Consumer Privacy Protection Act) talks about the need to establish “rules to govern the protection of personal information in a manner that recognizes the right to privacy of individuals with respect to their personal information and the need of organizations to collect, use or disclose personal information for purposes that a reasonable person would consider appropriate in the circumstances.” It is pretty much what was already in PIPEDA, and it falls far short of the statements quoted from Convention 108+ and the GDPR. In the PIPEDA context, the argument has been that “human rights” are not within exclusive federal jurisdiction, so talking about human rights in PIPEDA just makes the issue of its constitutionality more fraught. Whether this argument holds water or not (it doesn’t), the same excuse does not exist for the federal Privacy Act.

The Cambridge Analytica scandal (in which personal data was used to subvert democracy), concerns over uses of data that will perpetuate discrimination and oppression, and complex concerns over how data is collected and used in contexts such as smart cities all demonstrate that data protection is more than just about a person’s right to a narrow view of privacy. Privacy is a human right that is closely linked to the enjoyment of other human rights and freedoms. Recognizing privacy as a human right does not mean that data protection will not not require some balancing. However, it does mean that in a data driven economy and society we keep fundamental human values strongly in focus. We’re not going to get data protection right if we cannot admit these connections and clearly state that data protection is about the protection of fundamental human rights and freedoms.

There. Is that so hard?

Published in Privacy

 

The BC Court of Appeal has handed down a decision that shakes up certain assumptions about recourse for privacy-related harms in that province – and perhaps in other provinces as well.

The decision relates to a class action lawsuit filed after a data breach. The defendant had stored an unencrypted copy of a database containing customer personal information on its website. The personal information included: “names, addresses, email addresses, telephone numbers, dates of birth, social insurance numbers, occupations, and, in the case of credit card applicants, their mothers' birth names.” (at para 4) This information was accessed by hackers. By the time of this decision, some of the information had been used in phishing scams but the full extent of its use is still unknown.

As is typical in privacy class action lawsuits, the plaintiffs sought certification on multiple grounds. These included: “breach of contract, negligence, breach of privacy, intrusion upon seclusion, beach of confidence, unjust enrichment and waiver of tort.” (at para 6) The motions judge certified only claims in contract, negligence, and the federal common law of privacy.

The defendants appealed, arguing that the remaining grounds were not viable and that the action should not have been certified. They also argued that a class action lawsuit was not the preferable procedure for the resolution of the common issues. While the plaintiffs cross-appealed the dismissal of the claim for breach of confidence, they did not appeal the decision that there was no recourse for breach of privacy or the tort of intrusion upon seclusion under BC law.

This post focuses what I consider to be the three most interesting issues in the case. These are: whether there is recourse for data breaches other than via data protection legislation; whether the tort of breach of privacy exists in B.C.; and whether there is a federal common law of privacy.

1. Is PIPEDA a complete code

The defendants argued that the class action lawsuit was not the preferred procedure because the federal Personal Information Protection and Electronic Documents Act (PIPEDA) constituted a “complete code in respect of the collection, retention, and disclosure of personal information by federally-regulated businesses, and that no action, apart from the application to the Federal Court contemplated by the Act can be brought in respect of a data breach.” (at para 18) Justice Groberman, writing for the unanimous Court, noted that while it was possible for a statute to constitute a complete code intended to fully regulate a particular domain, it is not inevitable. He observed that the Ontario Court of Appeal decision in Hopkins v. Kay had earlier determined that Ontario’s Personal Health Information Protection Act (PHIPA) did not constitute a complete code when it came to regulating personal health information, allowing a lawsuit to proceed against a hospital for a data breach. In Hopkins, the Ontario Court of Appeal noted that PHIPA was primarily oriented towards addressing systemic issues in the handling of personal health information, rather than dealing with individual disputes. Although there was a complaints mechanism in the statute, the Commissioner had the discretion to decline to investigate a complaint if a more appropriate procedure were available. Justice Groberman noted that PIPEDA contained a similar provision in s. 12. He observed that “[t]his language, far from suggesting that the PIPEDA is a complete code, acknowledges that other remedies continue to be available, and gives the Commissioner the discretion to abstain from conducting an investigation where an adequate alternative remedy is available to the complainant.” (at para 28) In his view, PIPEDA is similarly oriented towards addressing systemic problems and preventing future breaches, and that “[w]hile there is a mechanism to resolve individual complaints, it is an adjunct to the legislative scheme, not its focus.” (at para 29) He also found it significant that PIPEDA addressed private rather than public sector data protection. He stated: “[w]ithin a private law scheme, it seems to me that we should exercise even greater caution before concluding that a statute is intended to abolish existing private law rights.” (at para 30) He concluded that nothing in PIPEDA precluded other forms of recourse for privacy harms.

2. Do common law privacy torts exist in BC?

In 2012 the Ontario Court of Appeal recognized the privacy tort of intrusion upon seclusion in Jones v. Tsige. However, since British Columbia has a statutory privacy tort in its Privacy Act, the motions judge (like other BC judges before him) concluded that the statutory tort displaced any possible common law tort in BC. Justice Groberman was clearly disappointed that the plaintiffs had chosen not to appeal this conclusion. He stated: “In my view, the time may well have come for this Court to revisit its jurisprudence on the tort of breach of privacy.” (at para 55) He proceeded to review the case law usually cited as supporting the view that there is no common law tort of breach of privacy in BC. He distinguished the 2003 decision in Hung v. Gardiner on the basis that in that case the judge at first instance had simply stated that he was not convinced by the authorities provided that such a tort existed in BC. On appeal, the BCCA agreed with the judge’s conclusion on an issue of absolute privilege, and found it unnecessary to consider any of the other grounds of appeal.

The BCCA decision in Mohl v. University of British Columbia is more difficult to distinguish because in that case the BCCA stated “[t]here is no common-law claim for breach of privacy. The claim must rest on the provisions of the [Privacy] Act.” (Mohl at para 13) Nevertheless, Justice Groberman indicated that while this statement was broad, “it is not entirely clear that it was intended to be a bold statement of general principle as opposed to a conclusion with respect to the specific circumstances of Mr. Mohl's case. In any event, the observation was not critical to this Court's reasoning.” (at para 62)

Justice Groberman concluded that “The thread of cases in this Court that hold that there is no tort of breach of privacy, in short, is a very thin one.” (at para 64) He also noted that the privacy context had considerably changed, particularly with the Ontario Court of Appeal’s decision in Jones v. Tsige. He stated:

It may be that in a bygone era, a legal claim to privacy could be seen as an unnecessary concession to those who were reclusive or overly sensitive to publicity, though I doubt that that was ever an accurate reflection of reality. Today, personal data has assumed a critical role in people's lives, and a failure to recognize at least some limited tort of breach of privacy may be seen by some to be anachronistic. (at para 66)

He indicated that the Court of Appeal might be inclined to reconsider the issue were it to be raised before them, although he could not do so in this case since the plaintiffs had not appealed the judge’s ruling on this point.

3. There is no federal common law of privacy

However keen Justice Groberman might have been to hear arguments on the common law tort of privacy, he overturned the certification of the privacy claims as they related to the federal common law of privacy. He characterized this approach as ‘creative’, but inappropriate. He noted that while common law principles might evolve in areas of federal law (e.g. maritime law), in cases where there was shared jurisdiction such as in privacy law, there was no separate body of federal common law distinct from provincial common law. He stated “there is only a single common law, and it applies within both federal and provincial spheres.” (at para 76) More specifically, he stated:

Where an area of law could be regulated by either level of government, it is not sensible to describe the situation in which neither has enacted legislation as being a situation of either "federal" or "provincial" common law. It is simply a situation of the "common law" applying. The plaintiffs cannot choose whether to bring their claims under "federal" or "provincial" common law as if these were two different regimes. (at para 86)

Because the claim advanced by the plaintiff had nothing to do with any specific area of federal jurisdiction, Justice Groberman rejected the idea that a cause of action arose under “federal” common law.

Overall, this decision is an interesting one. Clearly the Court of Appeal is sending strong signals that it is time to rethink recourse for breach of privacy in the province. It may now be that there is both a statutory and a common law action for breach of privacy. If this is so, it will be interesting to see what scope is given to the newly recognized common law tort. “Complete code” arguments have arisen in other lawsuits relating to breach of privacy; the BCCA’s response in this case adds to a growing body of jurisprudence that rejects the idea that data protection laws provide the only legal recourse for the mishandling of personal data. Finally, a number of class action lawsuits have asserted the “federal common law of privacy”, even though it has been entirely unclear what this is. The BCCA suggests that it is a fabrication and that no such distinct area of common law exists.

Published in Privacy

 

The Ontario Government has just launched a public consultation and discussion paper to solicit input on a new private sector data protection law for Ontario.

Currently, the collection, use and disclosure of personal information in Ontario is governed by the Personal Information Protection and Electronic Documents Act (PIPEDA). This is a federal statute overseen by the Privacy Commissioner of Canada. PIPEDA allows individual provinces to pass their own private sector data protection laws so long as they are ‘substantially similar’. To date, Quebec, B.C. and Alberta are the only provinces to have done so.

Critics of this move by Ontario might say that there is no need to add the cost of overseeing a private sector data protection law to the provincial budget when the federal government currently bears this burden. Some businesses might also balk at having to adapt to a new data protection regime. While many of the rules might not be significantly different from those in PIPEDA, there are costs involved simply in reviewing and assessing compliance with any new law. Another argument against a new provincial law might relate to the confusion and uncertainty that could be created around the application of the law, since it would likely only apply to businesses engaged in intra-provincial commercial activities and not to inter-provincial or international activities, which would remain subject to PIPEDA. Although these challenges have been successfully managed in B.C., Alberta and Quebec, there is some merit in having a single, overarching law for the whole of the private sector in Canada.

Nevertheless, there are many reasons to enthusiastically embrace this development in Ontario. First, constitutional issues limit the scope of application of PIPEDA to organizations engaged in the collection, use or disclosure of personal information in the course of commercial activity. This means that those provinces that rely solely on PIPEDA for data protection regulation have important gaps in coverage. PIPEDA does not apply to employees in provincially regulated sectors; non-commercial activities of non-profits and charities are not covered, nor are provincial (or federal, for that matter) political parties. The issue of data protection and political parties has received considerable attention lately. B.C.’s private sector data protection law applies to political parties in B.C., and this has recently been interpreted to include federal riding associations situated in B.C. Bill 64, a bill to amend data protection laws in Quebec, would also extend the application of that province’s private sector data protection law to provincial political parties. If Ontario enacts its own private sector data protection law, it can (and should) extend it to political parties, non-commercial actors or activities, and provide better protection for employee personal data. These are all good things.

A new provincial law will also be designed for a digital and data economy. A major problem with PIPEDA is that it has fallen sadly out of date and is not well adapted to the big data and AI environment. For a province like Ontario that is keen to build public trust in order to develop its information economy, this is a problem. Canadians are increasingly concerned about the protection of their personal data. The COVID-19 crisis appears to have derailed (once again) the introduction of a bill to amend PIPEDA and it is not clear when such a bill will be introduced. Taking action at the provincial level means no longer being entirely at the mercy of the federal agenda.

There is something to be said as well for a law, and a governance body (in this case, it would be the Office of the Ontario Information and Privacy Commissioner) that is attuned to the particular provincial context while at the same time able to cooperate with the federal Commissioner. This has been the pattern in the other provinces that have their own statutes. In Alberta and B.C. in particular, there has been close collaboration and co-operation between federal and provincial commissioners, including joint investigations into some complaints that challenge the boundaries of application of federal and provincial laws. In addition, Commissioners across the country have increasingly issued joint statements on privacy issues of national importance, including recently in relation to COVID-19 and contact-tracing apps. National co-operation combined with provincial specificity in data protection could offer important opportunities for Ontario.

In light of this, this consultation process opens an exciting new phase for data protection in Ontario. The task will not simply to be to replicate the terms of PIPEDA or even the laws of Alberta and B.C. (all of which can nonetheless provide useful guidance). None of these laws is particularly calibrated to the big data environment (B.C.’s law is currently under review), and there will be policy choices to be made around many of the issues that have emerged in the EU’s General Data Protection Regulation. This consultation is an opportunity to weigh in on crucially important data protection issues for a contemporary digital society, and a made-in-Ontario statute.

Published in Privacy

A recent story in iPolitics states that both the Liberals and the Conservatives support strengthening data protection laws in Canada, although it also suggests they may differ as to the best way to do so.

The Liberals have been talking about strengthening Canada’s data protection laws – both the Privacy Act (public sector) and the Personal Information Protection and Electronic Documents Act (PIPEDA) (private sector) since well before the last election, although their emphasis has been on PIPEDA. The mandate letters of both the Ministers of Justice and Industry contained directions to reform privacy laws. As I discuss in a recent post, these mandate letters speak of greater powers for the Privacy Commissioner, as well as some form of “appropriate compensation” for data breaches. There are also hints at a GDPR-style right of erasure, a right to withdraw consent to processing of data, and rights of data portability. With Canada facing a new adequacy assessment under the EU’s General Data Protection Regulation (GDPR) it is perhaps not surprising to see this inclusion of more EU-style rights.

Weirdly, though, the mandate letters of the Minister of Industry and the Minister of Heritage also contain direction to create the new role of “Data Commissioner” to serve an as-yet unclear mandate. The concept of a Data Commissioner comes almost entirely out of the blue. It seems to be first raised before the ETHI Committee on February 7, 2019 by Dr. Jeffrey Roy of Dalhousie University. He referenced in support of this idea a new Data Commissioner role being created in Australia as well as the existence of a UK Chief Data Officer. How it got from an ETHI Committee transcript to a mandate letter is still a mystery.

If this, in a nutshell, is the Liberal’s plan, it contains both the good, the worrisome, and the bizarre. Strengthening PIPEDA – both in terms of actual rights and enforcement of those rights is a good thing, although the emphasis in the mandate letters seems very oriented towards platforms and other issues that have been in the popular press. This is somewhat worrisome. What is required is a considered and substantive overhaul of the law, not a few colourful and strategically-placed band-aids.

There is no question that the role of the federal Privacy Commissioner is front and centre in this round of reform. There have been widespread calls to increase his authority to permit him to issue fines and to make binding orders. These measures might help address the fundamental weakness of Canada’s private sector data protection laws, but they will require some careful thinking about the drafting of the legislation to ensure that some of the important advisory and dispute resolution roles of the Commissioner’s office are not compromised. And, as we learned with reform of the Access to Information Act, there are order-making powers and then there are order-making powers. It will not be a solution to graft onto the legislation cautious and complicated order-making powers that increase bureaucracy without advancing data protection.

The bizarre comes in the form of the references to a new Data Commissioner. At a time when we clearly have not yet managed to properly empower the Privacy Commissioner, it is disturbing that we might be considering creating a new bureaucracy with apparently overlapping jurisdiction. The mandate letters suggest that the so-called data commissioner would oversee (among other things?) data and platform companies, and would have some sort of data protection role in this regard. His or her role might therefore overlap with both those of the Privacy Commissioner and the Competition Bureau. It is worth noting that the Competition Bureau has already dipped its toe into the waters of data use and abuse. The case for a new bureaucracy is not evident.

The Conservatives seem to be opposed to the creation of the new Data Commissioner, which is a good thing. However, Michelle Rempel Garner was reported by iPolitics as rejecting “setting up pedantic, out of date, ineffectual and bloated government regulatory bodies to enforce data privacy.” It is not clear whether this is simply a rejection of the new Data Commissioner’s office, or also a condemnation of the current regulatory approach to data protection (think baby and bath water). Instead, the Conservatives seem to be proposing creating a new data ownership right for Canadians, placing the economic value of Canadians’ data in their hands.

This is a bad idea for many reasons. In the first place, creating a market model for personal data will do little to protect Canadians. Instead, it will create a context in which there truly is no privacy because the commercial exchange of one’s data for products and services will include a transfer of any data rights. It will also accentuate existing gaps between the wealthy and those less well off. The rich can choose to pay extra for privacy; others will have no choice but to sell their data. Further, the EU, which has seriously studied data ownership rights (and not just for individuals) has walked away from them each time. Data ownership rights are just too complicated. There are too many different interests in data to assign ownership to just one party. If a company uses a proprietary algorithm to profile your preferences for films or books, is this your data which you own, or theirs because they have created it?

What is much more important is the recognition of different interests in data and the strengthening, through law, of the interests of individuals. This is what the GDPR has done. Rights of data portability and erasure, the right to withdraw consent to processing, and many other rights within the GDPR give individuals much stronger interests in their data, along with enforcement tools to protect those interests. Those strengthened interests are now supporting new business models that place consumers at the centre of data decision-making. Open banking (or consumer-directed banking), currently being studied by the Department of Finance in Canada, is an example of this, but there are others as well.

The fix, in the end, is relatively simple. PIPEDA needs to be amended to both strengthen and expand the existing interests of individuals in their personal data. It also needs to be amended to provide for appropriate enforcement, compensation, and fines. Without accountability, the rights will be effectively meaningless. It also needs to happen sooner rather than later.

 

(With thanks to my RA Émilie-Anne Fleury who was able to find the reference to the Data Commissioner in the ETHI Committee transcripts)

Published in Privacy

The year 2020 is likely to bring with it significant legal developments in privacy law in Canada. Perhaps the most important of these at the federal level will come in the form of legislative change. In new Mandate letters, the Prime Minister has charged both the Minister of Justice and the Minister of Innovation Science and Industry with obligations to overhaul public and private sector data protection laws. It is widely anticipated that a new bill to reform the Personal Information Protection and Electronic Documents Act (PIPEDA) will be forthcoming this year, and amendments to the Privacy Act are also expected at some point.

The mandate letters are interesting in what they both do and do not reveal about changes to come in these areas. In the first place, both mandate letters contain identical wording around privacy issues. Their respective letters require the two Ministers to work with each other:

. . . to advance Canada’s Digital Charter and enhanced powers for the Privacy Commissioner, in order to establish a new set of online rights, including: data portability; the ability to withdraw, remove and erase basic personal data from a platform; the knowledge of how personal data is being used, including with a national advertising registry and the ability to withdraw consent for the sharing or sale of data; the ability to review and challenge the amount of personal data that a company or government has collected; proactive data security requirements; the ability to be informed when personal data is breached with appropriate compensation; and the ability to be free from online discrimination including bias and harassment. [my emphasis]

A first thing to note is that the letters reference GDPR-style rights in the form of data portability and the right of erasure. If implemented, these should give individuals considerably more control over their personal information and will strengthen individual interests in their own data. It will be interesting to see what form these rights take. A sophisticated version of data portability has been contemplated in the context of open banking, and a recent announcement makes it clear that work on open banking is ongoing (even though open banking is notably absent from the mandate letter of the Minister of Finance). GDPR-style portability is a start, though it is much less potent as a means of empowering individuals.

The right of erasure is oddly framed. The letters describe it as “the ability to withdraw, remove and erase basic personal data from a platform” (my emphasis). It is unclear why the right of erasure would be limited to basic information on platforms. Individuals should have the right to withdraw, remove and erase personal data from all organizations that have collected it, so long as that erasure is not inconsistent with the purposes for which it was provided and for which it is still required.

Enhancements to rights of notice and new rights to challenge the extent of data collection and retention will be interesting reforms. The references to “appropriate compensation” suggest that the government is attuned to well-publicized concerns that the consequences of PIPEDA breaches are an insufficient incentive to improve privacy practices. Yet it is unclear what form such compensation will take and what procedures will be in place for individuals to pursue it. It is not evident, for example, whether compensation will only be available for data security breaches, or whether it will extend to breaches of other PIPEDA obligations. It is unclear whether the right to adequate compensation will also apply to breaches of the Privacy Act. The letters are mum as to whether it will involve statutory damages linked to a private right of action, or some other form of compensation fund. It is interesting to note that although the government has talked about new powers for the Commissioner including the ability to levy significant fines, these do not appear in the mandate letters.

Perhaps the most surprising feature of the Minister of Industry’s mandate letter is the direction to work with the Minister of Canadian Heritage to “create new regulations for large digital companies to better protect people’s personal data and encourage greater competition in the digital marketplace.” This suggests that new privacy obligations that are sector-specific and separate from PIPEDA are contemplated for “large digital companies”, whatever that might mean. These rules are to be overseen by a brand new Data Commissioner. Undoubtedly, this will raise interesting issues regarding duplication of resources, as well as divided jurisdiction and potentially different approaches to privacy depending on whether an organization is large or small, digital or otherwise.

Published in Privacy

Class action lawsuits for privacy breaches are becoming all the rage in Canada – this is perhaps unsurprising given the growing number of data breaches. However, a proceeding certified and settled in October 2019 stands out as significantly different from the majority of Canadian privacy class action suits.

Most privacy class action lawsuits involve data breaches. Essentially, an entity trusted with the personal information of large numbers of individuals is sued because they lost the data stored on an unsecured device, a rogue employee absconded with the data or repurposed it, a hacker circumvented their security measures, or they simply allowed information to be improperly disclosed due to lax practices or other failings. In each of these scenarios, the common factor is a data breach and improper disclosure of personal information. Haikola v. Personal Insurance Co. is a notably different. In Haikola, the alleged misconduct is the over collection of personal information in breach of the Personal Information Protection and Electronic Documents Act (PIPEDA).

The legal issues in this case arose after the representative class plaintiff, Mr. Haikola, was involved in a car accident. In settling his claim, his insurance company asked him to consent to providing them access to his credit score with a credit reporting agency. Mr. Haikola agreed, although he felt that he had had no choice but to do so. He followed up with the insurance company on several occasions, seeking more information about why the information was required, but did not receive a satisfactory explanation. He filed a complaint with the Office of the Privacy Commissioner. The subsequent investigation led to a Report of Findings that concluded, in the words of Justice Glustein, that the insurance company’s “collection and use of credit scores during the auto insurance claim assessment process is not something that a reasonable person would consider to be appropriate.” (at para 13) The company eventually changed its practices.

Under PIPEDA, the Commissioner’s findings are not binding. Once a complainant has received a Report of Findings, they can choose to bring an application under s. 14 of PIPEDA to Federal Court for an order and/or an award of damages. After receiving his Report of Findings, Mr. Haikola took the unusual step of seeking to commence a class action lawsuit under s. 14 of PIPEDA. The defendants argued that the Federal Court had no jurisdiction under s. 14 to certify a class action lawsuit. There is no case law on this issue, and it is not at all clear that class action recourse is contemplated under s. 14.

The parties, in the meantime, negotiated a settlement agreement. However, quite apart from the issue of whether a class action suit could be certified under s. 14 of PIPEDA, it was unclear whether the Federal Court could “make an enforceable order in a PIPEDA class action against a non-governmental entity.” (at para 28) With advice from the Federal Court case management judge, the parties agreed that Mr. Haikola would commence an action in Ontario Superior Court, requesting certification of the class action lawsuit and approval of the settlement. The sole cause of action in the suit initiated in Ontario Superior Court was for breach of contract. The argument was that in the contract between the insurance company and its customers, the insurance company undertook to “”act as required or authorized by law” in the collection, use, and disclosure of the Class Members’ personal information – including information from credit reporting agencies.” (at para 56) This would include meeting its PIPEDA obligations.

The class included persons whose credit history was used as part of a claim settlement process. The insurance company identified 8,525 people who fell into this category. The settlement provided for the paying out of $2,250,000. The court estimated that if every member of the class filed a valid claim, each would receive approximately $150.

In considering whether a class action lawsuit was the preferable procedure, Justice Glustein noted that generally, for this type of privacy complaint, the normal recourse was under PIPEDA. The structure of PIPEDA is such that each affected individual would have to file a complaint; the filing of a complaint and the issuance of a report were both prerequisites to commencing an action in Federal Court. Justice Glustein considered this to be a barrier to access to justice, particularly since most individuals would have claims “of only a very modest value”. (at para 66) He found that “The common law claim proposed is preferable to each Class Member making a privacy complaint, waiting for the resolution of the complaint from the Privacy Commissioner with a formal report, and then commencing a Federal Court action.” (at para 67)

Justice Glustein certified the proceedings and approved the settlement agreement. He was certainly aware of the potential weaknesses of the plaintiff’s case – these were factors he took into account in assessing the reasonableness of the amount of the settlement. Not only were there real issues as to whether a class action lawsuit was a possible recourse for breach of PIPEDA, a proceeding under s. 14 is de novo, meaning the court would not be bound by the findings of the Privacy Commissioner. Further, the Federal Court has been parsimonious with damages under PIPEDA, awarding them only in the most “egregious” circumstances. It is, in fact, rare for a Federal Court judge to award damages unless there has been an improper disclosure of personal information. In this case, the insurance company was found to have collected too much information, but there had been no breach or loss of personal data.

This case is interesting because raises the possibility of class action lawsuits being used for privacy complaints other than data security breaches. This should put fear into the heart of any company whose general practices or policies have led them to collect too much personal information, obtain insufficient consent, or retain data for longer than necessary (to name just a few possible shortcomings). Perhaps the facts in Haikola are exceptional enough to avoid a landslide of litigation. Justice Glustein was clearly sympathetic towards a plaintiff who had doggedly pursued his privacy rights in the face of an insufficiently responsive company, and who had been vindicated by the OPC’s Report of Findings. Justice Glustein noted as well that it was the plaintiff who had sought to initiate the class action lawsuit – he had not been recruited by class counsel.

There is clearly also an element in this decision of frustration and dissatisfaction with the current state of Canadian data protection law. Justice Glustein observed: “If systemic PIPEDA breaches are not rectified by a class procedure, it is not clear what incentive large insurers and others will have to avoid overcollection of information.” (at para 88) Justice Glustein also observed that “While the Privacy Commissioner may encourage or require changes to future practices, it [sic] has very limited powers to enforce compliance through strong regulatory penalties.” (at para 88) This is certainly true, and many (including the Privacy Commissioner) have called for greater enforcement powers to strengthen PIPEDA. This comment, taken with Justice Glustein’s additional comment that the settlement imposes on the Defendants a “meaningful business cost” for the overcollection of personal information, are nothing short of a condemnation of Canada’s private sector data protection regime.

The government has heard such condemnations from the Commissioner himself, as well as from many other critics of PIPEDA. It is now hearing it from the courts. Hopefully it is paying attention. This is not just because PIPEDA obligations need stronger and more diverse enforcement options to provide meaningful privacy protection, but also because class action lawsuits are a blunt tool, ill-designed to serve carefully-tailored public policy objectives in this area.

 

 

Published in Privacy

A recent Finding from the Office of the Privacy Commissioner of Canada contains a consideration of the meaning of “publicly available information”, particularly as it relates to social media profiles. This issue is particularly significant given a recent recommendation by the ETHI committee in its Report on PIPEDA reform. PIPEDA currently contains a very narrowly framed exception to the requirement of consent for “publicly available information”. ETHI had recommended amending the definition to make it “technologically neutral”. As I argued here, such a change would make it open-season for the collection, use and disclosure of social media profiles of Canadians.

The Finding, issued on June 12, 2018, came after multiple complaints were filed by Canadians about the practices of a New Zealand-based social media company, Profile Technology Ltd (PTL). The company had obtained Facebook user profile data from 2007 and 2008 under an agreement with Facebook. While their plan might have originally been to create a powerful search engine for Facebook, in 2011 they launched their own social media platform. They used the Facebook data to populate their platform with profiles. Individuals whose profiles were created on the site had the option of ‘claiming’ them. PTL also provided two avenues for individuals who wished to delete the profiles. If an email address had been part of the original data obtained from Facebook and was associated with the PTL profile, a user could log in using that email address and delete the account. If no email address was associated with the profile, the company required individuals to set up a helpdesk ticket and to provide copies of official photo identification. A number of the complainants to the OPC indicated that they were unwilling to share their photo IDs with a company that had already collected, used and disclosed their personal information without their consent.

The complainants’ concerns were not simply that their personal information had been taken and used to populate a new social media platform without their consent. They also felt harmed by the fact that the data used by PTL was from 2007-2008, and did not reflect any changes or choices they had since made. One complaint received by the OPC related to the fact that PTL had reproduced a group that had been created on Facebook, but that since had been deleted from Facebook. Within this group, allegations had been made about the complainant that he/she considered defamatory and bullying. The complainant objected to the fact that the group persisted on PTL and that the PTL platform did not permit changes to public groups and the behest of single individuals on the basis that they treated the group description “as part of the profile of every person who has joined that group, therefore modifying the group would be like modifying all of those people’s profiles and we cannot modify their profiles without their consent.” (at para 55)

It should be noted that although the data was initially obtained by PTL from Facebook under licence from Facebook, Facebook’s position was that PTL had used the data in violation of the licence terms. Facebook had commenced proceedings against PTL in 2013 which resulted in a settlement agreement. There was some back and forth over whether the terms of the agreement had been met, but no information was available regarding the ultimate resolution.

The Finding addresses a number of interesting issues. These include the jurisdiction of the OPC to consider this complaint about a New Zealand based company, the sufficiency of consent, and data retention limits. This post focuses only on the issue of whether social media profiles are “publicly available information” within the meaning of PIPEDA.

PTL argued that it was entitled to benefit from the “publicly available information” exception to the requirement for consent for collection and use of personal information because the Facebook profiles of the complainants were “publicly available information”. The OPC disagreed. It noted that the exception for “publicly available information”, found in ss. 7(1)(d) and 7(2)(c.1) of PIPEDA, is defined by regulation. The applicable provision is s. 1(e) of the Regulations Specifying Publicly Available Information, which requires that “the personal information must appear in a publication, the publication must be available to the public, and the personal information has to have been provided by the individual.”(at para 87) The OPC rejected PTL’s argument that “publication” included public Facebook profiles. In its view, the interpretation of “publicly available information” must be “in light of the scheme of the Act, its objects, and the intention of the legislature.” (at para 89) It opined that neither a Facebook profile nor a ‘group’ was a publication. It noted that the regulation makes it clear that “publicly available information” must receive a restrictive interpretation, and reflects “a recognition that information that may be in the public domain is still worthy of privacy protection.” (at para 90) The narrow interpretation of this exception to consent is consistent with the fact that PIPEDA has been found to be quasi-constitutional legislation.

In finding that the Facebook profile information was not publicly available information, the OPC considered that the profiles at issue “were created at a time when Facebook was relatively new and its policies were in flux.” (at para 92) Thus it would be difficult to determine that the intention of the individuals who created profiles at that time was to share them broadly and publicly. Further, at the time the profiles were created, they were indexable by search engines by default. In an earlier Finding, the OPC had determined that this default setting “would not have been consistent with users’ reasonable expectations and was not fully explained to users” (at para 92). In addition, the OPC noted that Facebook profiles were dynamic, and that their ‘owners’ could update or change them at will. In such circumstances, “treating a Facebook profile as a publication would be counter to the intention of the Act, undermining the control users otherwise maintain over their information at the source.” (at para 93) This is an interesting point, as it suggests that the dynamic nature of a person’s online profile prevents it from being considered a publication – it is more like an extension of a user’s personality or self-expression.

The OPC also noted that even though the profile information was public, to qualify for the exception it had to be contributed by the individual. This is not always the case with profile information – in some cases, for example, profiles will include photographs that contain the personal information of third parties.

This Finding, which is not a decision, and not binding on anyone, shows how the OPC interprets the “publicly available information” exception in its home statute. A few things are interesting to note:

· The OPC finds that social media profiles (in this case from Facebook) are different from “publications” in the sense that they are dynamic and reflect an individual’s changing self-expression

· Allowing the capture and re-use, without consent, of self-expression from a particular point in time, robs the individual not only of control of their personal information by of control over how they present themselves to the public. This too makes profile data different from other forms of “publicly accessible information” such as telephone or business directory information, or information published in newspapers or magazines.

· The OPC’s discussion of Facebook’s problematic privacy practices at the time the profiles were created muddies the discussion of “publicly available information”. A finding that Facebook had appropriate rules of consent should not change the fact that social media profiles should not be considered “publicly available information” for the purposes of the exception.

 

It is also worth noting that a complaint against PTL to the New Zealand Office of the Privacy Commissioner proceeded on the assumption that PTL did not require consent because the information was publicly available. In fact, the New Zealand Commissioner ruled that no breach had taken place.

Given the ETHI Report’s recommendation, it is important to keep in mind that the definition of “publicly accessible information” could be modified (although the government’s response to the ETHI report indicates some reservations about the recommendation to change the definition of publicly available information). Because the definition is found in a regulation, a modification would not require legislative amendment. As is clear from the ETHI report, there are a number of industries and organizations that would love to be able to harvest and use social media platform personal information without need to obtain consent. Vigilance is required to ensure that these regulations are not altered in a way that dramatically undermines privacy protection.

 

Published in Privacy
<< Start < Prev 1 2 Next > End >>
Page 1 of 2

Canadian Trademark Law

Published in 2015 by Lexis Nexis

Canadian Trademark Law 2d Edition

Buy on LexisNexis

Electronic Commerce and Internet Law in Canada, 2nd Edition

Published in 2012 by CCH Canadian Ltd.

Electronic Commerce and Internet Law in Canada

Buy on CCH Canadian

Intellectual Property for the 21st Century

Intellectual Property Law for the 21st Century:

Interdisciplinary Approaches

Purchase from Irwin Law