Tags
access to information
AI
AIDA
AI governance
AI regulation
Ambush Marketing
artificial intelligence
big data
bill c11
Bill c27
copyright
data governance
data protection
data strategy
freedom of expression
Geospatial
geospatial data
intellectual property
Internet
internet law
IP
open courts
open data
open government
personal information
pipeda
Privacy
smart cities
trademarks
transparency
|
Displaying items by tag: bill c11
Monday, 04 July 2022 06:10
Bill C-27’s Take on Consent: A Mixed Review
Note: this is the first in a series of blog posts on Bill C-27, also known as An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act. Bill C-27 is a revised version of the former Bill C-11 which died on the order paper just prior to the last federal election in 2021. The former Privacy Commissioner called Bill C-11 ‘a step backwards’ for privacy, and issued a series of recommendations for its reform. At the same time, industry was also critical of the Bill, arguing that it risked making the use of data for innovation too burdensome. Bill C-27 takes steps to address the concerns of both privacy advocates and those from industry with a series of revisions, although there is much that is not changed from Bill C-11. Further, it adds an entirely new statute – the Artificial Intelligence and Data Act (AIDA) – meant to govern some forms of artificial intelligence. This series of posts will assess a number of the changes found in Bill C-27. It will also consider the AIDA. _________________________________ The federal government has made it clear that it considers consent to be a cornerstone of Canadian data protection law. They have done so in the Digital Charter, in Bill C-11 (the one about privacy), and in the recent reincarnation of data protection reform legislation in Bill C-27. On the one hand, consent is an important means by which individuals can exercise control over their personal information; on the other hand, it is widely recognized that the consent burden has become far too high for individuals who are confronted with long, complex and often impenetrable privacy policies at every turn. At the same time, organizations that see new and emerging uses for already-collected data seek to be relieved of the burden of obtaining fresh consents. The challenge in privacy law reform has therefore been to make consent meaningful, while at the same time reducing the consent burden and enabling greater use of data by private and public sector entities. Bill C-11 received considerable criticism for how it dealt with consent (see, for example, my post here, and the former Privacy Commissioner’s recommendations to improve consent in C-11 here). Consent is back, front and centre in Bill C-27, although with some important changes. Section 15 of Bill C-27 reaffirms that consent is the default rule for collection, use or disclosure of personal information, although the statute creates a long list of exceptions to this general rule. One criticism of Bill C-11 was that it removed the definition of consent in s. 6.1 of PIPEDA, which provided that consent “is only valid if it is reasonable to expect that an individual to whom the organization’s activities are directed would understand the nature, purpose and consequences of the collection, use or disclosure of the personal information to which they are consenting.” Instead, Bill C-11 simply relied upon a list of information that must be provided to individuals prior to consent. Bill C-27’s compromise is found in the addition of a new s. 15(4) which requires that the information provided to individuals to obtain their consent must be “in plain language that an individual to whom the organization’s activities are directed would reasonably be expected to understand.” This has the added virtue of ensuring, for example, that privacy policies for products or services directed at youth or children must take into account the sophistication of their audience. The added language is not as exigent as s. 6.1 (for example, s. 6.1 requires an understanding of the nature, purpose and consequences of the collection, use and disclosure, while s. 15(4) requires only an understanding of the language used), so it is still a downgrading of consent from the existing law. It is, nevertheless, an improvement over Bill C-11. A modified s. 15(5) and a new s. 15(6) also muddy the consent waters. Subsection 15(5) provides that consent must be express unless it is appropriate to imply consent. The exception to this general rule is the new subsection 15(6) which provides: (6) It is not appropriate to rely on an individual’s implied consent if their personal information is collected or used for an activity described in subsection 18(2) or (3). Subsections 18(2) and (3) list business activities for which personal data may be collected or used without an individual’s knowledge or consent. At first glance, it is unclear why it is necessary to provide that implied consent is inappropriate in such circumstances, since no consent is needed at all. However, because s. 18(1) sets out certain conditions criteria for collection without knowledge or consent, it is likely that the goal of s. 15(6) is to ensure that no organization circumvents the limited guardrails in s. 18(1) by relying instead on implied consent. The potential breadth of s. 18(3) (discussed below), combined with s. 2(3) makes it difficult to distinguish between the two, in which case, the cautious organization will comply with s. 18(3) rather than rely on implied consent in any event. The list of business activities for which no knowledge or consent is required for the collection or use of personal information is pared down from that in Bill C-11. The list in C-11 was controversial, as it included some activities which were so broadly stated that they would have created gaping holes in any consent requirement (see my blog post on consent in C-11 here). The worst of these have been removed. This is a positive development, although the provision creates a backdoor through which other exceptions can be added by regulation. Further, Bill C-27 has added language to s. 12(1) to clarify that the requirement that the collection, use or disclosure of personal information must be “only in a manner and for purposes that a reasonable person would consider appropriate in the circumstances” applies “whether or not consent is required under this Act.” [Note that although the exceptions in s. 18 are to knowledge as well as consent, s. 62(2)(b) of Bill C-27 will require that an organization provide plain language information about how it makes use of personal information, and how it relies upon exceptions to consent “including a description of any activities referred to in subsection 18(3) in which it has a legitimate interest”.] Bill C-27 does, however, contain an entirely new exception to the collection or use of personal data with knowledge or consent. This is found in s. 18(3): 18 (3) An organization may collect or use an individual’s personal information without their knowledge or consent if the collection or use is made for the purpose of an activity in which the organization has a legitimate interest that outweighs any potential adverse effect on the individual resulting from that collection or use and (a) a reasonable person would expect the collection or use for such an activity; and (b) the personal information is not collected or used for the purpose of influencing the individual’s behaviour or decisions. So as not to leave this as open-ended as it seems at first glance, a new s. 18(4) sets conditions precedent for the collection or use of personal information for ‘legitimate purposes’: (4) Prior to collecting or using personal information under subsection (3), the organization must (a) identify any potential adverse effect on the individual that is likely to result from the collection or use; (b) identify and take reasonable measures to reduce the likelihood that the effects will occur or to mitigate or eliminate them; and (c) comply with any prescribed requirements. Finally, a new s. 18(5) requires the organization to keep a record of its assessment under s. 18(4) and it must be prepared to provide a copy of this assessment to the Commissioner at the Commissioner’s request. It is clear that industry had the ear of the Minister when it comes to the addition of ss. 18(3). A ‘legitimate interest’ exception was sought in order to enable the use of personal data without consent in a broader range of circumstances. Such an exception is found in the EU’s General Data Protection Regulation (GDPR). Here is how it is worded in the GDPR: 6(1) Processing shall be lawful only if and to the extent that at least one of the following applies: [. . . ] (f) processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child. Under the GDPR, an organization that relies upon legitimate interests instead of consent, must take into account, among other things: 6(4) [. . . ] (a) any link between the purposes for which the personal data have been collected and the purposes of the intended further processing; (b) the context in which the personal data have been collected, in particular regarding the relationship between data subjects and the controller; (c) the nature of the personal data, in particular whether special categories of personal data are processed, pursuant to Article 9, or whether personal data related to criminal convictions and offences are processed, pursuant to Article 10; (d) the possible consequences of the intended further processing for data subjects; (e) the existence of appropriate safeguards, which may include encryption or pseudonymisation. Bill C-27’s ‘legitimate interests’ exception is different in important respects from that in the GDPR. Although Bill C-27 gives a nod to the importance of privacy as a human right in a new preamble, the human rights dimensions of privacy are not particularly evident in the body of the Bill. The ‘legitimate interests’ exception is available unless there is an “adverse effect on the individual” that is not outweighed by the organization’s legitimate interest (as opposed to the ‘interests or fundamental freedoms of the individual’ under the GDPR). Presumably it will be the organization that does this initial calculation. One of the problems in data protection law has been quantifying adverse effects on individuals. Data breaches, for example, are shocking and distressing to those impacted, but it is often difficult to show actual damages flowing from the breach, and moral damages have been considerably restricted by courts in many cases. Some courts have even found that ordinary stress and inconvenience of a data breach is not compensable harm since it has become such a routine part of life. If ‘adverse effects’ on individuals are reduced to quantifiable effects, the ‘legitimate interests’ exception will be far too broad. This is not to say that the ‘legitimate interests’ provision in Bill C-27 is incapable of facilitating data use while at the same time protecting individuals. There is clearly an attempt to incorporate some checks and balances, such as reasonable expectations and a requirement to identify and mitigate any adverse effects. But what C-27 does is take something that, in the GDPR, was meant to be quite exceptional to consent and make it potentially a more mainstream basis for the use of personal data without knowledge or consent. It is able to do this because rather than reinforce the centrality and importance of privacy rights, it places privacy on an uneasy par with commercial interests in using personal data. The focus on ‘adverse effects’ runs the risk of equating privacy harm with quantifiable harm, thus trivializing the human and social value of privacy.
Published in
Privacy
Monday, 21 June 2021 12:32
Ontario White Paper Seeks Input on a Private Sector Data Protection Law
In June 2021, Ontario issued a White Paper that sets out some proposals, including suggested wording, for a new private sector data protection law for the province. This is part of its overall digital and data strategy. Input on the White Paper is sought by August 3, 2021. I have published table that compares Ontario’s with the federal government’s Bill C-11 (which will not make it through Parliament in the present sitting, and which may get some necessary attention over the summer). It makes sense to compare the proposal to C-11 because, if it passes, any Ontario law would have to be found to be substantially similar to it. The Ontario proposal has clearly been drafted with Bill C-11 in mind. That said, the idea is not to simply copy Bill C-11. The White Paper shows areas where Bill C-11 may be largely copied, but other places where Ontario plans to modify it, add something new, or go in a different direction. Of course, feedback is sought on the contents of the White Paper, and a bill, if and when it is introduced in the Legislature, may look different from what is currently proposed – depending on what feedback the government receives. I have prepared a Table that compares the Ontario proposal with Bill C-11, with some added commentary. The Table can be found here, with the caveat that the commentary is preliminary – and was generated quite quickly. Please be sure to respond to the consultation by the August 3 deadline!
Published in
Privacy
Thursday, 04 February 2021 10:06
How Might Bill C-11 Affect the Outcome of a Clearview AI-type Complaint?
A joint ruling from the federal Privacy Commissioner and his provincial counterparts in Quebec, B.C., and Alberta has found that U.S.-based company Clearview AI breached Canadian data protection laws when it scraped photographs from social media websites to create the database it used to support its facial recognition technology. According to the report, the database contained the biometric data of “a vast number of individuals in Canada, including children.” Investigations of complaints under public sector data protection laws about police use of Clearview AI’s services are still ongoing. The Commissioners’ findings are unequivocal. The information collected by Clearview AI is sensitive biometric data. Express consent was required for its collection and use, and Clearview AI did not obtain consent. The company’s argument that consent was not required because the information was publicly available was firmly rejected. The Commissioners described Clearview AI’s actions as constituting “the mass identification and surveillance of individuals by a private entity in the course of commercial activity.” (at para 72) In defending itself, Clearview AI put forward arguments that were clearly at odds with Canadian law. They also resisted the jurisdiction of the Canadian Commissioners, notwithstanding the fact that they collected the personal data of Canadians and offered their commercial services to Canadian law enforcement agencies. Clearview AI did not accept the Commissioners’ findings, and “has not committed to following” the recommendations. At the time of this report, Bill C-11, a bill to reform Canada’s current data protection law, is before Parliament. The goal of this post is to consider what difference Bill C-11 might make to the outcome of complaints like this one should it be passed into law. I consider both the substantive provisions of the bill and its new enforcement regime. Consent Like the current Personal Information Protection and Electronic Documents Act (PIPEDA), consent is a core requirement of Bill C-11. To collect, use or disclose personal information, an organization must either obtain valid consent, or its activities must fall into one of the exceptions to consent. In the Clearview AI case, there was no consent, and the disputed PIPEDA exception to the consent requirement was the one for ‘publicly available personal information’. While this exception seems broad on its face, to qualify, the information must fall within the parameters set out in the Regulations Specifying Publicly Available Personal Information. These regulations focus on certain categories of publicly available information – such as registry information (land titles, for example), court registries and decisions, published telephone directory information, and public business information listings. In most cases, the regulations provide that the use of the information must also relate directly to the purposes for which it was made public. The regulations also contain an exception for “personal information that appears in a publication, including a magazine, book or newspaper, in printed or electronic form, that is available to the public, where the individual has provided the information.” The interpretation of this provision was central to Clearview AI’s defense of its practices. It argued that social media postings were “personal information that appears in a publication.” The Commissioners adopted a narrow interpretation consistent with this being an exception in quasi-constitutional legislation. They distinguished between the types of publications mentioned in the exception and uncurated, dynamic social-media sites. The Commissioners noted that unlike newspapers or magazines, individuals retain a degree of control over the content of their social media sites. They also observed that to find that all information on the internet falls within the publicly available information exception “would create an extremely broad exemption that undermines the control users may otherwise maintain over their information at the source.” (at para 65) Finally, the Commissioners observed that the exception applied to information provided by the data subject, but that photographs were scraped by Clearview AI regardless of whether they were posted by the data subject or by someone else. Would the result be any different under Bill C-11? In section 51, Bill C-11 replicates the “publicly available information exception” for collection, use or disclosure of personal information. Like PIPEDA, it also leaves the definition of this term to regulations. However, Canadians should be aware that there has been considerable pressure to expand the regulations so that personal information shared on social media sites is exempted from the consent requirement. For example, in past hearings into PIPEDA reform, the House of Commons ETHI Committee at one point appeared swayed by industry arguments that PIPEDA should be amended to include websites and social media within this exception. Bill C-11 does not resolve this issue; but if passed, it might well be on the table in the drafting of regulations. If nothing else, the Clearview AI case provides a stark illustration of just how important this issue is to the privacy of Canadians. However, data scrapers may be able to look elsewhere in Bill C-11 for an exception to consent. Bill C-11 contains new exceptions to consent for “business operations” which I have criticized here. One of these exceptions would almost certainly be relied upon by a company in Clearview AI’s position if the bill were passed. The exceptions allow for the collection and use of personal information without an individual’s knowledge or consent if, among other things, it is for “an activity in the course of which obtaining the individual’s consent would be impracticable because the organization does not have a direct relationship with the individual.” (18(2)(e)). A company that scrapes data from social media sites to create a facial recognition database would find it impracticable to get consent because it has no direct relationship with any of the affected individuals. The exception seems to fit. That said, s. 18(1) does set some general guardrails. The one that seems relevant in this case is that the exceptions to consent are only available where “a reasonable person would expect such a collection or use for that activity”. Hopefully, collection of images from social media websites to fuel facial recognition technology would not be something that a reasonable person would expect; certainly, the Commissioners would not find it to be so. In addition, section 12 of Bill C-11 requires that information be collected or used “only for purposes that a reasonable person would consider appropriate in the circumstances” (a requirement carried over from PIPEDA, s. 5(3)). In their findings, the Commissioners ruled that the collection and use of images by Clearview AI was for a purpose that a reasonable person would find inappropriate. The same conclusion could be reached under Bill C-11. There is reason to be cautiously optimistic, then, that Bill C-11 would lead to the same result on a similar set of facts: the conclusion that the wholesale scraping of personal data from social media sites to build a facial recognition database without consent is not permitted. However, the scope of the exception in s. 18(2)(e) is still a matter of concern. The more exceptions that an organization pushing the boundaries feels it can wriggle into, the more likely it will be to engage in a privacy-compromising activities. In addition, there may be a range of different uses for scraped data and “what a reasonable person would expect” is a rather squishy buffer between privacy and wholesale data exploitation. Enforcement Bill C-11 is meant to substantially increase enforcement options when it comes to privacy. Strong enforcement is particularly important in cases where organizations are not interested in accepting the guidance of regulators. This is certainly the case with Clearview AI, which expressly rejected the Commissioners’ findings. Would Bill C-11 strengthen the regulator’s hand? The Report of Findings in this case reflects the growing trend of having the federal and provincial commissioners that oversee private sector data protection laws jointly investigate complaints involving issues that affect individuals across Canada. This cooperation is important as it ensures consistent interpretation of what is meant to be substantially similar legislation across jurisdictions. Nothing in Bill C-11 would prevent the federal Commissioner from continuing to engage in this cross-jurisdictional collaboration – in fact, subsection 116(2) expressly encourages it. Some will point to the Commissioner’s new order-making powers as another way to strengthen his enforcement hand. The Commissioner can now direct an organization to take measures to comply with the legislation or to cease activities that are in contravention of the legislation (s. 92(2)). This is a good thing. However, these orders are subject to appeal to the new Personal Information Protection and Data Tribunal (the Tribunal). By contrast, orders of the Commissioners of BC and Alberta are final, subject only to judicial review. In addition, it is not just the orders of the Commissioner that are appealable under C-11, but also his findings. This raises questions about how the new structure under Bill C-11 might affect cooperative inquiries like the one in this case. Conclusions shared with other Commissioners can be appealed by respondents to the Tribunal, which owes no deference to the Commissioner on questions of law. As I and others have already noted, the composition of the Tribunal is somewhat concerning; Bill C-11 would require only a minimum of one member of the tribunal to have expertise in privacy law. While it is true that proceedings before the Federal Court were de novo, and thus the Commissioner was afforded no formal deference in that context either, access to Federal Court was more limited than the wide-open appeals route to the Tribunal. The Bill C-11 structure really seems to shift the authority to interpret and apply the law away from the Commissioner and to the mysterious and not necessarily expert Tribunal. Bill C-11 also has a much-touted new power to issue substantial fines for breach of the legislation. Interestingly, however, this does not seem to be the kind of case in which a fine would be available. Fines, provided for under s. 93(1) of Bill C-11 are available only with respect to the breach of certain obligations under the statute (these are listed in s. 93(1)). Playing fast and loose with the requirement to obtain consent is not one of them. This is interesting, given the supposedly central place consent plays within the Bill. Further thought might need to be given to the list of ‘fine-able contraventions’. Overall, then, although C-11 could lead to a very similar result on similar facts, the path to that result may be less certain. It is also not clear that there is anything in the enforcement provisions of the legislation that will add heft to the Commissioner’s findings. In practical terms, the decisions that matter will be those of the Tribunal, and it remains to be seen how well this Tribunal will serve Canadians.
Published in
Privacy
Tuesday, 12 January 2021 15:20
Data Mobility (Portability) in Canada's Bill C-11
This post is the third in a series that considers the extent to which the Digital Charter Implementation Act (Bill C-11) by overhauling Canada’s federal private sector data protection law, implements the principles contained in the government’s Digital Charter. This post addresses the fourth principle of the Charter: Transparency, Portability and Interoperability, which provides that “Canadians will have clear and manageable access to their personal data and should be free to share or transfer it without undue burden.” Europe’s General Data Protection Regulation (GDPR) introduced the concept of data portability (data mobility) as part of an overall data protection framework. The essence of the data portability right in article 20 of the GDPR is: (1) The data subject shall have the right to receive the personal data concerning him or her, which he or she has provided to a controller, in a structured, commonly used and machine-readable format and have the right to transmit those data to another controller without hindrance from the controller to which the personal data have been provided [...] In this version, the data flows from one controller to another via the data subject. There is no requirement for data to be in a standard, interoperable format – it need only be in a common, machine-readable format. Data portability is not a traditional part of data protection; it largely serves consumer protection/competition law interest. Nevertheless, it is linked to data protection through the concept of individual control over personal information. For example, consider an individual who subscribes to a streaming service for audiovisual entertainment. The service provider acquires considerable data about that individual and their viewing preferences over a period of time. If a new company enters the market, they might offer a better price, but the consumer may be put off by the lack of accurate or helpful recommendations or special offers/promotions tailored to their tastes. The difference in the service offered lies in the fact that the incumbent has much more data about the consumer. A data mobility right, in theory, allows an individual to port their data to the new entrant. The more level playing field fosters competition that is in the individual’s interest, and serves the broader public interest by stimulating competition. The fourth pillar of the Digital Charter clearly recognizes the idea of control that underlies data mobility, suggesting that individuals should be free to share or transfer their data “without undue burden.” Bill C-11 contains a data mobility provision that is meant to implement this pillar of the Charter. However, this provision is considerably different from what is found in the GDPR. One of the challenges with the GDPR’s data portability right is that not all data will be seamlessly interoperable from one service provider to another. This could greatly limit the usefulness of the data portability right. It could also impose a significant burden on SMEs who might face demands for the production and transfer of data that they are not sufficiently resourced to meet. It might also place individuals’ privacy at greater risk, potentially spreading their data to multiple companies, some of which might be ill-equipped to provide the appropriate privacy protection. These concerns may explain why Bill C-11 takes a relatively cautious approach to data mobility. Section 72 of the Consumer Privacy Protection Act portion of Bill C-11 provides: 72 Subject to the regulations, on the request of an individual, an organization must as soon as feasible disclose the personal information that it has collected from the individual to an organization designated by the individual, if both organizations are subject to a data mobility framework provided under the regulations. [My emphasis] It is important to note that in this version of mobility, data flows from one organization to another rather than through the individual, as is the case under the GDPR. The highlighted portion of s. 72 makes it clear that data mobility will not be a universal right. It will be available only where a data mobility framework is in place. Such frameworks will be provided for in regulations. Section 120 of Bill C-11 states: 120 The Governor in Council may make regulations respecting the disclosure of personal information under section 72, including regulations (a) respecting data mobility frameworks that provide for (i) safeguards that must be put in place by organizations to enable the secure disclosure of personal information under section 72 and the collection of that information, and (ii) parameters for the technical means for ensuring interoperability in respect of the disclosure and collection of that information; (b) specifying organizations that are subject to a data mobility framework; and (c) providing for exceptions to the requirement to disclose personal information under that section, including exceptions related to the protection of proprietary or confidential commercial information. The regulations provide for frameworks that will impose security safeguards on participating organizations, and ensure data interoperability. Paragraph 120(b) also suggests that not all organizations within a sector will automatically be entitled to participate in a mobility framework; they may have to qualify by demonstrating that they meet certain security and technical requirements. A final (and interesting) limitation on the mobility framework relates to exceptions to disclosure where information that might otherwise be considered personal information is also proprietary or confidential commercial information. This gets at the distinction between raw and derived data – data collected directly from individuals might be subject to the mobility framework, but profiles or analytics based on that data might not – even if they pertain to the individual. It is reasonable to expect that open banking (now renamed ‘consumer-directed finance’) will be the first experiment with data mobility. The federal Department of Finance released a report on open banking in January 2020, and has since been engaged in a second round of consultations. Consumer-directed finance is intended to address the burgeoning fintech industry which offers many new and attractive financial management digital services to consumers but which relies on access to consumer financial data. Currently (and alarmingly) this need for data is met by fintechs asking individuals to share account passwords so that they can regularly scrape financial data from multiple sources (accounts, credit cards, etc.) in order to offer their services. A regulated framework for data mobility is seen as much more secure, since safeguards can be built into the system, and participants can be vetted to ensure they meet security and privacy standards. Data interoperability between all participants will also enhance the quality of the services provided. If financial services is the first area for development of data mobility in Canada, what other areas for data mobility might Canadians expect? The answer is: not many. The kind of scheme contemplated for open banking has already required a considerable investment of time and energy, and it is not yet ready to launch. Of course, financial data is among the most sensitive of personal data; other schemes might be simpler to design and create. But they will still take a great deal of time. One sector where some form of data mobility might eventually be contemplated is telecommunications. (Note that Australia’s comparable “consumer data right” is being unrolled first with open banking and will be followed by initiatives in the telecommunications and energy sectors). Data mobility in the CPPA will also be limited by its stringency. It is no accident that banking and telecommunications fall within federal jurisdiction. The regulations contemplated by s. 120 go beyond simple data protection and impact how companies do business. The federal government will face serious challenges if it attempts to create data mobility frameworks within sectors or industries under provincial jurisdiction. Leadership on this front will have to come from the provinces. Those with their own private sector data protection laws could choose to address data mobility on their own terms. Quebec has already done this in Bill 64, which would amend its private sector data protection law to provide: 112 [. . .] Unless doing so raises serious practical difficulties, computerized personal information collected from the applicant must, at his request, be communicated to him in a structured, commonly used technological format. The information must also be communicated, at the applicant’s request, to any person or body authorized by law to collect such information. It remains to be seen what Alberta and British Columbia might decide to do – along with Ontario, if in fact it decides to proceed with its own private sector data protection law. As a result, while there might be a couple of important experiments with data mobility under the CPPA, the data mobility right within that framework is likely to remain relatively constrained.
Published in
Privacy
Monday, 04 January 2021 12:12
How do new data protection enforcement provisions in Canada's Bill C-11 measure up?
This post is the second in a series that considers the extent to which the Digital Charter Implementation Act, by overhauling Canada’s federal private sector data protection law, implements the principles contained in the government’s Digital Charter. It addresses the tenth principle of the Charter: Strong Enforcement and Real Accountability. This principle provides that “There will be clear, meaningful penalties for violations of the laws and regulations that support these principles.” Canada’s current data protection law, the Personal Information Protection and Electronic Documents Act (PIPEDA) has been criticized for the relatively anemic protection it provides for personal information. Although complaints may be filed with the Commissioner, the process ends with a non-binding “report of findings”. After receiving a report, a complainant who seeks either a binding order or compensation must make a further application to the Federal Court. Recourse to Federal Court is challenging for unrepresented plaintiffs. Yet, awards of damages have been low enough to make it distinctly not worth anyone’s while to hire a lawyer to assist them with such a claim. As a result, the vast majority of cases going to the Federal Court have been brought by unrepresented plaintiffs. Damage awards have been low, and nobody has been particularly impressed. It is now far more likely that privacy issues – at least where data breaches are concerned – will be addressed through class action lawsuits, which have proliferated across the country. Of course, the protection of personal information is not all about seeking compensation or court orders. In fact, through the complaints process over the years, the Commissioner has worked to improve data protection practices through a variety of soft compliance measures, including investigating complaints and making recommendations for changes. The Commissioner also uses audit powers and, more recently, compliance agreements, to ensure that organizations meet their data protection obligations. Nevertheless, high profile data breaches have left Canadians feeling vulnerable and unprotected. There is also a concern that some data-hungry companies are making fortunes from personal data and that weak legislative sanctions provide no real incentive to limit their rampant collection, use and disclosure of personal data. Public unease has been augmented by overt resistance to the Commissioner’s rulings in some instances. For example, Facebook was defiant in response to the Commissioner’s findings in relation to the Cambridge Analytica scandal. Even more recently, in an inquiry into the use of facial recognition technologies in shopping malls, the respondent politely declined to accept the Commissioner’s findings that certain of their practices were in breach of PIPEDA. The Digital Charter Implementation Act is meant to address PIPEDA’s enforcement shortfall. It provides for the enactment of two statutes related to personal data protection: The Consumer Privacy Protection Act (CPPA) and the Personal Information and Data Protection Tribunal Act (the PIDPTA). A government Fact Sheet describes this combination as providing a “Comprehensive and accessible enforcement model”. The revamped version of PIPEDA, the CPPA would give the Commissioner the power to order an organization to comply with its obligations under the CPPA or to stop collecting or using personal information. This is an improvement, although the order-making powers are subject to a right of appeal to the new Tribunal created by the PIDPTA. At least the Tribunal will owe some deference to the Commissioner on questions of fact or of mixed law and fact – proceedings before the Federal Court under PIPEDA were entirely de novo. Under the CPPA, the Commissioner will also be able to recommend that the Tribunal impose a fine. Fines are available only for certain breaches of the legislation. These are ones that involve excessive collection of personal information; use or disclosure of personal information for new purposes without consent or exception; making consent to personal data collection a condition of the provision of a product or service (beyond what is necessary to provide that product or service); obtaining consent by deception; improper retention or disposal of personal information; failure to dispose of personal information at an individual’s request; breach of security safeguards; or failure to provide breach notification. The fines can be substantial, with a maximum penalty of the higher of $10,000,000 or 3% of the organization’s gross global revenue for the preceding financial year. Of course, that is the upper end. Fines are discretionary and subject to a number of considerations, and fines are explicitly not meant to be punitive. Within this structure, the Tribunal will play a significant role. It was no doubt created to provide greater distance between the Commissioner and the imposition of fines on organizations. In this respect, it is a good thing. The Commissioner still plays an important role in encouraging organizations to comply voluntarily with the legislation. This role is fairer and easier to perform when there is greater separation between the ombuds functions of the Commissioner and the ability to impose penalties. More problematically, the Tribunal will hear appeals of both findings and orders made by the Commissioner. The appeal layer is new and will add delays to the resolution of complaints. An alternative would be to have left orders subject to judicial review, with no appeals. In theory, going to the Tribunal will be faster and perhaps less costly than a trip to Federal Court. But in practice, the Tribunal’s value will depend on its composition and workload. Under the PIDPTA, the Tribunal will have only six members, not necessarily full-time, and only one of these is required to have experience with privacy. Decisions of the tribunal cannot be appealed, but they will be subject to judicial review by the Federal Court. The CPPA also creates a new private right of action. Section 106 provides that an individual affected by a breach of the Act can sue for damages for “loss or injury that the individual has suffered”. However, in order to do so, the individual must first make a complaint. That complaint must be considered by the Commissioner. The Commissioner’s findings and order must either not be appealed or any appeal must have been dealt with by the Tribunal. Note that not all complaints will be considered by the Commissioner. The Commissioner can decline to deal with complaints for a number of reasons (see s. 83) or can discontinue an investigation (see s. 85). There is also a right of action for loss or injury where and organization has been convicted of an offence under the legislation. An offence requires an investigation, a recommendation, and consideration by the Tribunal. All of these steps will take time. It will be a truly dogged individual who pursues the private right of action under the CPPA. Ultimately, then, the question is whether this new raft of enforcement-related provisions is an improvement? To better get a sense of how these provisions might work in practice, consider the example of the massive data breach at Desjardins that recently led to a Commissioner’s report of findings. The data breach was a result of employees not following internal company policies, flawed training and oversight, as well as certain employees going ‘rogue’ and using personal data for their own benefit. In the Report of Findings, the Commissioner makes a number of recommendations most of which have already been implemented by the organization. As a result, the Commissioner has ruled the complaint well-founded and conditionally resolved. Class action lawsuits related to the breach have already been filed. How might this outcome be different if the new legislation were in place? A complaint would still be filed and investigated. The Commissioner would issue his findings as to whether any provisions of the CPPA were contravened. He would have order-making powers and could decide to recommend that a penalty be imposed. However, if his recommendations are all accepted by an organization, there is no need for an order. The Commissioner might, given the nature and size of the breach, decide to recommend that a fine be imposed. However, considering the factors in the legislation and the organization’s cooperation, he might decide it was not appropriate. Assuming a recommendation were made to impose a penalty, the Tribunal would have to determine whether to do so. It must consider a number of factors, including the organization’s ability to pay the fine, any financial benefit derived by the organization from the activity, whether individuals have voluntarily been compensated by the organization, and the organization’s history of complying with the legislation. The legislation also specifically provides that “the purpose of a penalty is to promote compliance with this Act and not to punish.” (s. 94(6)) In a case where the organization was not exploiting the data for its own profit, took steps quickly to remedy the issues by complying with the Commissioner’s recommendations, and provided credit monitoring services for affected individuals, it is not obvious that a fine would be imposed. As for the private right of action in the legislation, it is not likely to alter the fact that massive data breaches of this kind will be addressed through class action lawsuits. The reworking of the enforcement provisions may therefore not be hugely impactful in the majority of cases. This is not necessarily a bad thing, if the lack of impact is due to the fact that the goals of the legislation are otherwise being met. Where it may make a difference is in cases where organizations resist the Commissioner’s findings or where they act in flagrant disregard of data protection rights. It is certainly worth having more tools for enforcement in these cases. Here, the big question mark is the Tribunal – and more particularly, its composition. But there may also be consequences felt by individuals as a result of the changes. The Commissioner’s findings – not just any orders he might make – are now subject to appeal to the Tribunal. This will likely undermine his authority and might undercut his ability to achieve soft compliance with the law. It is also likely to delay resolution of complaints, thus also delaying access to the private right of action contemplated under the legislation. It shifts power regarding what constitutes a breach of the legislation from the Commissioner to the new Tribunal. This may ultimately be the most concerning aspect of the legislation. So much will depend on who is appointed to the Tribunal, and the Bill does not require demonstrable privacy expertise as a general pre-requisite for membership. At the very least, this should be changed.
Published in
Privacy
Monday, 21 December 2020 08:03
The Gutting of Consent in Bill C-11
Bill C-11, the bill to reform Canada’s private sector data protection regime, is titled the Digital Charter Implementation Act. The Digital Charter is a 10-point plan set out by the federal government to frame its digital agenda. This is the first of a series of posts that considers Bill C-11 in light of some of the principles of the Digital Charter. A key pillar of the Digital Charter is “consent”. It states: “Canadians will have control over what data they are sharing, who is using their personal data and for what purposes, and know that their privacy is protected.” A “Fact Sheet” published about Bill C-11 explains: “Modernized consent rules would ensure that individuals have the plain-language information they need to make meaningful choices about the use of their personal information.” How well does this describe what Bill C-11 actually does? It is now generally well accepted that individuals face an enormous consent burden when it comes to personal information. Personal data is collected from every digitally-enabled transaction; it is collected when we use online platforms and as we surf the internet; it is harvested from our phones, and from every app on our phones; home appliances are increasingly co-opted to harvest data about us; our cars collect and transmit data – the list is endless. There are privacy policies somewhere that govern each of these activities, but we do not have the time to read them. If we did, we would likely struggle to grasp their full meaning. And, in any event, these policies are basically take-it-or-leave-it. Add to this the fact that most people’s preoccupation is necessarily with the actual product or service, and not with the many different ways in which collected data might be used or shared. They are unlikely to be able to fully grasp how all this might at some future point affect them. Consent is thus largely a fiction. How does Bill C-11 address this problem? It starts by requiring consent, at or before the time that personal information is collected. This consent must be “valid”, and validity will depend on plain language information being provided to the individual about the purpose for the collection, use or disclosure of the information, the way in which it will be collected, used or disclosed, any “reasonably foreseeable consequences” of this collection, use or disclosure, the specific type of personal information to be collected, and the names of any third parties or types of third parties with whom the information may be shared. It requires express consent, unless the organization “establishes that it is appropriate to rely on an individual’s implied consent”. The organization cannot make the provision of a product or service conditional on granting consent to the collection, use or disclosure of personal information, unless that information is necessary to the provision of the product or service. Consent cannot be obtained by fraud or deception. And, finally, individuals have the right to withdraw consent, on reasonable notice, and subject to a raft of other exceptions which include the “reasonable terms of a contract”. It sounds good until you realize that none of this is actually particularly new. Yes, the law has been tightened up a bit around implied consent and the overall wording has been tweaked. But the basic principles are substantially the same as those in PIPEDA. Some of the tweaks are not necessarily for the better. The plain language list of information required for “valid consent” under Bill C-11 changes PIPEDA’s focus on the ability of the target audience for a product or service to properly grasp the nature, purposes and consequences of the collection, use and disclosure of personal data. By considering the target audience, the PIPEDA language is likely better adapted to things like protecting children’s privacy. If, as the government seems to suggest, there is a new implementation of the “consent” principle in Bill C-11, it is not to be found in the main consent provisions. These are largely a rehash of PIPEDA, and, to the extent they are different, they are not obviously better. What has changed – and ever so much for the worse – are the exceptions to consent, particularly the ones found in sections 18 to 21 of Bill C-11. These exceptions are not the long laundry-list of exceptions to consent that were already found in PIPEDA (those have all made their way into Bill C-11 as well). Sections 18 and 19, in particular, are new in Bill C-11, and they can only be seen as enhancing consent if you conceive of consent as a burden that should be largely eliminated. Essentially, the government has tackled two different public policy issues in one set of provisions. The first issue is the consent burden described above. This can be summed up as: Privacy policies are too long and complex, and no one has time to read them. The legislative solution is to make them shorter by reducing the information they must contain. The second public policy goal is to make it easier for organizations to use the personal data they have collected in new ways without having to go back to individuals for their consent. The solution, though, is to carve out exceptions that address not just new uses of data already collected, but that are broad enough to include the initial collection of data. When these two solutions are combined, the result is quite frankly a data protection disaster. A first problem is that these exceptions are not just to consent, but to knowledge and consent. In other words, not only does an organization not need to seek consent for the listed activities, it does not even need to inform the individual about them. It is very hard to hold an organization to account for things about which one has no knowledge. The first set of exceptions to knowledge and consent in section 18 are for “business activities”. Perhaps recognizing that this provision creates a kind of open season on personal data, it begins with important limitations. The exception to knowledge and consent created by this provision is available only where the collection or use of the data is for one of the listed business activities; a reasonable person “would expect such a collection or use for that activity”; and “the personal information is not collected or used for the purpose of influencing the individual’s behaviour or decision.” These are important guard rails. But, as noted above, without knowledge of the collection, use or disclosure, it will be difficult to hold organizations to account. The list of consent-free activities is open ended – it can be added to by regulation. No doubt this is to make the legislation more responsive to changing practices or circumstances, but it is a mechanism by which the list can expand and grow with relative ease. And some of the listed activities have the potential for dramatic privacy impacts. For example, organizations can collect or use personal data without an individual’s knowledge or consent to reduce their commercial risk. This suggests, shockingly, that financial profiling of individuals without their knowledge or consent is fair game. Organizations may also collect personal data without knowledge or consent “that is necessary for the safety of a product or service that the organization provides or delivers”. In an era of connected cars, appliances, medical devices, and home alarm systems, to give just a few examples, the kinds of information that might fall into this category could be surprising. Even more troubling, though, is the provision that allows for collection and use of personal data for activities “in the course of which obtaining the individual’s consent would be impracticable because the organization does not have a direct relationship with the individual.” No one knows what this really means – because it could mean all kinds of things. I will give just one example below. The next exception, in section 19, allows an organization to transfer an individual’s personal information to a service provider without their knowledge or consent. Let’s say you go to a company’s website and you need customer service. There is a chatbot available on the site to assist you. The chatbot is part of a suite of digital customer services provided to the company by a service provider, and your personal information is transferred to them without your knowledge or consent to enable them to deliver these services. The service provider, on its own behalf, also wants to improve its chatbot AI by recording the chat transcripts, and possibly by collecting other data from you. Based on the exception mentioned above (where knowledge and consent would be impracticable because the service provider does not have a direct relationship with you), it can do this without your knowledge or consent. And you don’t even know about the service provider in the first place because of the exception in section 19. From a service point of view, it’s all very smooth and seamless. But let’s go back to the Digital Charter statement: “Canadians will have control over what data they are sharing, who is using their personal data and for what purposes, and know that their privacy is protected.” How are you feeling about this now? In fairness, there are other provisions of the Act that govern transfers of data to service providers to ensure privacy protection and accountability. (I may draw a road map in a later post…you will need one to find these provisions which are scattered throughout the Bill). And, in fairness, there is a ‘transparency’ provision in s. 62(2)(b) that requires organizations to “make available” a “general account of how the organization makes use of personal information”. This explicitly includes “how the organization applies the exceptions to the requirement to obtain consent under this Act.” It is difficult to know what this might look like. But a “general account” being “made available” is not the same as a requirement to provide clear information and obtain consent at or before the time that the data is collected and used. There are ways to reduce the consent burden and to facilitate legitimate uses of data already collected by organizations other than removing the requirements for knowledge or consent in a broad and potentially open-ended list of circumstances. One of these is the concept of “legitimate interests” in art. 6(1) of the EU’s GDPR. Of course, the legitimate interests of organizations under the GDPR are carefully balanced against the “interests or fundamental rights and freedoms of the data subject.” As noted in an earlier post, recognizing the human rights implications of data protection is something that the federal government is simply not prepared to do. The bottom line is that Bill C-11, in its current form, does not enhance consent. Instead, it will directly undermine it. At the very least, section 18 must be drastically overhauled.
Published in
Privacy
Sunday, 06 December 2020 15:05
Data for Good?: An Assessment of the Proposed Exception in Canada’s Private Sector Data Protection Law Reform Bill
Bill C-11, the Act to reform Canada’s private sector data protection legislation – contains a new provision – one that has no equivalent in the current Personal Information Protection and Electronic Documents Act. Section 39 will permit the disclosure of an individual’s personal information without their knowledge or consent where that the disclosure is for “socially beneficial purposes.” This post examines the proposed exception. In the course of their commercial activities, many private sector organizations amass vast quantities of personal data. In theory, these data could be used for a broad range of purposes – some of them in the public interest. There are a growing number of instances where organizations offer to share data with governments or with other actors for public purposes. For example, some organizations have shared data with governments to assist in their research or modeling efforts during the pandemic. There may also be instances where data sharing is part of the quid pro quo for a company’s partnership with the public sector. Los Angeles County, for example, has sought to require data-sharing in exchange for licence to operate dockless scooter rental businesses. The ill-fated Sidewalk Toronto project raised issues around data collected in public spaces, including who would be able to access and use such data and for what purposes. This led to debates about “data trusts”, and whether an entity could be charged with the oversight and licensing of ‘smart city’ data. It is into this context that the proposed exception for “socially beneficial purposes” is introduced. Section 39 of Bill C-11 reads: 39 (1) An organization may disclose an individual’s personal information without their knowledge or consent if (a) the personal information is de-identified before the disclosure is made; (b) the disclosure is made to (i) a government institution or part of a government institution in Canada, (ii) a health care institution, post-secondary educational institution or public library in Canada, (iii) any organization that is mandated, under a federal or provincial law or by contract with a government institution or part of a government institution in Canada, to carry out a socially beneficial purpose, or (iv) any other prescribed entity; and (c) the disclosure is made for a socially beneficial purpose. The first thing to note about this provision is that it reflects a broader ambivalence within the Bill about de-identified data. The ambivalence is evident in the opening words of section 39. An organization “may disclose an individual’s personal information without their knowledge or consent” if it is first de-identified. Yet, arguably, de-identified information is not personal information. Many maintain that it should therefore be usable outside of the constraints of data protection law, as is the case under Europe’s General Data Protection Regulation. Canada’s government is no doubt sensitive to concerns that de-identified personal information poses a reidentification risk, leaving individuals vulnerable to privacy breaches. Even properly de-identified data could lead to reidentification as more data and enhanced computing techniques become available. Bill C-11 therefore extends its regulatory reach to deidentified personal data, even though the Bill contains other provisions which prohibit attempts to re-identify de-identified data, and provide potentially stiff penalties for doing so (sections 75 and 125). The Bill defines “de-identify” as “to modify personal information – or create information from personal information – by using technical processes to ensure that the information does not identify an individual or could not be used in reasonably foreseeable circumstances, alone or in combination with other information, to identify an individual”. The idea that it would include information created from personal information makes the definition surprisingly broad. Consider that in the early days of the pandemic, a number of companies – including Google and Fitbit – released data about mobility – in the form of charts – as we moved into lockdown. These visualizations might well fit the category of ‘information created from personal information’. If this is so, the release of such data – if Bill C-11 were passed in its current form – might constitute a breach, since according to section 39, the disclosure without knowledge or consent must be to a specified entity and must also be for a socially beneficial purpose. Perhaps Bill C-11 intends to restrain this self-publishing of data visualizations or analyses based on personal information. It is just not clear that this is the case – or that if it is, it would not violate the right to freedom of expression. Under section 39, de-identified data may be disclosed without knowledge or consent to specific actors, including government or health care institutions, public libraries and post-secondary institutions. Data may also be disclosed to any other “prescribed entity”, thus allowing other entities to be added to the list by regulation. In the current list, the most interesting category – in light of debates and discussions around data trusts – is “any organization that is mandated, under a federal or provincial law or by contract with a government institution or part of a government institution in Canada, to carry out a socially beneficial purpose”. This category allows for a range of different kinds of “data trusts” – ones created by law or by contract. They may be part of government, operating under a mandate from government, or engaged by contract with government. Such arrangements must be for a “socially beneficial purpose”, which is defined in subsection 39(2) as “a purpose related to health, the provision or improvement of public amenities or infrastructure, the protection of the environment or any other prescribed purpose.” While a data trust-type exception to facilitate data sharing is intriguing, the proposed definition of “socially beneficial purpose” may be too limiting. Consider a private sector company that wishes to provide de-identified data from personal fitness devices to a university for research purposes. If these data are used for health-related research there is no problem under section 39. But what if a social scientist seeks to examine other phenomena revealed by the data? What if a business scholar seeks to use the data to understand whether counting steps leads to more local shopping or dining? If the research is not about health, the provision or improvement of public amenities or infrastructure, or the protection of the environment, then it does not appear to fall within the exception. This might mean that some researchers can use the data and others cannot. There is a separate exception to the requirements of knowledge or consent for research or statistical purposes, but it is not for de-identified personal information and is more complex in its requirements as a result. There are also some rather odd potential consequences with this scheme. What if a short-term rental company is willing to share de-identified data with a provincial government that is looking for better ways to target its tourism marketing efforts? Or perhaps it seeks to use the data to better regulate short term accommodation. It is not clear that either of these purposes would fit within the “improvement of public amenities or infrastructure” category of a socially beneficial purpose. And, although Bill C-11 sets out to regulate what private sector companies do with their data and not what data provincial or municipal governments are entitled to use, it does seem that these provisions could limit the access of provincial public sector actors to data that might otherwise be made available to them. By allowing private sector actors to share de-identified data without knowledge or consent in some circumstances, the implication is that such data cannot be shared in other circumstances – even if appropriate safeguards are in place. Finally, it seems as if the de-identification of the data and a reference to socially beneficial purposes are the only safeguards mandated for the personal data under this scheme. The wording of section 39 suggests that shared data cannot simply be made available as open data (since it can only be shared with a specific entity for a specific purpose). Yet, there is no further requirement that the new custodians of the data – the public sector or prescribed entities – allow access to the data only under licenses that ensure that any downstream use is for the prescribed socially beneficial purposes – or that impose any other necessary limitations. For entities such as post-secondary institutions, public libraries, and ‘data trusts’, use by third parties must surely be contemplated. Section 39 should therefore require appropriate contractual terms for data-sharing. Overall, the concept behind s. 39 of Bill C-11 is an important one, and the effort to facilitate data sharing by the private sector for public purposes in privacy-friendly ways is laudable. It is also important to consider how to place limits on such sharing in order to protect against privacy breaches that might flow from re-identification of de-identified data. However, section 39 as drafted raises a number of questions about its scope, not all of which are easily answered. It would benefit from a better definition of ‘de-identify’, a more flexible definition of a socially beneficial purposes, and a further requirement that any data sharing arrangements be subject to appropriate contractual limitations. And, even though individual knowledge of the sharing arrangements may not be feasible, there should be some form of transparency (such as notice to the Commissioner) so that individuals know when their de-identified personal data is being shared, by whom, and for what socially beneficial purposes.
Published in
Privacy
Monday, 23 November 2020 13:42
With a New Federal Bill Before Parliament, Is There Still a Case for Ontario to Enact its Own Private Sector Data Protection Law?
The federal government’s new Bill C-11 to reform its antiquated private sector data protection law has landed on Parliament’s Order Paper at an interesting moment for Ontario. Earlier this year, Canada’s largest province launched a consultation on whether it should enact its own private sector data protection law that would apply instead of the federal law to intraprovincial activities. The federal Personal Information Protection and Electronic Documents Act was enacted in 2000, a time when electronic commerce was on the rise, public trust was weak, and transborder flows of data were of growing economic importance. Canada faced an adequacy assessment under the European Union’s Data Protection Directive, in order to keep data flowing to Canada from the EU. At the time, only Quebec had its own private sector data protection law. Because a federal law in this area was on a somewhat shaky constitutional footing, PIPEDA’s compromise was that it would apply nationally to private sector data collection, use or disclosure in the course of commercial activity, unless a province had enacted “substantially similar” legislation. In such a case, the provincial statute would apply within the province, although not to federally-regulated industries or where data flowed across provincial or national borders. British Columbia and Alberta enacted their own statutes in 2004. Along with Quebec’s law, these were declared substantially similar to PIPEDA. The result is a somewhat complicated private sector data protection framework made workable by co-operation between federal and provincial privacy commissioners. Those provinces without their own private sector laws have seemed content with PIPEDA – and with allowing Ottawa picking up the tab for its oversight and enforcement. Twenty years after PIPEDA’s enactment, data thirsty technologies such as artificial intelligence are on the ascendance, public trust has been undermined by rampant overcollection, breaches and scandals, and transborder data flows are ubiquitous. The EU’s 2018 General Data Protection Regulation (GDPR) has set a new and higher standard for data protection and Canada must act to satisfy a new adequacy assessment. Bill C-11 is the federal response. There are provisions in Bill C-11 that tackle the challenges posed by the contemporary data environment. For example, organizations will have to provide upfront a “general account” of their use of automated decision systems that “make predictions, recommendations or decisions about individuals that could have significant impacts on them” (s. 62(1)(c)). The right of access to one’s personal information will include a right to an explanation of any prediction, recommendation or decision made using an automated decision system (s. 63(3)). There are also new exceptions to consent requirements for businesses that seek to use their existing stores of personal information for new internal purposes. C-11 will facilitate some sharing of de-identified data for “socially beneficial purposes”. These are among the Bill’s innovations. There are, however, things that the Bill does not do. Absent from Bill C-11 is anything specifically addressing the privacy of children or youth. In fact, the Bill reworks the meaning of “valid consent”, such that it is no longer assessed in terms of the ability of those targeted for the product or service to understand the consequences of their consent. This undermines privacy, particularly for youth. Ontario could set its own course in this area. More importantly, perhaps, there are some things that a federal law simply cannot do. It cannot tread on provincial jurisdiction, which leaves important data protection gaps. These include employee privacy in provincially regulated sectors, the non-commercial activities of provincial organizations, and provincial political parties. The federal government clearly has no stomach for including federal political parties under the CPPA. Yet the province could act – as BC has done – to impose data protection rules on provincial parties. There is also the potential to build more consistent norms, as well as some interoperability where necessary, across the provincial public, health and private sectors under a single regulator. The federal bill may also not be best suited to meet the spectrum of needs of Ontario’s provincially regulated private sector. Many of the bill’s reforms target the data practices of large corporations, including those that operate transnationally. The enhanced penalties and enforcement mechanisms in Bill C-11 are much needed, but are oriented towards penalizing bad actors whose large-scale data abuses cause significant harm. Make no mistake – we need C-11 to firmly regulate the major data players. And, while a provincial data protection law must also have teeth, it would be easier to scale such a law to the broad diversity of small and medium-sized enterprises in the Ontario market. This is not just in terms of penalties but also in terms of the compliance burden. Ontario’s Information and Privacy Commissioner could play an important role here as a conduit for information and education and as a point of contact for guidance. Further, as the failed Sidewalk Toronto project demonstrated, the province is ripe with opportunities for public-private technology partnerships. Having a single regulator and an interoperable set of public and private sector data protection laws could offer real advantages in simplifying compliance and making the environment more attractive to innovators, while at the same time providing clear norms and a single point of contact for affected individuals. In theory as well, the provincial government would be able to move quickly if need be to update or amend the law. The wait for PIPEDA reform has been excruciating. It it is not over yet, either. Bill C-11 may not be passed before we have to go to the polls again. That said, timely updating has not been a hallmark of either BC or Alberta’s regimes. Drawbacks of a new Ontario private sector data protection law would include further multiplication of the number of data protection laws in Canada, and the regulatory complexity this can create. A separate provincial law will also mean that Ontario will assume the costs of administering a private sector data protection regime. This entails the further risk that budget measures could be used by future governments to undermine data protection in Ontario. Still, the same risks – combined with considerably less control – exist with federal regulation. There remains a strong and interesting case for Ontario to move forward with its own legislation.
Published in
Privacy
Wednesday, 18 November 2020 11:29
It’s not you, it’s me? Why does the federal government have a hard time committing to the human right to privacy?
It’s been a busy privacy week in Canada. On November 16, 2020 Canada’s Department of Justice released its discussion paper as part of a public consultation on reform of the Privacy Act. On November 17, the Minister of Industry released the long-awaited bill to reform Canada’s private sector data protection legislation. I will be writing about both developments over the next while. But in this initial post, I would like to focus on one overarching and obvious omission in both the Bill and the discussion paper: the failure to address privacy as a human right. Privacy is a human right. It is declared as such in international instruments to which Canada is a signatory, such as the Universal Declaration of Human Rights and the International Convention on Civil and Political Rights. Data protection is only one aspect of the human right to privacy, but it is an increasingly important one. The modernized Convention 108 (Convention 108+), a data protection originating with the Council of Europe but open to any country, puts human rights front and centre. Europe’s General Data Protection Regulation also directly acknowledges the human right to privacy, and links privacy to other human rights. Canada’s Privacy Commissioner has called for Parliament to adopt a human rights-based approach to data protection, both in the public and private sectors. In spite of all this, the discussion paper on reform of the Privacy Act is notably silent with respect to the human right to privacy. In fact, it reads a bit like the script for a relationship in which one party dances around commitment, but just can’t get out the words “I love you”. (Or, in this case “Privacy is a human right”). The title of the document is a masterpiece of emotional distancing. It begins with the words: “Respect, Accountability, Adaptability”. Ouch. The “Respect” is the first of three pillars for reform of the Act, and represents “Respect for individuals based on well established rights and obligations for the protection of personal information that are fit for the digital age.” Let’s measure that against the purpose statement from Convention 108+: “The purpose of this Convention is to protect every individual, whatever his or her nationality or residence, with regard to the processing of their personal data, thereby contributing to respect for his or her human rights and fundamental freedoms, and in particular the right to privacy.” Or, from article 1 of the GDPR: “This Regulation protects fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data.” The difference is both substantial and significant. The discussion paper almost blurts it out… but again stops short in its opening paragraph, which refers to the Privacy Act as “Canada’s quasi-constitutional legal framework for the collection, use, disclosure, retention and protection of personal information held by federal public bodies.” This is the romantic equivalent of “I really, really, like spending time with you at various events, outings and even contexts of a more private nature.” The PIPEDA reform bill which dropped in our laps on November 17 does mention the “right to privacy”, but the reference is in the barest terms. Note that Convention 108+ and the GDPR identify the human right to privacy as being intimately linked to other human rights and freedoms (which it is). Section 5 of the Bill C-11 (the Consumer Privacy Protection Act) talks about the need to establish “rules to govern the protection of personal information in a manner that recognizes the right to privacy of individuals with respect to their personal information and the need of organizations to collect, use or disclose personal information for purposes that a reasonable person would consider appropriate in the circumstances.” It is pretty much what was already in PIPEDA, and it falls far short of the statements quoted from Convention 108+ and the GDPR. In the PIPEDA context, the argument has been that “human rights” are not within exclusive federal jurisdiction, so talking about human rights in PIPEDA just makes the issue of its constitutionality more fraught. Whether this argument holds water or not (it doesn’t), the same excuse does not exist for the federal Privacy Act. The Cambridge Analytica scandal (in which personal data was used to subvert democracy), concerns over uses of data that will perpetuate discrimination and oppression, and complex concerns over how data is collected and used in contexts such as smart cities all demonstrate that data protection is more than just about a person’s right to a narrow view of privacy. Privacy is a human right that is closely linked to the enjoyment of other human rights and freedoms. Recognizing privacy as a human right does not mean that data protection will not not require some balancing. However, it does mean that in a data driven economy and society we keep fundamental human values strongly in focus. We’re not going to get data protection right if we cannot admit these connections and clearly state that data protection is about the protection of fundamental human rights and freedoms. There. Is that so hard?
Published in
Privacy
|
Electronic Commerce and Internet Law in Canada, 2nd EditionPublished in 2012 by CCH Canadian Ltd. Intellectual Property for the 21st CenturyIntellectual Property Law for the 21st Century: Interdisciplinary Approaches |