Teresa Scassa - Blog

The furore in Canada over the cancellation of the long-form census and the subsequent elation over its reinstatement in 2016 illustrates that – well – that Canadians get excited about odd things, such as being counted for statistical purposes. Of course, not all Canadians are enthusiastic about the census. Each census period a few objectors refuse to complete the long-form census, and some are even prosecuted for their refusal. While some opposition has been based on the past involvement of defense contractor Lockheed Martin in conducting the census (this involvement apparently ended for the 2016 census), other objections have been linked to privacy concerns. Perhaps because of the extensive measures in place to protect census privacy, these concerns have gained little traction either publicly or in the courts, although they did provide the former conservative government with an excuse to cancel the long-from census.

A recent Federal Court decision considers issues of privacy and the census in a somewhat different context. In O’Grady v. Canada (Attorney General), the objection was not to the census itself, but rather to the secondary use of census data for medical research. The applicant, Kelly O’Grady, objected to an agreement that had been entered into between Statistics Canada and McGill University’s Faculty of Medicine in 2011. This agreement, like others of its kind, provided the legal framework by which medical researchers could use Stats Canada data in population health research. The McGill project seeks to assess infant mortality and newborn health in Canada by linking perinatal outcomes with risk factors related to socioeconomic status, ethno-cultural background, and environmental conditions. The researchers needed to link a sample of births from the national birth record database with data from the 1996 and 2006 national censuses.

The collection and maintenance of census data is governed by the Statistics Act, which also establishes Statistics Canada. Stats Canada does not simply hand over data of this kind to researchers. Under the terms of the agreement with McGill, Stats Canada would make the linkages between the records, and then would provide researchers with access only to de-identified information. Further, only those researchers who were either employees or deemed employees of Stats Canada would have access to the data. Under the Statistics Act, “deemed employees” are individuals who are brought under the umbrella of the Act, who must swear oaths of office that affirm that they will comply with the Act and maintain confidentiality, and who are subject to penalties under the Act for any breaches of their obligations.

The applicant objected to the use of the census data under the terms of the Agreement. She argued that it violated of the Statistics Act and the federal Privacy Act. She argued that census data could only be shared with express consent of those who had shared their personal information, and this had not been obtained. Further, she maintained that under the Privacy Act government institutions can only share information without consent in narrowly limited circumstances, and only where the disclosure is consistent with the purposes for which the information had been collected. She argued that the census information had not been collected for medical or public health research, and therefore could not be disclosed for these purposes.

The applicant had complained to the Office of the Privacy Commissioner in 2012, arguing that her personal information had been improperly used in the study. In a 2014 decision, the Privacy Commissioner agreed that the applicant’s census data constituted her personal information, and also found that census information was being used in the study for purposes that went beyond those for which it was collected. However, the Commissioner had noted that the Statistics Act expressly permitted Stats Canada to use its data in this way. Perhaps more importantly, the Commissioner found that the applicant’s own personal information had not been used in the study. The Applicant had given birth within a period that would have been captured by the study, but she did so in Ontario, and the Ontario data had been excluded from the study because of concerns regarding its quality. The Commissioner concluded that the applicant’s complaint was not well-founded.

The fact that the applicant’s personal information had been excluded from the study was an important factor. The Federal Court found that the exclusion of her data meant that she had not been – nor could she ever be – personally affected by the study, and ruled that she did not have standing to bring this application. Further, Justice Russell noted that “[t]he issues she raises and argues can only really be decided on a set of facts that includes an applicant or applicants who were directly affected, or who may be directly affected by the Study when it is eventually released” (at para 52). He noted that there was, as yet, simply no indication that any personal information had been or would be improperly disclosed as a result of the study. He also observed that there was “no indication that the Applicant’s position is anything more than her own personal position, born of her academic interests and her social activism” (at para 52).

Despite ruling that the applicant had no standing in the matter, Justice Russell nevertheless considered the merits of the application. He found that it was clear that Stats Canada had not disclosed any personal information – whether of the applicant or any other person. Only employees and deemed employees of Stats Canada had access to the raw data for the purposes of creating the data linkages. The linked data was accessible only to employees or deemed employees of Stats Canada. Other members of the McGill research team only saw non-confidential aggregate data. Justice Russell noted that the applicant had provided no evidence to show how the aggregate data could be linked to specific individuals. Although the applicant had argued that postal code data was going to be provided to the researchers in order to enable them to assess environmental factors, Justice Russell ruled that the applicant’s claim that the postal code data could be used to re-identify individuals was nothing more than an assertion. Further, he noted that there was no evidence that any postal code data had been revealed to anyone who was not an employee or deemed employee of Stats Canada.

Justice Russell also considered the argument that the disclosure of the data violated the Privacy Act because it was not for a purpose for which it had been collected. He agreed that the census data was personal information. However, he found that while the specific purpose of using the data for this study was not formed at the time of its collection during the 1996 or 2006 censuses, the purpose of the study “is to compile and analyse statistics related to the health and welfare of Canadians”, and this was a consistent with both the purpose of the census and the mandate of Stats Canada. There was therefore no inconsistency with the terms of the Privacy Act.

Although he dismissed the application, Justice Russell cautioned that this was primarily because it both involved an applicant with no standing and was premature. It was premature in the sense that it was too early to know if any personal information might be improperly disclosed. He stated that his decision “should not prevent anyone whose personal information is inappropriately used or disclosed from bringing the matter before the Court in the future” (at para 86). The bottom line, therefore, is that individuals whose interests are directly affected by inappropriate actions by Stats Canada or by researchers will have recourse to the courts. However, there is little room to raise broader privacy arguments about the use in principle of Stats Canada data in appropriate research.

 

Published in Privacy

Note: The following are my speaking notes for my appearance on February 23, 2026 before the House of Commons Standing Committee on Access to Information, Privacy and Ethics (ETHI). ETHI is currently engaged in a review of PIPEDA. My colleague Dr. Florian Martin-Bariteau also appeared before the same committee. His remarks are found here.

Thank you for the invitation to meet with you today and to contribute to your study of the Personal Information Protection and Electronic Documents Act. I am a professor at the University of Ottawa, Faculty of Law, where I hold the Canada Research Chair in Information Law. I am appearing in my personal capacity.

We are facing a crisis of legitimacy when it comes to personal data protection in Canada. Every day there are new stories about data hacks and breaches, and about the surreptitious collection of personal information by devices in our homes and on our persons that are linked to the Internet of Things. There are stories about how big data profiling impacts the ability of individuals to get health insurance, obtain credit or find employment. There are also concerns about the extent to which state authorities access our personal information in the hands of private sector companies. PIPEDA, as it currently stands, is inadequate to meet these challenges

My comments are organized around the theme of transparency. Transparency is fundamentally important to data protection and it has always played an important role under PIPEDA. At a fundamental level, transparency means openness and accessibility. In the data protection context it means requiring organizations to be transparent about the collection, use and disclosure of personal information; and it means the Commissioner must be transparent about his oversight functions under the Act. I will also argue that it means that state actors (including law enforcement and national security organizations) must be more transparent about their access to and use of the vast stores of personal information in the hands of private sector organizations.

Under PIPEDA, transparency is at the heart of the consent-based data protection scheme. Transparency is central to the requirement for companies to make their privacy policies available to consumers, and to obtain consumer consent to collection, use or disclosure of personal information. Yet this type of transparency has come under significant pressure and has been substantially undermined by technological change on the one hand, and by piecemeal legislative amendment on the other.

The volume of information that is collected through our digital, mobile and online interactions is enormous, and its actual and potential uses are limitless. The Internet of Things means that more and more, the devices we have on our person and in our homes are collecting and transmitting information. They may even do so without our awareness, and often on a continuous basis. The result is that there are fewer clear and well-defined points or moments at which data collection takes place, making it difficult to say that notice has been provided and consent obtained in any meaningful way. In addition, the number of daily interactions and activities that involve data collection have multiplied beyond the point at which we are capable of reading and assessing each individual privacy policy. And, even if we did have the time, privacy policies are often so long, complex, and vague that reading them does not provide much of an idea of what is being collected and shared, with or by whom, or for what purposes.

In this context consent has become a joke, although unfortunately the joke is largely on the consumer. The only parties capable of saying that our current consent-based model still works are those that benefit from consumer resignation in the face of this ubiquitous data harvesting.

The Privacy Commissioner’s recent consultation process on consent identifies a number of possible strategies to address the failure of the current system. There is no quick or easy fix – no slight changing of wording that will address the problems around consent. This means that on the one hand, there need to be major changes in how organizations achieve meaningful transparency about their data collection, use and disclosure practices. There must also be a new approach to compliance that gives considerably more oversight and enforcement powers to the Commissioner. The two changes are inextricably linked. The broader public protection mandate of the Commissioner requires that he have necessary powers to take action in the public interest. The technological context in which we now find ourselves is so profoundly different from what it was when this legislation was enacted in 2001 that to talk of only minor adjustments to the legislation ignores the transformative impacts of big data and the Internet of Things.

A major reworking of PIPEDA may in any event be well be overdue, and it might have important benefits that go beyond addressing the problems with consent. I note that if one was asked to draft a statute as a performance art piece that evokes the problems with incomprehensible, convoluted and contorted privacy policies and their effective lack of transparency, then PIPEDA would be that statute. As unpopular as it might seem to suggest that it is time to redraft the legislation so that it no longer reads like the worst of all privacy policies, this is one thing that the committee should consider.

I make this recommendation in a context in which all those who collect, use or disclose personal information in the course of commercial activity – including a vast number of small businesses with limited access to experienced legal counsel – are expected to comply with the statute. In addition, the public ideally should have a fighting chance of reading this statute and understanding what it means in terms of the protection of their personal information and their rights of recourse. As it is currently drafted PIPEDA is a convoluted mishmash in which the normative principles are not found in the law itself, but are rather tacked on in a Schedule. To make matters worse, the meaning of some of the words in the Schedule, as well as the principles contained therein are modified by the statute so that it is not possible to fully understand rules and exceptions without engaging in a complex connect-the-dots exercise. After a series of piecemeal amendments, PIPEDA now consists in large part of a growing list of exceptions to the rules around collection, use or disclosure without consent. While the OPC has worked hard to make the legal principles in PIPEDA accessible to businesses and to individuals, the law itself is not accessible In a recent case involving an unrepresented applicant, Justice Roy of the Federal Court expressed the opinion that for a party to “misunderstand the scope of the Act is hardly surprising.”

I have already mentioned the piecemeal amendments to PIPEDA over the years as well as concerns over transparency. In this respect it is important to note that the statute has been amended so as to increase the number of exceptions to the consent that would otherwise be required for the collection, use or disclosure of personal information. For example, paragraphs 7(3)(d.1) and (d.2) were added in 2015, and permit organizations to share personal information between themselves for the purposes of investigating breaches of an agreement or actual or anticipated contraventions of the laws of Canada or a province, or to detect or supress fraud. These are important objectives, but I note that no transparency requirements were created in relation to these rather significant powers to share personal information without knowledge or consent. In particular, there is no requirement to notify the Commissioner of such sharing. The scope of these exceptions creates a significant transparency gap that undermines personal information protection. This should be fixed.

PIPEDA also contains exceptions that allow organizations to share personal information with government actors for law enforcement or national security purposes without notice or consent of the individual. These exceptions also lack transparency safeguards. Given the huge volume of highly detailed personal information, including location information that is now collected by private sector organizations, the lack of mandatory transparency requirements is a glaring privacy problem. The Department of Industry, Science and Economic Development has created a set of voluntary transparency guidelines for organizations that choose to disclose the number of requests they receive and how they deal with them. It is time for there to be mandatory transparency obligations around such disclosures, whether it be public reporting or reporting to the Commissioner, or a combination of both. It should also be by both public and private sector actors.

Another major change that is needed to enable PIPEDA to meet the contemporary data protection challenges relates to the powers of the Commissioner. When PIPEDA was enacted in 2001 it represented a fundamental change in how companies were to go about collecting, using and disclosing personal information. This major change was made with great delicacy; PIPEDA reflected an ombuds model which allowed for a light touch with an emphasis on facilitating and cajoling compliance rather than imposing and enforcing it. Sixteen years later and with exabytes of personal data under the proverbial bridge, it is past time for the Commissioner to be given a new set of tools in order to ensure an adequate level of protection for personal information in Canada.

First, the Commissioner should have the authority to impose fines on organizations in circumstances where there has been substantial or systemic non-compliance with privacy obligations. Properly calibrated, such fines can have an important deterrent effect, which is currently absent in PIPEDA. They also represent transparent moments of accountability that are important in maintaining public confidence in the data protection regime.

The toolbox should also include the power for the Commissioner to issue binding orders. I am sure that you are well aware that the Commissioners in Quebec, Alberta and British Columbia already have such powers. As it stands, the only route under PIPEDA to a binding order runs through the Federal Court, and then only after a complaint has passed through the Commissioner’s internal process. This is an overly long and complex route to an enforceable order, and it requires an investment of time and resources that places an unfair burden on individuals.

I note as well that PIPEDA currently does not provide any guidance as to damage awards. The Federal Court has been extremely conservative in damage awards for breaches of PIPEDA, and the amounts awarded are unlikely to have any deterrent effect other than to deter individuals who struggle to defend their personal privacy. Some attention should be paid to establishing parameters for non-pecuniary damages under PIPEDA. At the very least, these will assist unrepresented litigants in understanding the limits of any recourse available to them.

Thank you for your attention, and I welcome any questions.

Published in Privacy

The Federal Court of Canada has ordered a Romanian company and its sole proprietor to cease publishing online any Canadian court or tribunal decisions containing personal information. It has also awarded damages against the company’s owner. The decision flows from an application made pursuant to s. 14 of the Personal Information Protection and Electronic Documents Act (PIPEDA). The applicant had complained to the Privacy Commissioner of Canada regarding the activities of the defendant and his website Globe24h.com. The Commissioner ruled the complaint well-founded (my comment on this finding is here). However, since the Commissioner has no power to make binding orders or to award damages, the applicant pursued the matter in court. (Note that the lack of order-making powers is considered by many to be a weakness of PIPEDA, and the Commissioner has suggested to Parliament that it might be time for greater enforcement powers.)

Globe24h.com is a Romania-based website operated by the respondent Radulescu. The site re-publishes public documents from a number of jurisdictions, including Canada. The Canadian content is scraped from CanLII and from court and tribunal websites. This scraping is contrary to the terms of use of those sites. The Canadian court websites and CanLII also prevent the indexing of their websites by search engines; this means that a search for an individual by name will not turn up court or tribunal decisions in which that individual is named. This practice is meant to balance the privacy of individuals with the public interest in having broad access to court and tribunal decisions. Such decisions may contain considerable amounts of personal information as they may relate to any kind of legal dispute including family law matters, employment-related disputes, discrimination complaints, immigration proceedings, bankruptcy cases, challenges to decisions on pensions or employment insurance, criminal matters, disputes between neighbors, and so on. In contrast, the Globe24h.com website is indexed by search engines; as a result, the balance attempted to be struck by courts and tribunals in Canada is substantially undermined.

The applicant in this case was one of many individuals who had complained to the Office of the Privacy Commissioner (OPC) after finding that a web search for their names returned results containing personal information from court decisions. The applicant, like many others, had sought to have his personal information removed from the Globe24h website. However, the “free removal” option offered by the site could take half a year or more to process. The alternative was to pay to have the content removed. Those who had opted to pay for removal found that they might have to pay again and again if the same information was reproduced in more than one document or in multiple versions of the decision hosted on the Globe24h web site.

The first issue considered by the Federal Court was whether PIPEDA could apply extraterritorially to Globe24h.com. In general, a country’s laws are not meant to apply outside its boundaries. Although the Federal Court referred to the issue as one of extraterritorial application of laws, it is more akin to what my co-authors and I have called extended territoriality. In other words, PIPEDA will apply to activities carried out in Canada and with impacts in Canada – even though the actors may be located outside of Canada. The internet makes such situations much more common. In this case, Radulescu engaged in scraping data from websites based in Canada; the information he collected included personal information of Canadians. He then, through his company, charged individuals fees to have their personal information removed from his website. The Court found that in these circumstances, PIPEDA would apply.

It was clear that the respondent had collected, used and disclosed the personal information of the applicant without his consent. Although Radulescu did not appear before the Federal Court, he had interacted with the OPC during the course of the investigation of the complaint against Globe24h. In that context, he had argued that he was entitled to benefit from the exception in PIPEDA which permitted the collection, use and disclosure of personal information without consent where it is for journalistic purposes. There is little case law that addresses head-on the scope of the “journalistic purposes” exception under PIPEDA. Justice Mosely found that the criteria proposed by the Canadian Association of Journalists, and supported by the OPC, provide a “reasonable framework” to define journalistic purposes:

 

. . . only where its purpose is to (1) inform the community on issues the community values, (2) it involves an element of original production, and (3) it involves a “self-conscious discipline calculated to provide an accurate and fair description of facts, opinion and debate at play within a situation.” (at para 68)

Justice Mosley found that “journalistic purposes” required something more than making court decisions available for free over the internet without any value-added content. He also noted that the statutory exception applies only where the collection, use or disclosure of personal information is for journalistic purposes and for no other purpose. Here, he found that the respondent had other purposes – namely to profit from charging people to remove their personal information from the website.

The respondent had also argued that he was entitled to benefit from the exception to the consent requirement because the information he collected, used and disclosed was ‘publicly available’. This exception is contained in PIPEDA and in regulations pertaining to publicly available information. While court and tribunal decisions fall within the definition of publicly available information, the exception to the consent requirement is only available where the collection, use or disclosure of the information relates “directly to the purpose for which the information appears in the record or document.” (Regs, s. 1(d)). In this case, Justice Mosley found that the respondent’s purpose did not relate directly to the reasons why the personal information was included in the decisions. Specifically, the inclusion of personal information in court decisions is to further the goals of the open courts principle, whereas, in the words of Justice Mosley, the respondent’s purpose “serves to undermine the administration of justice by potentially causing harm to participants in the justice system.” (at para 78)

PIPEDA contains a requirement that limits data collection, use or disclosure by an organization to only where it is “for purposes that a reasonable person would consider are appropriate in the circumstances.” (s. 5(3)). Justice Mosely noted that the Canadian Judicial Council’s policies on the online publication of court decisions strongly discourages the indexing of such decisions by search engines in order to strike a balance between open courts and privacy. This led Justice Mosely to conclude that the respondent did not have a bona fide business interest in making court decisions available in a way that permitted their indexing by search engines. Therefore the collection, use and disclosure of this information was not for purposes that a reasonable person would consider to be appropriate.

Having found that the respondent had breached PIPEDA, Justice Mosley next considered the issue of remedies. The situation was complicated in this case by the fact that the respondent is based in Romania. This raised issues of whether the court should make orders that would have an impact in Romania, as well as the issue of enforceability. The applicant was also pursuing separate remedies in Romania, and Justice Mosley noted that a court order from Canada might assist in these objectives. The OPC argued that it would be appropriate for the Court to make an order with a broader impact than just the applicant’s particular circumstances. The number of other complaints received by both CanLII and the OPC about personal information contained in decisions hosted on the Romanian site were indicative of a systemic issue. Justice Mosley was also influenced by the OPC’s argument that a broad order could be used by the applicant and by others to persuade search engines to de-index the pages of the respondent’s websites. Accepting that PIPEDA enabled him to address systemic and not just individual problems, Justice Mosely issued a declaration that the respondent had violated PIPEDA, and ordered that he remove all Canadian court and tribunal decisions that contain personal information. He also ordered that the respondent take steps to ensure that these decisions are removed from search engine caches. The respondent was also ordered to refrain from any further copying or publishing of Canadian court or tribunal decisions containing personal information in a manner that would violate PIPEDA.

The applicant also sought damages for breach of PIPEDA. Damages awards have been a weak spot under PIPEDA. The Federal Court has been extremely conservative in awarding damages; this tendency has not been helped by the fact that the overwhelming majority of applications have been brought by self-represented litigants. In this case, Justice Mosley accepted that the breach was egregious, and noted the practice of the respondent to profit from exploiting the personal information of Canadians. He also noted that the level of disclosure of personal information was extensive because of the bulk downloading and publishing of court decisions. Finally, he noted that the respondent “has also acted in bad faith in failing to take responsibility and rectify the problem” (at para 103). In the circumstances, one might have expected an order of damages far in excess of the modest $5000 ultimately ordered by Justice Mosely. This amount seems disproportionate to the nature of the breach, as well as to the impact it had on the applicant and the extensive steps he has had to take to try to address the problem. Even though recovering any amount from the respondent might be no more than a pipe dream in the circumstances, the amount set in this case would seem to lack any deterrent effect and is hardly proportionate to the nature of the breach.

Overall, this decision is an important one. It confirms the application of PIPEDA to the collection, use or disclosure of personal information of Canadians that is linked to Canada, even where the respondent is located in another country. It also provides clarification of the exceptions to consent for journalistic purposes and for publicly available information. In this regard, the court’s careful reading of these exceptions prevents them from being used as a broad licence to exploit personal information. The court’s reasoning with respect to its declaration and its order is also useful, particularly as it applies to the sanctioning of offshore activities. The only weakness is in the award of damages; this is a recurring issue with PIPEDA and one that may take legislative intervention to address.

Published in Privacy

Note: the following are my speaking notes for my appearance before the Standing Committee on Transport, Infrastructure and Communities, February 14, 2017. The Committee is exploring issues relating Infrastructure and Smart Communities. I have added hyperlinks to relevant research papers or reports.

Thank you for the opportunity to address the Standing Committee on Transport, Infrastructure and Communities on the issue of smart cities. My research on smart cities is from a law and policy perspective. I have focused on issues around data ownership and control and the related issues of transparency, accountability and privacy.

The “smart” in “smart cities” is shorthand for the generation and analysis of data from sensor-laden cities. The data and its accompanying analytics are meant to enable better decision-making around planning and resource-allocation. But the smart city does not arise in a public policy vacuum. Almost in parallel to the development of so-called smart cities, is the growing open government movement that champions open data and open information as keys to greater transparency, civic engagement and innovation. My comments speak to the importance of ensuring that the development of smart cities is consistent with the goals of open government.

In the big data environment, data is a resource. Where the collection or generation of data is paid by taxpayers it is surely a public resource. My research has considered the location of rights of ownership and control over data in a variety of smart-cities contexts, and raises concerns over the potential loss of control over such data, particularly rights to re-use the data whether it is for innovation, civic engagement or transparency purposes.

Smart cities innovation will result in the collection of massive quantities of data and these data will be analyzed to generate predictions, visualizations, and other analytics. For the purposes of this very brief presentation, I will characterize this data as having 3 potential sources: 1) newly embedded sensor technologies that become part of smart cities infrastructure; 2) already existing systems by which cities collect and process data; and 3) citizen-generated data (in other words, data that is produced by citizens as a result of their daily activities and captured by some form of portable technology).

Let me briefly provide examples of these three situations.

The first scenario involves newly embedded sensors that become part of smart cities infrastructure. Assume that a municipal transit authority contracts with a private sector company for hardware and software services for the collection and processing of real-time GPS data from public transit vehicles. Who will own the data that is generated through these services? Will it be the municipality that owns and operates the fleet of vehicles, or the company that owns the sensors and the proprietary algorithms that process the data? The answer, which will be governed by the terms of the contract between the parties, will determine whether the transit authority is able to share this data with the public as open data. This example raises the issue of the extent to which ‘data sovereignty’ should be part of any smart cities plan. In other words, should policies be in place to ensure that cities own and/or control the data which they collect in relation to their operations. To go a step further, should federal funding for smart infrastructure be tied to obligations to make non-personal data available as open data?

The second scenario is where cities take their existing data and contract with the private sector for its analysis. For example, a municipal police service provides their crime incident data to a private sector company that offers analytics services such as publicly accessible crime maps. Opting to use the pre-packaged private sector platform may have implications for the availability of the same data as open data (which in turn has implications for transparency, civic engagement and innovation). It may also result in the use of data analytics services that are not appropriately customized to the particular Canadian local, regional or national contexts.

In the third scenario, a government contracts for data that has been gathered by sensors owned by private sector companies. The data may come from GPS systems installed in cars, from smart phones or their associated apps, from fitness devices, and so on. Depending upon the terms of the contract, the municipality may not be allowed to share the data upon which it is making its planning decisions. This will have important implications for the transparency of planning processes. There are also other issues. Is the city responsible for vetting the privacy policies and practices of the app companies from which they will be purchasing their data? Is there a minimum privacy standard that governments should insist upon when contracting for data collected from individuals by private sector companies? How can we reconcile private sector and public sector data protection laws where the public sector increasingly relies upon the private sector for the collection and processing of its smart cities data? Which normative regime should prevail and in what circumstances?

Finally, I would like to touch on a different yet related issue. This involves the situation where a city that collects a large volume of data – including personal information – through its operation of smart services is approached by the private sector to share or sell that data in exchange for either money or services. This could be very tempting for cash-strapped municipalities. For example, a large volume of data about the movement and daily travel habits of urban residents is collected through smart card payment systems. Under what circumstances is it appropriate for governments to monetize this type of data?

How does one balance transparency with civil liberties in the context of election campaigns? This issue is at the core of a decision just handed down by the Supreme Court of Canada.

B.C. Freedom of Information and Privacy Association v. Attorney-General (B.C.) began as a challenge by the appellant organization to provisions of B.C.’s Election Act that required individuals or organizations who “sponsor election advertising” to register with the Chief Electoral Officer. Information on the register is publicly available. The underlying public policy goals to allow the public to see who is sponsoring advertising campaigns during the course of elections. The Supreme Court of Canada easily found this objective to be “pressing and substantial”.

The challenge brought by the B.C. Freedom of Information and Privacy Association (BCFIPA) was based on the way in which the registration requirement was framed in the Act. The Canada Elections Act also contains a registration requirement, but the requirement is linked to a spending threshold. In other words, under the federal statute, those who spend more than $500 on election advertising are required to register; others are not. The B.C. legislation is framed instead in terms of a general registration requirement for all sponsors of election advertising. BCFIPA’s concern was that this would mean that any individual who placed a handmade sign in their window, who wore a t-shirt with an election message, or who otherwise promoted their views during an election campaign would be forced to register. Not only might this chill freedom of political expression in its own right, it would raise significant privacy issues for individuals since they would have to disclose not just their names, but their addresses and other contact information in the register. Thus, the BCFIPA sought to have the registration requirement limited by the Court to only those who spent more than $500 on an election campaign.

The problem in this case was exacerbated by the position taken by B.C.’s Chief Electoral Officer. In a 2010 report to the B.C. legislature, he provided his interpretation of the application of the legislation. He expressed the view that it did not “distinguish between those sponsors conducting full media campaigns and individuals who post handwritten signs in their apartment windows.” (at para 19). This interpretation of the Election Act was accepted by both the trial judge and at the Court of Appeal, and it shaped the argument before those courts as well as their decisions.

The Supreme Court of Canada took an entirely different approach. They interpreted the language “sponsor election advertising” to mean something other than the expression of political views by individuals. In other words, the statute applied only to those who sponsored election advertising – i.e., those who paid for election advertising to be conducted or who received such services as a contribution. The Court was of the view that the public policy behind registration requirements was generally sound. It found that a legislature could mitigate the impact on freedom of expression by either setting a monetary threshold to trigger the requirement (as is the case at the federal level) or by defining sponsorship to exclude individual expression (as was the case in B.C.). While it is true that the B.C. statute could still capture organized activities involving expenditures of less than $500, and might thus have some limiting effect, the Court found that this would not be significant for a number of reasons, and that such impacts were easily reconcilable with the benefits of the registration scheme.

The decision of the Supreme Court of Canada will be useful in clarifying the scope and impact of the Election Act and in providing guidance for similar statutes. It should be noted however, that the case traveled to the Supreme Court of Canada at great cost both to BCFIPA and to the taxpayer because of either legislative inattention to the need to clarify the scope of the legislation or because of an over-zealous interpretation of the statute by the province’s Chief Electoral Officer. The situation highlights the need for careful attention to be paid at the outset of such initiatives to the balance that must be struck between transparency and other competing values such as civil liberties and privacy.

 

Published in Privacy

Many Canadians are justifiably concerned that the vast amounts of information they share with private sector companies – simply by going about their day-to-day activities – may end up in the hands of law enforcement or national security officials without their knowledge or consent. The channels through which vast amounts of personal data can flow from private sector hands to law enforcement with little transparency or oversight can turn the companies we do business with into informers and make us unwittingly complicit in our own surveillance.

A recent Finding of the Office of the Privacy Commissioner of Canada (OPC) illustrates how the law governing the treatment of our personal information in the hands of the private sector has been adapted to the needs of the surveillance state in ways that create headaches for businesses and their customers alike. The Finding, which posted on the OPC site in November 2016 attempts to unravel a tangle of statutory provisions that should not have to be read by anyone making less than $300 per hour.

Basically, the Personal Information Protection and Electronic Documents Act (PIPEDA) governs how personal information is collected, used and disclosed by private sector organizations at the federal level and in all provinces that do not have their own equivalent statutes (only Quebec, B.C. and Alberta do). One of the core principles of this statute is the right of access to one’s personal information. This means that individuals may ask to be informed about the existence, use and disclosure of their personal information in the hands of an organization. They must also be given access to that information on request. Without the right of access it would be difficult for us to find out whether an organization was in compliance with its privacy policies. The right of access also allows us to verify and request correction of any erroneous information.

Another core principle of PIPEDA is consent. This means that information about us should not be collected, used or disclosed without our consent. The consent principle is meant to give us some control over our personal information (although there are huge challenges in this age of overly-long, vague, and jargon-laden privacy policies).

The hunger for our personal information on the part of law enforcement and national security officials (check out these Telco transparency reports here, here and here) has led to a significant curtailment of both the principles of access and of consent. The law is riddled with exceptions that permit private sector companies to disclose our personal information to state authorities in a range of situations without our knowledge or consent, with or without a warrant or court order. Other exceptions allow these disclosures to be hidden from us if we make access requests. What this means is that, in some circumstances, organizations that have disclosed an individual’s information to state authorities, and that later receive an access request from the individual seeking to know if their information has been disclosed to a third party, must contact the state authority to see if they are permitted to reveal that information has been shared. If the state authority objects, then the individual is not told of the disclosure.

The PIPEDA Report of Findings No. 2016-008 follows a complaint by an individual who contacted her telecommunications company and requested access to her personal information in the hands of that company. Part of the request was for “any information about disclosures of my personal information, or information about my account or devices, to other parties, including law enforcement and other state agencies.” (at para 4). She received a reply from the Telco to the effect that it was “fully in compliance with subsections 9(2.1), (2.2), (2.3) and (2.4) of [PIPEDA].” (at para 5) In case that response was insufficiently obscure, the Telco also provided the wording of the subsections in question. The individual complained to the Office of the Privacy Commissioner (OPC).

The OPC decision makes it clear that the exceptions to the access principle place both the individual and the organization in a difficult spot. Basically, an organization that has disclosed information to state authorities without the individual’s knowledge or consent, and that receives an access request regarding this disclosure, must check with the relevant state authority to see if they have any objection to the disclosure of information about the disclosure. The state authorities can object if the disclosure of the disclosure would pose a threat to national security, national defence or the conduct of international affairs, or would adversely impact investigations into money laundering or terrorist financing. Beyond that, the state authorities can also object if disclosure would adversely impact “the enforcement of any law of Canada, a province or a foreign jurisdiction, an investigation relating to the enforcement of any such law, or the gathering of intelligence for the purpose of enforcing any such law.” If the state authorities object, then the organization may not disclose the requested information to the individual, nor can they disclose that they contacted the state authorities about the request, or that the authorities objected to any disclosure. In the interests of having a modicum of transparency, the organization must inform the Privacy Commissioner of the situation.

The situation is complex enough that in its finding, the OPC produced a helpful chart to guide organizations through the whole process. The chart can be found in the Finding.

In this case, the Telco justified its response to the complainant by explaining that if pushed further by a customer about disclosures, it would provide additional information, but even this additional information would be necessarily obscure. The Commissioner found that the Telco’s approach was not compliant with the law, but acknowledged that compliance with the law could mean that a determined applicant, by virtue of repeated requests over time, could come up with a pattern of responses that might lead them to infer whether information was actually disclosed, and whether the state authority objected to the disclosure. This is perhaps not what Parliament intended, but it does seem to follow from a reading of the statute.

As a result of the complaint, the Telco agreed to change its responses to access requests to conform to the requirements outlined in the table above.

It may well be that this kind of information-sharing offers some, perhaps significant, benefits to society, and that sharing information about information sharing could, in some circumstances, be harmful to investigations. The problem is that protections for privacy – including appropriate oversight and limitations – have not kept pace with the technologies that have turned private sector companies into massive warehouses of information about every detail of our lives and activities. The breakdown of consent means that we have little practical control over what is collected, and rampant information sharing means that our information may be in the hands of many more companies than those with which we actively do business. The imbalance is staggering, as is the risk of abuse. The ongoing review of PIPEDA must address these gaps issues – although there are also risks that it will result in the addition of more exceptions from the principles of access and consent.

 

 

 

 

Published in Privacy

The Supreme Court of Canada has issued a relatively rare decision on the interpretation of Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA). Although it involves fairly technical facts that are quite specific to the banking and mortgage context, the broader significance of the case lies in the Court’s approach to implied consent under PIPEDA.

The case arose in the context of the Royal Bank of Canada’s (RBC) attempt to obtain a mortgage discharge statement for property owned by two individuals (the Trangs), who defaulted on a loan advanced by the bank. The mortgage was registered against a property in Toronto, on which Scotiabank held the first mortgage. In order to recover the money owed to it, RBC sought a judicial sale of the property, but the sheriff would not carry out the sale without the mortgage discharge statement. Scotiabank refused to provide this statement to RBC on the basis that it contained the Trangs’ personal information and it could therefore not be disclosed to RBC without the Trangs’ consent.

PIPEDA allows for the disclosure of personal information without consent in a number of different circumstances. Three of these, raised by lawyers for RBC, include where it is for the purpose of collecting a debt owed by the individual to the organization; where the disclosure is required by a court order; and where the disclosure is required by law. Ultimately, the Court only considered the second of these exceptions. Because Scotiabank refused to disclose the discharge statement, RBC had applied to a court for a court order that would enable disclosure without consent. However, it found itself caught in a procedural loop – it seemed to be asking the court to order disclosure on the basis of a court order which the court had yet to grant. Although the Court of Appeal had found the court order exception to be inapplicable because of this circularity, the Supreme Court of Canada swept aside these objections in favour of a more pragmatic approach. Justice Côté found that the court had the power to make an order and felt that an order was appropriate in the circumstances. She ruled that it would be “overly formalistic and detrimental to access to justice” to require RBC to reformulate its request for a court order in a new proceeding.

Although this would have been enough to decide the matter, Justice Côté, for the unanimous court, went on to find that the Trangs had given implied consent to the disclosure of the mortgage statement in any event. Under PIPEDA, consent can be implied in some circumstances. Express consent is generally required where information is sensitive in nature. Acknowledging that financial information is generally considered highly sensitive, Justice Côté nevertheless found that in this case the mortgage discharge statement was less sensitive in nature. She stated that “the degree of sensitivity of specific financial information is a contextual determination.” (at para 36) Here, the context included the fact that a great deal of mortgage-related financial information is already in the public domain by virtue of the Land Titles Registry, which includes details such as the amount of a mortgage recorded against the property, the interest rate, payment periods and due date. Although the balance left owing on a mortgage is not provided in the Registry, it can still be roughly calculated by anyone interested in doing so. Justice Côté characterized the current balance of a mortgage as “a snapshot at a point in time in the life of a publicly disclosed mortgage.” (at para 39)

Justice Côté’s implied consent analysis was also affected by other contextual considerations. These included the fact that the party seeking disclosure of the discharge statement had an interest in it; as a creditor, it was relevant to them. According to the Court, the reasonable expectations of the individual with respect to the sensitivity of any information must be assessed in “the whole context” so as not to “unduly prioritize privacy interests over the legitimate business concerns that PIPEDA was also designed to reflect”. (at para 44) The fact that other creditors have a legitimate business interest in the information in a mortgage disclosure statement is “a relevant part of the context which informs the reasonable expectation of privacy.” (at para 45) In this regard, Justice Côté observed that the identity of the party seeking disclosure of the information and the reason for which they are seeking disclosure are relevant considerations. She noted that “[d]isclosure to a person who requires the information to exercise an established legal right is clearly different from disclosure to a person who is merely curious or seeks the information for nefarious purposes.” (at para 46)

Justice Côté also found that the reasonable mortgagor in the position of the Trangs would be aware of the public nature of the details of their mortgage, and would be aware as well that if they defaulted on either their mortgage or their loan with RBC, their mortgaged property could be seized and sold. They would also be aware that a judgment creditor would have a “legal right to obtain disclosure of the mortgage discharge statement through examination or by bringing a motion.” (at para 47)

It seems that it is the fact that RBC could ultimately legally get access to the mortgage discharge statement, viewed within the broader context that drives the Court to find that there is an implied consent to the disclosure of this information – even absent a court order. The Court’s finding of implied consent is nevertheless limited to this context; it would not be reasonable for a bank to disclose a mortgage discharge statement to anyone other than a person with a legal interest in the property to which the mortgage relates. The Court’s reasoning seems to be that since RBC is ultimately entitled to get this information and has legal means at its disposal to get the information, then the Trangs can be considered to have consented to the information being shared.

Pragmatism is often a good thing, and it is easy to be sympathetic to the Court’s desire to not create expensive legal hurdles to achieve inevitable ends in transactions that are relatively commonplace. It should be noted, however, that the same result could have been achieved by the addition of a clause in the mortgage documents that would effectively obtain the consent of any mortgagor to disclosures of this kind and in those circumstances. No doubt after the earlier decisions in this case and in the related Citi Cards Canada Inc. v. Pleasance, banks had already taken steps to address this in their mortgage documents. One of the reasons for having privacy policies is to require institutions to explain to their customers what personal information is collected, how it will be used, and in what circumstances it will be disclosed. While it is true that few people read such privacy policies, they are at least there for those who choose to do so. Nobody reads implied terms because they are… well, implied. Implied consent works where certain uses or disclosures are relatively obvious. In more complicated transactions implied consent should be sparingly relied upon.

It will be interesting to see what impact the Court’s judicial eye roll to the facts of this case will have in other circumstances where consent to disclosure is an issue. The Court is cautious enough in its contextual approach that it may not lead to a dangerous undermining of consent. Nevertheless, there is a risk that the almost exasperated pragmatism of the decision may cause a more general relaxation around consent.

Published in Privacy

In a press release issued on October 26, 2016, the Ontario Provincial Police announced that they would be adopting a new investigative technique – one that relies on cellphone tracking of ordinary members of the public. The use of this new technique is being launched in the context of the investigation of an unsolved murder that took place in Ottawa in 2015. Police are searching for leads in the case.

The OPP sought a Production Order from a justice of the peace. This order required major cellular phone service providers to furnish them with a list of cellphone numbers used in the vicinity of West Hunt Club and Merivale Road in Ottawa, between 12:30 and 3:30 p.m. on December 15, 2015. Production orders for cell phone information have become commonplace. Typically, however, they have been used to determine whether a person of interest to the police was in a certain area at a specific time. This is not the case here. In this case, the police intend to send text messages to the individual cell phone numbers provided by the phone companies. These messages will encourage recipients to visit a web site set up by the police and to respond to some questions. According to the press release, the production order did not include customer name and address information associated with the phone numbers. In theory, then, individual privacy is protected by the fact that an person who does not respond to the text message does provide any further identifying information to the police.

There is clearly a public interest in solving crimes. Where investigations have grown cold, new techniques may be important to finding justice for victims and their families. However, it is also important that any new investigative techniques are consistent with the principles and values that are an integral part of our justice system. Privacy advocates and the public have reason to be concerned about this new investigative technique. Here are some of the reasons why:

First, production orders of this kind provide completely inadequate opportunities to hear and consider the privacy interests of affected individuals. Persons accused of crimes can always challenge in court the way in which the police went about collecting the evidence against them. They can argue that their privacy interests were violated and that search warrants should never have been issued. However, ordinary members of the public have little practical recourse when their privacy rights are infringed by investigations of crimes that have nothing to do with them. In a decision of the Ontario Superior Court (which I wrote about here) Justice Sproat reviewed production orders for massive amounts of cell phone data sought by police. He was sharply critical of both the seeking and the granting of a production order for quantities of cell phone customer data that far exceeded what was genuinely required for the purpose of the investigation. The case impacted the privacy rights of the broad public (it involved the data of over 43,000 customers) yet as is so often the case, the public had no way to learn of or challenge the production order before it was granted. In that case, it was the Telcos – Rogers and Telus – who challenged the production orders and raised privacy issues before the courts. Without this intervention, there would have been no voice for the privacy interests of ordinary citizens and no means of reviewing the legitimacy of the order.

Second, production orders of this kind come with no safeguards for the protection of data after it has been used by police. Production orders typically do not contain directions on how long data can be retained, whether it should be destroyed after a certain time, what other uses it might (or should not) be put to, or what safeguards are required to protect it while it is in the hands of police. The lack of such safeguards was commented upon by Justice Sproat in the case mentioned above. He was of the view that this was an issue for Parliament to address. Parliament has yet to do so.

In its press release, the OPP analogized what it was doing to police going through a neighborhood where a crime has taken place and knocking on doors to see if anyone has seen or heard anything that might be relevant. The analogy is problematic. The existence and location of houses and apartment units are matters of public record – they are in plain view. However, data about the cell phone usage of individuals, along with their location information, as they carry out their day to day activities are not. When police seek access to information that allows them to identify the locations of thousands of individuals who are not suspected of engaging in criminal activity, they are doing more than knocking on doors.

There needs to be a public conversation about how and when police get to tap into the massive volumes of data collected about the minutiae of our daily activities by private sector companies. The use of cell phone data production orders by the OPP in this case merely adds to list of subjects for that conversation. Because the use of this data by police is now to identify and contact people who are themselves not the targets of criminal investigation, these individuals effectively have no way in which to raise privacy concerns. This is a conversation that must be led by Parliament and that most likely will require new law.

 

 

Published in Privacy

The Ontario Supreme Court of Justice has just approved the settlement of a class action law suit against Home Depot over a data privacy breach that took place in 2014. Both the settlement agreement and the decision by Justice Perell offer some interesting insights into privacy class actions in Canada.

Between April 11, 2014 and September 13, 2014 Home Depot’s payment system was hacked by criminals who used malware to skim data from credit card purchases at self-serve stations. When Home Depot discovered the breach it notified potentially affected customers through the French and English press in Canada. It also sent out over half a million emails to potentially affected customers in Canada. The emails apologized for the breach, and confirmed that the malware had been eradicated. Customers were assured that they would not be held responsible for fraudulent charges to their credit card accounts and they were offered free credit monitoring and identity theft insurance.

Although the breach led to complaints against Home Depot being filed with the privacy commissioners of Alberta, Quebec, B.C. and Canada, the commissioners all concluded that Home Depot had not breached their respective private sector data protection statutes. The fact that Home Depot had acted quickly and decisively to notify customers and to offer them protection also clearly influenced Justice Perell in his decision on the settlement agreement. He noted that Home Depot “apparently did nothing wrong”, and that it “responded in a responsible, prompt, generous and exemplary fashion to the criminal acts perpetrated on it by the computer hackers.” (at para 74.)

After the breach, which affected customers in the U.S. and Canada, a number of class action lawsuits were filed in both countries. The U.S.-based suits were consolidated into a single action which led to a settlement. The U.S. agreement was used as a template for the Canadian settlement. Under the terms of the settlement agreement put before Justice Perell, Home Depot admitted no wrongdoing. In exchange for releasing their claims against Home Depot, class members would be entitled to access a settlement fund of $250,000 available to compensate them for any actual expenses incurred as a result of the data breach up to a maximum of $5000 per claimant. The agreement also provides for class members to access free credit monitoring to a cap of $250,000. Justice Perell noted that given the cost of bulk purchases of credit card monitoring, this amount would allow for between 2,500 and 5,000 of the class members to access credit monitoring. In order to be entitled to any funds or credit monitoring, class members would have to file a claim form by October 29, 2016. Under the terms of the agreement, Home Depot would assume the costs of notifying class members and of administering the funds. Any money not distributed from the funds at the end of the claims period could be used to offset these costs. Justice Perell approved these terms of the settlement agreement.

The agreement also provided for a sum of $360,000 plus HST to be paid to the class action lawyers for legal fees, costs and disbursements. Small sums were also provided for in the agreement as honoraria for the representative plaintiffs in the class, although Justice Perell declined to approve these amounts, noting that honoraria were not appropriate in this case. He noted that “Compensation for a representative plaintiff may only be awarded if he or she has made an exceptional contribution that has resulted in success for the class.” (at para 80)

In assessing the settlement agreement, Justice Perell made it clear that the value of the settlement for class members was at most $400,000. He noted that in terms of compensation very little might actually be paid out. No class members would have had to cover the cost of fraudulent credit card charges and, in the time since the breach, there were no documented cases of identity theft related to this breach. He noted that the only information obtained through the hack was credit card information; other identity details used in identity theft such as driver’s licence data or social insurance numbers, were never stolen. He thus found it “highly unlikely” that the $250,000 fund would be used for damage awards. He also expressed doubt whether, given the short deadline in the agreement, the $250,000 fund for identity theft insurance would be used up.

Given the modest value of the settlement agreement, Justice Perell would not approve the $360,000 bill for legal fees and disbursements. Instead, he set the amount at $120,000. He noted that to do otherwise would pay class counsel more than would be received by the class members. He noted as well that in his view the case against Home Depot was very weak: the data breach was the result of a criminal hack; the privacy commissioners had found no wrongdoing on the part of Home Depot; and Home Depot had not attempted to cover it up and instead had acted promptly to notify customers and to help them mitigate any possible harm. Further, he noted that “by the time the actions against Home Depot came to be settled, there were no demonstrated or demonstrable losses by the Class Members” (at para 101). Justice Perell observed that while class counsel may have incurred higher fees than what were being awarded, there is a degree of risk with any class proceeding. He noted that “class counsel should not anticipate that every reasonably commenced class action will be remunerative and a profitable endeavor.” (at para 103)

The result is interesting on a number of fronts. Clearly Home Depot found it less costly to settle than to proceed with the litigation even though Justice Perell seems to be of the view that they would have won their case. The case illustrates just how costly data breaches can be, even for companies that have done nothing wrong and are themselves victims of criminal activities. In terms of the class action law suit, as with many data breaches, proof of actual harm to the class members was difficult to come by, making losses quite speculative. Further, as litigation of this kind tends to proceed slowly, the lack of harm to class members becomes increasingly apparent in cases where there is no evidence that the illegal obtained data has been used by the malefactors. The result in this case suggests that in class action law suits related to privacy breaches, class members who do not suffer actual pecuniary loss should not expect significant payouts; and companies who are not at fault in the breach and who act promptly to assist affected customers may substantially reduce (or eliminate) their liability. These factors may affect decisions by class counsel to launch class action lawsuits where the link between the breach and actual harm is weak, or where defendants are not obviously at fault.

 

 

Published in Privacy

Yesterday I appeared before the House of Commons’ Standing Committee on Access to Information, Privacy and Ethics, along with Professor David Lyon of Queen’s University and Professor Lisa Austin of the University of Toronto. The Committee is considering long overdue reform of the Privacy Act, and we had been invited to speak on this topic.

All three of us urged the Committee to take into account the very different technological environment in which we now find ourselves. Professor Lyon cogently addressed the changes brought about by the big data context. Although the Privacy Act as it currently stands largely address the collection, use and disclosure of personal information for “administrative purposes” all three of us expressed concerns over the access to and use by government of information in the hands of the private sector, and the use of information in big data analytics. Professor Austin in particular emphasized the need to address not just the need for accuracy in the data collected by government but also the need to assess “algorithmic accuracy” – the quality/appropriateness of algorithms used to analyse large stores of data and to draw conclusions or predictions from this data. She also made a clear case for bringing Charter considerations into the Privacy Act – in other words, for recognizing that in some circumstances information collection, disclosure or sharing that appears to be authorized by the Privacy Act might nevertheless violate the Canadian Charter of Rights and Freedoms. There was also considerable discussion of information-sharing practices both within government and between our government and other foreign or domestic governments.

The Committee seemed very interested and engaged with the issues, which is a good sign. Reform of the Privacy Act will be a challenging task. The statute as a public sector data protection statute is sorely out of date. However, it is also out of context – in other words, it was drafted to address an information context that is radically different from that in which we find ourselves today. Many of the issues that were raised before the Committee yesterday go well beyond the original boundaries of the Privacy Act, and the addition of a few provisions or a few tweaks here and there will not come close to solving some of these privacy issues – many of which overlap with issues of private sector data protection, criminal law and procedure, and national security.

The notes related to my own remarks to the Committee are available below.

Written Notes for Comments by Professor Teresa Scassa to the House of Commons’ Standing Committee on Access to Information, Privacy and Ethics, June 14, 2016

Thank you for the opportunity to address this Committee on the issue of reform of the Privacy Act.

I have reviewed the Commissioner’s recommendations on Privacy Act reform and I am generally supportive of these proposals. I will focus my remarks today on a few specific issues that are united by the theme of transparency. Greater transparency with respect to how personal information is collected, used and disclosed by government enhances privacy by exposing practices to comment and review and by enabling appropriate oversight and accountability. At the same time, transparency is essential to maintaining public confidence in how government handles personal information.

The call for transparency must be situated within our rapidly changing information environment. Not only does technology now enable an unprecedented level of data collection and storage, enhanced analytic capacity has significantly altered the value of information in both public and private sectors. This increased value provides temptations to over-collect personal information, to share it, mine it or compile it across departments and sectors for analysis, and to retain it beyond the period required for the original purposes of its collection.

In this regard, I would emphasize the importance of the recommendation of the Commissioner to amend the Privacy Act to make explicit a “necessity” requirement for the collection of personal information, along with a clear definition of what ‘necessary’ means. (Currently, s. 4(1) of the Privacy Act requires only that personal information “relate[] directly to an operating program or activity of the institution”.) The goal of this recommendation is to curtail the practice of over-collection of personal information. Over-collection runs counter to the expectations of the public who provide information to government for specific and limited purposes. It also exposes Canadians to enhanced risks where negligence, misconduct or cyberattack result in data breaches. Data minimization is an important principle that is supported by data protection authorities around the world and that is reflected in privacy legislation. The principle should be explicit and up front in a reformed Privacy Act. Data minimization also has a role to play in enhancing transparency: not only do clear limits on the collection of personal information serve transparency goals; over-collection encourages the re-purposing of information, improper use and over-sharing.

The requirement to limit collection of information to specific and necessary purposes is tied to the further requirement on government to collect personal information directly from the individual “where possible” (s. 5(1)). This obviously increases transparency as it makes individuals directly aware of the collection. However, this requirement relates to information collected for an “administrative purpose”. There may be many other purposes for which government collections information, and these fall outside the privacy protective provisions of the Privacy Act. This would include circumstances that is disclosed to a government investigative body at its request in relation to an investigation or the enforcement of any law, or that is disclosed to government actors under court orders or subpoenas. Although such information gathering activities may broadly be necessary, they need to be considered in the evolving data context in which we find ourselves, and privacy laws must adapt to address them.

Private sector companies now collect vast stores of personal information, and this information often includes very detailed, core-biographical information. It should be a matter of great concern, therefore, that the permissive exceptions in both PIPEDA and the Criminal Code enable the flow of massive amounts of personal information from the private sector to government without the knowledge or consent of the individual. Such requests/orders are often (although not always) made in the course of criminal or national security investigations. The collection is not transparent to the individuals affected, and the practices as a whole are largely non-transparent to the broader public and to the Office of the Privacy Commissioner (OPC).

We have heard the most about this issue in relation to telecommunications companies, which are regularly asked or ordered to provide detailed information to police and other government agents. It should be noted, however, that many other companies collect personal information about individuals that is highly revelatory about their activities and choices. It is important not to dismiss this issue as less significant because of the potentially anti-social behaviour of the targeted individuals. Court orders and requests for information can and do encompass the personal information of large numbers of Canadians who are not suspected of anything. The problem of tower dump warrants, for example, was recently highlighted in a recent case before the Ontario Supreme Court (R. v. Rogers Communication (2016 ONSC 70))(my earlier post on this decision can be found here). The original warrant in that case sought highly detailed personal information of around 43,000 individuals, the vast majority of whom had done nothing other than use their cell phones in a certain area at a particular time. Keep in mind that the capacity to run sophisticated analytics will increase the attractiveness of obtaining large volumes of data from the private sector in order to search for an individual linked to a particular pattern of activity.

Without adequate transparency regarding the collection of personal information from the private sector, there is no way for the public to be satisfied that such powers are not abused. Recent efforts to improve transparency (for example, the Department of Innovation, Science and Economic Development’s voluntary transparency reporting guidelines) have focused on private sector transparency. In other words, there has been an attempt to provide a framework for the voluntary reporting by companies of the number of requests they receive from government authorities, the number they comply with, and so on. But these guidelines are entirely voluntary, and they also only address transparency reporting by the companies themselves. There are no legislated obligations on government actors to report in a meaningful way – whether publicly or to the OPC – on their harvesting of personal information from private sector companies. I note that the recent attempt by the OPC to audit the RCMP’s use of warrantless requests for subscriber data came to an end when it became clear that the RCMP did not keep specific records of these practices.

In my view, a modernization of the Privacy Act should directly address this enhanced capacity of government institutions to access the vast stores of personal information in the hands of the private sector. The same legislation that permits the collection of personal information from private sector companies should include transparency reporting requirements where such collection takes places. In addition, legislative guidance should be provided on how government actors who obtain personal information from the private sector either by request or under court order should deal with this information. Specifically, limits on the use and retention of this data should be imposed.

It is true that both the Criminal Code and PIPEDA enable police forces and investigative bodies under both federal and provincial jurisdiction to obtain personal information from the private sector under the same terms and conditions, and that reform of the Privacy Act in this respect will not address transparency and accountability of provincial actors. This suggests that issues of transparency and accountability of this kind might also fruitfully be addressed in the Criminal Code and in PIPEDA, but this is no reason not to also address it in the Privacy Act. To the extent that government institutions are engaged in the indirect collection of personal information, the Privacy Act should provide for transparency and accountability with respect to such activities.

Another transparency issue raised by the Commissioner relates to information-sharing within government. Technological changes have made it easier for government agencies and departments to share personal information – and they do so on what the Commissioner describes as a “massive” scale. The Privacy Act enables personal information sharing within and between governments, domestically and internationally, in specific circumstances – for investigations and law enforcement, for example, or for purposes consistent with those for which it was collected. (Section 8(2)(a) allows for sharing “for the purpose for which the information was obtained or compiled by the institution or for a use consistent with that purpose”). Commissioner Therrien seeks amendments that would require information-sharing within and between governments to take place according to written agreements in a prescribed form. Not only would this ensure that information sharing is compliant with the legislation, it would offer a measure of transparency to a public that has a right to know whether and in what circumstances information they provide to one agency or department will be shared with another – or whether and under what conditions their personal information may be shared with provincial or foreign governments.

Another important transparency issue is mandatory data breach reporting. Treasury Board Secretariat currently requires that departments inform the OPC of data security breaches; yet the Commissioner has noted that not all comply. As a result, he is asking that the legislation be amended to include a mandatory breach notification requirement. Parliament has recently amended PIPEDA to include such a requirement. Once these provisions take effect, the private sector will be held to a higher standard than the public sector unless the Privacy Act is also amended. Any amendments to the federal Privacy Act to address data security breach reporting would have to take into account the need for both the Commissioner and for affected individuals to be notified where there has been a breach that meets a certain threshold for potential harm, as will be the case under PIPEDA. The PIPEDA amendments will also require organizations to keep records of all breaches of security safeguards regardless of whether they meet the harm threshold that triggers a formal reporting requirement. Parliament should impose a requirement on those bodies governed by the Privacy Act to both keep and to submit records of this kind to the OPC. Such records would be helpful in identifying patterns or trends either within a single department or institution or across departments or institutions. The ability to identify issues proactively and to address them either where they arise or across the federal government can only enhance data security – something which is becoming even more urgent in a time of increased cybersecurity threats.

 

Published in Privacy
<< Start < Prev 1 2 3 4 5 6 7 8 9 Next > End >>
Page 1 of 9

Canadian Trademark Law

Published in 2015 by Lexis Nexis

Canadian Trademark Law 2d Edition

Buy on LexisNexis

Electronic Commerce and Internet Law in Canada, 2nd Edition

Published in 2012 by CCH Canadian Ltd.

Electronic Commerce and Internet Law in Canada

Buy on CCH Canadian

Intellectual Property for the 21st Century

Intellectual Property Law for the 21st Century:

Interdisciplinary Approaches

Purchase from Irwin Law