Teresa Scassa - Blog

Teresa Scassa

Teresa Scassa

In June 2019, the Standing Senate Committee on Banking, Trade and Commerce (BANC) released its report on open banking following hearings it held in the spring of 2019. The federal government, which has been conducting its own consultation into open banking, has yet to issue a report.

For those who have not been following discussions around this issue, ‘open banking’ refers to a framework that enables consumers to share their personal financial data with financial services providers in a secure manner. The anticipated benefits of open banking include providing consumers of financial services (both individuals and small businesses) with more and better financial planning and payment options, and stimulating innovation in the fintech sector. Open banking is no small undertaking. To work, it will require major financial institutions to adopt standardized formats for data. It will also require the adoption of appropriate security measures. A regulator will have to create a list of approved open banking fintech providers. There will also need to be oversight from competition and privacy commissioners. For consumer privacy to be adequately protected there will have to be an overhaul of Canada’s Personal Information Protection and Electronic Documents Act.

The BANC committee report reviews the testimony it heard and makes a number of recommendations. It begins by noting that approximately 4 million Canadians already make use of fintech apps to obtain financial services not otherwise available. These apps require users to provide their banking usernames and passwords in order to enable them to repeatedly access and screen-scrape financial data. It is a risky practice and one that may violate the terms of service for those customer accounts, leaving consumers vulnerable and unprotected. The Senate report notes that the legal and regulatory changes needed to implement open banking in Canada – as well as the necessary work on standards and interoperability – will take time. As a result, the first part of the report makes a number of recommendations to address, in the short term, the protection of Canadians who engage in screen-scraping.

The BANC committee notes that other countries – including Australia and the UK – are already further ahead than Canada in launching open banking initiatives. It expresses concern that Canada may be falling behind what is an international shift towards open banking, noting that “without swift action, Canada may become an importer financial technology rather than an exporter” (at pr. 14). The report makes a number of recommendations to facilitate the adoption of open banking in Canada, urging a “principles-based, industry-led open banking framework that would be integrated with existing financial sector and privacy legislation” (Recommendation III). The recommendations include work on developing standards, creating a registry of accredited providers of fintech services, legislating limits on the use of standardized and interoperable consumer financial data, creating a framework in which provincially regulated credit unions and caisses populaires can participate, improving broadband access for rural and remote communities, reforming PIPEDA, and creating appropriate regulatory oversight and enforcement mechanisms.

The BANC committee correctly links open banking to a broader data portability right. This portability right, which is present in the EU’s General Data Protection Regulation (GDPR), is one of the 10 principles articulated in the federal government’s new Digital Charter. The federal government’s recent discussion paper on PIPEDA reform also references data portability. Data portability is a mechanism by which individuals are given much greater control over their data – allowing them to ‘port’ their data from one provider to another. It also has potential to encourage competition and to stimulate innovation in the tech sector. However, for the BANC committee, consumer control is at the heart of open banking. The Committee clearly sees open banking as something that should benefit consumers. They characterize it as giving consumers more control over their personal financial information, and something that can provide them with a “more personalized, convenient digital banking experience” (at p. 37).

Indeed, the BANC committee report as a whole places consumer interests at the centre of the move towards open banking. As noted earlier, its first recommendations are oriented towards taking action to protect consumers who are engaging in screen-scraping to obtain the fintech services they want. It is also sharply critical of the federal government for not appointing a consumer advocate to its Advisory Committee on Open Banking, even though the Department of Finance indicates that it has consulted widely to obtain consumer and civil society input. The BANC committee expressed concern that not enough is known about the potential impacts on consumers of open banking, and recommends that more research be carried out as soon as possible on these issues, funded by the federal government.

 

On May 21, 2019, Canada’s federal government launched its Digital Charter, along with several other supporting documents, including its action plan for the Charter and proposals for modernizing the Personal Information Protection and Electronic Documents Act (PIPEDA). Together, the documents discuss the outcomes of the recent federal digital strategy consultation and chart a path forward for federal policy in this area. The documents reflect areas where the government is already forging ahead, and they touch on a number of issues that have been at the centre of media attention, as well as public and private sector engagement.

As a strategy document (which, launched less than six months away from an election, it essentially is) the Digital Charter hits many of the right notes, and its accompanying documentation reveals enough work already underway to give shape to its vision and future directions. Navdeep Bains, the Minister of Innovation, Science and Economic Development, describes the Digital Charter as articulating principles that “are the foundation for a made in Canada digital approach that will guide our policy thinking and actions and will help to build an innovative, people-centred and inclusive digital and data economy.”

The Digital Charter features 10 basic principles. Three relate to digital infrastructure: universal access to digital services; safety and security; and open and modern digital government. Another three touch on human rights issues: data and digital for good; strong democracy; and freedom from hate and violent extremism. Two principles address data protection concerns: control and consent; and transparency, portability and interoperability — although the latter principle blends into the marketplace and competition concerns that are also reflected in the principle of ensuring a level playing field. Perhaps the most significant principle in terms of impact is the tenth, an overarching commitment to strong enforcement and real accountability. Weak enforcement has undermined many of our existing laws that apply in the digital context, and without enforcement or accountability, there is little hope for a credible strategy. Taken together, the 10 principles reflect a careful and thorough synthesis of some of the issues confronting Canada’s digital future.

Yet, this digital charter might more accurately be described as a digital chart. In essence, it is an action plan, and while it is both credible and ambitious, it is not a true charter. A charter is a document that creates legal rights and entitlements. The Digital Charter does not. Its principles are framed in terms of open-ended goals: “Canadians will have control over what data they are sharing,” “All Canadians will have equal opportunity to participate in the digital world,” or “Canadians can expect that digital platforms will not foster or disseminate hate, violent extremism or criminal content.” Some of the principles reflect government commitments: “The Government of Canada will ensure the ethical use of data.” But saying that some can “expect” something is different from saying they have a right to it.

The goals and commitments in the Digital Charter are far from concrete. That is fair enough — these are complex issues — but concepts such as universal digital access and PIPEDA reform have been under discussion for a long time now with no real movement. A chart shows us the way, but it does not guarantee we’ll arrive at the destination.

It is interesting to note as well that privacy as a right is not squarely a part of the Digital Charter. Although privacy has (deservedly) been a high-profile issue in the wake of the Cambridge Analytica scandal and the controversies over Sidewalk Labs’ proposed smart city development in Toronto, this Digital Charter does not proclaim a right to privacy. A right to be free from unjustified surveillance (by public or private sector actors) would be a strong statement of principle. An affirmation of the importance of privacy in supporting human autonomy and dignity would also acknowledge the fundamental importance of privacy, particularly as our digital economy plows forward. The Digital Charter does address data protection, stating that Canadians will have control over the data they share and will “know that their privacy is protected.” They will also have “clear and manageable access to their personal data.” While these are important data protection goals, they are process-related commitments and are aimed at fostering trust for the purpose of data sharing.

Indeed, trust is at the the core of the government strategy. Minister Bains makes it clear that, in his view, “innovation is not possible without trust.” Further, “trust and privacy are key to ensuring a strong, competitive economy and building a more inclusive, prosperous Canada.”

Privacy, however, is the human right; trust is how data protection measures are made palatable to the commercial sector. Trust is about relationships — in this case, between individuals and businesses and, to some extent, between individuals and governments. In these relationships, there is a disparity of power that leaves individuals vulnerable to exploitation and abuse. A trust-oriented framework encourages individuals to interact with businesses and government — to share their data in order to fuel the data economy. This is perhaps the core conundrum in creating digital policy in a rapidly shifting and evolving global digital economy: the perceived tension between protecting human rights and values on the one hand, and fostering a competitive and innovative business sector on the other. In a context of enormous imbalance of power, trust that is not backed up by strong, enforceable measures grounded in human rights principles is a flimsy thing indeed.

And this, in a nutshell, is the central flaw in an otherwise promising Digital Charter. As a road map for future government action, it is ambitious and interesting. It builds on policies and actions that are already underway, and sets a clear direction for tackling the many challenges faced by Canada and Canadians in the digital age. It presents a pre-election digital strategy that is in tune with many of the current concerns of both citizens and businesses. As a charter, however, it falls short of grounding the commitments in basic rights and enshrining values for our digital future. That, perhaps, is a tall order and it may be that a transparent set of principles designed to guide government law and policy making is as much as we can expect at this stage. But calling it a Charter misleads, and creates the impression that we have done the hard work of articulating and framing the core human rights values that should set the foundational rules for the digital society we are building.

On May 3, 2019 I was very pleased to give a keynote talk at the Go Open Data 2019 Conference in Toronto (video recordings of the conference proceedings are now available from the site). The following post includes the gist of my talk, along with hyperlinks to the different sources and examples I referenced. My talk was built around the theme of the conference: Inclusive, Equitable, Ethical, and Impactful.

In my talk this morning I am going to use the conference’s themes of Inclusive, Equitable, Ethical and Impactful to shape my remarks. In particular, I will apply these concepts to data in the smart cities context as this has been garnering so much attention lately. But it is also important to think about these in the artificial intelligence (AI) context which is increasingly becoming part of our everyday interactions with public and private sector actors, and is a part of smart cities as well.

As this is an open data conference, it might be fair to ask what smart cities and AI have to do with open data. In my view, these contexts extend the open data discussion because both depend upon vast quantities of data as inputs. They also complicate it. This is for three broad reasons:

First, the rise of smart cities means that there are expanding categories and quantities of municipal data (and provincial) that could be available as open data. There are also growing quantities of private sector data gathered in urban contexts in a variety of different ways over which arguments for sharing could be made. Thus, there is more and more data and issues of ownership, control and access become complex and often conflictual. Open government data used to be about the operations and activities of government, and there were strong arguments for making it broadly open and accessible. But government data is changing in kind, quality and quantity, particularly in smart cities contexts. Open data may therefore be shifting towards a more nuanced approach to data sharing.

Second, smart cities and AI are just two manifestations of the expanding demand for access to data for multiple new uses. There is not just MORE data, there are more applications for that data and more demand from public, private sector and civil society actors for access to it. Yet the opacity of data-hungry analytics and AI contribute to a deepening unease about data sharing.

Third, there is a growing recognition that perhaps data sharing should not be entirely free and open. Open data, under an open licence, with few if any restrictions and with no registration requirement was a kind of ideal, and it fit with the narrower concept of government data described earlier. But it is one that may not be best suited to our current environment. Not only are there potential use restrictions that we might want to apply to protect privacy or to limit undesirable impacts on individuals or communities, but there might also be arguments for cost recovery as data governance becomes more complex and more expensive. This may particularly be the case if use is predominantly by private sector actors – particularly large foreign companies. The lack of a registration requirement limits our ability to fully understand who is using our data, and it reduces the possibility of holding users to account for misuse. Again this may be something we want to address.

I mentioned that I would use the themes of this conference as a frame for my comments. Let me start with the first – the idea of inclusiveness.

Inclusive

We hear a lot about inclusiveness in smart cities – and at the same time we hear about privacy. These are complicated and intertwined.

The more we move towards using technology as an interface for public and private sector services, for interaction with government, for public consultations, elections, and so on, the more we need to focus on the problem of the digital divide and what it means to include everyone in the benefits of technology. Narrowing the digital divide will require providing greater access to devices, access to WIFI/broadband services, access to computer and data literacy, and access in terms of inclusiveness of differently-abled individuals. These are all important goals, but their achievement will inevitably have the consequence of facilitating the collection of greater quantities and more detailed personal information about those formerly kept on the other side of the digital divide. The more we use devices, the more data we generate. The same can be said of the use of public WIFI. Moving from analog to digital increases our data exhaust, and we are more susceptible to tracking, monitoring, profiling, etc. Consider the controversial LinkNYC Kiosks in New York. These large sidewalk installations include WiFi Access, android tablets, charging stations, and free nation-wide calling. But they have also raised concerns about enhanced tracking and monitoring. This is in part because the kiosks are also equipped with cameras and a range of sensors.

No matter how inclusiveness is manifested, it comes with greater data collection. The more identifiable data collected, the greater the risks to privacy, dignity, and autonomy. But de-identified data also carries its own risks to groups and communities. While privacy concerns may prompt individuals to share less data and to avoid data capture, the value of inclusiveness may actually require having one’s data be part of any collection. In many ways, smart cities are about collecting vast quantities of data of many different kinds (including human behavioural data) for use in analytics in order to identify problems, understand them, and solve them. If one is invisible in the data, so are one’s particular needs, challenges and circumstances. In cases where decisions are made based on available data, we want that data to be as complete and comprehensive as possible in order to minimize bias and to make better diagnoses and decisions. Even more importantly, we want to be included/represented in the data so that our specificity is able to influence outcomes. Inclusiveness in this sense is being counted, and counting.

Yet this type of inclusion has privacy consequences – for individuals as well as groups. One response to this has been to talk about deidentification. And while deidentification may reduce some privacy risks, but it does not reduce or eliminate all of them. It also does not prevent harmful or negative uses of the data (and it may evade the accountability provided by data protection laws). It also does not address the dignity/autonomy issues that come from the sense of being under constant surveillance.

Equitable and Ethical

If we think about issues of equity and ethics in the context of the sharing of data it becomes clear that conventional open data models might not be ideal. These models are based on unrestricted data sharing, or data sharing with a bare minimum of restrictions. Equitable and ethical data sharing may require more restrictions to be placed on data sharing – it may require the creation of frameworks for assessing proposed uses to which the data may be put. And it may even require changing how access to data is provided.

In the privacy context we have already seen discussion about reforming the law to move away from a purely consent-based model to one in which there may be “no-go zones” for data use/processing. The idea is that if we can’t really control the collection of the information, we should turn our attention to identifying and banning certain inappropriate uses. Translated into the data sharing context, licence agreements could be used to put limits on what can be done with data that is shared. Some open data licences already explicitly prohibit any attempts to reidentify deidentified data. The Responsible Data Use Assessment process created by Sidewalk Labs for its proposed data governance framework for Toronto’s Quayside development similarly would require an ‘independent’ body to assess whether a proposed use of urban data is acceptable.

The problem, of course, is that licence-based restrictions require oversight and enforcement to have any meaning. I wrote about this a couple of years ago in the context of the use of social media data for analytics services provided to police services across North America. The analytics companies contracted for access to social media data but were prohibited in their terms of use from using this data in the way they ultimately did. The problem was uncovered after considerable effort by the ACLU and the Brennan Center for Justice – it was not discovered by the social media companies who provided access to their data or who set the terms of use. In the recent Report of Findings by the Privacy Commissioner of Canada into Facebook’s role in the Cambridge Analytica scandal, the Commissioner found that although Facebook’s terms of service with developers prohibited the kind of activities engaged in by Dr Kogan who collected the data, they failed in their duty to safeguard personal information, and in particular, ignored red flags that should have told them that there was a problem. Let’s face it; companies selling access to data may have no interest in policing the behaviour of their customers or in terminating their access. An ‘independent’ body set up to perform such functions may lack the resources and capacity to monitor and enforce compliance.

Another issue that exists with ethical approaches is, of course, whose ethics? Taking an ethical approach does not mean being value-neutral and it does not mean that there will not be winners and losers. It is like determining the public interest – an infinitely malleable concept. This is why the composition of decision-making bodies and the location of decision-making power, when it comes to data collection and data sharing, is so important and so challenging.

Impactful

In approaching this last of the conference’s themes – impactful – I think it is useful to talk about solutions. And since I am almost out of time and this is the start of the day’s events, I am going to be very brief as solutions will no doubt be part of the broader discussion today.

The challenges of big data, AI and smart cities have led to a broad range of different proposed data governance solutions. Some of these are partial; for example, deidentification/anonymization or privacy by design approaches address what data is collected and how, but they do not necessarily address uses.

Some are aspirational. For example, developing ethical approaches to AI such as the Montreal Declaration for a Responsible Development of Artificial Intelligence. Others attempt to embed both privacy and ethics into concrete solutions – for example the federal Directive on Automated Decision-Making for the public sector, which sets parameters for the adoption, implementation and oversight of AI deployment in government. In addition, there are a number of models emerging, including data trusts in all their variety (ODI), or bottom-up solutions such as Civic Data Trusts (see, e.g.: MaRS, Element AI, SeanMcDonald), which involve access moderated by an independent (?), representative (?) body, in the public interest (?) according to set principles.

Safe sharing sites is another concept discussed by Lisa Austin and David Lie of the University of Toronto – they are not necessarily independent of data trusts or civic data trusts. Michel Girard is currently doing very interesting work on the use of data standards (see his recent CIGI paper).

Some solutions may also be rooted in law reform as there are deficiencies in our legal infrastructure when it comes to data governance. One key target of reform is data protection laws, but context-specific laws may also be required.

Many of these solutions are in the debate/discussion/development stage. Without a doubt there is a great deal of work to be done. Let’s start doing it.

 

 

On April 25 the federal Privacy Commissioner and the Privacy Commissioner of British Columbia released a joint Report of Findings in an investigation into Facebook’s handling of personal information in relation to the Cambridge Analytica scandal. Not surprisingly, the report found that Facebook was in breach of a number of different obligations under the Personal Information Protection and Electronic Documents Act (PIPEDA). Somewhat more surprisingly, the Report also finds that the corresponding obligations under BC’s Personal Information Protection Act (PIPA) were also breached. The Report criticizes Facebook for being less than fully cooperative in the investigation. It also notes that Facebook has disputed the Commissioners’ findings and many of their recommendations. The Report concludes by stating that each Commissioner will “proceed to address the unresolved issues in accordance with our authorities” under their respective statutes. Since the federal Commissioner has no order-making powers, the next step for him will be the Federal Court seeking a court order to compel changes. This will be a hearing de novo – meaning that the same territory will be covered before the Court, and Facebook will be free to introduce new evidence and argument to support its position. The court will owe no deference to the findings of the Privacy Commissioner. Further, while the Federal Trade Commission in the US contemplates fines to impose on Facebook in relation to its role in this scandal, Canada’s Commissioner does not have such a power, nor does the Federal Court. This is the data protection law we have – it is not the one that we need. Just as the Cambridge Analytica scandal drew attention to the dynamics and scale of personal data use and misuse, this investigation and its outcomes highlight the weaknesses of Canada’s current federal data protection regime.

As for the BC Commissioner – he does have order making powers under PIPA, and in theory he could order Facebook to change its practices in accordance with the findings in the Report. What the BC Commissioner lacks, however, with all due respect, is jurisdiction, as I will discuss below.

While the substantive issues raised in the complaint are important and interesting ones, this post will focus on slightly less well-travelled territory. (For comment on these other issues see, for example, this op-ed by Michael Geist). My focus is on the issue of jurisdiction. In this case, the two Commissioners make joint findings about the same facts, concluding that both statutes are breached. Although Facebook challenges their jurisdiction, the response, in the case of the BC Commissioner’s jurisdiction is brief and unsatisfactory. In my view, there is no advantage to Canadians in having two different data protection laws apply to the same facts, and there is no benefit in a lack of clarity as to the basis for a Commissioner’s jurisdiction.

This investigation was carried out jointly between the federal and the BC Privacy Commissioner. There is somewhat of a BC nexus, although this is not mentioned in the findings. One of the companies involved in processing data from Facebook is Aggregate IQ, a BC-based analytics company. There is an ongoing joint investigation between the BC and federal Privacy Commissioners into the actions of Aggregate IQ. However, this particular report of findings is in relation to the activities of Facebook, and not Aggregate IQ. While that other joint investigation will raise similar jurisdictional questions, this one deals with Facebook, a company over whose activities the federal Privacy Commissioner has asserted jurisdiction in the past.

There is precedent for a joint investigation of a privacy complaint. The federal privacy commissioners of Australia and Canada carried out a joint investigation into Ashley Madison. But I that case each Commissioner clearly had jurisdiction under their own legislation. This, I will argue, is not such a case. Within Canada, only one privacy Commissioner will have jurisdiction over a complaint arising from a particular set of facts. In this case, it is the federal Privacy Commissioner.

Unsurprisingly, Facebook raised jurisdictional issues. It challenged the jurisdiction of both commissioners. The challenge to the federal Commissioner’s jurisdiction was appropriately dismissed – there is a sufficient nexus between Facebook and Canada to support the investigation under PIPEDA. However, the challenge to the jurisdiction of the BC Commissioner was more serious. Nevertheless, it was summarily dismissed in the findings.

Uneasiness about the constitutional reach of PIPEDA in a federal state has meant that the law, which relies on the federal trade and commerce power for its constitutional legitimacy, applies only in the context of commercial activity. It applies across Canada, but it carves out space for those provinces that want to enact their own data protection laws to assert jurisdiction over the intra-provincial collection, use and disclosure of personal information. To oust PIPEDA in this sphere, these laws have to be considered “substantially similar” to PIPEDA (s. 26(2)(b)). Three provinces – BC, Alberta and Quebec, have substantially similar private sector data protection laws. Even within those provinces, PIPEDA will apply to the collection, use or disclosure by federally-regulated businesses (such as banks or airline companies). It will also apply to cross-border activities by private sector actors (whether international or inter-provincial). This split in jurisdiction over privacy can be complicated for individuals who may not know where to direct complaints, although the different commissioners’ offices will provide assistance. This does not mean there is no room for collaboration. The federal and provincial Commissioners have taken common positions on many issues in the past. These instances are conveniently listed on the website of Alberta’s privacy commissioner.

What has happened in this case is quite different. This is described as a joint investigation between the two Commissioners, and it has resulted in a joint set of recommendations and findings. Both PIPEDA and BC’s PIPA are cited as being applicable laws. In response to the challenge to the BC Privacy Commissioner’s jurisdiction, the Report tersely states that “PIPA (Personal Information Protection Act (British Columbia)) applies to Facebook’s activities occurring within the province of BC”. Yet no information is given as to what specific activities of Facebook were exclusively within the province of BC. No distinction is made at any point in the report between those activities subject to PIPA and those falling under PIPEDA. In this respect, it seems to me that Facebook is entirely correct in challenging the BC Privacy Commissioner’s jurisdiction. Facebook collects, uses and discloses personal information across borders, and its activities with respect to Canadians are almost certainly covered by PIPEDA. If that is the case, then they are not also subject to PIPA. The Exemption Order that finds PIPA BC to be substantially similar to PIPEDA provides:

1. An organization, other than a federal work, undertaking or business, to which the Personal Information Protection Act, S.B.C. 2003, c. 63, of the Province of British Columbia, applies is exempt from the application of Part 1 of the Personal Information Protection and Electronic Documents Act, in respect of the collection, use and disclosure of personal information that occurs within the Province of British Columbia.

Section 3(2) of the Personal Information Protection Act provides:

(2) This Act does not apply to the following:

(c) the collection, use or disclosure of personal information, if the federal Act applies to the collection, use or disclosure of the personal information;

The “federal Act” is defined in s. 1 of PIPA to mean PIPEDA. The scheme is quite simple: if PIPEDA applies then PIPA does not. If the federal Commissioner has jurisdiction over the activities described in the Report, the provincial Commissioner does not. The only way in which the BC Commissioner would have jurisdiction is if there are purely local, provincial activities of Facebook that would not be covered by PIPEDA. Nothing in the Findings suggests that there are. At a minimum, if there are separate spheres of legislative application, these should be made explicit in the Findings.

Jurisdictional issues matter. We already have a complex mosaic of different data protection laws (federal, provincial, public sector, private sector, health sector) in Canada. Individuals must muddle through them to understand their rights and recourses; while organizations and entities must likewise understand which laws apply to which of their activities. Each statute has its own distinct sphere of operation. We do not need the duplication that would result from the adjudication of the same complaint under two (or more) different statutes; or the confusion that might result from different results flowing from different complaint resolutions. If there are separate sets of facts giving rise to separate breaches under different statutes, this has to be spelled out.

Federal-provincial cooperation on data protection is important; it is also valuable for the different privacy commissioners to reach consensus on certain principles or approaches. But creating overlapping jurisdiction over complaints flies in the face of the law and creates more problems than it solves. We have enough data protection challenges to deal with already.

A recent decision on a motion before the Federal Court marks the progress of the Privacy Commissioner’s reference case on whether the Personal Information Protection and Electronic Documents Act (PIPEDA) includes a right to be forgotten. In an earlier report following the OPC’s consultation on digital reputation, the Privacy Commissioner had indicated that he was of the view that PIPEDA, in its unamended form, provided for a right to be forgotten that could be exercised against search engines.

The reference, launched on October 10, 2018, is linked to a complaint filed with the Office of the Privacy Commissioner (OPC) by an individual against Google. The Complainant is concerned that Google searches of his name produce links to news articles that he alleges “are outdated and inaccurate and disclose sensitive information such as his sexual orientation and a serious medical condition” (at para 6). The complainant’s view is that by providing prominent links to these articles, Google is breaching the PIPEDA. He is seeking to have these results de-indexed. This means that they would no longer appear in Google search results. De-indexing does not involve the removal of content from the source websites. Basically, the articles would still be out there, but they would not appear in Google search results. Unless similar orders were made against other search engines such as Bing, they content would be findable using those engines.

The Commissioner has referred two questions to the Federal Court. First, he seeks to know whether Google’s search engine activities constitute the “commercial activity” necessary to bring these activities within the scope of PIPEDA, which applies to the collection, use or disclosure of personal information in the course of commercial activity. The second question is whether Google’s search engine activities, even if commercial, fall within the exception to PIPEDA’s application where personal information is collected, used or disclosed “for journalistic, artistic or literary purposes and for no other purpose” (s. 4(2)(c)). Google and the Attorney General of Canada were given notice of the reference and are entitled to become parties to the reference. Google has challenged the scope of the reference. It seeks to add the question of whether, if PIPEDA does apply to the search engine’s activities, and if there is a deindexing order, such an order would violate s. 2(b) of the Canadian Charter of Rights and Freedoms. This motion to expand the scope of the reference had not yet been heard.

The CBC, along with a coalition of other Canadian media organizations brought motions seeking to be added as parties to the original reference. Their concern is that the Commissioner’s interpretation of the scope of PIPEDA as including a right to be forgotten is a violation of the freedom of expression guaranteed by s. 2(b) of the Charter. Their argument is based on the principle that the right of expression includes the right to receive information, and that measures taken to limit access to information in the news media thus breach the Charter. By bringing their motion, the media outlets sought to be added as parties, with the right to introduce evidence and make argument before the Court.

The motion was heard by Prothonotary Tabib, who rendered her decision on March 1. She began by noting that since the motion was being heard prior to any decision on Google’s motion to expand the scope of proceedings, party status would be considered only with respect to the original reference questions. She was critical of the motion on the basis that it proceeded “from the fundamental assumption that the Court’s determination of the jurisdictional questions in a way that confers jurisdiction on the OPC to investigate the underlying complaint will inevitably result in deindexing lawful news media content from Internet search results” (at para 17). She noted that in fact the reference questions were directed towards the issue of whether the Commissioner had jurisdiction in the matter. If the outcome of the reference was a finding that there was jurisdiction, the Commissioner would still have to investigate, would have to find the complaint well-founded, and would have to determine whether de-indexing was an appropriate remedy. The Commissioner can only make non-binding orders, so no Charter rights would be violated unless the matter proceeded to a recommendation to de-index with which Google voluntarily complied. If Google refused to comply the complainant or the Commissioner could bring the matter to Federal Court seeking a binding order, but the Court would hold a hearing de novo and might reach different conclusions. Basically, the prothonotary was of the view that the matter was a long way from breaching anyone’s Charter rights. She noted that “The media parties’ reliance on assumptions as to the ultimate result to form the cornerstone of their argument conflates all subsequent steps and determinations into the preliminary issue” (at para 18).

Prothonotary Tabib considered Rule 104(1)(b) of the Federal Courts Rules, which empowers the Court to order a person to be joined as a party. She focused on the issue of whether the presence of the media parties was necessary “for a full and effectual determination” of all of the issues in the reference. The media companies argued that their presence was necessary since the results of the reference would be binding on them. Prothonotary Tabib noted:

 

The media parties’ arguments thus essentially rest on the underlying assumption that what is truly at issue in this reference is the constitutionality of the Privacy Commissioner "“intended”" institution of a deindexing process in respect of lawful news content from Internet search results. However, as determined above, that is not what is truly at issue in this reference. What is at issue here is only whether Google is subject to or exempt from the application of Part 1 of PIPEDA in respect of how it collects, uses or discloses personal information in the operation of its search engine service when it presents search results in response to an individual’s name. (at para 36)

 

She observed that the only direct effect of the outcome of the reference would be the Commissioner’s decision to proceed with the investigation of the complaint against Google. She also noted that any freedom of expression impact that might ultimately flow from this matter would be shared by all internet content providers, as well as all those who used Google’s search engines. If the Charter interests of the media entitled them to be parties, then there was virtually no limit to who could be a party – which would be an absurd and unmanageable result. In her view it would be more appropriate for the media companies to seek intervenor status. However, she found that their motion did not address the issues they would need to establish for intervenor status. In brief, they failed to show how their contributions to the argument would be distinct from what Google would provide as party to the reference case. The motions were dismissed, with leave provided for the companies to reapply for leave to intervene once Google’s motion to vary the scope of the reference is decided.

 

 

As discussed in my earlier posts here and here, Ontario’s new budget bill contains quite a number of measures related to digital, data and privacy issues. In this third post I look at the proposed new statute that will balance privacy with the openness of provincial tribunal adjudicative records.

This new statute responds to the decision in Toronto Star v. AG Ontario, discussed in an earlier post here, in which Justice Morgan of the Ontario Supreme Court ruled that Ontario’s Freedom of Information and Protection of Privacy Act (FIPPA) breached the right to freedom of expression under s. 2(b) of the Canadian Charter of Rights and Freedoms. It did so because of the way in which it applied to administrative tribunals in respect of requests for access to their adjudicative records. Some tribunals to which FIPPA applied required those seeking access to adjudicative records to file access to information requests. What breached the Charter right was the presumption in FIPPA that personal information could not be disclosed unless one of the statutory exceptions applied. This was found to clash with the open courts principle. Justice Morgan suspended the declaration of invalidity of the legislation for one year in order to give the government time to fix the problem. The year is up later this month; it is therefore not surprising that this legislative change has found its way into the omnibus bill.

The Tribunal Adjudicative Records Act, 2019 provides, as a default principle, that the adjudicative records of tribunals prescribed by regulations enacted under this statute are to be made available to the public (s. 2(1)). The definition of adjudicative records in s. 1(2) is quite broad and includes transcripts of oral evidence, documents admitted in evidence, and reasons for decision. Adjudicative records expressly do not include personal notes or draft decisions, or records related to attempts to resolve matters through alternative dispute resolution procedures.

The obligation to disclose adjudicative records will be subject to any confidentiality orders that the tribunal might make (s. 2(2)). A confidentiality order in relation to personal information can be issued where:

2(3)(b) intimate financial or personal matters or other matters contained in the record are of such a nature that the public interest or the interest of a person served by avoiding disclosure outweighs the desirability of adhering to the principle that the record be available to the public.

A confidentiality order may be applied for by a party to the proceedings or by a person who would be affected by the disclosure of the information at issue (s. 2(3)).

Section 3(1) gives tribunals the authority to make rules governing their own procedures relating to providing access or issuing confidentiality orders. Under s. 4, tribunals are, with ministerial permission, entitled to charge fees for access to their adjudicative records. The new statute also provides for consequential amendments to FIPPA that will exclude the application of that statute to “personal notes, draft decisions, draft orders and communications related to draft decisions or draft orders that are created by or for a person who is acting in a quasi-judicial capacity. It also excludes the application of FIPPA to adjudicative records covered by the new statute.

This new statute resolves the constitutional issues at the heart of the Toronto Star decision. It does not, however, resolve other issues related to privacy and administrative tribunal decisions that have long been the subject of debate and discussion. In a recent Ontario case, for example, the personal information of third parties to a matter before the Ontario Human Rights Tribunal ended up in the tribunal’s decision. While the new Tribunal Adjudicative Records Act will allow third parties to apply for confidentiality orders, it is not clear how such individuals will know in advance that their personal information might be published. Further, many administrative tribunals deal with highly sensitive matters involving personal health or financial information. While they are urged to take privacy into account in the drafting of their decisions and in the amount of personal information shared, the trend towards providing broader access through online publication of decisions is leading to greater privacy risks for individuals that may not be properly balanced against the open courts principle. It would have been good to see in this new statute some recognition of the importance of these issues. Administrative tribunals are not courts, and government would not unduly interfere with their independence by stating in law that the disclosure of personal information should be minimized to only that which is clearly necessary to explain the reasons for decision, or by limiting the disclosure of some personal information in versions of decisions published online.

Schedule 31 and Schedule 41 of Ontario’s new omnibus Budget Bill amend the Freedom of Information and Protection of Privacy Act (FIPPA) and the Municipal Freedom of Information and Protection of Privacy Act (MFIPPA) respectively. One change to both statutes will expand the ability of public sector bodies to share personal information with law enforcement without consent. A more extensive set of amendments to FIPPA constitute another piece of the government’s digital and data strategy, which is further developed in the Simpler, Faster, Better Services Act, another piece of the budget bill discussed in my post here.

FIPPA and MFIPPA set the rules for the collection, use and disclosure of personal information by the public sector. MFIPPA applies specifically to municipalities, and FIPPA to the broader public sector. Both statutes prohibit the disclosure of personal information under the custody or control of a public body unless such a disclosure falls under an exception. Currently, both statutes have an exception related to investigations which reads:

(g) if disclosure is to an institution or a law enforcement agency in Canada to aid an investigation undertaken with a view to a law enforcement proceeding or from which a law enforcement proceeding is likely to result;

The Budget Bill will amend this exception by replacing it with:

(g)  to an institution or a law enforcement agency in Canada if,

(i)  the disclosure is to aid in an investigation undertaken by the institution or the agency with a view to a law enforcement proceeding, or

(ii)  there is a reasonable basis to believe that an offence may have been committed and the disclosure is to enable the institution or the agency to determine whether to conduct such an investigation;

Paragraph (g)(i) is essentially the same as the original provision. What is new is paragraph (g)(ii). It broadens the circumstances in which personal information can be shared with law enforcement. Not only that, it does so in the squishiest of terms. There must be a reasonable basis to believe that an offence may have been committed. This is different from a reasonable basis to believe that an offence has been committed. Not only does it lower the threshold in the case of individuals, it may also open the door to the sharing of personal information for law enforcement fishing expeditions. After all, if enough people file for certain benefits, it might be reasonable to believe that an offence may have been committed (there’s always someone who tries to cheat the system, right?). The exception could enable the sharing of a quantity of personal information to permit the use of analytics to look for anomalies that might suggest the commission of on offence. The presence of this amendment in an omnibus budget bill that will receive very little scrutiny or debate contradicts the government’s own statement, in its announcement of its data strategy consultation, that “Data privacy and protection is paramount.” This is not a privacy-friendly amendment.

The other set of amendments to FIPPA contained in the budget bill are aimed at something labelled “data integration”. This is a process meant to allow government to derive greater value from its stores of data, by allowing it to generate useful data, including statistical data, to government and its departments and agencies. It allows for the intra-governmental sharing of data for preparing statistics for the purposes of resource management or allocation, as well as the planning and evaluation of the delivery of government funded programs and services, whether they are funded “in whole or in part, directly or indirectly” (s. 49.2(b)).

Because these amendments contemplate the use of personal information, there are measures specifically designed to protect privacy. For example, under s. 49.3, personal information is not to be used for data integration unless other data will not serve the purpose, and no more personal information shall be used than is reasonably necessary to meet the purpose. Public notice of the indirect (i.e. not directly from the individual) collection of personal information must be provided under s. 49.4. Any collection of personal information can only take place after data standards provided for in s. 49.14 have been approved by the Privacy Commissioner (s. 49.5). Once collected, steps must be taken to deidentify the personal information. The amendments include a definition of deidentification, which involves the removal of direct identifiers as well as any information “that could be used, either alone or with other information, to identify an individual based on what is reasonably foreseeable in the circumstances” (s. 49.1). Section 49.8 specifically prohibits anyone from using or attempting to use “information that has been identified under this Part, either alone or with other information, to identify an individual”.

Provision is made for the disclosure of personal information collected through the data integration scheme in limited circumstances – this includes the unfortunately worded exception discussed above where “there is a reasonable basis to believe that an offence may have been committed”. (s. 49.9(c)(ii)).

In terms of transparency, a new s. 49.10 provides for notice to be published on a website setting out information about any collection of personal information by a ministry engaged in data integration. The information provided must include the legal authority for the collection; the type of personal information that may be collected; and the information sources, the purpose of any collection, use or disclosure, as well as the nature of any linkages that will be made. Contact information must also be provided for someone who can answer any questions about the collection, use or disclosure of the personal information. Contact information must also be provided for the Privacy Commissioner. Data standards developed in relation to data integration must also be published (s. 49.14(2)), and any data integration unit that collections personal information must publish an annual report setting out prescribed information (s. 49.13).

Section 49.11 mandates the safe storage and disposal of any personal information, and sets retention limits. It also provides for data breach notification to be made to affected individuals as well as to the Commissioner. The Commissioner has the power, under s. 49.12 to review the practices and procedures of any data integration unit if the Commissioner “has reason to believe that the requirements of this Part are not being complied with”. The Commissioner has power to make orders regarding the discontinuance or the modification of practices or procedures, and can also order the destruction of personal information or require the adoption of a new practice or procedure.

The amendments regarding data integration are clearly designed to facilitate a better use of government data for the development and delivery of programs and services and for their evaluation. These are important measures and seem to have received some careful attention in the amendments. Once again, however, these seem to be important pieces of the data strategy for which the government has recently launched a consultation process that seems to be becoming more irrelevant by the day. Further, as part of an omnibus budget bill, these measures will not receive much in the way of discussion or debate. This is particularly unfortunate for two reasons. First, as the furore over Statistics Canada’s foray into using personal information to generate statistical data shows, transparency, public input and good process are important. Second, the expansion of bases on which personal information shared with government can be passed along to law enforcement merits public scrutiny, debate and discussion. Encroachments on privacy slipped by on the sly should be particularly suspect.

Schedule 56 of the Budget Bill introduces a new statute, the Simpler, Faster, Better Services Act, 2019 (SFBSA), that, once passed, will take effect when proclaimed by the Lieutenant Governor. That passage is a foregone conclusion is evidenced by the fact that the role of Chief Digital and Data Officer, created under the statute, has already been filled with the announcement of the appointment of Hillary Hartley. The goal of the SFBSA is to “promote the transformation of government services in Ontario” (s. 1). Among other things, the Act provides for the appointment of a Chief Digital and Data Officer (CDDO) who is tasked with promoting the development and implementation of public sector digital services; providing advice to public sector organizations on digital services; assessing the design, development and effectiveness of these services; and promoting the use of data and effective data management (s. 3(1)). The CDDO will also promote the proactive publication of data by public sector organizations and involve the public in the design and implementation of digital services. Under s. 3(3) of the Act, the CDDP must also establish a digital and data action plan which, in broad terms, will develop initiatives to promote the adoption of digital services, and the improvement of existing services. The action plan will also promote the development of “effective data management and data sharing across public sector organizations”, and will specifically promote the use of technology that is scalable and interoperable. The action plan must also set targets and indicators for the evaluation of progress, and is to be reviewed and adapted as necessary at least every three years.

The CDDO is also charged, under s. 4 of the Act with setting standards for digital services and for open data. The open data standards can include “requirements to make specified datasets publicly available”, and will also include formal and technical standards for the data. This can include standards with respect to metadata, as well as the frequency and manner by which data sets are to be made public.

Interestingly, while this section is described as addressing “open data standards”, the requirements in the SFBSA actually relate to making public sector data “publicly available”. This is subtly different from open data in the classic sense. For example, s. 4(3)(d) allows the CDDO to set “the terms by which a public sector organization shall grant licences for the use of the datasets it publishes”. This suggests that some data might be made publicly available under more restrictive terms and conditions than traditional open data. Examples of possible restrictions might include non-commercial use limitations, or requirements that no attempts be made to reidentify deidentified data in the licensed data set. They might even include fees for access to some data sets, as nothing in the SFBSA actually requires the data to be provided free of charge. The statute also provides for the enactment of regulations, and these regulations can formalize the adopted standards.

The CDDO is also charged with maintaining a catalogue listing and describing all public sector datasets, including those that are required to be publicly available. The only exceptions relate to information that must be kept confidential under a law of Canada or Ontario, or information relating to “confidential law enforcement activities or other matters involving public safety or security” (s. 4(10)). The inventory and the standards developed for public sector data must also be made publicly available.

The SFBSA sets out, in s. 5, principles that must be followed by public sector organizations in developing and using digital services. Section 5(2) identifies principles that should guide the management of data and its public release.

The CDDO has some enforcement powers under the legislation in the sense that she may find organizations to be non-compliant and order them to change their practices, and can provide notice of non-compliance to the Management Board of Cabinet.

It should be noted that this statute is meant to apply both to public sector organizations (government ministries and public bodies), as well as “broader public sector organizations”. This latter category will include organizations referred to in a Schedule to the SFBSA, notably municipalities, school boards and universities, and some health services facilities.

Overall, this is a very interesting piece of public policy. Although provincial, federal and municipal governments across Canada have made commitments to open data, Ontario is the first to legislate open data requirements (or at least ‘publicly available data’ requirements). The establishment of a CDDO with a legislated mandate is also a positive commitment to improving digital and data services in the province. The principles that will guide digital services development and delivery as well as data management are important, straightforward, and public-interest oriented. The importance of this legislation, as Amanda Clarke says in her excellent post (with more to follow), “is exactly why this policy change demands broad and sustained scrutiny”.

While the substance of this statute is interesting and important, the process behind it is problematic. In February 2019 the Ontario government launched its data strategy consultation. The first step (which ended in March) was to accept submissions from the public. The second was to establish an advisory panel that would continue consultations and ultimately report in the Fall of 2019. Yet the SFBSA seems to contain precisely the kinds of measures contemplated by the data strategy consultation. In doing so it calls into question the genuineness of the consultation process. The process deficiencies are further reinforced by the fact that the SFBSA is crammed into an omnibus budget bill which will ultimately pass with a minimum of scrutiny and debate. It’s an interesting statute and an important piece of public policy, but the public and democratic process around it is not good.

Thursday, 04 April 2019 12:54

Open Banking & Data Ownership

On April 4, 2019 I appeared before the Senate Standing Committee on Banking, Trade and Commerce (BANC) which has been holding hearings on Open Banking, following the launch of a public consultation on Open Banking by the federal government. Open banking is an interesting digital innovation initiative with both potential and risks. I wrote earlier about open banking and some of the privacy issues it raises here. I was invited by the BANC Committee to discuss ‘data ownership’ in relation to open banking. The text of my open remarks to the committee is below. My longer paper on Data Ownership is here.

_______________

Thank you for this invitation and opportunity to meet with you on the very interesting subject of Open Banking, and in particular on data ownership questions in relation to open banking.

I think it is important to think about open banking as the tip of a data iceberg. In other words, if Canada moves forward with open banking, this will become a test case for rendering standardized data portable in the hands of consumers with the goal of providing them with more opportunities and choices while at the same time stimulating innovation.

The question of data ownership is an interesting one, and it is one that has become of growing importance in an economy that is increasingly dependent upon vast quantities of data. However, the legal concept of ‘ownership’ is not a good fit with data. There is no data ownership right per se in Canadian law (or in law elsewhere in comparable jurisdictions, although in the EU the idea has recently been mooted). Instead, we have a patchwork of laws that protect certain interests in data. I will give you a very brief overview before circling back to data portability and open banking.

The law of confidential information exists to protect interests in information/data that is kept confidential. Individuals or corporations are often said to ‘own’ confidential information. But the value of this information lies in its confidentiality, and this is what the law protects. Once confidentiality is lost, so is exclusivity – the information is in the public domain.

The Supreme Court of Canada in 1988 also weighed in on the issue of data ownership – albeit in the criminal law context. They ruled in R. v. Stewart that information could not be stolen for the purposes of the crime of theft, largely because of its intangible nature. Someone could memorize a confidential list of names without removing the list from the possession of its ‘owner’. The owner would be deprived of nothing but the confidentiality of and control over the information.

It is a basic principle of copyright law that facts are in the public domain. There is good reason for this. Facts are seen as the building blocks of expression, and no one should have a monopoly over them. Copyright protects only the original expression of facts. Under copyright law, it is possible to have protection for a compilation of facts – the original expression will lie in the way in which the facts are selected or arranged. It is only that selection or arrangement that is protected – not the underlying facts. This means that those who create compilations of fact may face some uncertainty as to their existence and scope of any copyright. The Federal Court of Appeal, for example, recently ruled that there was no copyright in the Ontario Real Estate Board’s real estate listing data.

Of course, the growing value of data is driving some interesting arguments – and decisions – in copyright law. A recent Canadian case raises the possibility that facts are not the same as data under copyright law. This issue has also arisen in the US. Some data are arguably ‘authored’, in the sense that they would not exist without efforts to create them. Predictive data generated by algorithms are an example, or data that require skill, judgment and interpretation to generate. Not that many years ago, Canada Post advanced the argument that they had copyright in a postal code. In the US, a handful of cases have recognized certain data as being ‘authored’, but even in those cases, copyright protection has been denied on other grounds. According ownership rights over data – and copyright law provides a very extended period of protection – would create significant issues for expression, creation and innovation.

The other context in which the concept of data ownership arises is in relation to personal information. Increasingly we hear broad statements about how individuals ‘own’ their personal information. These are not statements grounded in law. There is no legal basis for individuals to be owners of their personal information. Individuals do have interests in their personal information. These interests are defined and protected by privacy and data protection laws (as well as by other laws relating to confidentiality, fiduciary duties, and so on). The GDPR in Europe was a significant expansion/enhancement of these interests, and reform of PIPEDA in Canada – if it ever happens – could similarly enhance the interests that individuals have in their personal data.

Before I speak more directly of these interests – and in particular of data portability – I want to just mention why it is that it is difficult to conceive of interests in personal data in terms of ownership.

What personal data could you be said to own, and what would it mean? Some personal data is observable in public contexts. Do you own your name and address? Can you prevent someone from observing you at work every day and deciding you are regularly late and have no dress sense? Is that conclusion your personal information or their opinion? Or both? If your parents’ DNA might reveal your own susceptibility to particular diseases, is their DNA your personal information? If an online bookstore profiles you as someone who likes to read Young Adult Literature – particularly vampire themed – is that your personal information or is it the bookstore’s? Or is it both? Data is complex and there may be multiple interests implicated in the creation, retention and use of various types of data – whether it is personal or otherwise. Ownership – a right to exclusive possession – is a poor fit in this context. And the determination of ownership on the basis of the ‘personal’ nature of the data will overlook the fact that there may be multiple interests entangled in any single datum.

What data protection laws do is define the nature and scope of a person’s interest in their personal information in particular contexts. In Canada, we have data protection laws that apply with respect to the public sector, the private sector, and the health sector. In all cases, individuals have an interest in their personal information which is accompanied by a number of rights. One of these is consent – individuals generally have a right to consent to the collection, use or disclosure of their personal information. But consent for collection is not required in the public sector context. And PIPEDA has an ever-growing list of exceptions to the requirements for consent to collection, use or disclosure. This shows how the interest is a qualified one. Fair information principles reflected in our data protection laws place a limit on the retention of personal information – when an organization that has collected personal information that is now no longer required for the purpose for which it is collected, their obligation is to securely dispose of it – not to return it to the individual. The individual has an interest in their personal information, but they do not own it. And, as data protection laws make clear, the organizations that collect, use and disclose personal information also have an interest in it – and they may also assert some form of ownership rights over their stores of personal information.

As I mentioned earlier, the GDPR has raised the bar for data protection world-wide. One of the features of the GDPR is that it greatly enhances the nature and quality of the data subject’s interest in their personal information. The right to erasure, for example, limited though it might be, gives individuals control over personal information that they may have, at one time, shared publicly. The right of data portability – a right that is reflected to some degree in the concept of open banking – is another enhancement of the control exercised by individuals over their personal information.

What portability means in the open banking context is that individuals will have the right to provide access to their personal financial data to a third party of their choice (presumably from an approved list). While technically they can do that now, it is complicated and not without risk. In open banking, the standard data formats will make portability simple, and will enhance the ability to bring the data together for analysis and to provide new tools and services. Although individuals will still not own their data, they will have a further degree of control over it. Thus, open banking will enhance the interest that individuals have in their personal financial information. This is not to say that it is not without risks or challenges.

 

Ongoing litigation in Canada over the recovery by provincial governments of health care costs related to tobacco use continues to raise interesting issues about the intersection of privacy, civil procedure, and big data analytics. A March 7 2019 decision by the New Brunswick Court of Queen’s Bench (Her Majesty the Queen v. Rothmans Inc.) picks up the threads left hanging by the rather muted decision of the Supreme Court of Canada in The Queen v. Philip Morris International Inc.

The litigation before the Supreme Court of Canada arose from the BC government’s attempt to recover tobacco-related health care costs in that province. The central issue concerned the degree of access to be provided to one of the big tobacco defendants, Philip Morris International (PMI), to the databases relied upon by the province to calculate tobacco-related health care costs. PMI wanted access to the databases in order to develop its own experts’ opinions on the nature and extent of these costs, and to challenge the opinions to be provided by provincial experts who would have full access to the databases. Although the databases contained aggregate, de-identified data, the government denied access, citing the privacy interests of British Columbians in their health care data. As a compromise, they offered limited and supervised access to the databases at Statistics Canada Research Data Centre. While the other tobacco company defendants accepted this compromise, PMI did not, and sought a court order granting it full access.

The Supreme Court of Canada’s decision was a narrow one. It interpreted the applicable legislation as making health care records and documents of individuals non-compellable in litigation for recovery of costs based on aggregate health care data. The Court considered the health databases to be “records” and “documents” and therefore not compellable. However, their decision touched only on the issue of whether PMI was entitled to access the databases to allow its own experts to prepare opinions. The Court did not address whether a defendant would be entitled to access the databases in order to challenge the plaintiff’s expert’s report that was created using the database information. Justice Brown, who wrote for the unanimous Court stated: “To be clear, the databases will be compellable once "relied on by an expert witness": s. 2(5)(b). A "statistically meaningful sample" of the databases, once anonymized, may also be compelled on a successful application under ss. 2(5)(d) and 2(5) (e).” (at para 36) In response to concerns about trial fairness, Justice Brown noted the early stage of the litigation, and stated that: “Within the Act, the Legislature has provided a number of mechanisms through which trial fairness may be preserved. Specifically, s. 2(5)(b) itself requires that any document relied upon by an expert witness be produced.” (at para 34) He also observed that:

 

[Section] 2(5)(d) permits a court, on application, to order discovery of a "statistically meaningful sample" of any of the records and documents that are otherwise protected by s. 2(5)(b). No defendant has yet made such an application and thus no court has yet had reason to consider what would constitute a "statistically meaningful sample" of the protected documents. (at para 35)

The Supreme Court of Canada therefore essentially laid the groundwork for the motions brought to the New Brunswick Court of Queen’s Bench under essentially similar legislation. Section 2 of New Brunswick’s Tobacco Damages and Health Care Costs Recovery Act is more or less identical to the provisions considered by the Supreme Court of Canada. Sections 2(5)(d) and (e) of the Act provide:

2(5). . .

(b) the health care records and documents of particular individual insured persons or the documents relating to the provision of health care benefits for particular individual insured persons are not compellable except as provided under a rule of law, practice or procedure that requires the production of documents relied on by an expert witness,

. . .

(d) notwithstanding paragraphs (b) and (c), on application by a defendant, the court may order discovery of a statistically meaningful sample of the documents referred to in paragraph (b) and the order shall include directions concerning the nature, level of detail and type of information to be disclosed, and

(e) if an order is made under paragraph (d), the identity of particular individual insured persons shall not be disclosed and all identifiers that disclose or may be used to trace the names or identities of any particular individual insured persons shall be deleted from any documents before the documents are disclosed.

Thus, the provisions allow for discovery of documents relied upon by the government, subject to an obligation to deidentify them.

An expert witness for the Province of New Brunswick had produced several reports relying on provincial health care data. The province maintained that for privacy reasons the defendant should not have direct access to the data, even though it was deidentified in the database. It offered instead to provide recourse through a Statistics Canada Research Data Centre. The defendant sought “a "statistically meaningful sample" of clinical health care records concerning 1,273 individual insured persons in New Brunswick, under the authority of subsections 2(5)(d) and (e) of the Act.” (at para 2) It also sought a production order for “all Provincial administrative databases and national survey data” that was relied upon by the Province’s expert witness in preparing his reports. In addition, they sought access to data from other provincial health databases that were not relied upon by the expert in his report – the defendant was interested in assessing the approaches he chose not to pursue in addition to those he actually pursued. The province argued that it had provided sufficient access to relevant data through the Statistics Canada RDC, which implemented appropriate safeguards to protect privacy.

Justice Petrie first considered whether the access via Statistics Canada was adequate and he concluded that it was not. He noted that one of the other defendants in the litigation had filed an access to information request with Statistics Canada and had thereby learned of some of the work carried out by the province’s expert witness, including some “calculations and analysis” that he had chosen not to rely upon in his work. While the defendants were not prejudiced by this disclosure, they used it as an example of a flaw in the system administered by Stats Canada since its obligations under the Access to Information Act had led to the disclosure of confidential and privileged information. They argued that they could be prejudiced in their own work through Stats Canada by access to information requests from any number of entities with interests adverse to theirs, including other provincial governments. Justice Petrie sided with the defendants. He found that: “the Province's production of the data and materials relied upon by Dr. Harrison only within the confines and authority of a third party to this litigation, StatsCan/RDC poses a real risk to the confidentiality and privilege that must be accorded to the defendants and their experts.” (at para 66) He also stated:

 

The risk of potential premature or inadvertent disclosure, as determined by StatsCan, presents an unfair obstacle to the defendants' experts if required to undertake their analysis only within StatsCan/RDC. In short, the StatsCan Agreement terms and conditions are overly restrictive and likely pose a serious risk to trial fairness. I am of the view that less restrictive options are available to the Court and ones that more fairly balance trial fairness with the risks to any privacy breach for individual New Brunswickers. (at para 65)

These less restrictive options stem from the Courts own power to “provide for directions on production and to protect the personal and sensitive information of individuals.” (at para 68) Justice Petrie found that “there are no applicable restrictions under privacy legislation to prohibit the Court from ordering document production outside of the StatsCan/RDC in the circumstances.” (at para 72) He rejected arguments that the Statistics Act prevented such disclosures, ruling that custody and control over the health data remained shared between the province and Stats Canada, and that the court could order the province to disclose it. Further, it found:

 

Where, as here, the Province has served the defendants with five expert reports of Dr. Harrison and indicated their intention to call him as a witness at trial, I find that subsection 2(5)(b) of the Act expressly requires production of the materials "relied upon" by the expert in the ordinary course. I am confident that the Court is capable of fashioning an order which would adequately address any privacy or reidentification concerns while, at the same time, imposing more balanced measures on the defendants and/or their experts. (at para 82)

These measures could include a direction by the court that no party attempt to identify specific individuals from the deidentified data.

On the issue of the disclosure of a statistically significant sample of health records, the defendant sought a sample from over 1200 New Brunswick patients. The legislation specifically provides in s. 2(5)(d) that a court may order discovery of a statistically meaningful sample of the documents”, so long as they are deidentified. Justice Petrie found that there was a statutory basis for making this order, so long as privacy could be preserved. He rejected the province’s argument that the only way to do this was through the Stats Canada RDC. Instead, he relied upon the court’s own powers to tailor orders to the circumstances. He stated: “I am of the view that there is a satisfactory alternative to the StatsCan/RDC Agreement on terms that can allow for any re-identification risks to be properly addressed by way of a consent order preferably, and if not, by way of further submissions and ruling of this Court.” (at para 131)

On the issue of privacy and the deidentified records in the statistically significant sample, Justice Petrie stated:

 

Even if individuals might be able to be re-identified, which I am not convinced, it is not clear why the defendants would ever do so. [. . .] With respect to this request for an individual's personal health records, the Province has suggested no other alternative to such a sample, nor any alternative to the suggested approach on "anonymization" of the information. (at para 141)

He granted the orders requested by the defendants and required the parties to come to terms on a consent order to protect privacy in a manner consistent with his reasons.

This decision raises issues that are more interesting than those that were before the Supreme Court of Canada, mainly because the court is required in this case to specifically address the balance between privacy and fairness in litigation. The relevant legislation clearly does not require defendants to accept the plaintiff’s analyses of health data at face value; they are entitled to conduct their own analyses to test the plaintiff’s evidence, and they are permitted to do so using the data directly and not through some intermediary. While this means that sensitive health data, although anonymized, will be in the hands of the defendant tobacco companies, the court is confident that the rules of the litigation process, including the implied undertaking rule and the power of the court to set limits on parties’ conduct will be sufficient to protect privacy. Although this court seems to believe that reidentification is not likely to be possible (a view that is certainly open to challenge), even if it were possible, direction from the court that no analyses designed to permit identification will take place, is considered sufficient.

<< Start < Prev 1 2 3 4 5 6 7 8 9 10 Next > End >>
Page 1 of 30

Canadian Trademark Law

Published in 2015 by Lexis Nexis

Canadian Trademark Law 2d Edition

Buy on LexisNexis

Electronic Commerce and Internet Law in Canada, 2nd Edition

Published in 2012 by CCH Canadian Ltd.

Electronic Commerce and Internet Law in Canada

Buy on CCH Canadian

Intellectual Property for the 21st Century

Intellectual Property Law for the 21st Century:

Interdisciplinary Approaches

Purchase from Irwin Law