Teresa Scassa - Blog

Displaying items by tag: personal health information

A recent communication from the Office of the Information and Privacy Commissioner of Ontario (IPC) highlights how rapidly evolving and widely available artificial intelligence-enabled tools can pose significant privacy risks for organizations.

The communication in question was a letter to an unnamed hospital (“the hospital”) which had reported a data breach to the IPC. The letter reviewed the breach, set out a series of recommendations for the hospital, and requested an update on the hospital’s response to the recommendations by late January 2026. Although the breach occurred in the health sector, with its strict privacy laws, lessons extend more broadly to other sectors as well.

The breach involved the use of a transcription tool of a kind now regularly in use by many physicians to document physician-patient interactions. AI Scribe tools record and transcribe physician-patient interactions and generate summaries suitable for inclusion in electronic medical records. These functions are designed to relieve physicians of significant note-taking and administrative burdens. Although there are many task-specific AI Scribe tools now commercially available, in this case, the tool used was the commonly available Otter.ai transcription tool designed for use in a broad range of contexts.

This breach was complicated by the fact that the Otter.ai tool acted as an AI agent of the physician who had downloaded it. AI agents can perform a series of tasks with a certain level of autonomy. In this case, the tool can be integrated with different communications platforms, as well as with the user’s digital calendar (such as Outlook). Essentially, Otter.ai can scan a user’s digital calendar and join scheduled meetings. The tool then transcribes and summarizes the meeting. It can also share both the summary and the transcription with other meeting participants – all without direct user intervention.

The physician had downloaded Otter.ai and provided it with access to his calendar over a year after he left the hospital that reported the breach. Because he had he used his personal email, rather than his hospital email, for internal communications while at that hospital, his departure in 2023 and the deactivation of his hospital email account had not led to the removal of his personal email from meeting invitation lists. When he downloaded Otter.ai in September 2024 and gave it access to his digital calendar, he was still receiving invitations from the hospital to hepatology rounds. Although the physician did not attend these rounds following his departure, his AI agent did. It attended a September 2024 meeting, produced a transcript and meeting summary and emailed the summary with a link to the full transcript to all 65 individuals on the meeting invitation. The breach was presumably reported to the hospital by one or more of the email recipients. Seven patients had been seen during the hepatology rounds, and the transcript and summary contained their sensitive personal health information.

The hospital took immediate action to address the breach. It cancelled the digital invitation to the physician and contacted all recipients of the summary and transcript asking them to promptly delete all copies of the rogue email and attachments. It also sent a notice to all staff reminding them that they are not permitted to use non-approved tools in association with their hospital credentials and/or devices. It contacted the physician who had used Otter.ai and ensured that he removed all digital connections with the hospital. They also requested that he contact Otter.ai to request that all information related to the meeting be deleted from their systems. Patients affected by the breach were also notified by the hospital. To prevent future breaches, the hospital created firewalls to block on-site access to non-approved scribing tools, updated its training materials to address the use of unapproved tools, and revised its Appropriate Use of Information and Information Technology policy. The revised policy emphasizes the importance of using only hospital approved IT resources. It also advises regular review of participant lists for meetings to ensure that AI tools or automated agents are not included.

In addition to these steps, the IPC made further recommendations, including that the hospital itself contact Otter.ai to request the deletion of any patient information that it may have retained. Twelve of the sixty-five email recipients had not confirmed that they had deleted the emails, and the IPC recommended that the hospital follow up to ensure this had been done. Updates to the hospital’s breach protocol were also recommended as well as changes to offboarding procedures to ensure that access to hospital information systems is “immediately revoked” when personnel leave the hospital. The OIPC also recommended the use of mandatory meeting lobbies for all virtual meetings so that unauthorized AI agents are not permitted access to meetings.

This incident highlights some of the important challenges faced by hospitals – as well as by many other organizations – with the development of widely available generative and agentic AI tools. Where sophisticated and powerful tools in the workplace were once more easily controlled by the employer, it is increasingly the case that employees have independent access to such tools. Shadow AI usage is a growing concern for organizations, as it may pose unexpected – and even undetected – risks for privacy and confidentiality of information. Rapidly evolving agentic AI tools – with their capacity to act independently may also create challenges, particularly where employees are not fully familiar with their full range of functions or default settings.

Medical associations and privacy commissioners’ offices have begun developing guidance for the use of AI Scribes in medical practice (see, e.g., guidance from Saskatchewan and Alberta OIPCs). Ontario MD has even gone so far as to develop a list of approved AI scribe vendors – ones that they consider meet privacy and security standards. However, the tool adopted in this case was designed for all contexts and is available in both free and paid versions, which only serves to highlight the risks and challenges in this area. The widespread availability of such tools poses important governance issues for privacy and security conscious organizations. Even where an organization may subscribe to a particular tool that has been customized to its own privacy and security standards, employees still have access to many other tools that they might already use in other contexts. The risk that an employee will simply decide to use a tool with which they are already familiar and with which they are comfortable must be considered.

More generic transcription tools may also pose other risks in the medical context, since they are not specifically trained or designed for a particular context such as health care. For example, they may be less adept at dealing with medical terminology, prescription drug names, or other terms of art. This could increase the incidence of errors in any transcriptions or summaries.

Risks that data collected through unauthorized tools may be used to train AI systems also underscores the potential consequences for privacy and confidentiality. Under Ontario’s Personal Health Information Protection Act (PHIPA), a health care custodian is not authorized to share personal health information with third parties without the patient’s express consent to do so. Using health-care related transcription or voice recordings to train third party AI systems without this express consent is not permitted. Although some services indicate that they only use “de-identified” information for system training, the term “de-identified” may not be defined in the same way as in PHIPA. For example, stripping information of all direct identifiers (names, ID numbers, etc.) does not count as de-identification under PHIPA which requires that in addition to the removal of all direct identifiers, it is also necessary to remove information “for which it is reasonably foreseeable in the circumstances that it could be utilized, either alone or with other information, to identify the individual”.

This incident highlights the vulnerability of sensitive personal information in a context in which a proliferation of novel (and evolving) technological tools for personal and professional use is rampant. Organizations must act quickly to assess and mitigate risks, and this will require regular engagement with and training of personnel.

Note: A pre-print version of my research paper with Daniel Kim on AI Scribes can be found here.

 

Published in Privacy

Ontario plans to introduce digital identity services (Digital ID) to provide Ontarians with better access to their personal health information (PHI) in the provincial Electronic Health Record (EHR). This is being done through proposed amendments to the Personal Health Information Protection Act (PHIPA) introduced in Schedule 6 of Bill 231, currently before the legislature. Schedule 6 replaces proposed amendments to PHIPA regulations that were introduced in the summer of 2024 and that were substantively criticized by Ontario’s Privacy Commissioner. In introducing Bill 231, Health Minister Sylvia Jones stated that the goal is “to provide more people with the right publicly funded care in the right place by making it easier to access your health care records”.

Digital ID is an electronic means of verifying a person’s identity. Typically, such systems include some form of biometric data (for example, a face-print) to create a secure and verifiable ID system. We are becoming increasingly used to consuming products and services from both public and private sector sources in mobile and online contexts. Digital ID has the potential to improve secure access to these services.

Digital ID is already in place in many countries, but adoption has been slow in Canada. This may be in part because Digital ID raises concerns among some about the empowerment of a surveillance state. There are rumours that Ontario retreated from plans to introduce a more ambitious public sector Digital ID system over concerns about potential backlash, although it is quietly moving ahead in Bill 231 with the Digital Health ID. Unfortunately, Digital ID is most advantageous where a single Digital ID can be used to access multiple sites and services, eliminating the need to manage numerous usernames and passwords (with the security risks such management can entail). It is important to note that under Bill 231, the Digital Health ID will be single purpose, significantly reducing its advantages.

There is no doubt that Digital ID systems raise important privacy and security issues. They must be carefully implemented to ensure that the sensitive personal information they incorporate and the identities they represent are not misappropriated. They also raise equity issues. If Digital ID provides better and faster access to information and services, those who are not able to make use of Digital ID – because of age, disability, or the digital divide – will be at a disadvantage. Attention must be paid to ensuring that services and information are still available to those who must use other forms of identification – and that those other forms of identification remain accessible so long as they are needed.

Ontario’s Privacy Commissioner, in her comments on Bill 231 indicates that she fully supports the Ontario government’s goal in introducing Digital ID for the Electronic Health Record. She notes the importance of “enabling meaningful access to one’s health records” and agrees that “EHR access can help Ontarians better manage their health, and in turn, help create efficiencies in the health care system”. However, while she endorses the objectives, the Commissioner is highly critical of Bill 231. Her detailed comments note that the proposed amendments to PHIPA have the potential to reduce rights of access to personal health information in the EHR; that the bill contains no parameters on how, why and by whom the Digital ID scheme will be used; and that it includes broad regulation and directive making powers that could unravel rights and requirements already in place under PHIPA. She also observes that it conflates and converges the role of Ontario Health with respect to health data and Digital ID, and that it creates inconsistent and incomplete powers that will hinder enforcement and oversight. These are important concerns, articulately expressed by the head of perhaps the only independent body in the province capable of making sense of Bill 231’s Schedule 6.

Schedule 6 is brutally difficult to read and comprehend. This is largely because the introduction of Digital Health ID is being done as a series of amendments to an already (overly) complex piece of health privacy legislation. New legislation often has a narrative structure that – although not gripping reading – is at least relatively easy to understand and to follow. Bills that amend existing legislation can also generally be understood by those who work with them. You can cross-reference and see where new powers are added, and where the wording of clauses has been changed. But Schedule 6 of Bill 231 is an ugly hybrid. It introduces a complex new Digital Health ID scheme as an amendment to existing health privacy legislation, even though Digital Health ID is more than just a privacy issue. There is no doubt that such a system would have to be compliant with PHIPA and that some amendments might be required. However, Digital Health ID creates a new system for accessing health data in the EHR. It could have been introduced as a separate bill. Such an approach would have been clearer, more transparent and more accessible than the convoluted and incomplete scheme that has been shoe-horned into PHIPA by Bill 231.

It is not just the lack of transparency caused by such a contorted set of amendments that is a problem. In a 2019 presentation by Assistant Deputy Minister of Health Hein, the government’s approach to their “Digital First for Health” program promised to “[m]odernize PHIPA to make it easier for Ontarians to access their information, streamline information sharing processes, and support the use of data for analytics and planning.” One of the goals of PHIPA modernization was “[r]educing barriers to patient access by enabling patients to more easily access, use, and share their personal health information, empowering them to better manage their health.” This sets up Digital ID as part of the PHIPA modernization process. But Digital ID is not a “solution” to barriers caused by privacy laws. For Digital ID, the real barriers to better access to health data are structural and infrastructural issues in health data management.

Let me be clear that I am not suggesting that the Ontario government’s health system reform goals are not important. They are. But Digital Health ID should not be framed as “PHIPA modernization”. The objectives of such a system are not about modernizing health privacy legislation; they are about modernizing the health care system. They will have privacy implications which will need to be attended to but framing them as “PHIPA modernization” means that you end up where we are now: with changes to the health care system being implemented through complicated and problematic amendments to legislation that is first and foremost meant to protect the privacy of personal health information.

Australia and New Zealand have both introduced government-backed digital ID systems through specific digital identity legislation. Admittedly both statutes address digital identity more broadly than just in the health sector. Nevertheless, these laws are examples of how legislation can clearly and systematically set out a framework for digital identity that includes all the necessary elements – including how the law will protect privacy and how it dovetails with existing privacy laws and oversight. This kind of framework facilitates public debate and discussion. It makes it easier to understand, critique and propose improvements to the Bill. In her comments on Bill 231, for example, the Privacy Commissioner notes that “[c]larity and coherence of the many roles of Ontario Health would also assist my office’s oversight and enforcement role.” She observes that Schedule 6 “is inconsistent and incomplete in its approach to my office’s oversight and enforcement authority”. These are only two examples of places in her comments where it is evident that the lack of clarity regarding the proposed Digital Health ID scheme hampers its assessment.

Schedule 6 also leaves much of its substance to future regulations and directives. This is part of a disturbing trend in law-making in which key details of legislation are left to behind-the-scenes rulemaking. As the Privacy Commissioner notes in her comments, some of the matters left to these subordinate forms of regulation are matters of policy for which public consultation and engagement are required. As she so aptly puts it: “Directives are appropriate for guiding the implementation of legal requirements, not for establishing the very legal requirements to be implemented.”

Clearly, technology moves fast, and it is hard to keep laws relevant and applicable. There may be a need in some cases to resort to different tools or strategies to ensure that the laws remain flexible enough to adapt to evolving and emerging technologies. The challenge is, however, to determine which things belong in the law, and which things can be ‘flexed’. There is a difference between building flexibility into a law and enacting something that looks like a rough draft with sticky notes in places where further elaboration will be needed. Schedule 6 of Bill 231 is a rough draft of a set of amendments to an already overly-complex law. It should be its own statute, carefully coordinated with PHIPA and its independent oversight.

Digital Health ID may be important to improve access to health information for Ontarians. It will certainly carry with it risks that should be properly managed. As a starting point, Ontarians deserve a clear and transparent law that can be understood and debated. Further, privacy law should not be set up as a problem that stands in the way of reforming the health care system. Such an approach does not make good law, nor does it bode well for the privacy rights of Ontarians.

 

Published in Privacy

Given that we are in the middle of a pandemic, it is easy to miss the amendments to Ontario’s Personal Health Information Protection Act (PHIPA) and the Freedom of Information and Protection of Privacy Act (FIPPA) that were part of the omnibus Economic and Fiscal Update Act, 2020 (Bill 188) which whipped through the legislature and received Royal Assent on March 25, 2020.

There is much that is interesting in these amendments. The government is clearly on a mission to adapt PHIPA to the digital age, and many of the new provisions are designed to do just that. For example, although many health information custodians already do this as a best practice, a new provision in the law (not yet in force) will require health information custodians that use digital means to manage health information to maintain an electronic audit log. Such a log must detail the identity of anyone who deals with the information, as well as the date and time of any access or handling of the personal information. The Commissioner may request a custodian to provide him with the log for audit or review. Clearly this is a measure designed to improve accountability for the handling of digital health information and to discourage snooping (which is also further discouraged by an increase in the possible fine for snooping found later in the bill).

The amendments will also create new obligations for “consumer electronic service providers”. These companies offer services to individuals to help manage their personal health information. The substance of the obligations remains to be further fleshed out in regulations; the obligations will not take effect until the regulations are in place. The Commissioner will have a new power to order that a health information custodian or class of custodians cease providing personal health information to a consumer electronic service provider. Presumably this will occur in cases where there are concerns about the privacy practices of the provider.

Interestingly, at a time when there is much clamor for the federal Privacy Commissioner to have new enforcement powers to better protect personal information, the PHIPA amendments give the provincial Commissioner the power to levy administrative penalties against “any person” who, in the opinion of the Commissioner, has contravened the Act or its regulations. The administrative penalties are meant either to serve as ‘encouragement’ to comply with the Act, or as a means of “preventing a person from deriving, directly or indirectly, any economic benefit as a result of contravention” of PHIPA. The amount of the penalty should reflect these purposes and must be in accordance with regulations. The amendments also set a two-year limitation period from the date of the most recent contravention for the imposition of administrative penalties. In order to avoid the appearance of a conflict of interest, administrative penalties are paid to the Minister of Finance of the province. These provisions await the enactment of regulations before taking effect.

The deidentification of personal information is a strategy relied upon to carry out research without adversely impacting privacy, but the power of data analytics today raises serious concerns about reidentification risk. It is worth noting that the definition of “de-identify” in PHIPA will be amended, pending the enactment of regulations to that can require the removal of any information “in accordance with such requirements as may be prescribed.” The requirements for deidentification will thus made more adaptable to changes in technology.

The above discussion reflects some of the PHIPA amendments; readers should be aware that there are others, and these can be found in Bill 188. Some take effect immediately; others await the enactment of regulations.

I turn now to the amendments to FIPPA, which is Ontario’s public sector data protection law. To understand these amendments, it is necessary to know that the last set of FIPPA amendments (also pushed through in an omnibus bill) created and empowered “inter-ministerial data integration units”. This was done to facilitate inter-department data sharing with a view to enabling a greater sharing of personal information across the government (as opposed to the more siloed practices of the past). The idea was to allow the government to derive more insights from its data by enabling horizontal sharing, while still protecting privacy.

These new amendments add to the mix the “extra-ministerial data integration unit”, which is defined in the law as “a person or entity, or an administrative division of a person or entity, that is designated as an extra-ministerial data integration unit in the regulations”. The amendments also give to these extra-ministerial data integration units many of the same powers to collect and use data as are available to inter-ministerial data integration units. Notably, however, an extra-ministerial data integration unit, according to its definition, need not be a public-sector body. It could be a person, a non-profit, or even a private sector organization. It must be designated in the regulations, but it is important to note the potential scope. These legislative changes appear to pave the way for new models of data governance in smart city and other contexts.

The Institute for Clinical Evaluative Sciences (ICES) is an Ontario-based independent non-profit organization that has operated as a kind of data trust for health information in Ontario. It is a “prescribed entity” under s. 45 of PHIPA which has allowed it to collect “personal health information for the purpose of analysis or compiling statistical information with respect to the management of, evaluation or monitoring of, the allocation of resources to or planning for all or part of the health system, including the delivery of services.” It is a trusted institution which has been limited in its ability to expand its data analytics to integrate other relevant data by public sector data protection laws. In many ways, these amendments to FIPPA are aimed at better enabling ICES to expand its functions, and it is anticipated that ICES will be designated in the regulations. However, the amendments are cast broadly enough that there is room to designate other entities, enabling the sharing of municipal and provincial data with newly designated entities for the purposes set out in FIPPA, which include: “(a) the management or allocation of resources; (b) the planning for the delivery of programs and services provided or funded by the Government of Ontario, including services provided or funded in whole or in part or directly or indirectly; and (c) the evaluation of those programs and services.” The scope for new models of governance for public sector data is thus expanded.

Both sets of amendments – to FIPPA and to PHIPA – are therefore interesting and significant. The are also buried in an omnibus bill. Last year, the Ontario government launched a Data Strategy Consultation that I have criticized elsewhere for being both rushed and short on detail. The Task Force was meant to report by the end of 2019; not surprisingly, given the unrealistic timelines, they have not yet reported. It is not even clear that a report is still contemplated.

While it is true that technology is evolving rapidly and that there is an urgent need to develop a data strategy, the continued lack of transparency and the failure to communicate clearly about steps already underway is profoundly disappointing. One of the pillars of the data strategy was meant to be privacy and trust. Yet we have already seen two rounds of amendments to the province’s privacy laws pushed through in omnibus bills with little or no explanation. Many of these changes would be difficult for the lay person to understand or contextualize without assistance; some are frankly almost impenetrable. Ontario may have a data strategy. It might even be a good one. However, it seems to be one that can only be discovered or understood by searching for clues in omnibus bills. I realize that we are currently in a period of crisis and resources may be needed elsewhere at the moment, but this obscurity predates the pandemic. Transparent communication is a cornerstone of trust. It would be good to have a bit more of it.

Published in Privacy

The Supreme Court of Canada has just granted leave to appeal a decision of the British Columbia Court of Appeal in a case involving evidentiary issues in the province’s law suit to recover health care costs from the tobacco industry. The law suit was brought under the Tobacco Damages and Health Care Costs Recovery Act – a law passed specifically for the purpose of recovering health care costs from the industry. The case raises interesting issues regarding the balance between privacy rights and fairness in litigation; it also touches on issues or re-identification risk in aggregate health care data.

Under the B.C. statute, the province has two options for recovering health care costs. It can recover actual costs for particular identified individuals, or it can recover costs on an aggregate basis “for a population of insured persons as a result of exposure to a type of tobacco product.” (s. 2(1)) The province chose the second option. Under s. 2(5) of the Act, if this route is chosen, the province is not required to identify specific individuals or to establish tobacco-related illnesses with respect to those individuals. Further, the health records of specific individuals need not be provided as part of the litigation. However, if aggregate data is relied upon, the court retains the right to “order discovery of a statistically meaningful sample” of the records, and can issue “directions concerning the nature, level of detail and type of information to be disclosed.” The court must nevertheless ensure that the identities of the specific individuals to whom the data pertain are not disclosed.

The province generated aggregate statistical data regarding costs from its databases of health care services provided to insured persons, and indicated its intention to rely upon this data to prove its case. The defendant tobacco companies sought access to the data relied upon by the province. The province declined to provide the data directly. Instead it arranged for a limited form of access through third party intermediaries, which included Statistics Canada employees. Although some of the defendants accepted this approach, Philip Morris International (PMI) did not. It argued that it was entitled to access the data itself in order to assess the reliability and accuracy of the province’s analyses. Both the court at first instance and the B.C. Court of Appeal ultimately sided with PMI.

The B.C. Information and Privacy Commissioner, who intervened in the appeal before the B.C. Court, argued that “the interpretation of a statutory provision aimed at protecting personal privacy must be approached in light of the importance of protection of privacy as a fundamental value in Canadian society” (at para 25 of the BCCA decision). He maintained that the court should rely upon the Freedom of Information and Protection of Privacy Act (FIPPA) in interpreting the Tobacco Act, and that FIPPA required the terms “personal information” and “record” to be given a broad interpretation. The Court of Appeal summarily rejected this argument, stating that “FIPPA does not limit the information available by law to a party to a proceeding (s. 3(2)) and has no role in the interpretation of s. 2(5)(b).” (at para 25)

The Court of Appeal noted that the Tobacco Act provided two routes for the province to establish damages, one that required consideration of individual health records and one that did not. It chose the second route, which means that in general terms, individual health records are not compellable. The province argued that their decision to choose this route was motivated by a desire to protect the privacy of affected individuals. The Information and Privacy Commissioner argued that a requirement to disclose the aggregate data “has privacy implications for millions of insured persons who are not involved as litigants in the underlying action.” (at para 28) The Court of Appeal noted, however, that the legislation established the ‘playing field’ on which the litigation would take place and that there was no indication that this playing field was not intended to be even. It observed that the legislation does not make privacy a “paramount concern” (at para 31) since it did provide the province with the option to choose a route that would involve consideration of thousands of specific records. Had this route been chosen, the Court noted, “all of the individualized persons’ health care records would be subject to discovery and disclosure notwithstanding any privacy concerns that such disclosure might raise.” (at para 31)

With an aggregate action, the focus is not on individualized health care records. Section 2(5)(b) protects the privacy of individuals if such a route is chosen, and prevents “the aggregate action from becoming bogged down with “individual forms of discovery” in which the defendants could demand voluminous records of thousands or millions of people.” (at para 34) However, the Court noted that in following this route, the province will rely upon the data generated from its databases to establish both causation and damage. This makes the databases highly relevant to the litigation. The Court noted that s. 2(5)(b) “is not intended to block the discovery of the cumulative data contained in the databases, which data is essential to prove causation and damages.” (at para 35)

The Court ruled that the anonymized data on which the province would base its analyses would pose “no realistic threat to personal privacy.” (at para 36) Further, the defendants would be bound not to disclose the information provided to them as part of the litigation-related implied undertaking. The Court also observed that the identity of the specific individuals would be of no interest to the defendants, making it highly unlikely any attempts at re-identification would be made.

The Court of Appeal was particularly concerned about the unfairness that might result if “The only data available to the defendants would be the data the Province offers up on restrictive terms, or the data the Province’s testifying experts eventually choose to rely on in their reports.” (at para 37) It found that fairness required that the databases be produced.

It should be noted that in reaching its decision, the B.C. Court of Appeal declined to follow a judgment from the New Brunswick Supreme Court in a very similar case under nearly identical legislation. In Her Majesty the Queen in Right of the Province of New Brunswick v. Rothmans Inc., the judge had dismissed an application by the defendant tobacco companies for the production of anonymized health care data in the same circumstances. The judge in that case had access to the decision of the B.C. Supreme Court which had ordered production of the databases, but had declined to follow that decision on the basis that the anonymization of the data would not be sufficient to protect privacy, and that the database was “a document containing information that relates to the provision of health care benefits for “particular individuals””. (BCCA decision at para 20) In declining to follow the New Brunswick decision, the B.C. Court of Appeal observed that the New Brunswick judge had relied entirely on the privacy provisions and “did not attempt to read the provisions in the New Brunswick Act as a harmonious whole.” (at para 39) The New Brunswick Court of Appeal declined leave to appeal. With two conflicting decisions from two different provinces, the matter is now heading to the Supreme Court of Canada.

 

 

Published in Privacy

Canadian Trademark Law

Published in 2015 by Lexis Nexis

Canadian Trademark Law 2d Edition

Buy on LexisNexis

Electronic Commerce and Internet Law in Canada, 2nd Edition

Published in 2012 by CCH Canadian Ltd.

Electronic Commerce and Internet Law in Canada

Buy on CCH Canadian

Intellectual Property for the 21st Century

Intellectual Property Law for the 21st Century:

Interdisciplinary Approaches

Purchase from Irwin Law