Thursday, November 19, 2020

10 Ways Canada’s Consumer Privacy Protection Act Will Impact Privacy Practices

We just posted this on the McInnes Cooper client information site:
10 Ways Canada’s Consumer Privacy Protection Act Will Impact Privacy Practices

November 19, 2020

By Sarah Anderson Dykema, CIPP/C, Lawyer at McInnes Cooper,

David Fraser, Privacy Lawyer | Partner at McInnes Cooper

On November 17, 2020, the federal government proposed dramatic changes to how Canada will enforce privacy law, ushering in a new legal regime to protect individuals’ personal information – and to regulate organizations’ privacy practices. Bill C-11: the Digital Charter Implementation Act creates the Consumer Privacy Protection Act (CPPA) to replace the federal Personal Information and Electronics Documents Act (PIPEDA), and codify in law organizations’ obligations respecting the collection, use and disclosure of personal information rather than merely rely on the Canadian Standard Association (CSA) Model Code. The federal government says it estimates 18 months for the CPPA to go through the legislative process and become law, though this is always difficult to gauge. It might be derailed by, for example, a federal election or the ongoing COVID-19 Pandemic – but it might not.

It’s still early days, but if the CPPA (or some form of it) passes, it will take organizations time to put the necessary compliance processes in place. Here are 10 ways the Consumer Privacy Protection Act will impact organizations’ Canadian privacy practices.

1. Big Penalties. There will be significant penalties for non-compliance with the CPPA. It authorizes administrative monetary penalties and fines of up to 5% of global revenue or $25 million, whichever is higher, for the most serious offences. Currently, PIPEDA only authorizes penalties for breach of the Digital Privacy Act, and those are markedly lower than those under the CPPA: the maximum fine for breaching the Digital Privacy Act is $100,000 per violation (though if there were multiple violations, which would not be uncommon, the fines could add up).

2. Privacy Commissioner Powers. In a move away form the traditional ombudsman model, the CPPA gives the federal Privacy Commissioner broad power to make orders against organizations and to recommend penalties to a new “Personal Information and Data Protection Tribunal”. Under PIPEDA, the Privacy Commissioner only has the power to make recommendations to a breaching organization.

3. New Tribunal. A new “Personal Information and Data Protection Tribunal” will determine and levy any penalties – which will have the effect of a court order – and hear appeals from orders of the Privacy Commissioner.

4. Global Application. The new law takes an expansive approach to applicability, expressly applying to all personal information an organization collects, uses or discloses, including interprovincially or internationally. This reflects the increased digitization and globalization of the global economy, which knows no border, and which the COVID-19 Pandemic has accelerated.

5. New Right of Action. It creates a new privacy breach legal claim. Where the Privacy Commissioner decides an organization violated an individual’s privacy under the CPPA, and the Personal Information and Data Protection Tribunal upholds that finding, that individual can sue the organization (within 2 years) for compensation for the violation.

6. Data Portability & Deletion. It provides for new individual rights of data portability and deletion. Consumers can require an organization to transfer their data to another organization (subject to regulations that aren’t yet available), likely to be a boon to open banking. Individuals can also require that an organization delete the personal information it’s collected about them, subject to some limitations, in what appears to be a limited form of the “right to erasure”.

7. Algorithmic Transparency. It requires algorithmic transparency. Consumers would now have the right to require an organization to explain how an automated decision-making system made a prediction, recommendation or decision.

8. Consent Exceptions. It “simplifies” consent requirements for organizations by making some (potentially broad) exceptions to when an organization must obtain an individual’s consent to the collection, use or disclosure of the individual’s personal information, such as where the use of personal information is core to the delivery of a product or service. This could impact, for example, the information an organization must communicate in a privacy policy.

9. Data De-Identification. It makes new rules around the de-identification of data – including allowing for organizations to use an individual’s personal information without their consent in order to de-identify their data, but appears to limit other uses of de-identified data. Under certain circumstances, organizations can also disclose de-identified data to public entities for socially beneficial purposes.

10. Codes of Practice. It introduces the concept of “Codes of Practice”. The CPPA allows private organizations to establish a “code” and internal certification programs for complying with the law that the Privacy Commissioner will approve. Once approved, the “code” will effectively establish the organization’s legal compliance obligations.

Wednesday, November 18, 2020

Presentation: Privacy and Cybersecurity - latest trends and legal obligations

I was invited to speak at the 2nd Annual Atlantic Technology Summit on the topic of cybersecurity, privacy and the law. Not surprisingly, the entire conferene this year was online but it was all well attended.

In case it is of interest to others, here's the presentation I gave which started with a few case studies and then an overview of the current environment affecting legal risk. Of course, the slides were prepared before C-11 dropped though I was able to comment during the presenation that the stakes will get even higher with any breach of security safeguards.

Wednesday, October 07, 2020

Presentation: Little Brother - Surveillance Technology and Privacy Law

I had the pleasure of speaking at the University of New Brunswick Law School's weekly speaker hour, on the topic of non-police use of surveillance technology and how that intersects/collides with Canadian privacy laws. Here are the slides in case it's of wider interest ...

Friday, April 10, 2020

Privacy best practices in a pandemic public health emergency

Since the early days of the COVID-19 pandemic, privacy questions have been in the headlines. International media reported stories from Asia about smartphones being used to enforce quarantine orders. In Ontario, Premier Ford suggested using telecom data to track social isolation compliance and more recently the Quebec police announced that it had arrested a woman in violation of a quarantine order by tracking her down via her cellphone.

Companies are wondering what information they can require from employees about their health, diagnosis or risk factors, and what information they can provide to public health authorities if asked. Companies also have similar questions about customer information.

What privacy laws apply?

Since Canada has a patchwork of privacy laws, the first question is always whether a privacy law applies at all and if so, which one. In Atlantic Canada, public sector employers and “federal works, undertakings and businesses” are subject to privacy regulation for employee information, but the private sector is only covered for customer information. The majority of private-sector employers in Canada (other than in British Columbia, Alberta and Quebec) fall in the gap without privacy regulation for the workplace. Even if no law applies, this does not mean that privacy should be thrown out the window.

Companies should be guided by privacy best practices described below, all of which are embodied in privacy statutes across Canada. These best practices align closely with what employees have come to expect regarding handling of their personal information. Organizations that adopt these principles generally avoid negative reaction from employees that their personal information has been misused. Transparency also encourages honest reporting, as individuals are usually more comfortable with disclosing personal information to an organization that is forthright about how they propose to use the information.

Organizations should be concerned about the relatively new common law causes of action for “intrusion upon seclusion” and “public disclosure of private facts”. Given that health information is particularly sensitive and the irrational stigma that seems to attach to COVID-19 disease, one might allege that disclosing infection risk or status to others may meet the “highly offensive to a reasonable person” threshold for the torts. Applying best practices would minimise the risk of liability.

Balancing privacy with public and occupational health

For employers, what should emerge is a careful balance between privacy principles and legitimate occupational health and safety concerns. The occupational health and safety imperative is a legal one, on both the employer and the employees, as the Occupational Health and Safety Act of Nova Scotia places obligations on both sides to ensure a safe workplace. Given the mode of transmission of the novel coronavirus, employers have a responsibility to keep employees who are at risk of spreading infection out of their workplaces. Some companies have decided to take the temperature of everyone entering the premises and excluding anyone with a fever. Others have adopted questionnaires or mandatory reporting of risk factors. Each of these scenarios involves the collection of personal information, so tread carefully.

What practices to adopt should be informed by the following privacy best practices:

(i) the collection of personal information must be justified, reasonable and non-discriminatory;

(ii) individuals should be given notice of the purposes for the collection, use and disclosure through policy or other direct communications such as signage;

(iii) personal information collected should be restricted to the minimum that is reasonable in the circumstances;

(iv) personal information should only be used for those purposes and should not be disclosed further than necessary; and

(v) the personal information should be accurate, as it will be used to make a decision of whether the employee, contractor or visitor will be permitted to work in the workplace.

What is justifiable and reasonable should be informed by the latest information from public health.

Disclosing personal information to public health authorities

Until recently, public health officials have largely been out of the spotlight, but they have been discreetly and diligently working to contain public health hazards, such as sexually transmitted infections. They are often been given special powers to do so, which includes the ability to require personal information from others. For example, in Nova Scotia, section 15 of the Health Protection Act gives the Chief Medical Officer of Health or his delegate broad powers to order information from third parties. Every privacy law in Canada permits disclosures where required by law and many also permit disclosures where it’s reasonably necessary related to the health and safety of the individual. Obviously, check your local statutes.

That said, we have to be very, very careful about attempts to get data in bulk, such as location data from telcos.

While health and safety are of course top of mind in this pandemic, privacy considerations should also be taken into account.

[Note: This post is based on an upcoming article for the Canadian Bar Association - Nova Scotia's Nova Voce magazine.]

Monday, February 17, 2020

Ontario court adopts the "false light publicity" privacy tort

Regular readers of this (irregular) blog will recall the milestone case of Jones v Tsige, in which the Ontario Court of Appeal imported into Canada the US privacy torts. That list includes:

1. Intrusion upon the plaintiff's seclusion or solitude, or into his private affairs.
2. Public disclosure of embarrassing private facts about the plaintiff.
3. Publicity which places the plaintiff in a false light in the public eye.
4. Appropriation, for the defendant's advantage, of the plaintiff's name or likeness.

The fourth cause of action, commercial appropriation of the plaintiff's image, was already alive and well in Canadian tort law. The Court in Jones applied the "intrusion upon seclusion" tort and subsequent cases have applied "public disclosure of private facts" (See Ontario court explicitly adopts new privacy tort: public disclosure of private facts.)

In December 2019, the Ontario Superior Court of Justice explicitly recognized the "false light" privacy tort. In Yenovkian v. Gulian, 2019 ONSC 7279, Justice Kristjanson was dealing with an unpleasant family law case in which the husband had made wild accusations against his former spouse, particularly related to their two children. The judge noted, with respect to the list of privacy torts:

[170] With these three torts all recognized in Ontario law, the remaining item in the “four-tort catalogue” of causes of action for invasion of privacy is the third, that is, publicity placing the plaintiff in a false light. I hold that this is the case in which this cause of action should be recognized. It is described in § 652E of the Restatement as follows:
Publicity Placing Person in False Light

One who gives publicity to a matter concerning another that places the other before the public in a false light is subject to liability to the other for invasion of his privacy, if

(a) the false light in which the other was placed would be highly offensive to a reasonable person, and

(b) the actor had knowledge of or acted in reckless disregard as to the falsity of the publicized matter and the false light in which the other would be placed.

[171] I adopt this statement of the elements of the tort. I also note the clarification in the Restatement’s commentary on this passage to the effect that, while the publicity giving rise to this cause of action will often be defamatory, defamation is not required. It is enough for the plaintiff to show that a reasonable person would find it highly offensive to be publicly misrepresented as they have been. The wrong is in publicly representing someone, not as worse than they are, but as other than they are. The value at stake is respect for a person’s privacy right to control the way they present themselves to the world.

While I don't propose to list all the misconduct the husband was found to have carried out, this summary at the beginning of the decision is helpful for context:

[2] It is also about a father, Mr. Vem Yenovkian, who has engaged in years of cyberbullying of the mother, Ms. Sonia Gulian on websites, YouTube videos, online petitions and emails. It is about a father who videotapes court-ordered access visits with the children—both in-person and on Skype—and edits and posts those access visits and photographs of the children on the internet, with commentary. It is about a father who publicly posts on YouTube a video of his son cowering under a table while the father harangues him over Skype on a court-ordered access visit. It is about a father who posts videos of him describing his daughter, who suffers from a neurological disorder, as looking drugged, when she used to be “normal,” and posting that his daughter has a “broken” mind.

[3] Despite court orders prohibiting posting, the father continues his cyberbullying campaign abusing Ms. Gulian and her parents. He seeks to undermine the administration of justice through an online campaign to “unseat” a judge of this Honourable Court for rulings made, internet attacks on trial witnesses and the wife’s lawyer, and by flouting court orders and family law disclosure obligations.

The Court in this case did not follow the $20,000 "cap" on non-pecuniary damages set out in Jones v Tsige, but rather followed the divergent train of thought started with the Doe case:

[186] There is no claim for pecuniary damages; the only issue is non-pecuniary damages. The infliction of mental suffering and invasion of privacy are based on many of the same facts.

[187] On damages for intrusion on seclusion, the Court of Appeal in Jones v. Tsige held at paragraphs 87-88 that damages for intrusion upon seclusion in cases where the plaintiff has suffered no pecuniary loss should be modest, in a range up to $20,000. The important distinction with the two invasion of privacy torts in issue here, however, is that intrusion on seclusion does not involve publicity to the outside world: they are damages meant to represent an invasion of the plaintiff’s privacy by the defendant, not the separate and significant harm occasioned by publicity.

[188] The two Jane Doe cases have recognized that the cap on damages for intrusion upon seclusion may not apply to the other forms of invasion of privacy: Jane Doe 2016 at para. 58; Jane Doe 2018 at paras. 127-132. In this case, as is in those, the “modest conventional sum” that might vindicate the “intangible” interest at stake in Jones v. Tsige, para. 71, would not do justice to the harm the plaintiff has suffered.

[189] In Jane Doe 2016, at para. 52, Stinson J. turned to sexual battery cases for guidance in arriving at an award, and Gomery J. in Jane Doe 2018, at paras. 127-128 followed the same approach. In support of this approach, Stinson, J. pointed to the similarly of the psychological and emotional harm the plaintiff had suffered to that experienced by victims of sexual assault.

[190] I likewise adopt the method of looking to the factors applied to decide damage awards for a tort causing harms analogous to those the present plaintiff has suffered for invasion of privacy. The harm arising from the invasion of privacy in the present case is akin to defamation. Accordingly, in arriving at an award of non-pecuniary damages, I am guided by the factors described by Cory J. in Hill v Church of Scientology, at para. 187, which I am adapting to the tort of publicity placing a person a false light:

a) the nature of the false publicity and the circumstances in which it was made,
b) the nature and position of the victim of the false publicity,
c) the possible effects of the false publicity statement upon the life of the plaintiff, and
d) the actions and motivations of the defendant.

[191] In this case, the false publicity is egregious, involving alleged criminal acts including by Ms. Gulian against her children. The false publicity is widely disseminated on the internet, as well as through targeted dissemination to church friends and business associates. Ms. Gulian has suffered damage as a mother, as an employee, in the Armenian community, and in her church community. She is peculiarly vulnerable as the spouse of the disseminator of false publicity. The false publicity has had a detrimental effect on Ms. Gulian’s health and welfare, humiliation, caused her fear, and could be expected as well to affect her social standing and position. Mr. Yenovkian has not apologized, nor has he retracted the outrageous comments despite court orders.

[192] The damages for intentional infliction of mental suffering are intended to be compensatory. I award $50,000 compensatory damages for intentional infliction of mental suffering, relying on Boucher v. Wal-Mart Canada Corp., 2014 ONCA 419.

Wednesday, December 11, 2019

Privacy Commissioner again upends the consensus on transfers for processing in Aggregate IQ investigation

You may recall earlier this year when the Canadian Privacy Commissioner completely revised the previous consensus by concluding that a "transfer for processing" was a disclosure that requires consent, along with any cross-border transfer of personal information. The Canadian privacy and business community were shocked by this reversal and the Commissioner eventually reversed this position, returning to the status quo.

Once again, the OPC has upended the consensus on using contractors to process information on behalf of a client.

The Privacy Commissioner of Canada and the Information and Privacy Commissioner of British Columbia together released their reports of findings into Aggregate IQ on November 24, 2019, following their joint investigation of the company. You may recognize the name of the company, as it was implicated in the many international Cambridge Analytica investigations. It was a contractor to the now infamous company that was implicated in a range of mischief related to the Brexit campaign and the US 2016 presidential election.

As a Canadian company, it should not be surprising that Aggregate IQ would come under scrutiny in Canada. What is surprising is that the result of the investigation essentially turns a whole lot of Canadian thinking about privacy and contracting out of services on its head, and also seems to ignore binding precedent from the Federal Court of Canada.

Aggregate IQ is essentially a data processing company that works on behalf of political parties and political campaigns. They take data from the campaigns, sort it, supplement it and sometimes use it on behalf of their clients. They key is that they do this work on behalf of clients.

Superficially, it may make sense to conclude that a Canadian company is subject to Canadian privacy laws. But the working assumption has always been that companies that collect, use and disclose personal information on behalf of clients are subject to the laws that govern their clients and their clients' activities. Those "trickle down" through the chain of contracts and sub-contracts. What's shocking is that the OPC has concluded that compliance with those laws is not enough. Processors in Canada, they say, have to also comply with Canadian laws even when they are incompatible with the laws that regulate the client.

For example, Aggregate IQ did work on a mayoral campaign in Newfoundland. No privacy law applies to a mayoral campaign in Newfoundland, but nevertheless the OPC says that Aggregate IQ needed consent for their use of the information on behalf of the candidate. The campaign did not need consent, but the OPC concluded that by using a contractor, the campaign is subject to more laws and additional burdens than the government of Newfoundland has concluded are necessary. Similarly, the OPC says that Aggregate IQ needed consent under PIPEDA for what they were doing on behalf of US and UK campaigns, even though the activity is largely unregulated in the US and consent is not required in the UK (using legitimate bases for processing under the GDPR). Setting aside whether the campaigns were actually complying with their local laws, the conclusion from the OPC is that additional Canadian requirements will be overlaid on top of the laws that should actually matter and actually have a close connection to what's really going on.

Until this point, the consensus has generally been that when a contractor is handling data for a customer, the obligations that lie on the customer flow down to the contractor. Similar to the “controller” and “processor” scheme in GDPR.

Canadian privacy law applies to the collection, use and disclosure of personal information in the course of commercial activity. And you'd think that Aggregate IQ is engaged in commercial activity so PIPEDA would apply. But that's not the case. If a contractor is collecting, using or disclosing personal information on behalf of a client, you have to look at that client's purposes. The Canadian Federal Court clearly concluded this in State Farm v Privacy Commissioner.* In that case, the OPC asserted its jurisdiction over an insurance company because they were clearly commercial, even when acting on behalf of an individual defendant in a car accident lawsuit. The Federal Court firmly disagreed. One has to look at what's really going on. State Farm was not handling personal information on its own behalf, but on behalf of its insured who was not subject to any privacy regulation for that activity. The same principle applies here. If Newfoundland has decided not to regulate how mayoral candidates collect and use personal information, it makes no difference if they use that information themselves or hire a contractor to do that.

This upends what has been understood to be the way things work. And it has worked.

And it is really bad public policy. It puts Canadian companies at a significant disadvantage in very competitive industries. While many people say that GDPR is much more privacy protective, there are many circumstances where personal data can be processed without consent, but based on a legitimate interest. A company or campaign in Europe would be much better off hiring a European company if hiring a Canadian company meant that the legitimate interest is disregarded and a Canadian consent requirement were superimposed. The same would apply to a Canadian campaign: the campaign that complies with whatever laws apply to it directly is suddenly subject to additional rules if it hires a contractor to carry out what would otherwise be a compliant and lawful activity.

It is also really bad public policy because if you take it to the logical conclusion, it means that Canadian governments cannot hire contractors to process or use personal information on their behalf. All Canadian public sector privacy laws are based on "legitimate purposes", so consent is not required where the collection, use or disclosure is lawfully authorized and legitimate. But this finding by the OPC would say that the contractor has to get consent under PIPEDA for whatever they do for their public sector client. This is not workable and I hope is an unintended consequence.

Beyond that, I'm not sure what to say. It appears that Aggregate IQ has agreed to follow the Commissioner's recommendations, so this will not be given the chance to be corrected by the Federal Court.

How this will play out in future cases remains to be seen.

* I should note that I was counsel to State Farm in that case.

Saturday, November 23, 2019

Presentation: Access to Government Information

On Friday, November 23, 2019, I had the pleasure of presenting on the topic of access to government information with Janet Curry of the Workers Compensation Board at the Canadian Bar Association - Nova Scotia branch annual professional development conference. In case it's of interest, here's our presentation.


David Fraser, McInnes Cooper; Janet Curry, Workers’ Compensation Board of Nova Scotia

The panel will share their perspectives on advising clients on requests for access to information held by government and public bodies. They will share best
practices and tips from both sides – those making requests for access to information, and those responding to such requests.

You can download it in PDF format here.

Wednesday, November 20, 2019

Presentation: Surveillance tech and privacy laws

I was honoured to be asked to give a breakfast presentation to the Canadian Security Association Atlantic Chapter on surveillance and security technology and the law. In case it's of broader interest, here's the presentation:

You can also grab it as a PDF here.

Monday, October 14, 2019

What a CLOUD Act agreement will look like for Canada

The United States Department of Justice and the United Kingdom Home Office have announced that the two countries have signed a bilateral agreement “On Access to Electronic Data for the Purpose of Countering Serious Crime”. The Agreement is intended to be a bilateral agreement of the type anticipated under the CLOUD Act. Passed in March 2018, partially to address the litigation against Microsoft related to evidence in Ireland, the CLOUD Act authorizes the United States to enter into executive agreements with other countries that meet specific criteria related to rule of law, civil rights and privacy. Once laid before Congress and approved, the result is to lift each party’s legal barriers that prevent one country’s legal processes from being recognized in the other. Many countries have been seeking an alternative to the traditional channels of mutual legal assistance, which are seen as time consuming and cumbersome.

When it comes to orders directed at US custodians of information, the main barrier to be overcome is the American Stored Communications Act that prohibits most US service providers from providing the content of communications except in response to a US court order. These can be obtained via the mutual legal assistance system, but all the steps required to obtain these orders are seen by law enforcement and cumbersome and time consuming. Under a CLOUD Act executive agreement, US service providers will no longer be prohibited from providing such content in response to an appropriate foreign order. It is very important to note that the CLOUD Act does not make foreign orders enforceable (with full force of a domestic court order) in the United States, but merely removes this barrier.

On the UK side of the equation, changes were made in UK law to permit this under the Crime (Overseas Production Orders) Act 2019, which received Royal Assent in February 2019. The Agreement will enter into force following a six-month Congressional review period mandated by the CLOUD Act, and the related review by UK’s Parliament.

Australia has already announced that it is seeking its own CLOUD Act executive agreement, and Canada is rumoured to be in similar discussions.

The Canadian Association of Chiefs of Police have been lobbying pretty hard for an executive agreement between Canada and the US, and called for it in their 2018 Annual Resolutions:

BE IT FURTHER RESOLVED that the Canadian Association of Chiefs of Police urges the Government of Canada to negotiate a bilateral data-sharing agreement with the United States of America who are authorized to do so pursuant to the CLOUD Act, and;

BE IT FURTHER RESOLVED that the Canadian Association of Chiefs of Police seeks a commitment from the Government of Canada for meaningful consultation with the CACP during the development of these instruments.

So what would this look like for Canada? The CLOUD Act and executive agreements are based on reciprocity, meaning that not only can Canadian law enforcement obtain information from US-based service providers, but American law enforcement can obtain information from Canadian-based information custodians. Currently, that’s mostly a no-go except through the MLAT.

In order for Canada to sign an executive agreement and give it effect, it would have to amend the Criminal Code and other statutes to give Canadian production orders extraterritorial effect or to create a new class of production orders, in a manner that is similar to the UK Crime (Overseas Production Orders) Act 2019. Notwithstanding the wishful thinking of many in Canada’s law enforcement community (relying, in part, on the wrongly-decided Brecknell decision from BC), Canadian production orders to not operate extraterritorially.

Removing Canadian legal barriers to foreign court orders that are subject to the bilateral executive agreement will likely be the most controversial part of the process. Canadians likely do not mind if Canadian law enforcement are able to obtain data about Canadian suspects in Canadian criminal investigations from foreign service providers. They likely will care about whether US law enforcement can obtain access to information from Canadian service providers.

Currently, all Canadian privacy laws prevent disclosure to foreign law enforcement under foreign orders. That includes private sector privacy laws, like the federal Personal Information Protection and Electronic Documents Act and provincial equivalents. The list would also include the health privacy laws in effect in most Canadian provinces, and each public sector privacy law. Currently, the public sector laws in British Columbia and Nova Scotia specifically prohibit disclosures in response to “foreign demands for disclosure”. This will either have to be removed or Canada will need to negotiate an exception in its executive agreement with the US to carve out information that is subject to public sector privacy laws.

What will likely be lost in the discussion and debate is the fact that CLOUD Act agreements are not intended to simply give effect to all orders from the other state. They are intended to create a form of passing lane in the MLAT for certain kinds of orders where the requesting state has a strong interest in the data and the receiving state has a minimal interest. For example, Canadian authorities can’t use a qualifying order to get information about a US suspect from a US service provider. Those would still have to go through the MLAT, subject to close scrutiny by American authorities. Likewise, US authorities should not be able to obtain information about Canadians from a Canadian service provider under this arrangement.

What also needs to be emphasised is that any Canadian amendments should not go any further than mirroring the changes made in the US law. The CLOUD Act does not make foreign orders enforceable (with full force of a domestic court order) in the United States, but merely removes certain barriers. Canadian amendments should do the same and make sure that a Canadian service provider has resort to Canadian courts and the Charter to review any foreign demands. And these orders should be limited to serious crimes.

I expect it will be an interesting discussion when it is finally announced. I would hope there is meaningful discussion, rather than just unveiling it as a fait accompli.

Thursday, August 22, 2019

Another privacy class action dismissed due to lack of compensable damages

Privacy class actions seem to be having a bit of a rough time as of late.

“The need to change a password at a higher frequency cannot give rise to a serious compensable loss claim.”

Following a trend that has become reasonably well established in Québec and is expanding across Canada, the province’s Superior Court has refused to certify a privacy class action on the basis that the representative plaintiff did not experience any compensable harm. In Bourbonnière c. Yahoo! Inc., Justice Tremblay considered a certification application brought by a putative class of individuals affected by a range of data breaches suffered by Yahoo! Inc. and Yahoo! Canada Corp. Yahoo! had announced a number of incidents, including one that saw information about 500 million users stolen in 2014, another in 2013 which also involved information theft and unauthorized access to account data using a forged digital cookie file.

The representative plaintiff testified that she had no reason to believe that she had been a victim of identity theft or fraud as a result, and had not identified any suspicious financial transactions. In addition, she continues to use her Yahoo! mail account and has not signed up for any identity theft protection or credit monitoring products.

The Court summarized her harm at paragraphs 36 and 37:

[36] In summary, Plaintiff has not incurred any out-of-pocket costs associated with the protection of her personal and/or financial information.

[37] The only prejudice suffered by the Plaintiff relates to the inconvenience of having to change her passwords in all of the accounts associated with her Yahoo email address and the alleged embarrassment suffered as a result of spam emails that were sent to her friends. The Court is of the view that such prejudice is insufficient to justify a class action.

This conclusion was based on a growing line of authorities in Québec. The Court referred to Mustapha v. Culligan of Canada Ltd of the Supreme Court of Canada, standing for the proposition that “compensable injury must be ‘serious and prolonged’ and rise above the ordinary annoyances, anxieties and fears that a person living in society may experience”.

[42] Similarly, in Mazzonna, a case involving the loss of data tape, the Superior Court concludes that the anxiety felt by the plaintiff upon and after learning that her personal information had been lost and the modification of habits in the manner in which she managed her bank account, is not enough to meet the threshold, even on a prima facie basis, of the existence of compensable damages.

[43] The present case can be distinguished from other data security incident cases such as Zuckerman and Belley since, unlike these two other cases, Plaintiff has not incurred any expenses for credit monitoring services nor was she a victim of identity theft.

[44] The transient embarrassement [sic] and inconveniences invoked by the Plaintiff are of the nature of ordinary annoyance and do not constitute compensable damages recoverable under the applicable law. Indeed, the need to change a password at a higher frequency cannot give rise to a serious compensable loss claim.

The Court also had issues with the composition of the class, particularly a subclass referred to as the “Collateral Victims”, being “all other persons, businesses, entities, corporations, financial institutions or banks who suffered damages or incurred expenses as a result of the data security incidents”. As the plaintiffs had not identified any single “Collateral Victim”, the court concluded that this particular subclass was “artificial” and questioned its existence.

The application for certification was dismissed. It is notable that a parallel Ontario proceeding is ongoing.

A previous version of this was written for the Canadian Technology Law Association newsletter.