Friday, September 20, 2024

Appeal court reverses Facebook’s Canadian privacy win




On September 9, 2024, the Federal Court of Appeal did something very interesting. A unanimous three judge panel fully reversed the factual conclusions of the court below to conclude that Facebook had violated the Personal Information Protection and Electronic Documents Act in connection with the Cambridge Analytica scandal. And then rather than sending it back to the Federal Court to be determined by another judge, they reached their own conclusions and invited submissions on remedy.

It is not common for an appeals court to reverse factual findings like that to begin with. They can only do so if they find “palpable and overriding error for questions of fact or mixed fact and law”. And it’s pretty clear that’s what the appeals court found.

The earlier decision, rendered in April 2023, was – I thought – pretty embarrassing to the Privacy Commissioner.

You may recall from 2019, when the Privacy Commissioner of Canada and the Information and Privacy Commissioner of British Columbia released, with as much fanfare as possible, the result of their joint investigation into Facebook related to the Cambridge Analytica incident. That incident took place around 2013 to 2015, so around ten years ago.

Both of the Commissioners concluded, at that time, that Facebook had violated the federal and British Columbia privacy laws, principally related to transparency and consent.

Because Facebook did not accept that finding, the Privacy Commissioner of Canada commenced an application in the Federal Court to have the Court make the same determination and issue a whole range of orders against the social media company.

The hearing of that application took place in March 2023 and a decision was released from the federal court just over a month later. It concluded that the Privacy Commissioner did not prove that Facebook violated our federal privacy law in connection with the Cambridge Analytica incident.

Just a little bit of additional procedural information: under our current privacy law, the Privacy Commissioner of Canada does not have the ability to issue any orders or to levy any penalties. What can happen after the Commissioner has released his report of findings is that the complainant, or the Commissioner with the complaint’s okay, can commence an application in the Federal Court of Canada. This is what is called a de novo proceeding.

The finding from the Privacy Commissioner below can be considered as part of the record, but it is not a decision being appealed from. Instead, the applicant, in this case, the Privacy Commissioner, has the burden of proving to a legal standard that the respondent has violated the federal privacy legislation.

This has to be done with evidence, which is where the trial judge concluded the privacy commissioner fell significantly short in the Facebook case. I did a video on that decision, which I’ll link to below.

The Old Facebook

To understand this decision, we have to understand what it was all about. It has to be remembered that the events being investigated took place almost 10 years ago, and the Facebook platform is substantially different now compared to what it looked like then. If you were a Facebook user from that time, you probably remember a whole bunch of apps running on the Facebook platform. You probably were annoyed by friends who were playing Farmville and sending you invitations and updates. Well, these don't exist anymore. Facebook largely is no longer a platform on which third party apps will run.

At the time, users could install apps and the app developers could get access to that user’s personal information. Those apps could also get access to some information related to the friends of the installing user. The installing user had some knowledge and control, but that person’s friends were largely ignorant of the fact and had no control over that.

In a nutshell, at the time, one of the app developers that used the Facebook platform was a researcher later associated with a company called Cambridge analytica. They had an app running on the platform called “this is your digital life”. It operated for some time in violation of Facebook's terms of use for app developers, hoovering up significant amounts of personal information and then selling and/or using that information for, among other things, profiling and advertising targeting. Appeal decision

The Appeal Court's Decision

The questions put to the Federal Court of Appeal was whether the judge below had made a reviewable error when he concluded that there was not sufficient evidence to prove that Facebook did not get adequate consent from users and whether they had failed to safeguard user data.

The Court of Appeal concluded that there was sufficient evidence to reach these conclusions and the judge below made an error in not seeing it as sufficient. The standards for consent are objective At the Federal Court level, the judge said that the Privacy Commissioner had failed to bring sufficient evidence to prove that Facebook did not get adequate user consent for the collection, use and disclosure of their personal information and that of their friends. The judge below said it would have been helpful to have expert evidence on users’ expectations and what Facebook could have done differently.

The Federal Court of Appeal essentially said that’s asking the wrong question, based on a premise that the standard of consent is subjective. PIPEDA uses the term “reasonable” in a number of places. The Federal Court of Appeal said the standard for consent in PIPEDA is an objective standard, and does not require that sort of evidence.

On a daily basis, courts deal with determining what is reasonable in a whole range of cases without that specialized evidence. Judges can apply common sense. The Court described the legal “reasonable person” at paragraph 63:

[63] The reasonable person is a fictional person. They do not exist as a matter of fact. The reasonable person is a construct of the judicial mind, representing an objective standard, not a subjective standard. Accordingly, a court cannot arbitrarily ascribe the status of “reasonable person” to one or two individuals who testify as to their particular, subjective perspective on the question. As Evans J.A. wrote for this Court: “determining the characteristics of the ‘reasonable person’ presents difficulties in a situation where reasonable people may view a matter differently, depending, in part, on their perspective… However, the view of the reasonable person in legal tests represents a normative standard constructed by the courts, not an actuality that can be empirically verified” (Taylor v. Canada (Attorney General) (C.A.), 2003 FCA 55, [2003] 3 F.C. 3 at para. 95).

The Court of Appeal said, at paragraph 60 of the decision:

[60] Subjective evidence does not play a role in an analysis focused on the perspective of the reasonable person.

[61] The meaningful consent clauses of PIPEDA, along with PIPEDA’s purpose, pivot on the perspective of the reasonable person. Section 6.1 of PIPEDA protects an organization’s collection, use, or disclosure of information only to the extent that a reasonable person would consider appropriate in the circumstances. Clause 4.3.2 of PIPEDA asks whether an individual could have “reasonably underst[ood]” how their information would be used or disclosed. (See also section 3 and clause 4.3.5 of PIPEDA).

The Court of Appeal then said at Paragraph 70:

[70] It was the responsibility of the Court to define an objective, reasonable expectation of meaningful consent. To decline to do so in the absence of subjective and expert evidence was an error.

I think the court is both right and wrong here. There are times in PIPEDA where a clearly objective standard is created. Look at section 5(3), which refers to the mythical, legal “reasonable person”:

Appropriate purposes (3) An organization may collect, use or disclose personal information only for purposes that a reasonable person would consider are appropriate in the circumstances.

We are clearly looking at THE REASONABLE PERSON ie the judge, applying this legal fiction.

But in other places in PIPEDA, we’re talking about conclusions about THE person who is being asked to give consent to the collection, use and disclosure of personal information:

6.1 For the purposes of clause 4.3 of Schedule 1, the consent of an individual is only valid if it is reasonable to expect that an individual to whom the organization’s activities are directed would understand the nature, purpose and consequences of the collection, use or disclosure of the personal information to which they are consenting.

Is it reasonable (objective standard) to expect that THE individual at issue understands? That’s an objective assessment of a subjective situation.

In the consent principle, “reasonable” is used a few times.

4.3.2 - The principle requires “knowledge and consent”. Organizations shall make a reasonable effort to ensure that the individual is advised of the purposes for which the information will be used. To make the consent meaningful, the purposes must be stated in such a manner that the individual can reasonably understand how the information will be used or disclosed.

Reasonable is used twice here. Was the effort reasonable? That’s objective, but in light of “the individual” in question, which seems subjective. But then it says “that the individual can reasonably understand”. That seems to suggest a reference to the individual at issue, who will be subjective.

A little further down, that seems to really apply a mixed objective/subjective assessment:

4.3.5 - In obtaining consent, the reasonable expectations of the individual are also relevant.

You’re looking at the reasonable expectations of THE INDIVIDUAL. It doesn’t say the reasonable expectations of a reasonable person, or the expectations of the reasonable person.

So I think, applied to this case, we’re really talking about the reasonable expectations of a Facebook user in 2014, not a fictional creature like the “man on the clapham omnibus.”

Friends’ information 

The next conclusion of the Federal Court of Appeal was much much easier to reach on the record before them. The Federal Court of Appeal very strongly determined that Facebook did not get adequate consent from friends of users who installed apps on the platform where those apps collected the personal information of those friends. When users installed apps, the app developer was required to inform the user about what personal information would be collected by the developer and how it would be used. At least in theory, that user could make an informed decision about whether to use the app or maybe can calibrate what data the app could access. However, if the app collected information about that user’s friends, those friends were not given notice or an opportunity to consent. On this point, the Federal Court of Appeal said:

[76] This distinction between users and friends of users is fundamental to the analysis under PIPEDA. The friends of users could not access the [Granular Data Permissions] process on an app-by-app basis and could not know or understand the purposes for which their data would be used, as required by PIPEDA.

[77] The only conclusion open to the Federal Court on the evidence was that Facebook failed to obtain meaningful consent from friends of users to disclose their data, and thus breached PIPEDA. This finding hinges mainly on Facebook’s different consent practices for users of apps and those users’ friends, and Facebook’s user-facing data policies and practices with third-party apps more broadly. To the extent this evidence was acknowledged by the Federal Court, it made a palpable and overriding error in its conclusion that there was no breach of PIPEDA.

[78] Facebook did not afford friends of users who downloaded third-party apps the opportunity to meaningfully consent to the disclosure of their data, since friends of users were simply unable to review those apps’ data policies prior to disclosure. This is not in accordance with PIPEDA: clause 4.3.2 of PIPEDA requires that organizations “make a reasonable effort to ensure that the individual is advised of the purposes for which the information will be used”.

The fact that users are informed via the privacy policy that this could happen was not sufficient, in the Court’s view.

Yeah, it seems very difficult to reach the conclusion that the friends had given knowledgeable, informed consent when they were not knowledgeable about the sharing of their personal information and were never given an opportunity to consent.

Privacy Policies are not sufficient

The Federal Court of Appeal raised one of the big challenges of privacy in the modern, online age. It says privacy policies are too long and nobody reads them. In a nutshell, long privacy policies are not the foundation for informed consent.

Trust but verify

On the question of whether Facebook fulfilled its obligations to safeguard users’ personal information, the Court of Appeal found that Facebook failed to safeguard user data because it did not review the privacy policies of third-party apps, and it did not act on red flags raised by apps like “this is your digital life”. Facebook's failure to review privacy policies or act on red flags amounted to a failure to adequately monitor third-party apps.

While companies can generally rely on the good faith performance of contracts, like the terms that all app developers had to agree to, the Court in this case raises the bar to “trust but verify” where you know there’s a risk of bad actors who will not adhere to those terms. The Court of Appeal specifically said at paragraph 117:

[117] Facebook is entitled to rely on the good faith performance of contracts, but only to a point. As discussed above, Mark Zuckerberg admitted that it would be difficult to guarantee that there were no “bad actors” using its Platform. It is incongruent to expect a bad actor to carry out a contract in good faith. Facebook therefore should have taken further measures to monitor third-party contractual compliance.

In particular, the Court noted that there were a number of red flags with this particular app developer that should have been pursued more promptly and perhaps with greater consequences to the developer.

This does change the analysis a bit. My take, which I think aligns with many of Facebook’s arguments, are that once someone has consented to their information being disclosed to a third party like an app developer, that app developer is the custodian of your data and they are now 100% responsible for securing it and living up to the legal obligations related to that data. The party that disclosed it, with consent, ceases to be responsible for it in the hands of the app developer.

I think what the Court of Appeal is getting at is that in a case like this, Facebook can’t be sure of the bona fides of the developer, Facebook continues to have some responsibility for the data. And since the Court concluded Facebook had reason to know that the people behind this particular act were perhaps – maybe likely – bad actors, permitting them to continue on the platform was inadequately safeguarding users’ data.

Takeaways

So the main takeaways from this decision is that consent and expectations of privacy seem to be fully objective, without regard to the individual at issue or the population we’re talking about. I’m not sure I fully agree with this.

The Court of Appeal is also clear – about something I’ve agreed with for some time – that privacy policies are not where you get knowledgeable, informed consent. People don’t read them and courts are starting to understand this. You need just in time consent, clearly articulated to the individual and I would still say tailored to the particular audience with whom you’re dealing.

And finally, trust but verify. You can rely on the good faith of third parties to live out their obligations under an agreement, unless you can’t. If there’s evidence to suggest they are bad actors, you may not be able to rely on that. Watch out for red flags. Trust but verify.

It will be interesting to see if Facebook seeks leave to appeal to the Supreme Court of CAnada. And it will be very interesting to see what remedies the OPC and Facebook agree to. And if they can’t agree, it’ll be interesting to see what remedies the FCA imposes. THAT will also be precedent setting.

No comments:

Post a Comment