Saturday, May 17, 2025

Alberta's privacy law unconstitutionally violates freedom of expression -- again -- in a decision that has implications for All Canadian privacy laws

You may have seen some headlines that said that Alberta’s privacy law has been declared unconstitutional. Yup, it’s true that at least part of it was and here’s why …..

This case involves Clearview AI Inc. ("Clearview"), a U.S.-based facial recognition company, challenging an order issued by Alberta’s Information and Privacy Commissioner. The order, based on findings from a joint investigation by Canadian federal and provincial privacy regulators, required Clearview to cease offering services in Alberta, stop collecting, using, and disclosing images and biometric data of Albertans, and delete the relevant data already in its possession.

Clearview sought judicial review of the order on a number of grounds, including that it is not subject to the jurisdiction of Alberta and that the Personal Information Protection Act (aka “PIPA”) does not apply to it, the Commissioner adopted an unreasonable interpretation of the words “publicly available” in PIPA and the Personal Information Protection Act Regulation (the “PIPA Regulation”), and the Commissioner’s finding that Clearview did not have a reasonable purpose for collecting, using, and disclosing personal information is unreasonable. Clearview further asserted that the Commissioner’s interpretation of PIPA and the PIPA Regulation is unconstitutional contrary to Charter s 2(b) which guarantees freedom of expression. That last argument is the one we’re going to focus on.

One thing that is really interesting about the case is that the Court did not really have to address the Charter issues. The Commissioner found that Clearview’s purposes were not reasonable, which is necessary for a company to even collect, use or disclose personal information. The Court agreed, and could have just said “not reasonable!” – don’t have to decide the Charter question – just go follow the Commissioner’s order. But the Court delved into the Charter question as well.

It’s also notable that this is the second time that the Alberta statute has been declared to violate the Charter based on “publicly available information” in the Act and the Regulations as being too narrow. That was done by the Supreme Court of Canada in Alberta (Information and Privacy Commissioner) v. United Food and Commercial Workers, Local 401, when the Act was being applied to video recording by a union at a picket line.

The company at issue in this case, Clearview AI, has been the subject of many privacy investigations around the world. They collect facial images from publicly accessible websites, including social media, and use them to create a biometric facial recognition database, marketed primarily to law enforcement. In 2020, privacy commissioners from Alberta, B.C., Quebec, and Canada investigated Clearview’s operations and concluded in a joint report that its practices violated privacy laws.

In December 2021, Alberta’s Commissioner issued an order directing Clearview to cease operations in Alberta, based on violations of PIPA. The Commissioner essentially said that Clearview must do for Alberta what they agreed to do in setting a lawsuit in Illinois (which is notorious for its biometric laws).

Clearview AI then brought an application for judicial review in the Court of King’s bench, contesting:

  • Jurisdiction of Alberta’s Commissioner,
  • The reasonableness of the Commissioner's interpretation of "publicly available" under PIPA,
  • The constitutionality of PIPA's consent-based restrictions on the collection, use, and disclosure of personal information.

It should be noted that the British Columbia Commissioner issued a similar order, which was upheld by the Supreme Court of British Columbia last year.

In Alberta, as far as the jurisdiction argument went, the Court upheld the Commissioner’s jurisdiction, finding a "real and substantial connection" between Clearview’s activities and Alberta. Clearview had marketed its services in Alberta and its database included images of Albertans. The bar for jurisdiction in Canada is pretty low.

On the statutory interpretation issue, the Court accepted as reasonable the Commissioner’s interpretation that images scraped from the internet, including social media, are not "publicly available" within the meaning of the PIPA Regulation. The Commissioner employed a purposive approach, interpreting the relevant provisions narrowly in light of the quasi-constitutional status of privacy rights.

PIPA, like other privacy regulatory regimes in Canada, provides that consent must be obtained to collect and use “personal information” unless certain exceptions apply. One of the exceptions provided for in PIPA is that the information is “publicly available.” PIPA uses the term “publicly available,” but the definition for those words is found in PIPA Regulation section 7(e). PIPA Regulation s 7(e) provides:

7 ... personal information does not come within the meaning of ... “the information is publicly available” except in the following circumstances: ... (e) the personal information is contained in a publication, including, but not limited to, a magazine, book or newspaper, whether in printed or electronic form, but only if (i) the publication is available to the public, and (ii) it is reasonable to assume that the individual that the information is about provided that information.

The private sector privacy laws of Alberta, British Columbia and Federally have similar, but not identical definitions of what is “publicly available” information that does not require consent for its collection and use. There are other categories, but this decision turned on information in a publication. Here are the three different definitions:

In Alberta, it says

the personal information is contained in a publication, including … but not limited to … a magazine, book or newspaper, whether in printed or electronic form, but only if (i) the publication is available to the public, and (ii) it is reasonable to assume that the individual that the information is about provided that information;

In British Columbia, it does not use “including but not limited to”:

personal information that appears in a printed or electronic publication that is available to the public, including a magazine, book or newspaper in printed or electronic form.

Under PIPEDA’s regulation, the analogous provision reads:

personal information that appears in a publication, including a magazine, book or newspaper, in printed or electronic form, that is available to the public, where the individual has provided the information.

Canadian privacy regulators have interpreted “publication” to exclude social media sites like Facebook and LinkedIn, where Clearview harvests much of its information.

Clearview argued that this narrow interpretation under the Alberta statute and regulation violated its freedom of expression rights under section 2(b) of the Charter of Rights and Freedoms, and could not be saved as a reasonable limitation under section 1 of the Charter.

The Court agreed that:

Clearview’s activities (compiling and using data to deliver a service) were expressive. The consent requirement effectively operated as a prohibition on expression where obtaining consent was impractical.

This amounted to a prima facie infringement of s. 2(b) of the Charter.

I should note that the Alberta Commissioner – ridiculously in my view – argued that the Charter wasn’t even engaged. Here’s what the Court said.

[107] The Commissioner submits that if Clearview’s activity is expressive, it should be excluded from constitutional protection because “the method – mass surveillance – conflicts with the underlying s 2(b) values.” Clearview’s activity, according to the Commissioner, conflicts with the purposes of Charter s 2(b) including the pursuit of truth, participation in the community, self-fulfillment, and human flourishing. The Commissioner offered no authority to support the position that expressive activity could be excluded from protection based on a conflict with underlying constitutional values. Short of violence, all expressive activity is protected by Charter s 2(b).

It’s just a dumb argument to make, in my view.

So once a prima facie infringement is made out, the burden shifts to the government to justify it as a reasonable limitation, prescribed by law that can be justified in a free and democratic society. This follows something called the Oakes test:

The test involves a two-stage analysis: first, the objective of the law must be pressing and substantial; second, the means used to achieve that objective must be proportionate, which requires

  1. a rational connection between the law and its objective,
  2. minimal impairment of the right or freedom, and
  3. a proportionality between the law’s benefits and its negative effects on rights.

In this case, the Court found that there was a Pressing and Substantial Objective: Protecting personal privacy is valid and important. The Court also found that the requirement of consent is logically connected to privacy protection, and thus rationally connected.

The law failed on the “minimal impairment” part of the analysis. The dual requirement of consent and a reasonable purpose, without an exception for publicly available internet data, was overly broad.

In a nutshell, the court has to consider what expressive activities are captured – how broadly the net is cast – and whether everything that is caught in that net is necessary or rationally connected to the pressing and substantial objective.

The Court summarized Clearview’s argument at paragraph 129:

“Clearview asserts that people who put their personal information on the internet without protection do not have a reasonable expectation of privacy. Where there is no reasonable expectation of privacy, the protection of privacy is not a pressing and substantial state objective.”

The Court noted that the way the net is being cast by the Act and the regulations not only captures Clearview’s web-scraping, but it also captures legitimate indexing by beneficial search engines. The Commissioner’s interpretation would exclude search engines, meaning that they would have to get consent for all collection, use and disclosure of personal information obtained from websites.

Here’s what the Court said at paragraph 132 of the decision:

[132] A difficulty with the PIPA consent requirement for personal information publicly available on the internet is that it applies equally to Clearview’s search technology used to create a facial recognition database and regular search engines that individuals use to access information on the internet. … For the most part, people consider Google’s indexing of images and information to be beneficial. And certainly, Albertans use Google and similar search engines for expressive purposes. But according to my interpretation of PIPA and the PIPA Regulation and the Commissioner’s interpretation of those same instruments, Google and similar search engines cannot scrape the internet in Alberta for the purpose of building and maintaining an index of images of people without consent from every individual whose personal information is collected.

The Court then went on to say at paragraphs 136 and 137:

[136] PIPA and the PIPA Regulation are overbroad because they limit valuable expressive activity like the operation of regular search engines. There is no justification for limiting use of publicly available personal information by regular search engines just as there was no justification to limit use of publicly available personal information for reasonable purposes by the union in UFCW Local 401.
[137] Alberta has a pressing and substantial interest in protecting personal information where individuals post images and information to websites and social media platforms subject to terms of service that preserve a reasonable expectation of limited use. This pressing and substantial interest, however, does not extend to the operation of regular search engines. A reasonable person posting images and information to a website or social media platform subject to terms of service but without using privacy settings expects that such images and information will be indexed and retrieved by internet search engines; indeed, that is sometimes the point of posting images and information to the internet without using privacy settings.

Then, at paragraph 138, the court concluded that the “publicly available” exception was too narrow because it specifically would capture general search engines, which do not engage the “pressing and substantial limitation”

[138] The public availability exception to the consent requirement in PIPA and the PIPA Regulation is source-based, not purpose-based. Because it is source-based, it applies to regular internet search engines that scrape images and information from the internet like Clearview even if they use images and information for a different purpose. I find that PIPA and the PIPA Regulation are overbroad because the definition of “publication” in PIPA Regulation s 7(e) is confined to magazines, books, newspapers, and like media. Without a reasonable exception to the consent requirement for personal information made publicly available on the internet without use of privacy settings, internet search service providers are subject to a mandatory consent requirement when they collect, use, and disclose such personal information by indexing and delivering search results. There is no pressing and substantial justification for imposing a consent requirement on regular search engines from collecting, using, and disclosing unprotected personal information on the internet as part of their normal function of providing the valuable service of indexing the internet and providing search results.

The court essentially concluded that it was OK to limit what Clearview is doing, but it is NOT OK to limit what search engines are doing. The law, as written, does not distinguish between the “bad” and the “good”, and as a result, the law did not “minimally impair” this important Charter right.

On the final balancing, the Court concluded that the harm to freedom of expression was not outweighed by the benefit to privacy.

The Court declared that PIPA ss. 12, 17, and 20 and PIPA Regulation s. 7 unjustifiably infringed s. 2(b) of the Charter and could not be saved under s. 1 of the Charter, to the extent that they prohibited the use of publicly available internet data for reasonable purposes.

The Court upheld the Commissioner’s jurisdiction and found her statutory interpretation reasonable. However, the impugned provisions of PIPA and the Regulation were declared unconstitutional insofar as they infringed freedom of expression by unduly restricting the use of publicly available information online.

I fully expect that this decision will be appealed, and I don’t know if the British Columbia decision has been appealed.

In the big picture, though this decision is not binding on the Federal Commissioner, it pretty strongly stands for the proposition that PIPEDA’s publicly available information exception is also unconstitutional. This has implications for “the right to be forgotten” and for collecting data for training AI models, both of which are currently before the federal commissioner.

No comments: