Thursday, June 26, 2025

Past Canadian "lawful access" attempts, both by Liberal and Conservative governments

2005 (Lib - Paul Martin - Minister Anne Maclellan) - C-74 (38-1) - LEGISinfo - Parliament of Canada - Short title: Modernization of Investigative Techniques Act (Did not pass)


Library of Parliament Legislative Summary for Bill C-74


2009 (Con - Stephen Harper - Minister Peter Van Loan) - C-47 (40-2) - LEGISinfo - Parliament of Canada - Short title: Technical Assistance for Law Enforcement in the 21st Century Act (Did not pass)


Library of Parliament Legislative Summary for Bill C-47 


2011 (Con - Stephen Harper / Minister Vic Toews) - C-52 (40-3) - LEGISinfo - Parliament of Canada - Short title: Investigating and Preventing Criminal Electronic Communications Act (Did not pass)


Library of Parliament Legislative Summary for Bill C-52


2012 (Con - Stephen Harper / Minister Vic Toews) - C-30 (41-1) - LEGISinfo - Parliament of Canada - Short title: Protecting Children from Internet Predators Act (Did not pass)


Library of Parliament Legislative Summary for Bill C-30


2013 (Con - Stephen Harper / Minister Peter MacKay)  C-13 (41-2) - LEGISinfo - Parliament of Canada - Short title: Protecting Canadians from Online Crime Act (Passed)


Library of Parliament Legislative Summary for Bill C-13


Monday, June 23, 2025

Materially misleading statements in the Charter Statement for Bill C-2's Lawful Access provisions

The government of Canada – specifically the Minister of Justice – just released its “Charter Statement” regarding Bill C-2, the Strong Borders Act. I’m particularly focused on the “lawful access” provisions in the Bill, and I read it with interest to see how the government thinks the expanded government access to data is compatible with Section 8 of the Charter. Section 8 prohibits unreasonable searches and seizures.

In the Charter Statement, the Minister significantly mischaracterizes his own bill in a manner that makes it appear more Charter-compliant. It could be a handful of honest mistakes, but I’m getting more cynical as my hair gets more grey. (The two may be connected, now that I think about it.)

Anyways, it’s not a huge “GOTCHA!”, but they should acknowledge the mistakes and fix them.

Some background on what Charter Statements are about can be found in the Charter Statement itself:

Section 4.2 of the Department of Justice Act requires the Minister of Justice to prepare a Charter Statement for every government bill to help inform public and Parliamentary debate on government bills. One of the Minister of Justice’s most important responsibilities is to examine legislation for inconsistency with the Canadian Charter of Rights and Freedoms. By tabling a Charter Statement, the Minister is sharing some of the key considerations that informed the review of a bill for inconsistency with the Charter. A Statement identifies Charter rights and freedoms that may potentially be engaged by a bill and provides a brief explanation of the nature of any engagement, in light of the measures being proposed.

So in this particular Charter Statement, there are a couple of troubling and significant mis-statements about the Lawful Access provisions which – surprise! surprise! – make it appear more Charter-compliant.

When discussing the new production order for Subscriber Information, it says:

The judge would have to be satisfied that an offence has or will be committed and that there are reasonable grounds to suspect that the information will assist in the investigation of an offence.

This is not true. Not even close. The conditions for issuing an order are set out in the new, proposed subsection 487.0142(2), which says:

(2) Before making the order, the justice or judge must be satisfied by information on oath in Form 5.‍004 that there are reasonable grounds to suspect that

(a) an offence has been or will be committed under this Act or any other Act of Parliament; and (b) the subscriber information is in the person’s possession or control and will assist in the investigation of the offence.

The judge only has to be satisfied based on a cop’s sworn say-so that there are reasonable grounds to suspect an offence has been or will be committed, and they have reasonable grounds to suspect the subscriber information will assist in the investigation. This is far from the judge having to be “satisfied” that an offence has been committed. The cop swearing the application doesn’t even have to be satisfied that an offence has been or will be committed. It’s enough that the judge believes that there are reasonable grounds to justify the cop’s tingling “Spidey sense”.

In the next paragraph about the production order for subscriber information, the Charter Statement says that this power will be used to “generate leads”, which sounds like a fishing expedition to me. I don’t think that’s a mistake.

We’ve been told that this power is to be used if the police have an IP address associated with someone they suspect is victimizing children, so they can identify THAT person, do an investigation and then get a search warrant. That’s not “generating leads”, as far as I understand that terminology.

The next material misstatement is in the last sentence of that paragraph, which says “if [the judge] chooses to issue an order, the judge would have discretion as to what information is specified in it.” I’m pretty sure that’s incorrect.

The new order power says it is for

ALL the subscriber information that relates to any information, including transmission data, that is specified in the order and that is in their possession or control when they receive the order.

ALL the subscriber information that relates to the identifier that is specified in the order. The form of the order, which is prescribed in the Act, does exactly that. The order is for ALL subscriber information, which is horribly broadly defined. I’m not seeing any discretion here.

I have some issues with the way certain things are characterized, like saying that information that can be subject to a warrantless demand by a cop is not sensitive information.

The way this provision is drafted, it can include going to a family doctor and saying “Do you provide services to David Fraser? What specialists (like psychiatrists) also provide him with services?” I would say I have a high expectation of privacy in that information. They can go to your bank and the definition of subscriber information can compel them to provide a list of all companies you do business with. That merely identifies the client and the services the client receives. But that’s sensitive information and goes well beyond going to a telco and asking “Do you provide service to this number, and what city does the customer live in?”

This is either sloppy or intended to be deceptive. If the government thinks this is defensible, they should defend it on its own actual, honest merits. In just about every lawful access provision in the Bill, they are lowering the bar to make it easier to get information, while widening the net to capture more information than they say they need.

I’ve said it before and I’ll say it again: Parts 14 and 15 need to be taken out of the Bill, put in their own Bill so we can discuss them. I want to have an honest debate with someone who is interested in an HONEST debate. Think about this …. Bill C-2 is the FIRST substantial bill that Mark Carney’s new government introduced in the House of Commons after getting elected. Correct me if I’m wrong – but I’m pretty sure I’m not – no liberal candidate or the present Prime Minister campaigned on any of the new police and national security powers mentioned in Parts 14 and 15 of Bill C-2.

Saturday, May 17, 2025

Alberta's privacy law unconstitutionally violates freedom of expression -- again -- in a decision that has implications for All Canadian privacy laws

You may have seen some headlines that said that Alberta’s privacy law has been declared unconstitutional. Yup, it’s true that at least part of it was and here’s why …..

This case involves Clearview AI Inc. ("Clearview"), a U.S.-based facial recognition company, challenging an order issued by Alberta’s Information and Privacy Commissioner. The order, based on findings from a joint investigation by Canadian federal and provincial privacy regulators, required Clearview to cease offering services in Alberta, stop collecting, using, and disclosing images and biometric data of Albertans, and delete the relevant data already in its possession.

Clearview sought judicial review of the order on a number of grounds, including that it is not subject to the jurisdiction of Alberta and that the Personal Information Protection Act (aka “PIPA”) does not apply to it, the Commissioner adopted an unreasonable interpretation of the words “publicly available” in PIPA and the Personal Information Protection Act Regulation (the “PIPA Regulation”), and the Commissioner’s finding that Clearview did not have a reasonable purpose for collecting, using, and disclosing personal information is unreasonable. Clearview further asserted that the Commissioner’s interpretation of PIPA and the PIPA Regulation is unconstitutional contrary to Charter s 2(b) which guarantees freedom of expression. That last argument is the one we’re going to focus on.

One thing that is really interesting about the case is that the Court did not really have to address the Charter issues. The Commissioner found that Clearview’s purposes were not reasonable, which is necessary for a company to even collect, use or disclose personal information. The Court agreed, and could have just said “not reasonable!” – don’t have to decide the Charter question – just go follow the Commissioner’s order. But the Court delved into the Charter question as well.

It’s also notable that this is the second time that the Alberta statute has been declared to violate the Charter based on “publicly available information” in the Act and the Regulations as being too narrow. That was done by the Supreme Court of Canada in Alberta (Information and Privacy Commissioner) v. United Food and Commercial Workers, Local 401, when the Act was being applied to video recording by a union at a picket line.

The company at issue in this case, Clearview AI, has been the subject of many privacy investigations around the world. They collect facial images from publicly accessible websites, including social media, and use them to create a biometric facial recognition database, marketed primarily to law enforcement. In 2020, privacy commissioners from Alberta, B.C., Quebec, and Canada investigated Clearview’s operations and concluded in a joint report that its practices violated privacy laws.

In December 2021, Alberta’s Commissioner issued an order directing Clearview to cease operations in Alberta, based on violations of PIPA. The Commissioner essentially said that Clearview must do for Alberta what they agreed to do in setting a lawsuit in Illinois (which is notorious for its biometric laws).

Clearview AI then brought an application for judicial review in the Court of King’s bench, contesting:

  • Jurisdiction of Alberta’s Commissioner,
  • The reasonableness of the Commissioner's interpretation of "publicly available" under PIPA,
  • The constitutionality of PIPA's consent-based restrictions on the collection, use, and disclosure of personal information.

It should be noted that the British Columbia Commissioner issued a similar order, which was upheld by the Supreme Court of British Columbia last year.

In Alberta, as far as the jurisdiction argument went, the Court upheld the Commissioner’s jurisdiction, finding a "real and substantial connection" between Clearview’s activities and Alberta. Clearview had marketed its services in Alberta and its database included images of Albertans. The bar for jurisdiction in Canada is pretty low.

On the statutory interpretation issue, the Court accepted as reasonable the Commissioner’s interpretation that images scraped from the internet, including social media, are not "publicly available" within the meaning of the PIPA Regulation. The Commissioner employed a purposive approach, interpreting the relevant provisions narrowly in light of the quasi-constitutional status of privacy rights.

PIPA, like other privacy regulatory regimes in Canada, provides that consent must be obtained to collect and use “personal information” unless certain exceptions apply. One of the exceptions provided for in PIPA is that the information is “publicly available.” PIPA uses the term “publicly available,” but the definition for those words is found in PIPA Regulation section 7(e). PIPA Regulation s 7(e) provides:

7 ... personal information does not come within the meaning of ... “the information is publicly available” except in the following circumstances: ... (e) the personal information is contained in a publication, including, but not limited to, a magazine, book or newspaper, whether in printed or electronic form, but only if (i) the publication is available to the public, and (ii) it is reasonable to assume that the individual that the information is about provided that information.

The private sector privacy laws of Alberta, British Columbia and Federally have similar, but not identical definitions of what is “publicly available” information that does not require consent for its collection and use. There are other categories, but this decision turned on information in a publication. Here are the three different definitions:

In Alberta, it says

the personal information is contained in a publication, including … but not limited to … a magazine, book or newspaper, whether in printed or electronic form, but only if (i) the publication is available to the public, and (ii) it is reasonable to assume that the individual that the information is about provided that information;

In British Columbia, it does not use “including but not limited to”:

personal information that appears in a printed or electronic publication that is available to the public, including a magazine, book or newspaper in printed or electronic form.

Under PIPEDA’s regulation, the analogous provision reads:

personal information that appears in a publication, including a magazine, book or newspaper, in printed or electronic form, that is available to the public, where the individual has provided the information.

Canadian privacy regulators have interpreted “publication” to exclude social media sites like Facebook and LinkedIn, where Clearview harvests much of its information.

Clearview argued that this narrow interpretation under the Alberta statute and regulation violated its freedom of expression rights under section 2(b) of the Charter of Rights and Freedoms, and could not be saved as a reasonable limitation under section 1 of the Charter.

The Court agreed that:

Clearview’s activities (compiling and using data to deliver a service) were expressive. The consent requirement effectively operated as a prohibition on expression where obtaining consent was impractical.

This amounted to a prima facie infringement of s. 2(b) of the Charter.

I should note that the Alberta Commissioner – ridiculously in my view – argued that the Charter wasn’t even engaged. Here’s what the Court said.

[107] The Commissioner submits that if Clearview’s activity is expressive, it should be excluded from constitutional protection because “the method – mass surveillance – conflicts with the underlying s 2(b) values.” Clearview’s activity, according to the Commissioner, conflicts with the purposes of Charter s 2(b) including the pursuit of truth, participation in the community, self-fulfillment, and human flourishing. The Commissioner offered no authority to support the position that expressive activity could be excluded from protection based on a conflict with underlying constitutional values. Short of violence, all expressive activity is protected by Charter s 2(b).

It’s just a dumb argument to make, in my view.

So once a prima facie infringement is made out, the burden shifts to the government to justify it as a reasonable limitation, prescribed by law that can be justified in a free and democratic society. This follows something called the Oakes test:

The test involves a two-stage analysis: first, the objective of the law must be pressing and substantial; second, the means used to achieve that objective must be proportionate, which requires

  1. a rational connection between the law and its objective,
  2. minimal impairment of the right or freedom, and
  3. a proportionality between the law’s benefits and its negative effects on rights.

In this case, the Court found that there was a Pressing and Substantial Objective: Protecting personal privacy is valid and important. The Court also found that the requirement of consent is logically connected to privacy protection, and thus rationally connected.

The law failed on the “minimal impairment” part of the analysis. The dual requirement of consent and a reasonable purpose, without an exception for publicly available internet data, was overly broad.

In a nutshell, the court has to consider what expressive activities are captured – how broadly the net is cast – and whether everything that is caught in that net is necessary or rationally connected to the pressing and substantial objective.

The Court summarized Clearview’s argument at paragraph 129:

“Clearview asserts that people who put their personal information on the internet without protection do not have a reasonable expectation of privacy. Where there is no reasonable expectation of privacy, the protection of privacy is not a pressing and substantial state objective.”

The Court noted that the way the net is being cast by the Act and the regulations not only captures Clearview’s web-scraping, but it also captures legitimate indexing by beneficial search engines. The Commissioner’s interpretation would exclude search engines, meaning that they would have to get consent for all collection, use and disclosure of personal information obtained from websites.

Here’s what the Court said at paragraph 132 of the decision:

[132] A difficulty with the PIPA consent requirement for personal information publicly available on the internet is that it applies equally to Clearview’s search technology used to create a facial recognition database and regular search engines that individuals use to access information on the internet. … For the most part, people consider Google’s indexing of images and information to be beneficial. And certainly, Albertans use Google and similar search engines for expressive purposes. But according to my interpretation of PIPA and the PIPA Regulation and the Commissioner’s interpretation of those same instruments, Google and similar search engines cannot scrape the internet in Alberta for the purpose of building and maintaining an index of images of people without consent from every individual whose personal information is collected.

The Court then went on to say at paragraphs 136 and 137:

[136] PIPA and the PIPA Regulation are overbroad because they limit valuable expressive activity like the operation of regular search engines. There is no justification for limiting use of publicly available personal information by regular search engines just as there was no justification to limit use of publicly available personal information for reasonable purposes by the union in UFCW Local 401.
[137] Alberta has a pressing and substantial interest in protecting personal information where individuals post images and information to websites and social media platforms subject to terms of service that preserve a reasonable expectation of limited use. This pressing and substantial interest, however, does not extend to the operation of regular search engines. A reasonable person posting images and information to a website or social media platform subject to terms of service but without using privacy settings expects that such images and information will be indexed and retrieved by internet search engines; indeed, that is sometimes the point of posting images and information to the internet without using privacy settings.

Then, at paragraph 138, the court concluded that the “publicly available” exception was too narrow because it specifically would capture general search engines, which do not engage the “pressing and substantial limitation”

[138] The public availability exception to the consent requirement in PIPA and the PIPA Regulation is source-based, not purpose-based. Because it is source-based, it applies to regular internet search engines that scrape images and information from the internet like Clearview even if they use images and information for a different purpose. I find that PIPA and the PIPA Regulation are overbroad because the definition of “publication” in PIPA Regulation s 7(e) is confined to magazines, books, newspapers, and like media. Without a reasonable exception to the consent requirement for personal information made publicly available on the internet without use of privacy settings, internet search service providers are subject to a mandatory consent requirement when they collect, use, and disclose such personal information by indexing and delivering search results. There is no pressing and substantial justification for imposing a consent requirement on regular search engines from collecting, using, and disclosing unprotected personal information on the internet as part of their normal function of providing the valuable service of indexing the internet and providing search results.

The court essentially concluded that it was OK to limit what Clearview is doing, but it is NOT OK to limit what search engines are doing. The law, as written, does not distinguish between the “bad” and the “good”, and as a result, the law did not “minimally impair” this important Charter right.

On the final balancing, the Court concluded that the harm to freedom of expression was not outweighed by the benefit to privacy.

The Court declared that PIPA ss. 12, 17, and 20 and PIPA Regulation s. 7 unjustifiably infringed s. 2(b) of the Charter and could not be saved under s. 1 of the Charter, to the extent that they prohibited the use of publicly available internet data for reasonable purposes.

The Court upheld the Commissioner’s jurisdiction and found her statutory interpretation reasonable. However, the impugned provisions of PIPA and the Regulation were declared unconstitutional insofar as they infringed freedom of expression by unduly restricting the use of publicly available information online.

I fully expect that this decision will be appealed, and I don’t know if the British Columbia decision has been appealed.

In the big picture, though this decision is not binding on the Federal Commissioner, it pretty strongly stands for the proposition that PIPEDA’s publicly available information exception is also unconstitutional. This has implications for “the right to be forgotten” and for collecting data for training AI models, both of which are currently before the federal commissioner.