Sunday, January 18, 2026

BC Privacy Commissioner finds city's use of public surveillance cameras unlawful ... off to court

The Information and PrivacyCommissioner of British Columbia just found that the City of Richmond in the BC lower mainland broke the law when it installed ultra-high-definition cameras in public places that capture faces, licence plates, and other identifiers. The Commissioner recommended that they take down the cameras and delete all the recordings. The City said “nope”, so the Commissioner issued a binding order for them to stop collection, delete recordings, and disband the system.

This is definitely going to court. The City of Richmond issued a statement saying they think it is lawful and appropriate, and are looking to have the legality of all of this determined by the Courts. I think that’s a good thing … the more clarity we have from the superior courts on the interpretation of our privacy laws, the better.

I should note that while these laws are generally consistent from province to province, there is a big variation on how police services are delivered. Not all of the conclusions of this finding will necessarily be applicable in all other provinces or municipalities.

The City of Richmond in British Columbia began field testing its “Public Safety Camera System” – or PSCS – in early 2025 at the intersection of Minoru Boulevard and Granville Avenue.

The City’s stated sole purpose was to collect and disclose video footage to the RCMP to assist in identifying criminal suspects. That point—sole purpose—is central to the Commissioner’s analysis. There was no other rationale for the City of Richmond to put up these cameras in these locations. 

Operationally, the system involved multiple high-resolution cameras capturing:

  • licence plate numbers,
  • high-definition images of vehicle occupants,
  • pedestrians,
  • vehicle identifying features, and
  • location/time information tied to the intersection.

The cameras recorded continuously, and the City retained footage for 48 hours before deletion.

The field test included capabilities like licence plate recognition, pan-tilt-zoom variants, panoramic/multi-sensor configurations, and other detection features; the City confirmed it did not use facial recognition or built-in audio recording during field testing, though some cameras had those capabilities.

The City’s goal for the field test was essentially procurement-and-design: evaluate camera tech, decide numbers and placement, assess performance in different conditions, and confirm the PSCS could generate “usable” footage for law enforcement use later.

Under BC FIPPA, public bodies can’t collect personal information just because it seems useful. Collection has to fit within a listed authorization—most importantly here, s. 26.

The Commissioner situates that within a broader privacy-protective approach: privacy rights are treated as quasi-constitutional, and public bodies should only compromise privacy where there’s a compelling state interest.

Richmond relied on three possible authorities:

  • s. 26(b) (law enforcement),
  • s. 26(c) (authorized program/activity + necessity),
  • s. 26(e) (planning/evaluating a program/activity).

The Commissioner rejected all three, finding there simply was not legal authority for the collection of personal information – and without legal authority, there’s no lawful collection.

Richmond first said they were authorized under s. 26(b):

26          A public body may collect personal information only if

(b)          the information is collected for the purposes of law enforcement,

Note the use of the word “only”. Unless section 26 permits it, a public body cannot collect personal information.

Richmond’s theory was straightforward: the definition of “law enforcement” includes policing, and the PSCS was meant to support policing by helping identify suspects—so it’s “for law enforcement.” That was their alleged purpose.

The Commissioner accepted there’s a connection: the information might be used by the RCMP in policing. But the Commissioner says that’s not the end of the inquiry, because the collector is the City—and the City must have a law enforcement mandate of its own to rely on s. 26(b).

This is a recurring theme in Canadian privacy oversight: a public body can’t bootstrap a law-enforcement collection power merely because another entity with a law-enforcement mandate might find the data useful.

The City may pay for law enforcement, and it may provide resources to law enforcement but they do not have a lawful law enforcement mandate. 

The report describes three arguments Richmond advanced:

  1. RCMP mandate should be imputed to the City (because the City “provides” policing by contracting with the RCMP to do it).
  2. The City has a mandate to collect information for the RCMP.
  3. The City has its own independent mandate to police through the cameras.

The Commissioner’s response is pretty technical: under the Police Act and the Municipal Police Unit Agreement framework, municipalities fund and resource policing, but policing authority and law enforcement functions remain with the police, operating independently of the municipality.

He underscores that the Police Act sets out specific ways a municipality provides policing—such as establishing a municipal force or contracting with the RCMP—and “running a surveillance camera system for the police to use” is not among those statutory options.

He also points to the RCMP’s peace-officer functions and the Municipal Police Unit Agreement structure as vesting law enforcement responsibilities in the RCMP, not the City, and he reads the legislative set-up as intentionally keeping policing independent from municipal control.

So this argument advanced by the City failed: the City lacked the necessary law-enforcement mandate, so it could not collect under s. 26(b)—even if the police might later use the footage.

Section 26(c) is the classic “public body operational authority” provision: even if a statute doesn’t explicitly say “collect this kind of personal information,” a public body can collect personal information if it is both:

  • directly related to an authorized program or activity, and
  • necessary for that program or activity.

Richmond framed its program as essentially: an intersection camera program to identify criminal suspects following criminal incidents, pointing to broad service powers under its Community Charter.

But the Commissioner rejected that program characterization as “authorized,” because—again—of the Police Act structure. In the Commissioner’s view, “collecting evidence to identify criminals that the RCMP may rely on” isn’t part of how the City is authorized to provide policing services or resources under the Police Act framework.

So, the analysis fails at the first step: if the underlying “program” isn’t authorized, 26(c) can’t save the collection.

The report goes further and addresses necessity. The Commissioner emphasizes that the City’s record was limited in establishing that: (a) unresolved crime was “real, substantial, and pressing,” (b) existing measures were ineffective, or (c) less intrusive means had been seriously examined.

He characterizes the intrusion into privacy as “vast,” relative to the limited evidentiary foundation offered to justify necessity.

The net effect was that the Commissioner was not satisfied that the City demonstrated that mass capture of high-definition identifying footage from “tens of thousands of people each day” who had nothing to do with any sort of crime was necessary for the purported municipal activity.

Richmond also argued: the field test is just planning and evaluation, and s. 26(e) specifically authorizes collection necessary for planning/evaluating a program.

The Commissioner’s treatment of 26(e) is crisp: 26(e) presupposes that the program being planned or evaluated is otherwise authorized. You can plan or evalue an authorized program, but if the program ain’t authorized, you can’t collect personal information to plan or evaluate it. Richmond itself largely accepted that proposition, and the Commissioner agreed.

Because the Commissioner had already found the PSCS was not authorized under 26(b) or 26(c), Richmond could not rely on 26(e) to do “planning” for an unauthorized program.

It makes sense that you can’t use the planning/evaluation clause as an end-run around the core requirement of lawful authority. Otherwise, everything under the sun could be said to be for planning or evaluation. 

FIPPA generally requires notice of purpose and authority when collecting personal information. Richmond tried to avoid notice by invoking s. 27(3)(a)—the idea that a notice is not required where the information is “about law enforcement.”

The Commissioner gives two responses.

First: the City couldn’t rely on law enforcement as its underlying authorization in the first place—so that alone undermined the attempt to rely on the exception.

Second, and more fact-specific: during the field testing phase, the City had confirmed it was not using the information for actual public safety or enforcement purposes—only to test and evaluate camera technical capabilities.

So even reading “about law enforcement” broadly, the Commissioner questioned whether the testing-phase collection qualified as “about law enforcement,” because it would not be used to enforce any laws, and there was no compelling enforcement purpose weighing against notice.

Richmond did install signs, but the Commissioner describes them as a “courtesy” and finds them legally inadequate.

The sign said “PUBLIC SAFETY CAMERA TESTING / FIELD TESTING IN PROGRESS AT THIS INTERSECTION” with contact information for the City’s Director of Transportation.

The Commissioner’s critique is twofold:

  1. First there was a Content deficiency: the signs did not clearly notify people that cameras were recording and collecting personal information, and did not include the purposes and legal authority for collection as required by s. 27(2).
  2. And secondly there was a Placement deficiency: signage was vehicle-focused, placed for eastbound and westbound approaches, but did not address entries from other directions and did not notify pedestrians—despite the system’s capacity to capture pedestrians and pan widely, including multi-direction recording.

The Commissioner’s conclusion is direct: the City did not adequately notify individuals when it collected their personal information during field testing.

The report notes that disclosure under s. 33(2) generally depends on lawful collection in the first place, and because the collection lacked authority, the City could not rely on “consistent purpose” disclosure to the RCMP for evaluation.

On security, the Commissioner acknowledges the City described a reasonably robust set of safeguards, and that even where collection is unlawful, the City still has a duty under s. 30 to protect personal information in its custody or control.

But safeguards don’t cure lack of authority. They are necessary, not sufficient.

The OIPC’s recommendations were blunt:

  1. stop collecting personal information through the PSCS,
  2. delete all recordings, and
  3. disband the equipment.

Richmond advised it would not comply, and the Commissioner issued Order F26-01, requiring immediate compliance and written evidence of compliance by a specific date.

My takeaway is that the Commissioner’s reasoning is primarily structural and jurisdictional: the City tried to create a surveillance-for-police capability, but the Commissioner reads BC’s legal framework as drawing a hard line between municipal services and police law-enforcement authority—particularly when the activity is mass surveillance in public space.

If you’re a public body contemplating “pilot projects” with high-capability cameras, the report is a reminder that planning provisions don’t let you pilot an unauthorized program, and that “law enforcement adjacent” doesn’t equal “law enforcement authorized.”

For a public body, every collection of personal information has to be directly authorized by law. It’s worth noting that the “law enforcement” provision in most public sector privacy laws is wide enough to drive a truck through. The RCMP in Richmond could have paid for and put up those cameras all over the place, since they have a law enforcement mandate. 

Criminal courts are pretty adept at dealing with privacy invasions on a case-by-case basis using section 8 of the Charter, but we actually need a better way to to evaluate proportionality, necessity and appropriateness when it comes to proposed police programs that hoover up data on hundreds, thousands or maybe millions of innocent people in the name of “law enforcement”.

It’ll be interesting to see how the courts deal with this.

 

Sunday, January 11, 2026

Canada's new proposed law to outlaw explicit deepfakes: Bill C-16

A number of years ago, the Parliament of Canada amended our Criminal Code to create a criminal offense related to the non-consensual distribution of intimate images. Last month, the Government of Canada proposed to further amend the Criminal Code to include so-called deepfake intimate images, and to create an offence of threatening to disclose intimate images, deepfake or not.

Section 162.1, which was added to the Criminal Code in 2014, makes it an offence to publish, distribute, transmit, sell, make available or advertising an intimate image without the consent of the individual depicted in the image. 


And a number of provinces have put in place laws that create civil remedies for the non-consensual distribution of intimate images. 


With some variation, they generally have the same definition of “intimate image”, but they really haven’t kept up with an explosion of synthetic, AI-generated intimate imagery. Synthetic images are created by generative AI systems that can “learns” what a person looks like and can use that information to create new images that resemble that person. 


If you look at the definition of what is an intimate image, it clearly presupposes that it is a recording of an actual person and that the actual person was involved, or at least present at its recording.


Criminal Code – 2014 Amendments Definition of intimate image (2) In this section, intimate image means a visual recording of a person made by any means including a photographic, film or video recording, (a) in which the person is nude, is exposing his or her genital organs or anal region or her breasts or is engaged in explicit sexual activity; (b) in respect of which, at the time of the recording, there were circumstances that gave rise to a reasonable expectation of privacy; and (c) in respect of which the person depicted retains a reasonable expectation of privacy at the time the offence is committed.


It refers to an image or recording where the person “is exposing” certain body parts or “is engaging” in explicit sexual activity. It talks about “reasonable expectations of privacy” at the time the image is recorded and at the time the offence is created. 


This definition would not capture synthetic, “deep fake” intimate images.


The province of British Columbia has the newest provincial statute to create a civil framework to provide civil remedies for the non-consensual distribution of intimate images. The definition there is clearly modeled on the definition from the Criminal Code of Canada, but does include images where the person is depicted as engaged in a particular activity, also regardless of whether the image has been altered. So the BC law would cover a situation where an actual image of a person has been altered, in any way, to depict the person as engaging in certain acts or nude. 


Intimate Images Protection Act (British Columbia) “intimate image” means a visual recording or visual simultaneous representation of an individual, whether or not the individual is identifiable and whether or not the image has been altered in any way, in which the individual is or is depicted as (a) engaging in a sexual act, (b) nude or nearly nude, or (c) exposing the individual's genital organs, anal region or breasts, and in relation to which the individual had a reasonable expectation of privacy at, (d) in the case of a recording, the time the recording was made and, if distributed, the time of the distribution, and (e) in the case of a simultaneous representation, the time the simultaneous representation occurred;

But this updated definition does not cover purely synthetic images, meaning images that are original and are not simply alterations of existing images. You may recall a little while ago when AI generated sexualized images of superstar Taylor Swift were posted online. If I recall correctly, these were images that were not alterations of existing images but were rather the result of the AI image generator having ingested many, many images of Taylor Swift and “knowing” what she looks like. Those images would not have been captured by the current Criminal Code or even the newer definition in the British Columbia intimate images law. 

In December, the Government of Canada introduced Bill C-16, called the “Protecting Victims Act”, that makes a number of amendments to Canadian criminal and related laws. Included in Bill C-16 are proposed amendments that will expand the existing definition of “intimate image” to include synthetic deepfakes. 


So here’s the new definition from Bill C-16, but it’s more helpful to compare it to the existing language of the Criminal Code. I’ve crossed out what’s being removed and underlined what’s being added. So we see in subsection (2)(a)(i), where it deals with what has to be in an image or recording to be considered an “intimate image” – they’ve removed “his or her genital organs or anal region or her breasts” and have replaced it with “their sexual organs”. 


Bill C-16 Proposed amendments (redline)

Definition of intimate image
(2) In this section, intimate image means

(a) a visual recording of a person made by any means including a photographic, film or video recording,

(i) in which the person is nude, is exposing his or her genital organs or anal region or her breasts their sexual organs or is engaged in explicit sexual activity,

(ii) in respect of which, at the time of the recording, there were circumstances that gave rise to a reasonable expectation of privacy, and

(iii) in respect of which the person depicted retains a reasonable expectation of privacy at the time the offence is committed; or

(b) a visual representation that is made by any electronic or mechanical means and that shows an identifiable person who is depicted as nude, as exposing their sexual organs or as engaged in explicit sexual activity, if the depiction is likely to be mistaken for a visual recording of that person.
That change doesn’t really do what it appears it will do because they’ve added a new defined term in section 150 of the Code, which defines specific terms for Part V of the Code which deals with sexual offences. 

“sexual organs” include breasts that are or appear to be female breasts and the anal region; 


So this isn’t really a material change, as far as I can see. 


Subsection (2)(b) is where they scope in deepfakes:


(b) a visual representation that is made by any electronic or mechanical means and that shows an identifiable person who is depicted as nude, as exposing their sexual organs or as engaged in explicit sexual activity, if the depiction is likely to be mistaken for a visual recording of that person.


So this part doesn’t depend on the reasonable expectation of privacy in the image or recording. Which makes sense. An actual image of an actual person will be associated with that actual person’s expectations of what would happen with that image. A purely made-up image doesn’t have that. 


The key parts are that it is a visual representation that depicts the same sorts of body parts or conduct as in subsection (2)(a)(i), and that it has to be sufficiently realistic that the depiction “is likely to be mistaken for a visual recording of that person.”


It can’t be cartoon-ish or of such poor quality that you’d know immediately that it is not really that person. 


The scope of what could be an intimate image could be broader, but we have to be mindful of freedom of expression. Unfortunately, as of January 10 when I’m recording this, no Charter statement related to Bill C-16 has been released by the Canadian Department of Justice. (It’s been more than a month since the Bill was tabled in Parliament, so should have been released by now.)


The creation and distribution of intimate images is an expressive act and would be protected by the freedom of expression provision in section 2(b) of the Charter of Rights and Freedoms. But protected expression can be subject to “reasonable limits prescribed by law as can be demonstrably justified in a free and democratic society”. In order to justify the limitation, the goal of the legislature has to be pressing and substantial. i.e., is the objective sufficiently important to justify limiting a Charter right? And then there has to be proportionality between the objective and the means used to achieve it. 


This has three parts: first, the limit must be rationally connected to the objective. There must be a causal link between the measure and the pressing and substantial objective.


Second, the limit must impair the right or freedom no more than is reasonably necessary to accomplish the objective. The government will be required to show that there are no less rights-impairing means of achieving the objective “in a real and substantial manner”. 


Third, there must be proportionality between the deleterious and salutary effects of the law.


I think there is some risk that this expanded definition of “intimate images” may be vulnerable to being struck down as an unjustified infringement of freedom of expression. The law doesn’t create an offence of creating explicit deepfakes for “personal use”, so that’s not an issue. Though there is a defence related to “serving the public good” in section 162.1(3), I don’t think it’s broad enough to address the potential use of deepfakes in political satire and commentary.


Whether you like it or not, and regardless of whether you think it’s tasteful, AI generated imagery is being used to produce political commentary and satire. And yes, some of it does veer into depicting body parts and activities that can be captured in the new definition of “intimate image.” And you generally can’t outlaw expression just because it’s tasteless. At the end of the day, I don’t think the existing defence of “serving the public good” shields such political expression and leaves this provision vulnerable to a successful Charter challenge. 


Before I wrap up, I should note that the Protecting Victims Act also proposes to create an offence of threatening to publish or distribute an intimate image. This is the new section 162.1(1.1):


Everyone who, with the intent to intimidate or to be taken seriously, knowingly threatens to publish, distribute, transmit, sell, make available or advertise an intimate image of a person knowing that the person depicted in the image would not give their consent to that conduct, or being reckless as to whether or not that person would give their consent to that conduct, is guilty of an offence.


This goes beyond what is typically described as “sextortion”, where a bad guy threatens to release intimate images in exchange for more such images or money. “Sextortion” is captured in the general offence of extortion. This new offence would capture a threat even where the person making the threat doesn't expect or demand anything in return. It’s a reasonable addition to the criminal law.