Sunday, January 11, 2026

Canada's new proposed law to outlaw explicit deepfakes: Bill C-16

A number of years ago, the Parliament of Canada amended our Criminal Code to create a criminal offense related to the non-consensual distribution of intimate images. Last month, the Government of Canada proposed to further amend the Criminal Code to include so-called deepfake intimate images, and to create an offence of threatening to disclose intimate images, deepfake or not.

Section 162.1, which was added to the Criminal Code in 2014, makes it an offence to publish, distribute, transmit, sell, make available or advertising an intimate image without the consent of the individual depicted in the image. 


And a number of provinces have put in place laws that create civil remedies for the non-consensual distribution of intimate images. 


With some variation, they generally have the same definition of “intimate image”, but they really haven’t kept up with an explosion of synthetic, AI-generated intimate imagery. Synthetic images are created by generative AI systems that can “learns” what a person looks like and can use that information to create new images that resemble that person. 


If you look at the definition of what is an intimate image, it clearly presupposes that it is a recording of an actual person and that the actual person was involved, or at least present at its recording.


Criminal Code – 2014 Amendments Definition of intimate image (2) In this section, intimate image means a visual recording of a person made by any means including a photographic, film or video recording, (a) in which the person is nude, is exposing his or her genital organs or anal region or her breasts or is engaged in explicit sexual activity; (b) in respect of which, at the time of the recording, there were circumstances that gave rise to a reasonable expectation of privacy; and (c) in respect of which the person depicted retains a reasonable expectation of privacy at the time the offence is committed.


It refers to an image or recording where the person “is exposing” certain body parts or “is engaging” in explicit sexual activity. It talks about “reasonable expectations of privacy” at the time the image is recorded and at the time the offence is created. 


This definition would not capture synthetic, “deep fake” intimate images.


The province of British Columbia has the newest provincial statute to create a civil framework to provide civil remedies for the non-consensual distribution of intimate images. The definition there is clearly modeled on the definition from the Criminal Code of Canada, but does include images where the person is depicted as engaged in a particular activity, also regardless of whether the image has been altered. So the BC law would cover a situation where an actual image of a person has been altered, in any way, to depict the person as engaging in certain acts or nude. 


Intimate Images Protection Act (British Columbia) “intimate image” means a visual recording or visual simultaneous representation of an individual, whether or not the individual is identifiable and whether or not the image has been altered in any way, in which the individual is or is depicted as (a) engaging in a sexual act, (b) nude or nearly nude, or (c) exposing the individual's genital organs, anal region or breasts, and in relation to which the individual had a reasonable expectation of privacy at, (d) in the case of a recording, the time the recording was made and, if distributed, the time of the distribution, and (e) in the case of a simultaneous representation, the time the simultaneous representation occurred;

But this updated definition does not cover purely synthetic images, meaning images that are original and are not simply alterations of existing images. You may recall a little while ago when AI generated sexualized images of superstar Taylor Swift were posted online. If I recall correctly, these were images that were not alterations of existing images but were rather the result of the AI image generator having ingested many, many images of Taylor Swift and “knowing” what she looks like. Those images would not have been captured by the current Criminal Code or even the newer definition in the British Columbia intimate images law. 

In December, the Government of Canada introduced Bill C-16, called the “Protecting Victims Act”, that makes a number of amendments to Canadian criminal and related laws. Included in Bill C-16 are proposed amendments that will expand the existing definition of “intimate image” to include synthetic deepfakes. 


So here’s the new definition from Bill C-16, but it’s more helpful to compare it to the existing language of the Criminal Code. I’ve crossed out what’s being removed and underlined what’s being added. So we see in subsection (2)(a)(i), where it deals with what has to be in an image or recording to be considered an “intimate image” – they’ve removed “his or her genital organs or anal region or her breasts” and have replaced it with “their sexual organs”. 


Bill C-16 Proposed amendments (redline)

Definition of intimate image
(2) In this section, intimate image means

(a) a visual recording of a person made by any means including a photographic, film or video recording,

(i) in which the person is nude, is exposing his or her genital organs or anal region or her breasts their sexual organs or is engaged in explicit sexual activity,

(ii) in respect of which, at the time of the recording, there were circumstances that gave rise to a reasonable expectation of privacy, and

(iii) in respect of which the person depicted retains a reasonable expectation of privacy at the time the offence is committed; or

(b) a visual representation that is made by any electronic or mechanical means and that shows an identifiable person who is depicted as nude, as exposing their sexual organs or as engaged in explicit sexual activity, if the depiction is likely to be mistaken for a visual recording of that person.
That change doesn’t really do what it appears it will do because they’ve added a new defined term in section 150 of the Code, which defines specific terms for Part V of the Code which deals with sexual offences. 

“sexual organs” include breasts that are or appear to be female breasts and the anal region; 


So this isn’t really a material change, as far as I can see. 


Subsection (2)(b) is where they scope in deepfakes:


(b) a visual representation that is made by any electronic or mechanical means and that shows an identifiable person who is depicted as nude, as exposing their sexual organs or as engaged in explicit sexual activity, if the depiction is likely to be mistaken for a visual recording of that person.


So this part doesn’t depend on the reasonable expectation of privacy in the image or recording. Which makes sense. An actual image of an actual person will be associated with that actual person’s expectations of what would happen with that image. A purely made-up image doesn’t have that. 


The key parts are that it is a visual representation that depicts the same sorts of body parts or conduct as in subsection (2)(a)(i), and that it has to be sufficiently realistic that the depiction “is likely to be mistaken for a visual recording of that person.”


It can’t be cartoon-ish or of such poor quality that you’d know immediately that it is not really that person. 


The scope of what could be an intimate image could be broader, but we have to be mindful of freedom of expression. Unfortunately, as of January 10 when I’m recording this, no Charter statement related to Bill C-16 has been released by the Canadian Department of Justice. (It’s been more than a month since the Bill was tabled in Parliament, so should have been released by now.)


The creation and distribution of intimate images is an expressive act and would be protected by the freedom of expression provision in section 2(b) of the Charter of Rights and Freedoms. But protected expression can be subject to “reasonable limits prescribed by law as can be demonstrably justified in a free and democratic society”. In order to justify the limitation, the goal of the legislature has to be pressing and substantial. i.e., is the objective sufficiently important to justify limiting a Charter right? And then there has to be proportionality between the objective and the means used to achieve it. 


This has three parts: first, the limit must be rationally connected to the objective. There must be a causal link between the measure and the pressing and substantial objective.


Second, the limit must impair the right or freedom no more than is reasonably necessary to accomplish the objective. The government will be required to show that there are no less rights-impairing means of achieving the objective “in a real and substantial manner”. 


Third, there must be proportionality between the deleterious and salutary effects of the law.


I think there is some risk that this expanded definition of “intimate images” may be vulnerable to being struck down as an unjustified infringement of freedom of expression. The law doesn’t create an offence of creating explicit deepfakes for “personal use”, so that’s not an issue. Though there is a defence related to “serving the public good” in section 162.1(3), I don’t think it’s broad enough to address the potential use of deepfakes in political satire and commentary.


Whether you like it or not, and regardless of whether you think it’s tasteful, AI generated imagery is being used to produce political commentary and satire. And yes, some of it does veer into depicting body parts and activities that can be captured in the new definition of “intimate image.” And you generally can’t outlaw expression just because it’s tasteless. At the end of the day, I don’t think the existing defence of “serving the public good” shields such political expression and leaves this provision vulnerable to a successful Charter challenge. 


Before I wrap up, I should note that the Protecting Victims Act also proposes to create an offence of threatening to publish or distribute an intimate image. This is the new section 162.1(1.1):


Everyone who, with the intent to intimidate or to be taken seriously, knowingly threatens to publish, distribute, transmit, sell, make available or advertise an intimate image of a person knowing that the person depicted in the image would not give their consent to that conduct, or being reckless as to whether or not that person would give their consent to that conduct, is guilty of an offence.


This goes beyond what is typically described as “sextortion”, where a bad guy threatens to release intimate images in exchange for more such images or money. “Sextortion” is captured in the general offence of extortion. This new offence would capture a threat even where the person making the threat doesn't expect or demand anything in return. It’s a reasonable addition to the criminal law.