PROVINCIAL COURT OF NOVA SCOTIA
Citation: R v MSK, 2026 NSPC 12
Date: 20260309
Docket: 8898884
Registry: Halifax
Between:
His Majesty the King
v.
M.S.K.
Restrictions on Publication:
Criminal Code s. 486.4
Any information that could identify the complainants shall not be published in any document or transmitted in any way.
Common law (Sherman Estate v Donovan, 2021 SCC 25)
Restriction on the publication of the images exhibited at trial.
TRIAL DECISION
|
Judge: |
The Honourable Judge Bronwyn Duffy |
|
Heard: |
February 9, 2026, in Halifax, Nova Scotia |
|
Decision: |
March 9, 2026 |
|
Charges: |
Section 162.1(1) of the Criminal Code |
|
Counsel: |
Adam McCulley, Senior Crown Attorney, for the PPS-NS Mark Knox, KC, for the Defence |
The court hearing this matter directs that the following notice be attached to the file.
This case is subject to the following restrictions on publication set out in s. 486.4 of the Criminal Code:
Order restricting publication — sexual offences
486.4 (1) Subject to subsection (2), the presiding judge or justice may make an order directing that any information that could identify the victim or a witness shall not be published in any document or broadcast or transmitted in any way, in proceedings in respect of
(a) any of the following offences:
(i) an offence under section 151, 152, 153, 153.1, 155, 160, 162, 163.1, 170, 171, 171.1, 172, 172.1, 172.2, 173, 213, 271, 272, 273, 279.01, 279.011, 279.02, 279.03, 280, 281, 286.1, 286.2, 286.3, 346 or 347, or
(ii) any offence under this Act, as it read from time to time before the day on which this subparagraph comes into force, if the conduct alleged would be an offence referred to in subparagraph (i) if it occurred on or after that day; or
(b) two or more offences being dealt with in the same proceeding, at least one of which is an offence referred to in paragraph (a).
Mandatory order on application
(2) In proceedings in respect of the offences referred to in paragraph (1)(a) or (b), the presiding judge or justice shall
(a) at the first reasonable opportunity, inform any witness under the age of eighteen years and the victim of the right to make an application for the order; and
(b) on application made by the victim, the prosecutor or any such witness, make the order.
Victim under 18 — other offences
(2.1) Subject to subsection (2.2), in proceedings in respect of an offence other than an offence referred to in subsection (1), if the victim is under the age of 18 years, the presiding judge or justice may make an order directing that any information that could identify the victim shall not be published in any document or broadcast or transmitted in any way.
Mandatory order on application
(2.2) In proceedings in respect of an offence other than an offence referred to in subsection (1), if the victim is under the age of 18 years, the presiding judge or justice shall
(a) as soon as feasible, inform the victim of their right to make an application for the order; and
(b) on application of the victim or the prosecutor, make the order.
Limitation
(4) An order made under this section does not apply in respect of the disclosure of information in the course of the administration of justice when it is not the purpose of the disclosure to make the information known in the community.
By the Court:
[1] This matter involves an allegation of knowingly publishing intimate images of several complainants without their consent, contrary to s. 162.1(1) of the Criminal Code, RSC 1985, c C-46 [Code].
[2] The case is unique in that the alleged intimate images are sexually explicit ‘deepfakes’ – that is, highly realistic fake sexual images of the complainants created using publicly accessible photos of their faces from social media and generative artificial intelligence [AI].
[3] The core issue is whether the definition of intimate images in s. 162.1(2) captures such AI-generated content that depicts fake images of real people in sexually explicit scenarios.
[4] I rendered this decision orally on March 9, 20026, and informed the parties of my intention to publish my reasons subject to revisions to form (grammar, structure, readability, organization, and the provision of full citations and footnotes) but not substance.
Restrictions on Publication
[5] In accordance with s. 486.4 of the Code, I will refrain from providing any details that may identify the complainants.
[6] Please note that the Court also imposed an additional common law restriction on publication at the commencement of trial with respect to the exhibited images of the complainants.
[7] The Crown made the application for this narrow common law restriction on publication and the Defence did not contest it.
[8] While the requested sealing of these exhibits was not a matter of dispute between counsel, all proposed common law restrictions on the open court principle must be evaluated using the legal framework set out in Sherman Estate v Donovan, 2021 SCC 25.[1]
[9] In granting the proposed restriction, I considered the three branches of the Sherman Estate test:
• whether court openness posed a serious risk to an important public interest;
• whether the order sought was necessary to prevent this serious risk to the identified interest because reasonably alternative measures will not prevent this risk; and
• whether, as a matter of proportionality, the benefits of the order outweighed its negative effects.
[10] The key benefit is that the publication ban will protect the complainants from the release of explicit, nefarious images, the disclosure of which I am satisfied would be injurious to their dignity. The negative effect is the trimming away of transparency in court proceedings, a mark of a functional system that services public accountability and supports fairness in proceedings, which in turn ensures stability of justice administration. Upon weighing the Sherman Estate factors, the Court was of the view that public confidence in the system would not be defeated by a limited common law restriction on publication of images exhibited at trial depicting nudity.
[11] The sealing of this information does not impact the public’s ability to discern the reasons for judgment. Written descriptions regarding the nature of the images provides sufficient information for understanding the resolution of the core legal issue.
[12] Furthermore, neither counsel identified any value to publication of this material, and I agree with their assessment. The benefit of the ban, to protect the serious risk to the complainants’ privacy interests and their dignity, and the lack of a viable alternative method to address these concerns, outweighed the negative effect on the open court principle by means of a limited curtailing of public access to information.
Procedural Background
[13] The Crown proceeded by way of indictment with respect to a three-count replacement information sworn on March 6, 2025, alleging offences under ss. 264(1), 163(1), and 162.1(1) of the Code. The alleged offences took place between November 25, 2023, and February 14, 2024.
[14] On March 13, 2025, the Defence elected to have the matter heard in the Provincial Court of Nova Scotia and the accused entered guilty pleas to the charges of criminal harassment (s. 264(1)) and sending an obscene picture (s. 163(1)).
[15] The remaining allegation of distributing intimate images without consent (s. 162.1(1)) was ultimately set down for trial with a narrow focus on whether the AI-generated images constituted intimate images within the meaning of s. 162.1(2).
[16] Counsel did not lead any viva voce evidence at trial. The evidentiary record consisted of an Agreed Statement of Fact (ASF) and the alleged intimate images were exhibited by consent.
Statutory Provisions
[17] The applicable legislative provisions came into force on March 9, 2015, and read as follows:
Publication, etc., of an intimate image without consent
162.1 (1) Everyone who knowingly publishes, distributes, transmits, sells, makes available or advertises an intimate image of a person knowing that the person depicted in the image did not give their consent to that conduct, or being reckless as to whether or not that person gave their consent to that conduct, is guilty
(a) of an indictable offence and liable to imprisonment for a term of not more than five years; or
(b) of an offence punishable on summary conviction.
Definition of “intimate image”
(2) In this section, “intimate image” means a visual recording of a person made by any means including a photographic, film or video recording,
(a) in which the person is nude, is exposing his or her genital organs or anal region or her breasts or is engaged in explicit sexual activity;
(b) in respect of which, at the time of the recording, there were circumstances that gave rise to a reasonable expectation of privacy; and
(c) in respect of which the person depicted retains a reasonable expectation of privacy at the time the offence is committed.
Admissions of Fact
[18] A summary of the s. 655 admissions of fact follows.
|
1. |
Date, time and jurisdiction. |
||
|
2. |
The complainants - AK, BLE, AG, JF, and NH - all attended high school with the accused in Halifax, Nova Scotia, and graduated in 2018. All individuals were enrolled in the International Baccalaureate program. |
||
|
3. |
The female complainants knew of the accused in high school but were not friends with him. |
||
|
4. |
The accused searched social media for the complainants and located images of them, including on Facebook. |
||
|
5. |
The accused obtained images of the complainants from social media. The accused then altered the images using artificial intelligence software, such that it appeared the complainants were naked. |
||
|
6. |
The software that the accused used to alter the images included “Undress” and “Clothoff.io”. |
||
|
7. |
The original images and the altered artificial intelligence images are appended to the ASF. The known identities of the complainants in the images are as follows: |
||
|
|
a. |
Page 1 – AK |
|
|
|
b. |
Page 2 – AK |
|
|
|
c. |
Page 3 – AK |
|
|
|
d. |
Page 4 – AG and AK |
|
|
|
e. |
Page 5 – AK |
|
|
|
f. |
Page 6 – AK |
|
|
|
g. |
Page 7 – BLE |
|
|
8. |
After the accused altered the images, he distributed them as follows: |
||
|
|
a. |
He sent the altered images of AK to AK; |
|
|
|
b. |
He sent the altered image of BLE to AM; |
|
|
|
c. |
He sent the altered image of AG to AK; |
|
|
|
d. |
He sent the altered image of JF to AM; |
|
|
|
e. |
He sent the altered image of NH to AG. |
|
|
9. |
The complainants did not give the accused permission to alter their images or distribute them. |
||
Analysis
[19] Counsel have helpfully narrowed the focus of my analysis.
Essential Elements – Section 162.1 – Distributing Intimate Images without Consent
[20] The elements that the Crown must prove are as follows:
|
a. |
The accused published, distributed, transmitted, sold, made available, or advertised an image. |
|
b. |
The image depicted a person. |
|
c. |
The image was an "intimate image" as defined in s. 162.1(2), reproduced in full above. |
|
d. |
The complainant did not consent to the accused’s conduct. |
|
e. |
The accused knew or was reckless to the fact that the person depicted did not give their consent to the prohibited conduct. |
[21] Counsel agreed that the only element in dispute is whether the images meet the definition of “intimate images” as set out in s. 162.1(2). Furthermore, within the criteria in that section, there is no dispute with respect to subsection (a), that the person was nude, exposing breasts/genital organs/anal regions/engaged in explicit sexual activity.
Elements of the Definition of Intimate Images in Dispute
[22] The elements requiring determination by this Court in relation to the definition of intimate image in s. 162.2(2) include:
|
a. |
whether the images are visual recordings of a person made by any means; |
|
b. |
whether the circumstances gave rise to a reasonable expectation of privacy when the recording was made; and |
|
c. |
whether the depicted person retained a reasonable expectation of privacy at the time the accused committed the offence; that is, upon distribution of the images. |
[23] Counsel were not able to identify any binding cases that addressed the relevant issues.
[24] The Defence did provide a helpful decision from the Ontario Court of Justice involving altered intimate images and s. 162.1: R v NK, 2025 ONCJ 542 [NK]. The Court concluded that “digitally altered images” were not protected by this section of the Code. I will return to this case later in these reasons.
Principles of Statutory Interpretation
[25] I turn now to the principles of statutory interpretation that are employed by courts in discerning the meaning of legislation. The Supreme Court of Canada provides guidance in how the words of a statute are to be interpreted. Legislation must be read in its entire context, in its grammatical and ordinary sense, harmoniously with the scheme of the Act, its object, and the intent of Parliament.[2] While the goal is to achieve harmony between the words and the objective of the statute, the objective cannot be reached “at all costs.”[3] The ordinary meaning of text, while a starting point, is not determinative, and statutory interpretation is incomplete without considering context, purpose, and relevant legal norms.[4]
[26] Ambiguities can arise when otherwise plain words are read in context (R v Alex, 2017 SCC 37 at para 31 [Alex]). The Court in Alex adopted the following passage from Ruth Sullivan in Sullivan on the Construction of Statutes at para 32:
At the end of the day ... the court must adopt an interpretation that is appropriate. An appropriate interpretation is one that can be justified in terms of (a) its plausibility, that is, its compliance with the legislative text; (b) its efficacy, that is, its promotion of legislative intent; and (c) its acceptability, that is, the outcome complies with accepted legal norms; it is reasonable and just.
[27] The interpretation must be sufficiently clear to avoid uncertainties and ambiguities, given that a principle of fundamental justice is that citizens must have substantive notice of the law to understand when their actions run afoul of it.[5]
[28] Notably, where there is ambiguity in a provision to the extent it can have multiple plausible interpretations, that ambiguity must be resolved in favour of the accused.[6]
Visual Recording of a Person
[29] The prosecution offers the case of R v Walsh, 2021 ONCA 43 [Walsh], to support its submission that the deepfake images are visual recordings within the definition of s. 162.1 of the Code.
[30] In concluding that a FaceTime call is a visual recording, the Court looked at the language of s. 162.1, noting that it is intentionally broad, prohibiting a wide range of conduct, and that nothing in the section suggests that the image must be capable of reproduction. The Crown emphasized the conclusion by the Ontario Court of Appeal that restricting the meaning of "recording" to outdated technology — by requiring that it be capable of reproduction — would fail to respond to the ways in which modern technology permits sexual exploitation through the non-consensual sharing of intimate images. By limiting the interpretation in this manner, it would undermine the objects of s. 162.1 and the intention of Parliament in enacting it (para. 68).
[31] Drawing upon the Supreme Court’s statement in R v Jarvis, 2019 SCC 10 [Jarvis], that sexual offences are enacted to protect personal autonomy and sexual integrity, the Court concluded that according ‘visual recording’ a broad and inclusive interpretation best suits the objects of the provision in s. 162.1 and aligns with Parliament’s intention.
[32] The Crown submitted that the use of expansive language such as “by any means”, suggests that the definition of ‘visual recording’ should be given a broad meaning with respect to the scope of content it captures. Mr. McCulley argued that Walsh demonstrates that the image need not be actually recorded.
[33] Moreover, Walsh supports the view that the subject does not have to have control over the image (paras. 63 and 67). Furthermore, the interpretation should accord with criminalizing the distribution of images that strike at human privacy, basic dignity, and self-worth.
[34] Following that line of reasoning, the Crown argued that the images produced by the accused, though altered and fake, qualify as visual recordings because they identify real people.
[35] While the Defence conceded that a visual recording is not restricted to photos, films, or video recordings, they disagreed that content altered or created by generative AI is captured by the definition.[7]
[36] Counsel agreed that Parliament introduced the legislation in 2015 in response to the burgeoning issue of cyberbullying. However, the Defence argued that generative AI was not in Parliament’s sightline at that time, because artificial intelligence to create images and videos did not occur until 2017. Further, widespread awareness of these tools emerged only in 2022 with the introduction of ChatGPT.[8]
[37] To that end, the Defence argued that an appreciation of the method of operation for generative AI is important to the inquiry. Although a definition of “deepfake” is not firmly cast, it is commonly described as the generation of “false but convincingly realistic material” using artificial intelligence.[9] One of the differences in image production between cameras and AI is that the images produced by the latter are not drawn, painted, animated, or captured by camera, but rather are created by the technology. The question is whether this distinction is material when it comes to the definition of a ‘visual recording’.
[38] The Court’s view in Walsh was that requiring a “visual recording” to be capable of reproduction would be tantamount to suspending the legislation in time and rendering it incapable of adapting to keep pace with developing technology.
[39] The Defence argued that if the Court were to allow such a broad and flexible definition here, the Court would be moving into the territory of legislative action as opposed to a mere incremental development of the common law. The Defence submitted that, unlike with AI, FaceTime technology is the same as previous video imaging technology, which suggests the evolution of the definition was within the realm of incremental change in that context.
[40] There are competing views in academia. Professor Suzie Dunn, Faculty of Law, Dalhousie University, authored an article entitled, “Legal Definitions of Intimate Images in the Age of Sexual Deepfakes and Generative AI”, published in the McGill Law Journal in October, 2024. In that article, Professor Dunn examines non-consensual synthetic intimate images [NSII], which are images created using technology such as AI or photoshop without the consent of the person depicted (page 399). While acknowledging that a case has yet to test whether the definition could be interpreted to include NSII, Professor Dunn notes that it appears, on a plain reading of the definition of “intimate image” currently in section 162.1(2), that only “authentic images of a person” would be captured under the offence (page 408).
[41] Professor Dunn goes on to review the provincial statutes that have been introduced over the last decade to provide a civil remedy when intimate images are shared without consent. Professor Dunn suggests that provincial governments should consider expanding the definition of intimate images to include synthetic or false, but reasonably convincing, intimate images. Manitoba has done so. This was also proposed in the federal Online Harms Act Bill. Part of the discussion centres on the demarcation of the threshold for “reasonably convincing” images as opposed to crude renderings.
[42] Professor Robert Diab, Faculty of Law, Thompson River University, holds a contrary view. In his blog article entitled “Are Sexual Deepfakes Not a Crime in Canada?”, dated January 11, 2025, (2025) 73 Criminal Law Quarterly, he acknowledges Professor Dunn’s concerns, but, citing R v Paré, [1987] 2 SCR 618, concludes as follows:
I agree that on a plain reading of 162.1 of the Criminal Code, the intimate images must be of the person themselves. But the Supreme Court of Canada has endorsed departures from the principle of strict construction in criminal law where a narrow reading would give rise to arbitrariness or defeat the larger aim or purpose of the provision.
[43] Professor Diab argues it would make little sense to be able to circumvent the legislation simply by doctoring an image of one’s partner nude before posting it online, allowing one to say that it is not actually the person’s body. Professor Diab poses the question: if the offence does not include pictures that look to be the real person, then how do we distinguish between a grainy picture of a person, good enough to make out, and a doctored picture of the same person that seems real enough to be convincing? Why would non-consensual distribution of the one be criminalized and not the other? (page 3)
[44] The definition of “child pornography” in s. 163.1(1) differs from the definition of an “intimate image” in s. 162.1. Section 163.1(1) states that child pornography is “a photographic, film, video or other visual representation, whether or not it was made by electronic or mechanical means”.
[45] Notably, the definition employs the words visual ‘representation’ rather than visual ‘recording’.
[46] Professors Dunn and Diab agree that the wording of s. 163.1 captures deepfake images. Two cases have confirmed this: R c Larouche, 2023 QCCQ 1853 [Larouche] and R v Legault, 2024 BCPC 29, though both are sentencing decisions following guilty pleas. The Court in Larouche puts it in pointed terms – whether the material was made using technology to create deepfakes, or more common methods, offenders will not receive leniency for being “mere tinkerers” [translation] of existing images of abused children (para 76).
[47] The Crown argued that a purposive interpretation considering the sexual integrity of the person, leads to the only appropriate reasoning that the images, where they clearly identify an individual engaged in sexual activity or where they appear nude or semi-nude, should be captured as a “visual recording of a person”.
[48] It would be premature to conclude whether these images are visual recordings before examining recent legislative bills to assess whether Parliament views the legislation in its current form to be incomplete as it relates to intimate images. I am of the view that these elements cannot be addressed as entirely discrete issues, so I intend to first complete a preliminary review of the reasonable expectation of privacy component.
Reasonable Expectation of Privacy
[49] Section 162.1 has a temporal requirement for a reasonable expectation of privacy as the Supreme Court set out in R v Downes, 2023 SCC 6. The reasonable expectation of privacy must be present at the ‘time of the recording’ and at the ‘time the offence is committed’.
[50] Jarvis addressed s. 162(1), which employs language that is, in several respects, similar to that in s. 162.1, but lacks the two temporal components. It follows then, that the application is somewhat broader, as it does not include these restrictive elements in relation to time. Also, I agree with the Defence that the voyeurism provisions offer protection to both children and adults, whereas s. 162.1 has application to adults only.
[51] Nevertheless, if not exactly on point, Jarvis remains instructive to courts as to the factors to be considered in evaluating whether a reasonable expectation of privacy exists.
[52] In Jarvis, the Court was of the view that for the purpose of s. 162(1), the circumstances that give rise to a reasonable expectation of privacy are those in which the person would reasonably expect not to be the subject of the type of observation or recording that, in fact, occurred (para. 28).
[53] The Court, in offering a general concept of privacy as “freedom from unwanted scrutiny, intrusion or attention”, directed a non-exhaustive list of factors to consider in the evaluation (para 29):
|
a. |
The location the person was in when they were observed or recorded; |
|
b. |
The nature of the impugned conduct, that is, whether it consisted of observation or recording; |
|
c. |
Awareness of or consent to potential observation or recording; |
|
d. |
The manner in which the observation or recording was done; |
|
e. |
The subject matter or content of the observation or recording; |
|
f. |
Any rules, regulations, or policies that governed the observation or recording; |
|
g. |
The relationship between the person who was observed or recorded and the person who did the observing or recording; |
|
h. |
The purpose for which the observation or recording was done; |
|
i. |
The personal attributes of the person who was observed or recorded. |
[54] The Supreme Court went on to detail in Jarvis the premium placed upon personal privacy in society, and the connection between personal privacy and human dignity.
[55] The inquiry is to be answered in view of societal norms of conduct. It is not to be determined solely on the basis of whether there was a risk that the person could be observed or recorded.
[56] In the context of voyeurism, the concurring approach in Jarvis instructs that if the surreptitious recording or observation diminished the subject’s ability to maintain control over their image, and if this type of recording or observation infringed their sexual integrity, then the circumstances give rise to a reasonable expectation of privacy under s. 162(1) of the Code.
[57] Returning to the Jarvis factors, the accused used photographs of the complainants’ faces that he accessed on social media. I agree with the prosecution that on review of the AI generated images, it is not easily apparent, if at all, to determine that the bodies generated by AI are altered or fake. In assessing the nature of the impugned conduct, the accused created altered images with realistic appearances, using unaltered faces identifiable as the complainants, and distributed them. The Crown argued that if I accept these as recordings, that weighs heavily in favour of the complainants having a reasonable expectation of privacy in these images.
[58] Awareness of or consent to the image is another factor to consider. It is agreed that there was no consent to create these images, and moreover, the complainants were unaware of the images until they were distributed to them. I am satisfied that despite being unaware of the altered creations, it is reasonable for the complainants to expect that their likeness would not be used for an iniquitous purpose. Privacy includes freedom from unwanted scrutiny, attention, or intrusion, all of which was brought to bear by the generation and distribution of these images, which intruded on the complainants’ sexual integrity. A person posting benign and clothed images on Facebook is entitled to the reasonable expectation that those images will not be adulterated and contaminated, in a way that affronts the sensibility of the public.
[59] The manner in which the accused generated the images also balances in favour of a reasonable expectation of privacy – they were generated in secret, the complainants were not informed, and the resulting sexualized images were far removed from the benign originals that were posted for public consumption. Furthermore, the content consists of sustained images rather than a fleeting observation, and was enhanced by technology, as Chief Justice Wagner discussed at para 63 of Jarvis. In the context of technology potentially allowing a person to see or hear more acutely, it can transform what is “reasonably expected and intended to be a private setting” into a setting that is not. Technology can and does make accessibility, storage, and distribution of information easier, but that does not equate to a corresponding reduction in a reasonable expectation of privacy in that same information.[10] Similarly, though technology now exists that can take an ordinary photo suitable for a public setting and transform it into something decidedly out of the ordinary, this does not, in and of itself, adjust the boundaries of what is reasonably expected to be done with publicly accessible images on social media. Indeed, this is exactly the sort of unscrupulous tampering that is not reasonably expected.
[60] The subject matter involved the most private of personal attributes – nude images, with up-close, realistic, easily visible breasts and genitalia, imposed on the real faces of these complainants. Such images strike at the core of individual dignity, and in my view, attract the highest level of privacy expectation.
[61] The Crown argued that the purpose for which the images were created was sexual in nature. Although there was a reference in the Crown brief to sexualized remarks made by the accused, the Agreed Statement of Facts makes no such reference. The only evidence I have in that regard is the images themselves, which, while I am satisfied constitute an affront to their sexual integrity, I am not of the view there is sufficient information to make the inferential leap that the accused created them for a sexual purpose.
[62] The personal attributes of the subjects is another factor to consider. The complainants are youthful females. I agree with the prosecution that I should be mindful that the legislation was proposed and effected in response to cyberbullying, and in particular, the decease by suicide of two young females who were victims of online cyberbullying. These complainants, by making available unobjectionable, fully clothed photos of themselves on social media, could not reasonably have expected to be subjected to nude renderings of themselves.
[63] The accused chose to create these images, distribute them to the complainants, and thereby intrude upon their dignity and sexual integrity. This factor weighs in favour of a reasonable expectation of privacy existing in the images.
[64] Both counsel have asked me to also consider the concurring approach in Jarvis by Justices Rowe, Côté, and Brown. While concurring in the result, they took issue with the use of the conceptual framework for defining Charter rights to apply to Code offences. The concurring Justices were of the view that courts should not expand criminal liability by reference to Charter jurisprudence.
[65] The concurring judgment included reference to two questions, which, if answered in the affirmative, mean that the subject observation or recording occurred in circumstances that gave rise to a reasonable expectation of privacy under s. 162(1):
1. Did the surreptitious observation or recording diminish the subject’s ability to maintain control over their image?
2. And if so, did this type of observation or recording infringe the sexual integrity of the subject?
[66] The prosecution included for my review the transcript of an unreported decision of Judge Murphy of the Nova Scotia Provincial Court where she convicted an offender, Pawel Marczak [Marczak], of a voyeurism allegation relating to events at a local beach. Mr. Marczak attended the beach and hid a video camera in a bag with holes cut into it so that he could videotape beachgoers. He recorded young females and used zoom features on his camera to focus on the sexual organs of the young females. There was no nudity.
[67] The above two questions are answered in the affirmative in this case. If the image is indeed a visual recording, the actions of the accused in taking the social media post displaying a clothed complainant and corrupting it, without their knowledge or consent, and then taking the step to distribute it, vitiated the subject’s control over it. The harm consequent to this loss of control is inflamed by the way in which it was manifested – that is, in a breach of the sexual integrity of the complainants. Notably, this does not involve a ‘sexual purpose’ inquiry. The intent of the accused in undertaking this action may be relevant but is not determinative[11]. The inquiry is instead objective – the images are a rendering of women’s intimate body parts, their breasts and genitals, in close range, and using the faces of the complainants.
[68] In Marczak, the Court concluded that exposure of individuals in a way that they may not have had comfort with or expected is an infringement of privacy when it occurs without a person’s knowledge. In R v Taylor, 2015 ONCJ 449, the Court expressed that it is reasonable to expect that “close-ups of your private areas will not be captured as a permanent record for the photographer, and potentially millions of others online” (para 32). Likewise, this situation encapsulates circumstances that give rise to a breach of a reasonable expectation of privacy, if the images at issue are indeed within the meaning of a visual recording.
[69] In assessing the non-exhaustive list of Jarvis factors - including the nature of the conduct (i.e. rendering the nude images), the lack of awareness or consent of the complainants, the intimate subject matter of the images, and the deeply personal attributes of the people who were the subject of the images – they all balance in favour of the existence of a reasonable expectation of privacy, both at the time the images were created and upon distribution, which is when the alleged offence was committed.
[70] The more challenging question is whether these images that the accused created using generative artificial intelligence qualify as visual recordings. A review of the evolution of the statutory provisions is necessary to assist in this evaluation.
Legislative Evolution
[71] When the scope of a statutory provision is in dispute, the origin and evolution of the legislation may assist in distilling its meaning. As our Court of Appeal expressed in R v Cosh, 2015 NSCA 76 at para 35:
Legislation is not created, nor amended, in a vacuum. Context can provide important signposts on the path to discerning the intention of Parliament in the words used, and their consequent meaning. …
Bill C-13 – Hansard Records
[72] Section 162.1 of the Code was introduced through Bill C-13 in 2014. As statutory interpretation requires consideration of the purpose, intent, and context of the legislation, the discussion surrounding its introduction is relevant to the analysis.
[73] The Bill was introduced to combat cyberbullying in response to the tragic suicides of Rehtaeh Parsons and Amanda Todd following the online distribution of their intimate images without their consent. During the second reading of the Bill on 27 November 2013, the Honourable Peter MacKay said:
…there is no offence in the Criminal Code that specifically addresses the contemptible form of cyberbullying that has emerged involving the distribution of sexual images without the consent of the person depicted in that image. Addressing this gap in the Criminal Code is one of the goals of Bill C-13. The bill proposes a new Criminal Code offence prohibiting the non-consensual distribution of intimate images. Essentially, this offence would prohibit the sharing of sexual or nude images without the consent of the person depicted. (p. 1436)
[74] The language the government used to introduce the Bill was broad in scope. The government emphasized that the proposed offence would prohibit “all manner” of distributing, sharing, or making available intimate images without the consent of the person depicted.
[75] During the same debate, Mr. Bob Dechert, Parliamentary Secretary to the Minister of Justice, said that the Bill had two main goals: to create the new Criminal Code offence of non-consensual distribution of intimate images; and to modernize the investigative powers in the Criminal Code. He said that it would:
…fill a gap related to a form of serious cyberbullying behaviour with respect to the sharing or distribution of nude or sexual images that are later used without the consent of the person depicted”. (pp. 1448-1449)
[76] The House of Commons passed the Bill and it then moved to the Senate for debate. During second reading in the Senate on October 23, 2014, the Honourable Tom McInnis noted that it was important to acknowledge that this offence was not intended to criminalize consensual "sexting". Rather, the target of this new offence was the unauthorized and non-consensual distribution of these images. He emphasized that the Bill contained amendments to the Code to “ensure that our laws are suitable for the technologically advanced world in which we live”. (p. 2301).
[77] A working group prepared a report for the federal, provincial, and territorial Ministers of Justice in relation to cyberbullying. The report, entitled “Cyberbullying and the Non-consensual Distribution of Intimate Images” was completed in June 2013 and provides valuable insight into the drafting and tabling of Bill C-13. On page 17 of the report, the working group agreed that the “person(s) depicted should be a real and identifiable person.”
[78] The following passage on page 17 is particularly notable:
The Working Group further agreed that the person(s) depicted should be a real and identifiable person: cartoons and other creative works that do not impact the depicted person’s privacy interest would be excluded. However, there was considerable concern that altered images could provide an easy defence to the accused if the definition of intimate image is too restrictive (i.e., the offence should not require that the image be unaltered). The Working Group suggested that the identity of the person depicted could be verified by various means and not only by the victim’s face (i.e. including by other identifying information such as text). The definition of an intimate image should be crafted in a manner that does not create a hurdle to a successful prosecution. [Emphasis added.]
[79] In written submissions, the Crown helpfully distilled the language used in the debates during the evolution of the legislation. The consistent thread was that the offence was intended to capture nude photos of real, identifiable people that were distributed without their consent, and that it should be sufficiently broad to keep pace with developing technology. The following is snapshot of some of the language used during the debates is of assistance:
a. “…without consent of the person depicted”
b. “…prohibit all manner of distributing”
c. “…any prohibited conduct done through any form of telecommunication”
d. Modernization
e. “fill a gap”
f. Sharing of photos containing nudity without consent.
g. “…no consent of the person depicted”
h. “..any visual recording”
i. Fighting 21st century crime.
j. “…images that relate to the core of a person’s privacy interest”
k. Offence should not require that the image be unaltered.
Bill C-63
[80] The government introduced Bill C-63, the Online Harms Act, in February of 2024. The proposed legislation promoted the online safety of Canadians and was concerned with the harm associated with online content. It also contained measures to hold social media services accountable. The proposed changes to the Code focused on hate crime and propaganda.
[81] Bill C-63 also included a definition of “intimate content” to address the publication of “deepfakes", which explicitly captured false images that were reasonably convincing. It did not propose amending the definition of s. 162.1 in the Code.
[82] Bill C-63 died on the table when Parliament was prorogued in 2025.
Private Member’s Bill – C-216
[83] Bill C-216, the Promotion of Safety in the Digital Age Act, was proposed by Conservative MP Michelle Garner on June 19, 2025. Its objective is to amend the Code by creating a new offence, section 162.1(1.1), and it includes a definition of “false intimate image” to criminalize the distribution of false images of people created using computer software (s. 162.1(2.1)). The proposed wording expressly addresses generative AI material. The Bill is currently outside the Order of Precedence and does not have a second reading date.
Other Jurisdictions
[84] Several common law jurisdictions have amended legislation that criminalizes non-consensual sharing of nude images, specifically codifying its application to images which have been altered or created with the use of generative AI. The Defence noted that the United States[12], the United Kingdom[13], and Australia[14] have enacted federal legislation that expressly prohibits sharing intimate images, including those created using artificial intelligence. The Defence argued that it is instructive that Canada has not followed suit as at the time of this hearing.
Bill C-16
[85] The November 2024 House Standing Report, Harms Caused by Illegal Sexually Explicit Material Online, emphasizes that the rise of AI-generated deepfake technology presents a serious concern. As stated at page 30:
…A number of witnesses agreed that deepfakes should be explicitly added to the Criminal Code but that international cooperation, platform accountability and other measures were also urgently needed to confront the new technology and its abuses.
[86] Recommendation #5 of the Standing Report at page 32 states:
That section 161.1(2) of the Criminal Code, which defines “intimate image”, be amended to include the concept of sexually explicit deepfakes.
[87] The Hansard transcripts involving Bill C-16 (House of Commons Debate – January 26, 2026), are also of assistance on this point:
Page 4991:
Another issue that is garnering much attention these day [sic] is the use of artificial intelligence to create deepfakes and the sharing of intimate images that have been created with that technology. A gap exists in the law today that we must address by expanding the definition of an intimate image to those created through artificial intelligence. Deepfake technologies are expanding rapidly, and we have to ensure that our laws evolve rapidly to address this emerging threat.
[Translation]
Changing the rules as technology evolves is crucial. There is currently a problem because the law does not reflect the technologies that exist today. [Emphasis added.]
[English]
…
“By changing the definition of an intimate image to include AI deepfakes, we can better protect people against this emerging threat. …
…
Page 4997:
There are elements of the bill that move in the right direction. Banning the creation and distribution of deepfake images is necessary and long overdue. [Emphasis added.]
…
Page 5004:
Bill C-16 would criminalize the distribution of non-consensual sexual deepfakes. AI-generated images or videos that depict someone in sexually explicit scenarios without their consent. These deepfakes can destroy reputations, cause profound psychological harm and, in some cases, be used to extort victims. Our laws must catch up to this disturbing reality. [Emphasis added.]
[88] I have also considered the following exchange that occurred in the House of Commons Debate on January 29, 2026:
Page 5214:
Madam Speaker, the Criminal Code must meet the public's expectations and evolve with the times.
Bill C-16 proposes criminalizing the creation of deepfakes. When someone's face is manipulated to create deepfakes using tools such as AI and then disseminated, it can have a significant impact on the victim's life. Bill C-16 will criminalize that. It also criminalizes threatening to distribute non-consensual intimate images, because not only can distribution have an impact on victims' lives, so can the threat of doing so.
I think these are important changes to the Criminal Code. I would like to hear my colleague's opinion. Does she also think that this is an important change to ensure that the Criminal Code meets the public's expectations and reflects today's reality?
[English]
Madam Speaker, yes, I do appreciate that criminalizing deepfakes is in the bill and that there will be things we will work on in committee, things we agree on, but why has it taken this long to address crime when our communities have been crying for help for years? This is a move in the right direction, but why did it take a decade to get here? [Emphasis added.]
Deepfakes as Visual Recordings within the definition of Intimate Image
[89] The root of this issue is whether the correct interpretation of s. 162.1 captures AI-generated material, or whether that is a reach that would amount to courts standing in the stead of Parliament and effectively legislating. In my view, one step in resolving this issue is determining whether Parliament acknowledges that there is currently a legislative gap in the Code with respect to protecting against deepfakes.
[90] I agree with Mr. McCulley that there is nothing in the legislation expressly requiring the image to be ‘real’.
[91] It is my view that the answer to the question of whether these complainants would have expected not to be the subject of the doctored images is a clear yes.
[92] The language in Bill C-13 that was the germ of the current legislation is indeed broad, prohibiting “all manner” of distribution and sharing of intimate images without consent.
[93] Further, the Court has not lost sight of the considerable concern in the debates leading up to the introduction of this legislation that altered images could provide an easy defence if the definition of intimate images is too restrictive.
[94] I have considered all of this. However, two issues remain.
[95] The Crown acknowledged that the current Bill C-16, is not supportive of the Crown’s position. Indeed, why would Parliament be amending the Code if images created by generative AI were already captured under its banner?
[96] The Crown argued that there is nothing identifying where that gap is – just that it exists. In essence, the Crown submitted that the statute in its current iteration is sufficient to capture the impugned content in this case. The argument is that such an interpretation does not preclude Parliament from shoring up, or clarifying, the legislation with amendments in due course in an effort to remain responsive to technological developments. Stated differently, reasonable statutory interpretation of the existing provisions should capture alterations in real images using generative AI.
[97] Part of this argument is a question of threshold – any alteration to an image, however minute, puts it in the realm of a fake. Does a photoshopped fingernail cross the threshold from a real image to a fake, thus removing it from the protection of legislation?
[98] That is the question for determination - where that line is appropriately drawn? I return to NK. That court was addressing an image in which the complainant’s head was digitally manipulated onto a naked body that was not her own. Beginning at para 24, the Court stated that this:
…photo…does not meet the definition of “intimate image”. The reason being, in short, because it is not her nude body and, it is not her breasts – both of which are necessary to meet that definition.
If this type of photo were meant to be captured by this section, Parliament would have specifically done so. [Emphasis in original.]
The Court went on to state that Parliament would not have chosen the words “the person” or “his or her” if fake images were intended to be protected by the statute (para 27).
[99] The legislation in its current form is not ideal. This is clear from the collection of bills that have been tabled to address its deficiencies, specifically with respect to the need to include the work product of generative AI. The issues with the current legislation are also highlighted by the academic discourse on the subject. These considerations give me pause in interpreting the legislation to encompass AI-generated or manipulated images of real people.
[100] To draw from Professor Coughlan,[15] “our system does not function on the assumption that there is a simple binary division into acceptable and unacceptable behaviour.” Much that is unacceptable is not criminal. It is the responsibility of Parliament to set the parameters of criminal liability. The question of threshold, where to fix a limit on criminal activity, is an essential inquiry that is difficult to resolve. That is precisely why clear legislation is vital.
[101] I appreciate that it is an impossible task for Parliament to be omniscient - it cannot anticipate the results of technology years in the future, the evolution of which occurs at a breakneck speed. The capabilities of current generative AI was the stuff of fancy in 2015 when s. 162.1 was enacted.
[102] Statutory interpretation, to a degree, is a tool of moderate gains, addressing incremental change as our society evolves. But generative artificial intelligence did not impose upon society moderate change – it was monumental. The public profits from its convenience and astonishing capabilities, but it has the capacity for substantial harm if left unchecked.
[103] The second concern is the fair notice principle raised by the Defence. The actions of the accused have caused harm to these complainants – they are indeed victims, as guilty pleas have been entered on these facts to counts of criminal harassment and sending an obscene picture. It is a principle of fundamental justice that citizens have notice of what actions offend the law. Other jurisdictions have been responsive to this issue by enacting legislation that expressly targets generative AI images.
[104] In reviewing Bill C-16 – currently at consideration in committee in the House of Commons - and Bills C-216 and C-63 before it, this Court is satisfied that Parliament has assessed a need to criminalize the distribution of non-consensual intimate image deepfakes that are not prohibited by the current legislation.
[105] The law does not reflect the technology that exists today, and to combat advancing technologies in the commission of offences requires the dictates of Parliament. It is not within the proper jurisdiction of this Court to extend the reach of the legislation to include an image that creates a nude body entirely generated by AI using a real, identifiable face of a person as a ‘visual recording of a person’ within the definition of intimate image. It would be a patchwork assembly to take the enormity of artificial intelligence and force-feed it into incompatible legislation. The statute must be revised to respond to such concerns.
[106] Drawing upon the tools of statutory interpretation, the altered images at issue in this case do not meet the definition of “intimate image” and as such, the accused is acquitted of the allegation contrary to s. 162.1.
[107] I acknowledge the diligent work of counsel who took great care in locating and synthesizing valuable information to assist the Court in its determination, a task that would have been notably more difficult without their meticulous efforts.
Bronwyn Duffy, JPC
[1] See also Dagenais v Canadian Broadcasting Corp [1994] 3 SCR 835, Canadian Broadcasting Corp v Named Person, 2024 SCC 21, and Toronto Star Newspapers Ltd v Ontario, 2005 SCC 41.
[2] R v Wolfe, 2024 SCC 34; Rizzo & Rizzo Shoes Ltd. (Re),[1998] 1 SCR 27, at para 21, quoting E. A. Driedger, Construction of Statutes (2nd ed. 1983), at p 87; Bell ExpressVu Limited Partnership v Rex, 2002 SCC 42, at para 26
[3] Sun Indalex Finance, LLC v. United Steelworkers, 2013 SCC 6, at para 174.
[4] Wolfe, supra, paragraph 32.
[5] R v Levkovic, 2013 SCC 25, per Fish J, at para. 1; Canada v Pharmaceutical Society (Nova Scotia), [1992] 2 SCR 606, per Gonthier J, at para. 29.
[6] R v McIntosh, [1995] 1 SCR 686, per Lamer J at para 39.
[7] Defence brief, p. 6.
[8] Canada, Parliamentary Information and Research Services, Deep Fakes: What Can Be Done About Synthetic Audio and Video? (Library of Parliament, 2019) https://lop.parl.ca/sites/PublicWebsite/default/en_CA/ResearchPublications/201911E
[9] Australia, Parliamentary Research Services, Sexually explicit deepfakes and the criminal law in NSW (Parliamentary of New South Wales, 2025) at p. 5
[10] Jarvis, supra, paragraph 63, citing in part, R. v. Rudiger, 2011 BCSC 1397, paras. 93-98.
[11] Jarvis, supra, paras. 141-145; R. v. Chase, 1987 CanLII 23;
[12] Take it Down Act, Public Law, No: 119-12 at s 2 (United States)
[13] Online Safety Act 2023, c 50, part 10, s 188 (United Kingdom). This legislation includes images generated or altered with AI and other programs, it does not prohibit the creation of such images. The Non-Consensual Sexually Explicit Images and Videos (Offences) Bill [HL Bill 26] was introduced in the House of Lords in 2024. This Bill would expand the prohibition to include the creation of, or solicitation to create, digitally produced images and film, which are sexually explicit, and created without consent. A case has applied these amendments: Essex Police, “Women thank Essex Police after man who shared deep fake images is jailed” (4 April, 2025) https://www.essex.police.uk/news/essex/news/news/2025/april/man-jailed-for-sharing-deep-fake-images/ (accessed 6 August, 2025)
[14] Criminal Code Act 1995, v 2, sch 1, s 474.17 (Australia). Some (but not all) Australian states have introduced legislation which explicitly targets the distribution of sexually explicit images which are created or altered with digital technology, Crimes Act 1900, No 40 [NSW], part 3, div 15C at ss 91P-91R (Australia)
[15] Criminal Reports – Comments, p. 4, - Jarvis, 2019 CSC 10 ; 2019 CarswellOnt 1921.